WO2021193421A1 - Information processing device, information processing method, program, and information processing system - Google Patents

Information processing device, information processing method, program, and information processing system Download PDF

Info

Publication number
WO2021193421A1
WO2021193421A1 PCT/JP2021/011342 JP2021011342W WO2021193421A1 WO 2021193421 A1 WO2021193421 A1 WO 2021193421A1 JP 2021011342 W JP2021011342 W JP 2021011342W WO 2021193421 A1 WO2021193421 A1 WO 2021193421A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
tactile
tactile presentation
control unit
virtual object
Prior art date
Application number
PCT/JP2021/011342
Other languages
French (fr)
Japanese (ja)
Inventor
諒 横山
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021193421A1 publication Critical patent/WO2021193421A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • This technology relates to information processing devices, information processing methods, programs and information processing systems.
  • AR Augmented Reality
  • Patent Document 1 describes a technique for notifying the user whether or not the pen tip is in contact with the virtual plane by vibrating the electronic pen when displaying the trajectory of the pen tip of the electronic pen on the virtual plane. It is disclosed.
  • Patent Document 1 merely informs the user of the state of the pen tip of the electronic pen with respect to the virtual plane by vibrating the electronic pen. Therefore, a technique capable of performing even better tactile presentation is desired.
  • One of the purposes of this technology is to provide an information processing device, an information processing method, a program, and an information processing system capable of performing excellent tactile presentation.
  • This technology is an information processing device including a control unit that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • the processor This is an information processing method that controls to generate control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • This technology On the computer This is a program that realizes a control function that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • This technology A display device that displays virtual objects and A tactile presentation device that presents tactile sensations, An information processing device connected to the display device and the tactile presentation device, With The information processing device generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • the tactile presentation device is an information processing system that presents a tactile sensation based on the control information.
  • FIG. 1 is a diagram showing a configuration example of an information processing system according to the present embodiment.
  • FIG. 2 is a block diagram showing a configuration example of the information processing device.
  • FIG. 3 is a functional block diagram showing an example of a functional configuration controlled by the control unit.
  • FIG. 4 is a diagram showing a configuration example for grasping the position of the operating body.
  • FIG. 5 is a diagram showing another configuration example for grasping the position of the operating body.
  • FIG. 6 is a diagram showing still another configuration example for grasping the position of the operating body.
  • FIG. 7 is a diagram for explaining an operation example of the enlargement / reduction UI.
  • FIG. 8 is a flowchart showing an example of the processing flow by the control unit.
  • FIG. 9 is a diagram showing a specific example of using the information processing system.
  • Embodiment 1-1 Information processing system configuration 1-2. Configuration of information processing device 1-3. Information processing device functions 1-4. Specific examples of tactile presentation 1-5. Information processing device processing 1-6. Specific example of information processing system 2. summary
  • FIG. 1 shows a configuration example of an information processing system according to the present embodiment.
  • the information processing system 1 shown in FIG. 1 includes a display device 2, a tactile presentation device 3, an information processing device 4, and an information sharing server S.
  • the information processing device 4 is connected to the display device 2, the tactile presentation device 3, and the information sharing server S, respectively, and is configured to be capable of information communication with each of the display device 2, the tactile presentation device 3, and the information sharing server S. ..
  • the connection may be made regardless of the communication method such as wired or wireless.
  • the display device 2 has a function of displaying an image.
  • the display device 2 is composed of a device including a display unit such as a display panel.
  • the image includes a still image and a moving image (video).
  • the display device 2 displays a virtual space such as an AR (Augmented Reality) space, a VR (Virtual Reality) space, and an MR (Mixed Reality) space based on the image information provided by the information processing device 4. That is, the display device 2 expresses a virtual space by displaying an image.
  • the virtual space is a virtual three-dimensional space constructed by information processing executed by the information processing device 4. Examples of the content displayed on the display device 2 include games using virtual space, live distribution, sports broadcasting, navigation, education, tourist information, shopping, and other hands-on content.
  • Examples of the display device 2 include a mobile terminal (for example, a smartphone, a smart tablet, a mobile phone, a portable game machine, etc.), a wearable display device (for example, a head-mounted display (HMD), AR glass, a VR glass, etc.). can give.
  • the display device 2 may be a stationary display.
  • the tactile presentation device 3 has a tactile presentation function for presenting a tactile sensation to a user.
  • the tactile presentation device 3 is composed of a device including a tactile presentation unit such as a vibrator, an electric tactile device, an electric muscle stimulator, and a Pelche element.
  • the tactile presentation device 3 presents the user with a tactile sensation regarding a phenomenon in the virtual space (details will be described later) based on the control information provided by the information processing device 4.
  • sensations are roughly classified into three types. Special senses such as sight and hearing, visceral sensations such as visceral pain, and tactile sensations (tactile sensations in a narrow sense), pressure sensations, and vibrations from the skin, mucous membranes, and deep muscles, tendons, and joints of the body.
  • tactile sensation in the present specification and drawings means tactile sensation in a broad sense (tactile sensation in a broad sense), and refers to this somatosensory sensation. That is, the above-mentioned tactile presentation function refers to a function that gives the user these somatosensory sensations.
  • the above-mentioned tactile presentation unit may be any as long as it can give this somatosensory to the user.
  • Examples of the tactile presentation device 3 include the above-mentioned mobile terminal, a pen-type electronic device (so-called AR pen, etc.), a grip-type electronic device such as a controller, a glove type (so-called haptic glove, etc.), a bracelet type, and a ring type. Such as wearable electronic devices.
  • the tactile presentation device 3 may have a configuration capable of presenting a tactile sensation to the user.
  • the information processing system 1 may include a plurality of tactile presentation devices 3 that can be used by one user.
  • the information processing device 4 has a function of controlling the display device 2 and the tactile presentation device 3, a function of performing information communication with each of the display device 2, the tactile presentation device 3, and the information sharing server S.
  • Examples of the information processing device 4 include the above-mentioned mobile terminal, personal computer, game machine, and the like. The details of the information processing device 4 will be described later.
  • the information sharing server S has a configuration in which information such as image information and control information can be shared between the information processing device 4 and another information processing device (not shown). It should be noted that the configuration may be such that the clients communicate directly with each other without providing the information sharing server S. For example, in the case of the illustrated example, the information processing device 4 which is a client may be configured to directly communicate with another information processing device. Further, when sharing is not required, the information sharing server S may be omitted.
  • the information processing system 1 may be one in which at least two or more of the display device 2, the tactile presentation device 3, and the information processing device 4 are integrally configured.
  • the display device 2 may be provided with the functional configuration of the information processing device 4, or the tactile presentation device 3 may be provided with the functional configuration of the information processing device 4.
  • the display device 2 may be provided with the functional configurations of both the information processing device 4 and the tactile presentation device 3, or the display device 2 may be provided with the functional configurations of the tactile presentation device 3.
  • the information processing system 1 may include an audio output device (not shown) having an audio output unit that outputs audio such as a speaker.
  • the audio output device may be configured separately from other devices, or may be integrally configured with other devices such as the display device 2. Examples of the audio output device include speakers, headphones, wireless earphones, and the like.
  • the display device 2 and the tactile presentation device 3 are not limited to being indirectly connected to the information sharing server S via the information processing device 4, but may be directly connected. good.
  • the display device 2, the tactile presentation device 3, the information processing device 4, and the information sharing server S may be connected to each other by using a network such as a LAN (Local Area Network).
  • LAN Local Area Network
  • the information processing system 1 is for making the user feel the phenomenon in the virtual space not only visually but also by touch as described above. As a result, the user can enjoy an advanced virtual experience that is close to reality as if he / she had a real experience that cannot be obtained only by looking at the virtual space.
  • FIG. 2 is a block diagram showing a configuration example of the information processing device 4.
  • the information processing device 4 includes a storage unit 5, a control unit 6, a communication unit 7, and a detection unit 8.
  • the storage unit 5 is composed of, for example, a RAM (RandomAccessMemory), a ROM (ReadOnlyMemory), a hard disk, or the like.
  • the storage unit 5 stores information necessary for processing by the control unit 6, such as a program and data used in the program.
  • the control unit 6 is composed of, for example, a CPU (Central Processing Unit, that is, a processor) or the like.
  • the control unit 6 reads and executes the program stored in the storage unit 5. Then, the control unit 6 controls each component of the information processing device 4.
  • the program may be stored in an external storage such as a USB memory, may be provided via a network, or may be partially executed by another device via the network. May be good.
  • the communication unit 7 is composed of, for example, a communication device and communicates with each of the display device 2, the tactile presentation device 3, and the information sharing server S.
  • the communication unit 7 gives the information obtained from the information sharing server S, for example, image information, control information, and the like to the control unit 6. Further, the communication unit 7 gives the information (for example, image information, control information, etc.) obtained from the control unit 6 to the display device 2, the tactile presentation device 3, and the information sharing server S. Specifically, the communication unit 7 gives image information to the display device 2 and the information sharing server S, and gives control information to the tactile presentation device 3 and the information sharing server S.
  • the detection unit 8 is composed of, for example, an imaging device or the like, and provides the control unit 6 with detection information for detecting the above-mentioned phenomenon in the virtual space.
  • the detection unit 8 may be provided by the display device 2 or the tactile presentation device 3.
  • FIG. 3 is a functional block diagram showing an example of a functional configuration controlled by the control unit 6.
  • the control unit 6 mainly includes a display control unit 61, a transmission / reception control unit 62, an operating body recognition unit 63, and a tactile presentation control unit 64. These functional blocks function, for example, when the control unit 6 executes the above-mentioned program.
  • the display control unit 61 controls the operation of the display device 2 to display the virtual space on the display device 2. Specifically, the display control unit 61 generates image information for displaying the virtual space, and provides the generated image information to the display device 2. As a result, the display device 2 displays a virtual space (display object such as a virtual object or a real object) based on the image information. This image information is generated, for example, by executing a content program.
  • the tactile presentation program may be included in the content program or may be separate.
  • the transmission / reception control unit 62 shares the display of the virtual space among users. Specifically, the transmission / reception control unit 62 controls the communication unit 7 so as to send image information to the information sharing server S. As a result, the virtual space based on the image information of the information sharing server S can be displayed on another display device (not shown) as in the display device 2, and can be seen by another user.
  • the transmission / reception control unit 62 shares the tactile sensation among users. Specifically, the transmission / reception control unit 62 controls the communication unit 7 so as to send the control information generated by the tactile presentation control unit 64, which will be described later, to the information sharing server S. As a result, similarly to the tactile presentation device 3, another tactile presentation device (not shown) can be made to perform tactile presentation based on the control information of the information sharing server S, and can be perceived by another user.
  • the operation body recognition unit 63 grasps the position of the real object (operation body) that operates the virtual object in the virtual space.
  • the operating body is different from the tactile presentation device 3 (separate body).
  • the operating body 10 may include the functional configuration of the tactile presentation device 3. That is, the information processing system 1 includes a tactile presentation device 3 different from the operating body.
  • the virtual object means a virtual object (specifically, a display object without an entity) that is perceived by the user as if it exists in the real space.
  • virtual objects are represented by two-dimensional or three-dimensional computer graphics and placed in virtual space.
  • the virtual object includes a virtual object, a virtual UI, and the like. Virtual objects include not only those that are visible to the user but also those that are invisible (transparent objects, to put it bluntly).
  • a real object means a real object (specifically, a display object with an entity) that actually exists in the real space.
  • the real object also includes the human body.
  • FIG. 4 is a diagram showing a configuration example for grasping the position of the operating body.
  • the operating body 10 shown in FIG. 4 is used by the user to cause (manipulate) the phenomenon of the virtual space.
  • Examples of the operating body 10 include a pen-shaped (for example, AR pen, etc.) operating device as shown in the figure, and a part or all of the body such as the user's hands, feet, and head.
  • the operation unit 11 is a portion of the operation body 10 that serves as a point of contact with a virtual object. For example, the operation unit 11 becomes a pen tip when the operation body 10 is a pen-type instrument.
  • the operating body 10 is a user's finger, it corresponds to a fingertip, and when it is a user's hand, it corresponds to a palm or the like.
  • the operating body recognition unit 63 obtains the position (distance and direction from a predetermined position, etc.) of the operating unit 11 of the operating body 10 by the detecting unit 8.
  • the information processing device 4 which is the main body of the operating body 10 is composed of a smartphone
  • the detection unit 8 is composed of an imaging device built in the smartphone.
  • the operating body recognition unit 63 obtains the position of the operating unit 11 by detecting the operating unit 11 from the detection information (imaging information in this example) from the detecting unit 8.
  • the detection unit 8 is configured by a Depth camera capable of acquiring depth information by ToF (Time-of-Flight) or the like, and by detecting the operation unit 11, the third order in the XYZ three-dimensional coordinate system with a predetermined position as a reference point.
  • the original coordinate values (x, y, z) can be specified.
  • the position of the operation unit 11 can be satisfactorily specified.
  • the operating body 10 shown in FIG. 4 is provided with an operating button 12 on the grip portion.
  • the operation button 12 is used, for example, for notifying the information processing apparatus 4 of an operation start (writing start) trigger.
  • the operation button 12 is not limited to an electric switch, but may be a mechanical switch.
  • the structure may be such that when the operation button 12 is pressed, the operation unit 11 pops out and the reflective material comes out.
  • a mechanical switch By using a mechanical switch, a battery, an electric circuit, and the like are not required, and the operating body 10 can be made smaller and lighter.
  • FIG. 5 is a diagram showing another configuration example for grasping the position of the operating body 10.
  • the operating body 10 is composed of a pen-shaped instrument. Further, a plurality of markers (for example, invisible markers) 13 are attached to the surface of the operating body 10. Then, as described above, the detection unit 8 of the information processing device 4 is configured by the image pickup device. As a result, the operating body recognition unit 63 can estimate the angle, position, etc. of the operating body 10 by detecting the marker 13 of the operating body 10 from the detection information, and specify the position of the operating unit 11 of the pen tip. can. At this time, for example, the orientation and angle of the operating body 10 may be obtained by changing the shape and size of the marker 13. Also in the case shown in FIG. 5, it is possible to reduce the size and weight as described above.
  • markers for example, invisible markers
  • FIG. 6 is a diagram showing still another configuration example for grasping the position of the operating body 10.
  • the user's hand finger
  • the fingertip of the operating body 10 is used as the operating unit 11.
  • the operating body recognition unit 63 can specify the position of the operating unit 11 from the detection information of the detecting unit 8 by using the detecting unit 8 as an image pickup device. At this time, for example, the position of the fingertip can be specified from the shape of each finger of the user's hand.
  • the operating body 10 does not need to use a particularly fixed object, and may be an instrument that can be operated by the user in the air, or may be the user's body itself. Further, the operation unit 11 may be close to a display object such as a virtual object and may be able to grasp the position thereof. That is, the operating body 10 and the operating unit 11 may be appropriately determined according to the operation content and the like, and the position grasping and the like are not limited to a specific method, and known techniques can be used.
  • the operating body 10 is an instrument such as an AR pen or a haptic glove
  • the operating body 10 may be used as the tactile presentation device 3.
  • the information processing system 1 is provided with a tactile presentation device 3 different from the operating body 10 as described above.
  • the tactile presentation control unit 64 shown in FIG. 3 controls the operation of the tactile presentation device 3.
  • the tactile presentation control unit 64 detects a phenomenon in the virtual space according to the distance between the virtual object and the operating body 10, and generates control information according to the detected phenomenon. That is, the tactile presentation control unit 64 generates a control signal that controls the operation of the tactile presentation device 3 according to the distance between the virtual object and the operating body 10.
  • the tactile presentation control unit 64 causes the tactile presentation device 3 to present the tactile sensation representing the tactile sensation of the virtual object when the contact between the virtual object and the operating body 10 is detected. Further, the tactile presentation control unit 64 causes the tactile presentation device 3 to present a tactile sensation representing a state change of the virtual object. Further, the tactile presentation control unit 64 causes the tactile presentation device 3 to present a tactile sensation representing an operation using the operating body 10 on the virtual object.
  • the tactile presentation control unit 64 has the position of the operation unit 11 specified by the operation body recognition unit 63 and the arrangement position of the virtual object in the virtual space (for example, the three-dimensional coordinate value in the above-mentioned XYZ three-dimensional coordinate system). ) And, and the contact is detected by calculating the distance between the two. For example, if the distance between the two is within a predetermined value, it can be determined that they are close to each other, and if the distance between them is zero, it can be determined that they are in contact with each other.
  • the setting of the distance (contact / non-contact determination range) for determining contact may be changed according to a predetermined condition. For example, this predetermined condition can be set to brightness.
  • this predetermined condition can be set to brightness.
  • Click operation selection / decision, etc.
  • tactile presentation is performed by generating vibration or the like representing the operation.
  • a virtual object for example, a virtual object
  • an operation decision of a virtual UI User Interface
  • the method of the click operation is not particularly limited, and examples thereof include pressing the operation button 12 (see FIG. 4) of the operation body 10.
  • a vibration or the like for example, a vibration that feels “catch” or “chi” may be generated.
  • the tactile presentation is performed by generating vibration or the like representing the operation during the drag / drag & drop operation.
  • the tactile presentation device 3 may be caused to vibrate or the like to be felt as “tick” pronounced of the scale.
  • vibration or the like indicating that the virtual object is moving may be generated.
  • vibration or the like indicating the collision may be generated, or when the moving virtual object is released in the air, gravity may occur. When it falls and collides with a real object such as a desk, vibration or the like representing the collision may be generated.
  • the tactile sensation representing the drawing comfort (texture feeling) of the operation body 10 with respect to the virtual plane is tactile.
  • the presenting device 3 present by vibration or the like.
  • the tactile sensation corresponding to the setting of the pen type of the operating body 10 may be presented by vibration or the like. For example, if you are assuming a pencil, you should feel "rough", and if you are assuming a felt-tip pen, you should feel "squeaky”.
  • the tactile sensation is presented by generating vibration or the like that expresses the tactile sensation.
  • the tactile sensation according to the setting of the material of the touched virtual object may be presented. For example, if a plastic material is assumed, it causes vibrations with a short cycle such as "clicking" when hitting a hard material, and if it is a soft material, it responds to changes in the amount of penetration of the operation unit 11. To generate continuous vibrations that make you feel "gugugu".
  • the tactile sensation device 3 may be made to present a tactile sensation having a strength corresponding to the collision speed of the operating body 10 with the virtual object. For example, if the collision speed when the virtual object is touched is high, the vibration is strong, and if the collision speed is slow, the vibration is weak. As a result, the user can feel the impact according to the strength of the collision.
  • the force sensation may be presented by generating vibration or the like representing the force received from the virtual object.
  • a vibration that feels like receiving a reaction force on the side opposite to the struck direction may be generated.
  • the virtual object is pierced so as to rotate, it may cause vibration or the like to feel the rotational force.
  • vibration of the feel of receiving a rotational reaction force may be generated according to the display position of the pierced virtual object on the screen. This makes it possible for the user to perceive the direction and movement of the force received from the virtual object.
  • the tactile presentation device 3 is made to present the force sense, for example, when the virtual object is grasped, the force sense on the side of the grasped virtual object is opposite to the grasped hand. Make sure it is presented in your hand. This is realized by having the tactile presentation device 3 different from the operating body 10 perform the tactile presentation. This eliminates the need to vibrate the operating body 10.
  • the tactile sensation is not limited to vibration, but may be presented by pressure, for example, tightening of a VR device or the like.
  • the tactile sensation indicating contact or access prohibition is presented to the tactile presentation device 3.
  • a dangerous object to touch for example, a hot object such as fire
  • a tactile sensation indicating that the object is dangerous before actually touching it is presented.
  • a fragile real object such as a vase
  • a tactile sensation indicating that may be presented is presented.
  • the tactile sensation presenting device 3 is made to present a tactile sensation corresponding to the gripping force of the user holding the operating body 10.
  • the tactile sensation is changed according to the strength of holding the operating body 10. This allows the user to select the desired tactile sensation.
  • the tactile sensation device 3 is made to present a tactile sensation having a strength corresponding to the distance between the detection unit 8 and the operating body 10. For example, when the distance is long, a weak tactile presentation is performed, and when the distance is short, a strong tactile presentation is performed. As a result, for example, a sense of distance can be felt by touch.
  • the information processing system 1 may be configured to include a plurality of tactile presentation devices 3.
  • control information for controlling the operation of each of the plurality of tactile presentation devices 3 can be generated, and the tactile presentation device 3 for presenting the tactile sensation under predetermined conditions can be used properly.
  • the operating body 10 is an AR pen provided with a tactile presentation device 3
  • UI operation and drawing comfort by pen operation provide tactile feedback by vibrating the tactile presentation device 3 of the AR pen.
  • the tactile feedback is provided by vibrating or the like the tactile presentation device 3 on the main body side that is not the AR pen.
  • the user can intuitively grasp the type of phenomenon in the virtual space from the tactile sense.
  • one tactile presentation device 3 for example, AR pen
  • another tactile presentation device 3 for example, a smartphone on the main body side
  • the tactile presentation can be performed without worrying about the battery state.
  • the voice is generated from, for example, a vibration waveform for tactile presentation.
  • the vibration for presenting the tactile sensation may be generated from the voice waveform representing the tactile sensation.
  • the image visually expresses the sense of touch by shaking the display with, for example, the Shake effect.
  • the device that cannot present the tactile sensation may be made to output the voice expressing the tactile sensation and display the image.
  • FIG. 8 is a flowchart showing an example of the processing flow by the control unit 6. The order of the following processes can be changed as long as each process is not hindered.
  • the operating body recognition unit 63 recognizes the position, posture, and the like of the operating body 10 in the virtual space (step S1). That is, the position, posture, etc. of the operating body 10 (operating unit 11) such as the AR pen, the haptic glove, and the fingertip in the virtual space are recognized.
  • step S2 the contact state between the operation body 10 (operation unit 11), which is a real object, and a virtual object such as an operation UI or a virtual object is identified (step S2), and it is determined whether or not there is contact (step S3). .. As described above, whether or not there is contact can be determined by the distance between the operating body 10 (operating unit 11) and the virtual object.
  • the tactile presentation device 3 is controlled by the tactile presentation control unit 64 to perform tactile presentation (step S4).
  • the tactile presentation control unit 64 vibrates an operating body 10 including a tactile presentation device 3 such as an AR pen or a haptic glove to present a tactile sensation regarding the operation feeling, drawing comfort, texture, temperature, etc. of the virtual UI. To do so.
  • the tactile sensation may be presented by the tactile sensation presenting device 3 other than the operating body 10.
  • the operating body 10 does not require a battery, an electric circuit, or the like, and can be made smaller and lighter.
  • step S6 After the tactile presentation is performed in step S4, or when it is determined that there is no contact in step S3, the state change of the virtual object such as moving or falling on the floor is identified (step S5). ), It is determined whether or not there is a state change (step S6).
  • the tactile presentation device 3 is controlled by the tactile presentation control unit 64 to perform tactile presentation (step S7).
  • the tactile presentation control unit 64 vibrates a tactile presentation device 3 such as a mobile terminal or a head-mounted display, which is different from the operating body 10, to perform tactile presentation regarding recoil such as tilting, impact of falling on the ground, or the like.
  • step S7 After the tactile sensation is presented in step S7, or when it is determined in step S6 that there is no state change, the process by the control unit 6 ends.
  • the information processing system 1 will be described with reference to a specific example.
  • a case where the above-mentioned display device 2, the tactile presentation device 3, and the information processing device 4 are integrally configured will be described as an example.
  • the content a content that can draw lines, characters, and pictures in the AR space will be described as an example.
  • FIG. 9 shows a specific example of using the information processing system 1.
  • one smartphone hereinafter, simply referred to as a smartphone
  • the smartphone 100 integrally constitutes the above-mentioned display device 2, tactile presentation device 3, and information processing device 4. That is, the smartphone 100 includes a display device 2, a tactile presentation device 3, and an information processing device 4, which are integrally configured.
  • the smartphone 100 includes a display 21 as a component of the display device 2 described above. Further, the smartphone 100 includes a vibrator 31 as a component of the above-mentioned tactile presentation device 3. Further, the smartphone 100 includes an image pickup device 41 as a component of the detection unit 8 of the information processing device 4 described above. Further, the smartphone 100 includes an audio output device and a speaker 101 as a component thereof.
  • the smartphone 100 performs a process of displaying characters and pictures drawn by the user in the air in the virtual space displayed on the display 21 by executing the program stored in the storage unit 5 by the control unit 6 described above.
  • the user draws a picture on a virtual plane (not necessarily an exact plane) in the air using the AR pen 110 as the operating body 10 held by the user (in the figure, a broken line). Show).
  • the control unit 6 detects the trajectory of the pen tip of the AR pen 110 and controls so that the drawn picture (a picture invisible in the real space) is projected on the virtual space displayed on the display 21. ..
  • the smartphone 100 makes the user feel the object (display object) in the virtual space described above by executing the program stored in the storage unit 5 by the control unit 6 described above.
  • the smartphone 100 presents the user with a tactile sensation to a part (left hand part) different from the part that operates the virtual object in the virtual space (the part where the AR pen 110 touches the body). ..
  • the feeling of writing when the user draws a picture in space with the AR pen 110 with the right hand is tactilely presented to the left hand of the user holding the smartphone 100 by the vibration of the vibrator 31 of the smartphone 100. ing. That is, the tactile presentation is performed not on the right hand portion holding the AR pen 110 but on the left hand holding the smartphone 100.
  • This tactile sensation for example, makes you feel as if you are drawing with a pen on an actual plane.
  • the sense of touch not only the sense of touch but also the sense of hearing and sight are used.
  • a voice related to writing comfort for example, a crisp sound
  • an object in the virtual space is made to be felt by hearing.
  • an image related to writing comfort in the illustrated example, a wavy jagged display
  • the display 21 is visually perceived.
  • the phenomenon in the virtual space can be felt visually, auditorily, and tactilely, and the user can have a more realistic and advanced virtual experience similar to that in the real space.
  • the control unit 6 controls to generate control information for controlling the operation of the tactile presentation device 3 according to the distance between the virtual object and the real object that operates the virtual object.
  • the information processing system 1 is provided with a tactile presentation device 3 different from the operation body 10, and the control unit 6 controls the operation of the tactile presentation device 3, so that the tactile presentation on the operation body 10 can be performed. It becomes unnecessary.
  • the operating body 10 does not require a battery, an electric circuit, or the like, the operating body 10 can be made smaller and lighter. Even when the operating body 10 is a human body such as a user's hand, the user can be made to perform tactile presentation.
  • the body part that becomes the user's operating body 10 and the part that comes into contact with the operating body 10 are not limited to the hands and fingers. The same applies to the portion where the tactile sensation is presented by the tactile sensation presenting device 3. For example, by making these two different, excellent tactile presentation becomes possible.
  • the present technology is not limited to the above-described embodiment, and various modifications based on the technical idea of the present technology are possible. For example, various modifications as described below are possible. In addition, one or a plurality of arbitrarily selected modifications may be appropriately combined in the following modification modes. In addition, the configurations, methods, processes, shapes, materials, numerical values, and the like of the above-described embodiments can be combined with each other as long as they do not deviate from the gist of the present technology.
  • the phenomenon of the virtual space is mainly caused by the contact between the virtual object and the operating body 10, but the phenomenon is not limited to the contact and may be caused by the approach of a predetermined distance.
  • the tactile presentation can be performed without contacting the operating body 10 with the virtual object.
  • the tactile presentation may be performed by approaching a predetermined distance.
  • the temperature of the tactile presenting device 3 is changed by changing the temperature of the Perche element provided in the tactile presenting device 3 to present the temperature sensation. May be good. As a result, the sense of reality can be further enhanced.
  • the tactile presentation to one user has been described as the processing of the information processing device 4, but the present invention is not limited to this, and the display of the display device 2 and the tactile presentation of the tactile presentation device 3 can be performed by another user. It may be shared with the device.
  • the transmission / reception control unit 62 may control the display device 2 so that the image information can be shared between the display device 2 and another display device (display device for another user) different from the display device 2.
  • the tactile presentation device 3 and another tactile presentation device (tactile presentation device for other users) different from the tactile presentation device 3 may be controlled so that the control information can be shared.
  • the tactile presentation regarding the operation of the operation body 10 of A is performed by the operation body 10 including the tactile presentation device 3 of A. Fine-tuned tactile presentation is possible, such as by performing the tactile presentation device 3 on the main body side, which is not the operating body 10 of B.
  • An information processing device including a control unit that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • the information processing device according to (1) wherein the control unit controls the operation of a tactile presentation device different from the real object.
  • the virtual object is a virtual plane
  • the real object is an operating body that draws a locus on the virtual plane.
  • the information processing device according to (1) or (2), wherein the control unit causes the tactile presentation device to present a tactile sensation representing the drawing comfort of the operating body on the virtual plane.
  • the control unit causes the tactile sensation presenting device to present a tactile sensation according to a setting of a pen type of the operating body.
  • the information processing device according to any one of (1) to (8).
  • the control unit controls the tactile presentation device and another tactile presentation device different from the tactile presentation device so that the control information can be shared.
  • Device. (11)
  • the control device generates control information for causing the tactile presenting device to present a sensation including any of tactile sensation, pressure sensation, vibration sensation, position, movement, force sensation, temperature sensation, and pain sensation in a narrow sense (1).
  • the information processing apparatus according to any one of 10).
  • the control unit causes the tactile presenting device to present a tactile sensation having a strength corresponding to the collision speed of the real object with the virtual object when the contact between the virtual object and the real object is detected.
  • the information processing device according to any one of (11).
  • the real object is an operating instrument
  • the information processing device according to any one of (1) to (13).
  • the control unit changes the setting of the distance for determining that the virtual object and the real object are in contact with each other according to a predetermined condition.
  • the information processing device according to any one of (1) to (15), wherein the control unit causes the tactile presentation device to present a tactile sensation having a strength corresponding to a distance between the detection unit and the real object.
  • the processor An information processing method that controls to generate control information that controls the operation of a tactile presentation device according to the distance between a virtual object and a real object that operates the virtual object.
  • On the computer A program that realizes a control function that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • (19) A display device that displays virtual objects and A tactile presentation device that presents tactile sensations, An information processing device connected to the display device and the tactile presentation device, With The information processing device generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  • the tactile presentation device is an information processing system that presents a tactile sensation based on the control information.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This information processing device includes a control unit for controlling the operation of a haptic presentation device in accordance with the distance between a virtual object and a real object (operating body) which operates the virtual object. Figure 1

Description

情報処理装置、情報処理方法、プログラム及び情報処理システムInformation processing equipment, information processing methods, programs and information processing systems
 本技術は、情報処理装置、情報処理方法、プログラム及び情報処理システムに関する。 This technology relates to information processing devices, information processing methods, programs and information processing systems.
 AR(拡張現実:Augmented Reality)等の仮想空間においてユーザと仮想物体とのインタラクション(相互作用)を高めることは、その体験価値を向上させるうえで必要不可欠である。例えば、仮想空間内に配置された仮想物体をユーザが手で触れたとしても触覚に対するフィードバックがないと実在感が薄れてしまう。 It is indispensable to enhance the interaction between the user and the virtual object in the virtual space such as AR (Augmented Reality) in order to improve the experience value. For example, even if a user touches a virtual object arranged in a virtual space by hand, the sense of reality is diminished if there is no feedback on the sense of touch.
 そこで、仮想物体の実在感を向上させるために触覚フィードバック技術による「触れた感」の提示が求められている。例えば、下記の特許文献1には、仮想平面上の電子ペンのペン先の軌跡を表示するに際し、電子ペンを振動させることで仮想平面にペン先が接しているか否かをユーザに知らせる技術について開示されている。 Therefore, in order to improve the sense of reality of the virtual object, it is required to present the "feeling of touch" by the tactile feedback technology. For example, Patent Document 1 below describes a technique for notifying the user whether or not the pen tip is in contact with the virtual plane by vibrating the electronic pen when displaying the trajectory of the pen tip of the electronic pen on the virtual plane. It is disclosed.
特開2013-125487号公報Japanese Unexamined Patent Publication No. 2013-125487
 しかしながら、特許文献1の技術は、仮想平面に対する電子ペンのペン先の状態をその電子ペンを振動させるなどしてユーザに知らせるものに過ぎない。そこで、さらに優れた触覚提示を行うことができる技術が望まれる。 However, the technique of Patent Document 1 merely informs the user of the state of the pen tip of the electronic pen with respect to the virtual plane by vibrating the electronic pen. Therefore, a technique capable of performing even better tactile presentation is desired.
 本技術の目的の一つは、優れた触覚提示を行うことができる情報処理装置、情報処理方法、プログラム及び情報処理システムを提供することにある。 One of the purposes of this technology is to provide an information processing device, an information processing method, a program, and an information processing system capable of performing excellent tactile presentation.
 本技術は、
 仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御部を備える
 情報処理装置である。
This technology
It is an information processing device including a control unit that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
 本技術は、
 プロセッサが、
 仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御を行う
 情報処理方法である。
This technology
The processor
This is an information processing method that controls to generate control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
 本技術は、
 コンピュータに、
 仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御機能を実現させる
 プログラムである。
This technology
On the computer
This is a program that realizes a control function that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
 本技術は、
 仮想オブジェクトを表示する表示装置と、
 触覚を提示する触覚提示装置と、
 前記表示装置及び前記触覚提示装置と接続された情報処理装置と、
 を備え、
 前記情報処理装置は、前記仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて前記触覚提示装置の動作を制御する制御情報を生成し、
 前記触覚提示装置は、前記制御情報に基づく触覚を提示する
 情報処理システムである。
This technology
A display device that displays virtual objects and
A tactile presentation device that presents tactile sensations,
An information processing device connected to the display device and the tactile presentation device,
With
The information processing device generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
The tactile presentation device is an information processing system that presents a tactile sensation based on the control information.
図1は、本実施の形態に係る情報処理システムの構成例を示す図である。FIG. 1 is a diagram showing a configuration example of an information processing system according to the present embodiment. 図2は、情報処理装置の構成例を示すブロック図である。FIG. 2 is a block diagram showing a configuration example of the information processing device. 図3は、制御部の制御による機能構成例を示す機能ブロック図である。FIG. 3 is a functional block diagram showing an example of a functional configuration controlled by the control unit. 図4は、操作体の位置把握を行う構成例を示す図である。FIG. 4 is a diagram showing a configuration example for grasping the position of the operating body. 図5は、操作体の位置把握を行う他の構成例を示す図である。FIG. 5 is a diagram showing another configuration example for grasping the position of the operating body. 図6は、操作体の位置把握を行うさらに他の構成例を示す図である。FIG. 6 is a diagram showing still another configuration example for grasping the position of the operating body. 図7は、拡大/縮小UIの操作例について説明するための図である。FIG. 7 is a diagram for explaining an operation example of the enlargement / reduction UI. 図8は、制御部による処理の流れの一例を示すフローチャートである。FIG. 8 is a flowchart showing an example of the processing flow by the control unit. 図9は、情報処理システムを利用する具体例を示す図である。FIG. 9 is a diagram showing a specific example of using the information processing system.
 以下に説明する実施の形態は、本技術の好適な具体例であり、技術的に好ましい種々の限定が付されている。しかしながら、本技術の範囲は、以下の説明において、特に本技術を限定する旨の記載がない限り、以下の実施の形態に限定されないものとする。なお、本明細書及び図面において、実質的に同一の機能構成を有するものについては同一の符号を付し、重複説明を適宜省略する。本技術の説明は、以下の順序で行う。
1.実施の形態
1-1.情報処理システムの構成
1-2.情報処理装置の構成
1-3.情報処理装置の機能
1-4.触覚提示の具体例
1-5.情報処理装置の処理
1-6.情報処理システムの具体例
2.まとめ
The embodiments described below are suitable specific examples of the present technology, and are provided with various technically preferable limitations. However, the scope of the present technology shall not be limited to the following embodiments unless otherwise stated in the following description to limit the present technology. In the present specification and the drawings, those having substantially the same functional configuration are designated by the same reference numerals, and duplicate description will be omitted as appropriate. The present technology will be described in the following order.
1. 1. Embodiment 1-1. Information processing system configuration 1-2. Configuration of information processing device 1-3. Information processing device functions 1-4. Specific examples of tactile presentation 1-5. Information processing device processing 1-6. Specific example of information processing system 2. summary
<1.実施の形態>
[1-1.情報処理システムの構成]
 まず、図1を参照して本実施の形態に係る情報処理システムの構成について説明する。図1は、本実施の形態に係る情報処理システムの構成例を示す。図1に示す情報処理システム1は、表示装置2、触覚提示装置3、情報処理装置4及び情報共有サーバSを備える。情報処理装置4は、表示装置2、触覚提示装置3及び情報共有サーバSとそれぞれ接続されており、表示装置2、触覚提示装置3及び情報共有サーバSの各々と情報通信可能に構成されている。なお、この接続は、有線、無線などの通信方式を問わない。
<1. Embodiment>
[1-1. Information processing system configuration]
First, the configuration of the information processing system according to the present embodiment will be described with reference to FIG. FIG. 1 shows a configuration example of an information processing system according to the present embodiment. The information processing system 1 shown in FIG. 1 includes a display device 2, a tactile presentation device 3, an information processing device 4, and an information sharing server S. The information processing device 4 is connected to the display device 2, the tactile presentation device 3, and the information sharing server S, respectively, and is configured to be capable of information communication with each of the display device 2, the tactile presentation device 3, and the information sharing server S. .. The connection may be made regardless of the communication method such as wired or wireless.
 表示装置2は、画像を表示する機能を備える。表示装置2は、例えば表示パネル等の表示部を含む機器で構成される。画像は、静止画像と動画像(映像)を含む。表示装置2は、具体的には、情報処理装置4から提供される画像情報に基づきAR(Augmented Reality)空間、VR(Virtual Reality)空間、MR(Mixed Reality)空間等の仮想空間を表示する。つまり、表示装置2は、画像を表示することで仮想空間を表現する。ここで、仮想空間とは、情報処理装置4が実行する情報処理によって構築された仮想上の三次元空間である。表示装置2に表示されるコンテンツとしては、例えば、仮想空間を利用したゲーム、ライブ配信、スポーツ中継、ナビゲーション、教育、観光案内、ショッピング、その他体験型コンテンツなどがあげられる。 The display device 2 has a function of displaying an image. The display device 2 is composed of a device including a display unit such as a display panel. The image includes a still image and a moving image (video). Specifically, the display device 2 displays a virtual space such as an AR (Augmented Reality) space, a VR (Virtual Reality) space, and an MR (Mixed Reality) space based on the image information provided by the information processing device 4. That is, the display device 2 expresses a virtual space by displaying an image. Here, the virtual space is a virtual three-dimensional space constructed by information processing executed by the information processing device 4. Examples of the content displayed on the display device 2 include games using virtual space, live distribution, sports broadcasting, navigation, education, tourist information, shopping, and other hands-on content.
 表示装置2としては、例えば、携帯端末(例えば、スマートフォン、スマートタブレット、携帯電話、携帯ゲーム機等)、ウェアラブル型表示機器(例えば、ヘッドマウントディスプレイ(HMD)、ARグラス、VRグラス等)などがあげられる。なお、表示装置2は、据え置き型のディスプレイであっても構わない。 Examples of the display device 2 include a mobile terminal (for example, a smartphone, a smart tablet, a mobile phone, a portable game machine, etc.), a wearable display device (for example, a head-mounted display (HMD), AR glass, a VR glass, etc.). can give. The display device 2 may be a stationary display.
 触覚提示装置3は、ユーザに触覚を提示する触覚提示機能を備える。触覚提示装置3は、例えば振動子、電気触覚装置、電気的筋肉刺激装置、ペルチェ素子等の触覚提示部を含む機器で構成される。触覚提示装置3は、具体的には、情報処理装置4から提供される制御情報に基づきユーザに仮想空間の現象(詳しくは後述する)に関する触覚を提示する。ここで、感覚は大きく3種類に分類される。視覚,聴覚などの特殊感覚と、内臓痛覚などの内臓感覚と、それ以外の体の表面の皮膚、粘膜や体の深部の筋、腱、関節などからの触覚(狭義の触覚)、圧覚、振動感覚や、位置、動き、力の感覚、温度感覚、痛覚などの体性感覚である。本明細書及び図面中における「触覚」とは、広義の意味での触覚(広義の触覚)であり、この体性感覚のことをいうものとする。つまり、上述した触覚提示機能は、ユーザにこれらの体性感覚を与える機能のことをいう。上述した触覚提示部は、この体性感覚をユーザに与えられるものであればよい。 The tactile presentation device 3 has a tactile presentation function for presenting a tactile sensation to a user. The tactile presentation device 3 is composed of a device including a tactile presentation unit such as a vibrator, an electric tactile device, an electric muscle stimulator, and a Pelche element. Specifically, the tactile presentation device 3 presents the user with a tactile sensation regarding a phenomenon in the virtual space (details will be described later) based on the control information provided by the information processing device 4. Here, sensations are roughly classified into three types. Special senses such as sight and hearing, visceral sensations such as visceral pain, and tactile sensations (tactile sensations in a narrow sense), pressure sensations, and vibrations from the skin, mucous membranes, and deep muscles, tendons, and joints of the body. It is a somatic sensation such as sensation, position, movement, force sensation, temperature sensation, and pain sensation. The term "tactile sensation" in the present specification and drawings means tactile sensation in a broad sense (tactile sensation in a broad sense), and refers to this somatosensory sensation. That is, the above-mentioned tactile presentation function refers to a function that gives the user these somatosensory sensations. The above-mentioned tactile presentation unit may be any as long as it can give this somatosensory to the user.
 触覚提示装置3としては、例えば、上述した携帯端末や、ペン型の電子機器(いわゆるARペン等)、コントローラ等の把持型の電子機器、グローブ型(いわゆるハプティックグローブ等)、腕輪型、指輪型等のウェアラブル型の電子機器などがあげられる。触覚提示装置3は、ユーザに触覚を提示できる構成であればよい。なお、情報処理システム1は、ユーザ1人で使用できる複数の触覚提示装置3を備えていてもよい。 Examples of the tactile presentation device 3 include the above-mentioned mobile terminal, a pen-type electronic device (so-called AR pen, etc.), a grip-type electronic device such as a controller, a glove type (so-called haptic glove, etc.), a bracelet type, and a ring type. Such as wearable electronic devices. The tactile presentation device 3 may have a configuration capable of presenting a tactile sensation to the user. The information processing system 1 may include a plurality of tactile presentation devices 3 that can be used by one user.
 情報処理装置4は、表示装置2及び触覚提示装置3を制御する機能、表示装置2、触覚提示装置3及び情報共有サーバSの各々と情報通信を行う機能等を備える。情報処理装置4としては、例えば、上述した携帯端末、パーソナルコンピュータ、ゲーム機などがあげられる。この情報処理装置4の詳細については後述する。 The information processing device 4 has a function of controlling the display device 2 and the tactile presentation device 3, a function of performing information communication with each of the display device 2, the tactile presentation device 3, and the information sharing server S. Examples of the information processing device 4 include the above-mentioned mobile terminal, personal computer, game machine, and the like. The details of the information processing device 4 will be described later.
 情報共有サーバSは、情報処理装置4と他の情報処理装置(図示略)とで情報、例えば画像情報、制御情報等を共有可能な構成を備える。なお、情報共有サーバSを設けずに、クライアント間で直接通信する構成であってもよい。例えば、図示した例の場合には、クライアントである情報処理装置4と他の情報処理装置との間で直接通信する構成であってもよい。また、共有不要の場合などは情報共有サーバSを省略してもよい。 The information sharing server S has a configuration in which information such as image information and control information can be shared between the information processing device 4 and another information processing device (not shown). It should be noted that the configuration may be such that the clients communicate directly with each other without providing the information sharing server S. For example, in the case of the illustrated example, the information processing device 4 which is a client may be configured to directly communicate with another information processing device. Further, when sharing is not required, the information sharing server S may be omitted.
 ここで、情報処理システム1は、表示装置2、触覚提示装置3及び情報処理装置4の少なくとも2つ以上が一体構成されたものであってもよい。例えば、表示装置2に情報処理装置4の機能構成を備えさせてもよいし、触覚提示装置3に情報処理装置4の機能構成を備えさせてもよい。また、表示装置2に情報処理装置4及び触覚提示装置3の両方の機能構成を備えさせてもよいし、表示装置2に触覚提示装置3の機能構成を備えさせてもよい。 Here, the information processing system 1 may be one in which at least two or more of the display device 2, the tactile presentation device 3, and the information processing device 4 are integrally configured. For example, the display device 2 may be provided with the functional configuration of the information processing device 4, or the tactile presentation device 3 may be provided with the functional configuration of the information processing device 4. Further, the display device 2 may be provided with the functional configurations of both the information processing device 4 and the tactile presentation device 3, or the display device 2 may be provided with the functional configurations of the tactile presentation device 3.
 また、情報処理システム1は、例えばスピーカ等の音声を出力する音声出力部を有する音声出力装置(図示略)を備えていてもよい。音声出力装置は、他の装置とは別体構成であってもよいし、表示装置2等の他の装置と一体構成であってもよい。音声出力装置としては、スピーカ、ヘッドフォン、ワイヤレスイヤホン等があげられる。 Further, the information processing system 1 may include an audio output device (not shown) having an audio output unit that outputs audio such as a speaker. The audio output device may be configured separately from other devices, or may be integrally configured with other devices such as the display device 2. Examples of the audio output device include speakers, headphones, wireless earphones, and the like.
 また、表示装置2及び触覚提示装置3は、図示したように、情報処理装置4を介して間接的に情報共有サーバSと接続することに限らず、直接的に接続されるものであってもよい。例えば、LAN(Local Area Network)等のネットワークを利用して表示装置2、触覚提示装置3、情報処理装置4及び情報共有サーバSがそれぞれ相互に接続されるものであってもよい。 Further, as shown in the figure, the display device 2 and the tactile presentation device 3 are not limited to being indirectly connected to the information sharing server S via the information processing device 4, but may be directly connected. good. For example, the display device 2, the tactile presentation device 3, the information processing device 4, and the information sharing server S may be connected to each other by using a network such as a LAN (Local Area Network).
 情報処理システム1は、上述したように仮想空間内の現象をユーザに視覚だけでなく触覚等を通じて感じさせるためものである。これによりユーザは、例えば仮想空間を見るだけでは得られない実体験をしているかの如く現実に近い高度な仮想体験を楽しむことができる。 The information processing system 1 is for making the user feel the phenomenon in the virtual space not only visually but also by touch as described above. As a result, the user can enjoy an advanced virtual experience that is close to reality as if he / she had a real experience that cannot be obtained only by looking at the virtual space.
[1-2.情報処理装置の構成]
 ここで、上述した情報処理装置4の構成について詳細に説明する。図2は、情報処理装置4の構成例を示すブロック図である。情報処理装置4は、記憶部5、制御部6、通信部7及び検出部8を備える。
[1-2. Information processing device configuration]
Here, the configuration of the information processing apparatus 4 described above will be described in detail. FIG. 2 is a block diagram showing a configuration example of the information processing device 4. The information processing device 4 includes a storage unit 5, a control unit 6, a communication unit 7, and a detection unit 8.
 記憶部5は、例えばRAM(Random Access Memory)、ROM(Read Only Memory)、ハードディスク等で構成される。記憶部5には、制御部6による処理に必要な情報、例えばプログラムやプログラムで使用されるデータ等が記憶される。 The storage unit 5 is composed of, for example, a RAM (RandomAccessMemory), a ROM (ReadOnlyMemory), a hard disk, or the like. The storage unit 5 stores information necessary for processing by the control unit 6, such as a program and data used in the program.
 制御部6は、例えばCPU(Central Processing Unit、すなわちプロセッサ)等で構成される。制御部6は、記憶部5に記憶されているプログラムを読み出して実行する。そして、制御部6は、情報処理装置4の各構成要素を制御する。なお、プログラムは、USBメモリ等の外部ストレージに記憶されたものであってもよいし、ネットワークを介して提供されるものやネットワークを介して他の装置で部分的に実行されるものであってもよい。 The control unit 6 is composed of, for example, a CPU (Central Processing Unit, that is, a processor) or the like. The control unit 6 reads and executes the program stored in the storage unit 5. Then, the control unit 6 controls each component of the information processing device 4. The program may be stored in an external storage such as a USB memory, may be provided via a network, or may be partially executed by another device via the network. May be good.
 通信部7は、例えば通信装置等で構成され、表示装置2、触覚提示装置3及び情報共有サーバSの各々と通信を行う。通信部7は、情報共有サーバSから得られた情報、例えば、画像情報、制御情報等を制御部6に与える。また、通信部7は、制御部6から得られた情報(例えば、画像情報、制御情報等)を表示装置2、触覚提示装置3、情報共有サーバSに与える。具体的には、通信部7は、画像情報を表示装置2及び情報共有サーバSに与え、制御情報を触覚提示装置3及び情報共有サーバSに与える。 The communication unit 7 is composed of, for example, a communication device and communicates with each of the display device 2, the tactile presentation device 3, and the information sharing server S. The communication unit 7 gives the information obtained from the information sharing server S, for example, image information, control information, and the like to the control unit 6. Further, the communication unit 7 gives the information (for example, image information, control information, etc.) obtained from the control unit 6 to the display device 2, the tactile presentation device 3, and the information sharing server S. Specifically, the communication unit 7 gives image information to the display device 2 and the information sharing server S, and gives control information to the tactile presentation device 3 and the information sharing server S.
 検出部8は、例えば撮像装置等で構成され、上述した仮想空間の現象を検出するための検出情報を制御部6に与える。なお、検出部8は、表示装置2又は触覚提示装置3が備えていてもよい。 The detection unit 8 is composed of, for example, an imaging device or the like, and provides the control unit 6 with detection information for detecting the above-mentioned phenomenon in the virtual space. The detection unit 8 may be provided by the display device 2 or the tactile presentation device 3.
[1-3.情報処理装置の機能]
 次に、情報処理装置4が備える機能について詳細に説明する。図3は、制御部6の制御による機能構成例を示す機能ブロック図である。制御部6は、主として表示制御部61、送受信制御部62、操作体認識部63及び触覚提示制御部64を備える。これらの機能ブロックは、例えば、制御部6が上述したプログラムを実行することによって機能する。
[1-3. Information processing device functions]
Next, the functions included in the information processing apparatus 4 will be described in detail. FIG. 3 is a functional block diagram showing an example of a functional configuration controlled by the control unit 6. The control unit 6 mainly includes a display control unit 61, a transmission / reception control unit 62, an operating body recognition unit 63, and a tactile presentation control unit 64. These functional blocks function, for example, when the control unit 6 executes the above-mentioned program.
 表示制御部61は、表示装置2の動作を制御して表示装置2に仮想空間を表示させる。表示制御部61は、具体的には、仮想空間を表示させるための画像情報を生成し、生成した画像情報を表示装置2に提供する。これにより、表示装置2は、画像情報に基づく仮想空間(仮想オブジェクト、実オブジェクト等の表示物)を表示する。この画像情報は、例えばコンテンツプログラムの実行によって生成される。なお、触覚提示を行うプログラムは、コンテンツプログラムに含まれていてもよいし、別個のものであってもよい。 The display control unit 61 controls the operation of the display device 2 to display the virtual space on the display device 2. Specifically, the display control unit 61 generates image information for displaying the virtual space, and provides the generated image information to the display device 2. As a result, the display device 2 displays a virtual space (display object such as a virtual object or a real object) based on the image information. This image information is generated, for example, by executing a content program. The tactile presentation program may be included in the content program or may be separate.
 送受信制御部62は、仮想空間の表示をユーザ間で共有させる。送受信制御部62は、具体的には、画像情報を情報共有サーバSに送るように通信部7を制御する。これにより、表示装置2と同様に他の表示装置(図示略)に情報共有サーバSの画像情報に基づく仮想空間を表示させ、他のユーザに見させることができる。 The transmission / reception control unit 62 shares the display of the virtual space among users. Specifically, the transmission / reception control unit 62 controls the communication unit 7 so as to send image information to the information sharing server S. As a result, the virtual space based on the image information of the information sharing server S can be displayed on another display device (not shown) as in the display device 2, and can be seen by another user.
 また、送受信制御部62は、触覚をユーザ間で共有させる。送受信制御部62は、具体的には、後述する触覚提示制御部64にて生成された制御情報を情報共有サーバSに送るように通信部7を制御する。これにより、触覚提示装置3と同様に他の触覚提示装置(図示略)に情報共有サーバSの制御情報に基づく触覚提示を行わせ、他のユーザに知覚させることができる。 Further, the transmission / reception control unit 62 shares the tactile sensation among users. Specifically, the transmission / reception control unit 62 controls the communication unit 7 so as to send the control information generated by the tactile presentation control unit 64, which will be described later, to the information sharing server S. As a result, similarly to the tactile presentation device 3, another tactile presentation device (not shown) can be made to perform tactile presentation based on the control information of the information sharing server S, and can be perceived by another user.
 操作体認識部63は、仮想空間において仮想オブジェクトを操作する実オブジェクト(操作体)の位置把握等を行う。操作体は、触覚提示装置3とは異なるもの(別体)である。但し、情報処理システム1が複数の触覚提示装置3を備える場合には、操作体10は、触覚提示装置3の機能構成を備えるものであってもよい。つまり、情報処理システム1は、操作体とは異なる触覚提示装置3を備える。 The operation body recognition unit 63 grasps the position of the real object (operation body) that operates the virtual object in the virtual space. The operating body is different from the tactile presentation device 3 (separate body). However, when the information processing system 1 includes a plurality of tactile presentation devices 3, the operating body 10 may include the functional configuration of the tactile presentation device 3. That is, the information processing system 1 includes a tactile presentation device 3 different from the operating body.
 なお、仮想オブジェクトとは、あたかも現実空間に存在するかのようにユーザに知覚される仮想的なもの(具体的には、実体を伴わない表示物)を意味する。例えば、仮想オブジェクトは、二次元又は三次元のコンピュータグラフィックによって表現され、仮想空間に配置される。仮想オブジェクトには、仮想物体、仮想UI等が含まれる。仮想物体には、ユーザに見えるものだけでなく見えないもの(強いて言えば透明物体)も含まれる。実オブジェクトとは、現実空間に実際に存在する現実物体(具体的には、実体を伴う表示物)を意味する。なお、現実物体には人体も含まれる。 Note that the virtual object means a virtual object (specifically, a display object without an entity) that is perceived by the user as if it exists in the real space. For example, virtual objects are represented by two-dimensional or three-dimensional computer graphics and placed in virtual space. The virtual object includes a virtual object, a virtual UI, and the like. Virtual objects include not only those that are visible to the user but also those that are invisible (transparent objects, to put it bluntly). A real object means a real object (specifically, a display object with an entity) that actually exists in the real space. The real object also includes the human body.
 図4は、操作体の位置把握を行う構成例を示す図である。図4に示す操作体10は、仮想空間の現象をユーザが引き起こす(操作する)ために使うものである。操作体10としては、例えば、図示するようなペン型(例えば、ARペン等)の操作器具や、ユーザの手、足、頭等の体の一部又は全部などがあげられる。操作部11は、操作体10において仮想オブジェクトとの接点となる部分である。例えば、操作部11は、操作体10がペン型の器具の場合にはペン先となる。操作体10がユーザの指である場合には指先、ユーザの手である場合には手のひらなどが該当する。 FIG. 4 is a diagram showing a configuration example for grasping the position of the operating body. The operating body 10 shown in FIG. 4 is used by the user to cause (manipulate) the phenomenon of the virtual space. Examples of the operating body 10 include a pen-shaped (for example, AR pen, etc.) operating device as shown in the figure, and a part or all of the body such as the user's hands, feet, and head. The operation unit 11 is a portion of the operation body 10 that serves as a point of contact with a virtual object. For example, the operation unit 11 becomes a pen tip when the operation body 10 is a pen-type instrument. When the operating body 10 is a user's finger, it corresponds to a fingertip, and when it is a user's hand, it corresponds to a palm or the like.
 操作体認識部63は、具体的には、検出部8によって操作体10の操作部11の位置(所定位置からの距離及び方向等)を求める。図示する例では、操作体10の本体となる情報処理装置4をスマートフォンによって構成し、検出部8をスマートフォンに内蔵されている撮像装置で構成している。操作体認識部63は、検出部8からの検出情報(この例では撮像情報)から操作部11を検出することで操作部11の位置を求める。例えば、ToF(Time-of-Flight)等により奥行き情報を取得可能なDepthカメラで検出部8を構成し、操作部11を検出することによって所定位置を基準点とするXYZ三次元座標系における三次元座標値(x,y,z)を特定することができる。例えば、操作部11に再帰性反射材を付与しておくことで操作部11の位置特定を良好に行うことができる。 Specifically, the operating body recognition unit 63 obtains the position (distance and direction from a predetermined position, etc.) of the operating unit 11 of the operating body 10 by the detecting unit 8. In the illustrated example, the information processing device 4 which is the main body of the operating body 10 is composed of a smartphone, and the detection unit 8 is composed of an imaging device built in the smartphone. The operating body recognition unit 63 obtains the position of the operating unit 11 by detecting the operating unit 11 from the detection information (imaging information in this example) from the detecting unit 8. For example, the detection unit 8 is configured by a Depth camera capable of acquiring depth information by ToF (Time-of-Flight) or the like, and by detecting the operation unit 11, the third order in the XYZ three-dimensional coordinate system with a predetermined position as a reference point. The original coordinate values (x, y, z) can be specified. For example, by imparting a retroreflective material to the operation unit 11, the position of the operation unit 11 can be satisfactorily specified.
 なお、図4に示す操作体10は、グリップ部分に操作ボタン12を備えている。操作ボタン12は、例えば、情報処理装置4への操作開始(書き始め)トリガ通知用として使用される。操作ボタン12は、電気的なスイッチに限らず、機械的なスイッチであってもよい。例えば、操作ボタン12を押すと操作部11が飛び出して反射材が出てくるような構造であってもよい。機械的なスイッチとすることで、バッテリ、電気回路等が不要となり、操作体10の小型、軽量化を図ることができる。 The operating body 10 shown in FIG. 4 is provided with an operating button 12 on the grip portion. The operation button 12 is used, for example, for notifying the information processing apparatus 4 of an operation start (writing start) trigger. The operation button 12 is not limited to an electric switch, but may be a mechanical switch. For example, the structure may be such that when the operation button 12 is pressed, the operation unit 11 pops out and the reflective material comes out. By using a mechanical switch, a battery, an electric circuit, and the like are not required, and the operating body 10 can be made smaller and lighter.
 図5は、操作体10の位置把握を行う他の構成例を示す図である。図示する例では、操作体10はペン型の器具で構成されている。また、操作体10の表面には複数のマーカ(例えば、目に見えないマーカ)13が付与されている。そして、上述したように情報処理装置4の検出部8を撮像装置で構成する。これにより、操作体認識部63は、検出情報から操作体10のマーカ13を検出することで、操作体10の角度・位置等を推定し、ペン先の操作部11の位置を特定することができる。この際、例えば、マーカ13の形や大きさ等を異ならせて操作体10の向きや角度などを求めてもよい。この図5に示す場合も上述したように小型、軽量化を図ることができる。 FIG. 5 is a diagram showing another configuration example for grasping the position of the operating body 10. In the illustrated example, the operating body 10 is composed of a pen-shaped instrument. Further, a plurality of markers (for example, invisible markers) 13 are attached to the surface of the operating body 10. Then, as described above, the detection unit 8 of the information processing device 4 is configured by the image pickup device. As a result, the operating body recognition unit 63 can estimate the angle, position, etc. of the operating body 10 by detecting the marker 13 of the operating body 10 from the detection information, and specify the position of the operating unit 11 of the pen tip. can. At this time, for example, the orientation and angle of the operating body 10 may be obtained by changing the shape and size of the marker 13. Also in the case shown in FIG. 5, it is possible to reduce the size and weight as described above.
 図6は、操作体10の位置把握を行うさらに他の構成例を示す図である。図示する例では、ユーザの手(指)を操作体10としている。そして、その操作体10の指先を操作部11としている。操作体認識部63は、検出部8を撮像装置とし、検出部8の検出情報から操作部11の位置を特定することができる。この際、例えば、ユーザの手の各指の形などから指先の位置を特定することができる。 FIG. 6 is a diagram showing still another configuration example for grasping the position of the operating body 10. In the illustrated example, the user's hand (finger) is the operating body 10. The fingertip of the operating body 10 is used as the operating unit 11. The operating body recognition unit 63 can specify the position of the operating unit 11 from the detection information of the detecting unit 8 by using the detecting unit 8 as an image pickup device. At this time, for example, the position of the fingertip can be specified from the shape of each finger of the user's hand.
 このように、操作体10としては、特に決まった物を用いる必要はなく、ユーザが空中操作可能な器具でもよいし、ユーザの体自体であってもよい。また、操作部11は仮想オブジェクト等の表示物に近接可能であり、その位置を把握できるものであればよい。つまり、操作体10及び操作部11は、操作内容等に応じて適宜決めればよく、位置把握等についても特定の方法に限定される訳ではなく、既知の技術を利用することができる。なお、操作体10がARペン、ハプティックグローブ等の器具である場合には、操作体10を触覚提示装置3としてもよい。ただし、この場合、情報処理システム1は、上述したように操作体10とは別の触覚提示装置3を備えるようにする。 As described above, the operating body 10 does not need to use a particularly fixed object, and may be an instrument that can be operated by the user in the air, or may be the user's body itself. Further, the operation unit 11 may be close to a display object such as a virtual object and may be able to grasp the position thereof. That is, the operating body 10 and the operating unit 11 may be appropriately determined according to the operation content and the like, and the position grasping and the like are not limited to a specific method, and known techniques can be used. When the operating body 10 is an instrument such as an AR pen or a haptic glove, the operating body 10 may be used as the tactile presentation device 3. However, in this case, the information processing system 1 is provided with a tactile presentation device 3 different from the operating body 10 as described above.
 図3に示す触覚提示制御部64は、触覚提示装置3の動作の制御を行う。触覚提示制御部64は、仮想オブジェクトと操作体10との距離に応じた仮想空間の現象を検出し、検出した現象に応じた制御情報を生成する。つまり、触覚提示制御部64は、仮想オブジェクトと操作体10との距離に応じて触覚提示装置3の動作を制御する制御信号を生成する。 The tactile presentation control unit 64 shown in FIG. 3 controls the operation of the tactile presentation device 3. The tactile presentation control unit 64 detects a phenomenon in the virtual space according to the distance between the virtual object and the operating body 10, and generates control information according to the detected phenomenon. That is, the tactile presentation control unit 64 generates a control signal that controls the operation of the tactile presentation device 3 according to the distance between the virtual object and the operating body 10.
 具体的には、触覚提示制御部64は、仮想オブジェクトと操作体10との接触を検出した場合に仮想オブジェクトの触感を表す触覚を触覚提示装置3に提示させる。また、触覚提示制御部64は、仮想オブジェクトの状態変化を表す触覚を触覚提示装置3に提示させる。さらに、触覚提示制御部64は、仮想オブジェクトへの操作体10を用いた操作を表す触覚を触覚提示装置3に提示させる。 Specifically, the tactile presentation control unit 64 causes the tactile presentation device 3 to present the tactile sensation representing the tactile sensation of the virtual object when the contact between the virtual object and the operating body 10 is detected. Further, the tactile presentation control unit 64 causes the tactile presentation device 3 to present a tactile sensation representing a state change of the virtual object. Further, the tactile presentation control unit 64 causes the tactile presentation device 3 to present a tactile sensation representing an operation using the operating body 10 on the virtual object.
 触覚提示制御部64は、具体的には、操作体認識部63により特定される操作部11の位置と仮想オブジェクトの仮想空間における配置位置(例えば、上述したXYZ三次元座標系における三次元座標値)とを用い、両者の距離を計算により求めることで接触を検出する。例えば、両者の距離が所定値以内であれば接近していると判定し、両者の距離がゼロであれば接触していると判断することができる。なお、この接触したと判断する距離(接触/非接触の判定範囲)の設定を所定条件に応じて変更させてもよい。例えば、この所定条件を明るさとすることができる。これにより、例えば、暗い場所では触覚の提示範囲を広げる(具体的には、ゼロ以上とする)ことで、操作体10自体が見えないような場合であっても容易に仮想オブジェクトに触れることができるようになる。 Specifically, the tactile presentation control unit 64 has the position of the operation unit 11 specified by the operation body recognition unit 63 and the arrangement position of the virtual object in the virtual space (for example, the three-dimensional coordinate value in the above-mentioned XYZ three-dimensional coordinate system). ) And, and the contact is detected by calculating the distance between the two. For example, if the distance between the two is within a predetermined value, it can be determined that they are close to each other, and if the distance between them is zero, it can be determined that they are in contact with each other. The setting of the distance (contact / non-contact determination range) for determining contact may be changed according to a predetermined condition. For example, this predetermined condition can be set to brightness. As a result, for example, by expanding the tactile presentation range in a dark place (specifically, setting it to zero or more), the virtual object can be easily touched even when the operating body 10 itself cannot be seen. become able to.
[1-4.触覚提示の具体例]
 ここで、操作体10の操作(作法)と、それに伴う触覚提示の具体例について説明する。
[1-4. Specific example of tactile presentation]
Here, the operation (method) of the operating body 10 and the specific example of the tactile sensation presentation accompanying it will be described.
(1)クリック操作(選択・決定等)
 仮想空間内におけるいわゆるマウス操作でいうところのクリック操作時に、該操作を表す振動等を生じさせることで触覚提示を行う。例えば、クリック操作によって操作体10の操作部11と重なっている仮想オブジェクト(例えば、仮想物体)を選択したとき、仮想ボタンを押すなどの仮想UI(User Interface)の操作決定をしたときなどが該当する。クリック操作の方法は、特に問わないが、例えば操作体10の操作ボタン12(図4参照)を押すことなどがあげられる。この場合、例えば操作ボタン12を押したとき、離したときにその操作を表す振動等(例えば、「カッ、チ」と感じる振動)を生じさせるようにしてもよい。
(1) Click operation (selection / decision, etc.)
At the time of a click operation, which is a so-called mouse operation in a virtual space, tactile presentation is performed by generating vibration or the like representing the operation. For example, when a virtual object (for example, a virtual object) that overlaps with the operation unit 11 of the operation body 10 is selected by a click operation, or when an operation decision of a virtual UI (User Interface) such as pressing a virtual button is made. do. The method of the click operation is not particularly limited, and examples thereof include pressing the operation button 12 (see FIG. 4) of the operation body 10. In this case, for example, when the operation button 12 is pressed or released, a vibration or the like (for example, a vibration that feels “catch” or “chi”) may be generated.
(2)ドラッグ・ドラッグ&ドロップ操作(操作、移動、描く等)
 上述したクリック操作時と同様、ドラッグ・ドラッグ&ドロップ操作時に、該操作を表す振動等を生じさせることで触覚提示を行う。例えば、図7に示すような拡大/縮小UIやスクロールバーなどの操作中に、目盛りを想起させる「カチカチ」などと感じるような振動等を触覚提示装置3に生じさせるようにしてもよい。また、例えば、仮想オブジェクトを移動させた際に移動中であることを表す振動等を生じさせてもよい。また、例えば、移動中の仮想オブジェクトが別の仮想オブジェクトや実オブジェクトに衝突した際に、その衝突を表す振動等を生じさせてもよいし、移動中の仮想オブジェクトを空中で離した場合に重力で落下し、机など実オブジェクトに衝突した際にその衝突を表す振動等を生じさせてもよい。
(2) Drag / drag & drop operation (operation, movement, drawing, etc.)
Similar to the above-mentioned click operation, the tactile presentation is performed by generating vibration or the like representing the operation during the drag / drag & drop operation. For example, during the operation of the enlargement / reduction UI or the scroll bar as shown in FIG. 7, the tactile presentation device 3 may be caused to vibrate or the like to be felt as “tick” reminiscent of the scale. Further, for example, when the virtual object is moved, vibration or the like indicating that the virtual object is moving may be generated. Further, for example, when a moving virtual object collides with another virtual object or a real object, vibration or the like indicating the collision may be generated, or when the moving virtual object is released in the air, gravity may occur. When it falls and collides with a real object such as a desk, vibration or the like representing the collision may be generated.
 操作体10を使って空中の仮想平面上に線、文字、数字、記号、図形、絵等の軌跡を描いた場合に、仮想平面に対する操作体10の描き心地(テクスチャ感)を表す触覚を触覚提示装置3に振動等によって提示させる。この場合、例えば、操作体10のペン種の設定に応じた触覚を振動等により提示させてもよい。例えば、鉛筆を想定している場合には「ザラザラ」、サインペンを想定している場合には「キュッ」と感じるようにする。 When a locus of lines, characters, numbers, symbols, figures, pictures, etc. is drawn on a virtual plane in the air using the operation body 10, the tactile sensation representing the drawing comfort (texture feeling) of the operation body 10 with respect to the virtual plane is tactile. Have the presenting device 3 present by vibration or the like. In this case, for example, the tactile sensation corresponding to the setting of the pen type of the operating body 10 may be presented by vibration or the like. For example, if you are assuming a pencil, you should feel "rough", and if you are assuming a felt-tip pen, you should feel "squeaky".
(3)突く、触れる、動かす等
 操作体10が仮想オブジェクトに触れたとき(例えば、触れた瞬間)に、その触感を表す振動等を生じさせることで触覚提示を行う。この際、例えば、触れた仮想オブジェクトの素材の設定に応じた触覚を提示させてもよい。例えば、プラスチック素材を想定している場合であれば固い素材を叩いたときのような「カツカツ」といった短い周期の振動等を生じさせ、柔らかい素材であれば、操作部11のめり込み量の変化に応じて「グググ」と感じる連続的な振動等を生じさせるようにする。なお、仮想オブジェクトと操作体10との接触を検出した場合に仮想オブジェクトへの操作体10の衝突速度に応じた強さの触覚を触覚提示装置3に提示させてもよい。例えば、仮想オブジェクトに触れたときの衝突速度が速い場合には強い振動等とし、遅い場合には弱い振動等とする。これにより、ユーザに衝突の強さに応じた衝撃を感じさせることができる。
(3) Poke, touch, move, etc. When the operating body 10 touches a virtual object (for example, at the moment of touching it), the tactile sensation is presented by generating vibration or the like that expresses the tactile sensation. At this time, for example, the tactile sensation according to the setting of the material of the touched virtual object may be presented. For example, if a plastic material is assumed, it causes vibrations with a short cycle such as "clicking" when hitting a hard material, and if it is a soft material, it responds to changes in the amount of penetration of the operation unit 11. To generate continuous vibrations that make you feel "gugugu". When the contact between the virtual object and the operating body 10 is detected, the tactile sensation device 3 may be made to present a tactile sensation having a strength corresponding to the collision speed of the operating body 10 with the virtual object. For example, if the collision speed when the virtual object is touched is high, the vibration is strong, and if the collision speed is slow, the vibration is weak. As a result, the user can feel the impact according to the strength of the collision.
(4)力覚フィードバック
 仮想オブジェクトを動かした際に、その仮想オブジェクトから受ける力を表す振動等を生じさせることで力覚提示を行うようにしてもよい。例えば、仮想オブジェクトを突いた場合に、突いた方向と反対側に反力を受けるような感触の振動等を生じさせてもよい。また、例えば、仮想オブジェクトが回転するような突かれ方をした場合、回転力を感じられるような感触の振動等を生じさせてもよい。また、例えば、突かれた仮想オブジェクトの画面内での表示位置に応じて回転反力を受けるような感触の振動等を生じさせてもよい。これにより、仮想オブジェクトから受ける力の方向、運動についてもユーザに知覚させることができる。
(4) Force Sense Feedback When a virtual object is moved, the force sensation may be presented by generating vibration or the like representing the force received from the virtual object. For example, when a virtual object is struck, a vibration that feels like receiving a reaction force on the side opposite to the struck direction may be generated. Further, for example, when the virtual object is pierced so as to rotate, it may cause vibration or the like to feel the rotational force. Further, for example, vibration of the feel of receiving a rotational reaction force may be generated according to the display position of the pierced virtual object on the screen. This makes it possible for the user to perceive the direction and movement of the force received from the virtual object.
(5)その他の応用例
 触覚提示装置3に力覚提示を行わせる場合において、例えば、仮想物体を掴んだ場合に、掴まれた仮想物体側の力覚が把持している手とは反対の手に提示されるようにする。これは、操作体10とは異なる触覚提示装置3に触覚提示を行わせることで実現する。これにより、操作体10を振動等させる必要がなくなる。なお、触覚の提示は、振動に限らず、圧力、例えばVRデバイス等の締め付けなどで提示してもよい。
(5) Other application examples In the case where the tactile presentation device 3 is made to present the force sense, for example, when the virtual object is grasped, the force sense on the side of the grasped virtual object is opposite to the grasped hand. Make sure it is presented in your hand. This is realized by having the tactile presentation device 3 different from the operating body 10 perform the tactile presentation. This eliminates the need to vibrate the operating body 10. The tactile sensation is not limited to vibration, but may be presented by pressure, for example, tightening of a VR device or the like.
 操作体10が所定の接触又は接近禁止オブジェクト(実オブジェクト)に所定距離以上近づいた場合に接触又は接近禁止を表す触覚を触覚提示装置3に提示させる。例えば、触ると危険なもの(例えば、火などの熱いもの)がある場合に、所定距離以上近づいたら実際に触れる前から危険物であることを表す触覚を提示するようにする。また、例えば、花瓶などの壊れやすい実オブジェクトに所定距離以上近づいた際に、そのことを表す触覚を提示するようにしてもよい。このようにすることで仮想物体が現実物体に近い(見分けがつきにくい)場合であっても、問題なく仮想体験を行うことができる。 When the operating body 10 approaches a predetermined contact or access prohibition object (real object) by a predetermined distance or more, the tactile sensation indicating contact or access prohibition is presented to the tactile presentation device 3. For example, when there is a dangerous object to touch (for example, a hot object such as fire), when the object approaches a predetermined distance or more, a tactile sensation indicating that the object is dangerous before actually touching it is presented. Further, for example, when a fragile real object such as a vase is approached by a predetermined distance or more, a tactile sensation indicating that may be presented. By doing so, even when the virtual object is close to the real object (it is difficult to distinguish), the virtual experience can be performed without any problem.
 操作体10が操作器具である場合に操作体10をユーザが把持する把持力に応じた触覚を触覚提示装置3に提示させる。例えば、操作体10がペン型の操作器具である場合、操作体10を握っている強さに応じて触覚が変わるようにする。これにより、ユーザが所望の触覚を選択することができるようになる。 When the operating body 10 is an operating device, the tactile sensation presenting device 3 is made to present a tactile sensation corresponding to the gripping force of the user holding the operating body 10. For example, when the operating body 10 is a pen-shaped operating device, the tactile sensation is changed according to the strength of holding the operating body 10. This allows the user to select the desired tactile sensation.
 検出部8と操作体10との距離に応じた強さの触覚を触覚提示装置3に提示させる。例えば、距離が遠い場合には弱い触覚提示を行い、近い場合には強い触覚提示が行われるようにする。これにより、例えば触覚により距離感を感じることができるようになる。 The tactile sensation device 3 is made to present a tactile sensation having a strength corresponding to the distance between the detection unit 8 and the operating body 10. For example, when the distance is long, a weak tactile presentation is performed, and when the distance is short, a strong tactile presentation is performed. As a result, for example, a sense of distance can be felt by touch.
 上述したように、情報処理システム1は、複数の触覚提示装置3を備える構成であってもよい。この場合、該複数の触覚提示装置3の動作を各々制御する制御情報を生成し、所定条件で触覚を提示させる触覚提示装置3を使い分けるようにすることができる。例えば、操作体10が触覚提示装置3を備えたARペンである場合に、ペン操作によるUI操作や描き心地などは、そのARペンの触覚提示装置3を振動等させることによって触感フィードバックを行う。そして、ペン操作後の仮想オブジェクトの挙動(仮想物体が傾く、落ちるなど)については、該ARペンでない本体側の触覚提示装置3を振動等させることによって触覚フィードバックを行うようにする。これにより、ユーザは触覚から直感的に仮想空間の現象の種類を把握できるようになる。また、例えば、触覚提示を行っている1つの触覚提示装置3(例えば、ARペン)のバッテリが切れた場合に、触覚提示を行う装置を他の触覚提示装置3(例えば、本体側のスマートフォン)に切り替えるようにする。これにより、バッテリ状態を気にすることなく触覚提示を行わせることができる。 As described above, the information processing system 1 may be configured to include a plurality of tactile presentation devices 3. In this case, control information for controlling the operation of each of the plurality of tactile presentation devices 3 can be generated, and the tactile presentation device 3 for presenting the tactile sensation under predetermined conditions can be used properly. For example, when the operating body 10 is an AR pen provided with a tactile presentation device 3, UI operation and drawing comfort by pen operation provide tactile feedback by vibrating the tactile presentation device 3 of the AR pen. Then, regarding the behavior of the virtual object after the pen operation (the virtual object tilts, falls, etc.), the tactile feedback is provided by vibrating or the like the tactile presentation device 3 on the main body side that is not the AR pen. As a result, the user can intuitively grasp the type of phenomenon in the virtual space from the tactile sense. Further, for example, when the battery of one tactile presentation device 3 (for example, AR pen) performing tactile presentation is exhausted, another tactile presentation device 3 (for example, a smartphone on the main body side) can be used for tactile presentation. Try to switch to. As a result, the tactile presentation can be performed without worrying about the battery state.
 触覚とともに、触覚を表す音声及び画像の少なくとも何れか1つ以上を提示する。音声は、例えば触覚提示用の振動波形から生成する。これにより、触覚に近い音声を容易に生成することができる。なお、触覚を表す音声波形から触覚提示用の振動を生成してもよい。画像は、例えば、Shakeエフェクト等により表示を揺らすことでビジュアル的に触覚を表現する。なお、触覚提示できない装置に対して触覚を表す音声の出力、画像の表示を行わせてもよい。 Present at least one or more of audio and image representing the sense of touch together with the sense of touch. The voice is generated from, for example, a vibration waveform for tactile presentation. As a result, it is possible to easily generate a voice close to the sense of touch. It should be noted that the vibration for presenting the tactile sensation may be generated from the voice waveform representing the tactile sensation. The image visually expresses the sense of touch by shaking the display with, for example, the Shake effect. It should be noted that the device that cannot present the tactile sensation may be made to output the voice expressing the tactile sensation and display the image.
 表示画像に含まれていない(例えば、カメラ内に映っていない)場所の現象については触覚提示を行わないようにする。逆に、表示画像に含まれていない場所の現象であっても触覚提示を行うようにする。例えば、これら設定を行えるようにすることで、ユーザの使い勝手を向上させることができる。 Do not present tactile sensations for phenomena in places that are not included in the displayed image (for example, not shown in the camera). On the contrary, even if the phenomenon is not included in the displayed image, the tactile sensation is presented. For example, by enabling these settings, it is possible to improve the usability of the user.
[1-5.情報処理装置の処理]
 次に、情報処理装置4における処理について説明する。図8は、制御部6による処理の流れの一例を示すフローチャートである。なお、以下の処理は、各処理に支障のない限り順序を入れ変えることができる。
[1-5. Information processing device processing]
Next, the processing in the information processing apparatus 4 will be described. FIG. 8 is a flowchart showing an example of the processing flow by the control unit 6. The order of the following processes can be changed as long as each process is not hindered.
 まず、操作体認識部63によって操作体10の仮想空間上の位置、姿勢等が認識される(ステップS1)。つまり、ARペン、ハプティックグローブ、指先等の操作体10(操作部11)の仮想空間上の位置、姿勢等が認識される。 First, the operating body recognition unit 63 recognizes the position, posture, and the like of the operating body 10 in the virtual space (step S1). That is, the position, posture, etc. of the operating body 10 (operating unit 11) such as the AR pen, the haptic glove, and the fingertip in the virtual space are recognized.
 次に、実オブジェクトである操作体10(操作部11)と操作UIや仮想物体等の仮想オブジェクトとの接触状態が識別され(ステップS2)、接触があったか否かについて判定される(ステップS3)。接触があったか否かは、上述したように、操作体10(操作部11)と仮想オブジェクトとの距離によって判定することができる。 Next, the contact state between the operation body 10 (operation unit 11), which is a real object, and a virtual object such as an operation UI or a virtual object is identified (step S2), and it is determined whether or not there is contact (step S3). .. As described above, whether or not there is contact can be determined by the distance between the operating body 10 (operating unit 11) and the virtual object.
 ステップS3にて接触があったと判定された場合には、触覚提示制御部64によって触覚提示装置3が制御されて触覚提示が行われる(ステップS4)。例えば、触覚提示制御部64は、ARペン、ハプティックグローブ等の触覚提示装置3を備える操作体10を振動などさせて、仮想UIの操作感や描き心地、質感、温度等に関する触覚提示が行われるようにする。なお、操作体10以外の触覚提示装置3により触覚提示を行ってもよい。これにより、操作体10には、バッテリ、電気回路等が不要となり、小型、軽量化を図ることができる。 When it is determined that there is contact in step S3, the tactile presentation device 3 is controlled by the tactile presentation control unit 64 to perform tactile presentation (step S4). For example, the tactile presentation control unit 64 vibrates an operating body 10 including a tactile presentation device 3 such as an AR pen or a haptic glove to present a tactile sensation regarding the operation feeling, drawing comfort, texture, temperature, etc. of the virtual UI. To do so. The tactile sensation may be presented by the tactile sensation presenting device 3 other than the operating body 10. As a result, the operating body 10 does not require a battery, an electric circuit, or the like, and can be made smaller and lighter.
 ステップS4にて触覚提示が行われた後、又は、ステップS3にて接触がなかったと判定された場合には、動いている、床に落ちた等の仮想オブジェクトの状態変化が識別され(ステップS5)、状態変化があったか否かについて判定される(ステップS6)。 After the tactile presentation is performed in step S4, or when it is determined that there is no contact in step S3, the state change of the virtual object such as moving or falling on the floor is identified (step S5). ), It is determined whether or not there is a state change (step S6).
 ステップS6にて、状態変化があったと判定された場合には、触覚提示制御部64によって触覚提示装置3が制御されて触覚提示が行われる(ステップS7)。例えば、触覚提示制御部64は、操作体10とは異なる携帯端末、ヘッドマウントディスプレイ等の触覚提示装置3を振動などさせて、傾いたなどの反動、地面に落ちた衝撃等に関する触覚提示が行われるようにする。 When it is determined in step S6 that the state has changed, the tactile presentation device 3 is controlled by the tactile presentation control unit 64 to perform tactile presentation (step S7). For example, the tactile presentation control unit 64 vibrates a tactile presentation device 3 such as a mobile terminal or a head-mounted display, which is different from the operating body 10, to perform tactile presentation regarding recoil such as tilting, impact of falling on the ground, or the like. To be
 ステップS7にて触覚提示が行われた後、又は、ステップS6にて状態変化がなかったと判定された場合には、制御部6による処理が終了する。 After the tactile sensation is presented in step S7, or when it is determined in step S6 that there is no state change, the process by the control unit 6 ends.
[1-6.情報処理システムの具体例]
 ここで、情報処理システム1について具体例を用いて説明する。なお、以下の説明では、上述した表示装置2、触覚提示装置3及び情報処理装置4が一体構成となっている場合を例にして説明する。コンテンツとしては、AR空間に線や文字、絵を描くことができるものを例にして説明する。
[1-6. Specific example of information processing system]
Here, the information processing system 1 will be described with reference to a specific example. In the following description, a case where the above-mentioned display device 2, the tactile presentation device 3, and the information processing device 4 are integrally configured will be described as an example. As the content, a content that can draw lines, characters, and pictures in the AR space will be described as an example.
 図9は、情報処理システム1を利用する具体例を示す。図示する例では、1台のスマートフォン(以下、単にスマホと称する)100によって、上述した表示装置2、触覚提示装置3及び情報処理装置4を一体的に構成している。つまり、スマホ100は、表示装置2、触覚提示装置3及び情報処理装置4を備え、これらを一体的に構成している。 FIG. 9 shows a specific example of using the information processing system 1. In the illustrated example, one smartphone (hereinafter, simply referred to as a smartphone) 100 integrally constitutes the above-mentioned display device 2, tactile presentation device 3, and information processing device 4. That is, the smartphone 100 includes a display device 2, a tactile presentation device 3, and an information processing device 4, which are integrally configured.
 スマホ100は、上述した表示装置2の構成要素としてディスプレイ21を備える。また、スマホ100は、上述した触覚提示装置3の構成要素としてバイブレータ31を備える。また、スマホ100は、上述した情報処理装置4の検出部8の構成要素として撮像装置41を備える。さらに、スマホ100は、音声出力装置を備え、その構成要素としてスピーカ101を備える。 The smartphone 100 includes a display 21 as a component of the display device 2 described above. Further, the smartphone 100 includes a vibrator 31 as a component of the above-mentioned tactile presentation device 3. Further, the smartphone 100 includes an image pickup device 41 as a component of the detection unit 8 of the information processing device 4 described above. Further, the smartphone 100 includes an audio output device and a speaker 101 as a component thereof.
 スマホ100は、上述した制御部6が記憶部5に記憶されているプログラムを実行することで、ユーザが空中に描いた文字や絵をディスプレイ21に表示される仮想空間内に映し出す処理を行う。例えば、図示した例では、ユーザが右手で持った操作体10としてのARペン110を使用して空中の仮想平面上(正確な平面でなくてもよい)に絵を描く(図中、破線で示す)ようにする。このとき、制御部6は、ARペン110のペン先の軌跡を検出し、ディスプレイ21に表示される仮想空間にその描いた絵(現実空間では目に見えない絵)が映し出されるように制御する。 The smartphone 100 performs a process of displaying characters and pictures drawn by the user in the air in the virtual space displayed on the display 21 by executing the program stored in the storage unit 5 by the control unit 6 described above. For example, in the illustrated example, the user draws a picture on a virtual plane (not necessarily an exact plane) in the air using the AR pen 110 as the operating body 10 held by the user (in the figure, a broken line). Show). At this time, the control unit 6 detects the trajectory of the pen tip of the AR pen 110 and controls so that the drawn picture (a picture invisible in the real space) is projected on the virtual space displayed on the display 21. ..
 また、スマホ100は、上述した制御部6が記憶部5に記憶されているプログラムを実行することで、上述した仮想空間内の物体(表示物)をユーザに感じさせる。この利用例では、具体的には、ユーザに対し、仮想空間内の仮想オブジェクトを操作する部分(ARペン110が体に触れる部分)とは異なる部分(左手部分)にスマホ100によって触覚提示される。例えば、図示した例では、ユーザが右手でARペン110を使って空間上に絵を描いたときの書き心地を、スマホ100のバイブレータ31の振動により、スマホ100を持つユーザの左手に触覚提示している。つまり、ARペン110を握っている右手部分ではなく、スマホ100を握っている左手に触覚提示を行っている。この触覚は、例えば、実際に実在する平面上にペンで描いていると感じさせるものである。 Further, the smartphone 100 makes the user feel the object (display object) in the virtual space described above by executing the program stored in the storage unit 5 by the control unit 6 described above. In this usage example, specifically, the smartphone 100 presents the user with a tactile sensation to a part (left hand part) different from the part that operates the virtual object in the virtual space (the part where the AR pen 110 touches the body). .. For example, in the illustrated example, the feeling of writing when the user draws a picture in space with the AR pen 110 with the right hand is tactilely presented to the left hand of the user holding the smartphone 100 by the vibration of the vibrator 31 of the smartphone 100. ing. That is, the tactile presentation is performed not on the right hand portion holding the AR pen 110 but on the left hand holding the smartphone 100. This tactile sensation, for example, makes you feel as if you are drawing with a pen on an actual plane.
 なお、図示した例では、触覚だけでなく聴覚及び視覚も利用している。例えば、スピーカ101から書き心地に関する音声(例えば、カリカリといった音等)を出力させることで聴覚により仮想空間内の物体を感じさせている。また、例えば、ディスプレイ21に書き心地に関する映像(図示した例では、波状のギザギザ表示)を表示させることで視覚により感じさせている。これにより、仮想空間での現象を視覚、聴覚及び触覚によって感じさせることができ、実空間と同じような、よりリアルで高度な仮想体験をユーザに行わせることができる。 In the illustrated example, not only the sense of touch but also the sense of hearing and sight are used. For example, by outputting a voice related to writing comfort (for example, a crisp sound) from the speaker 101, an object in the virtual space is made to be felt by hearing. Further, for example, by displaying an image related to writing comfort (in the illustrated example, a wavy jagged display) on the display 21, the display 21 is visually perceived. As a result, the phenomenon in the virtual space can be felt visually, auditorily, and tactilely, and the user can have a more realistic and advanced virtual experience similar to that in the real space.
<2.まとめ>
 以上説明したように、制御部6は、仮想オブジェクトと仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置3の動作を制御する制御情報を生成する制御を行う。これにより、上述したような多種多様な優れた触覚提示を行うことができる。例えば、情報処理システム1が操作体10とは異なる触覚提示装置3を備えるようにし、制御部6が該触覚提示装置3の動作を制御するようにすることで、操作体10での触覚提示が不要となる。これによれば、操作体10にバッテリ、電気回路等が不要となるので操作体10の小型、軽量化を図ることができる。操作体10がユーザの手などの人体である場合であってもユーザに触覚提示を行わせることができる。なお、ユーザの操作体10となる体部分、操作体10と触れる部分は、手や指に限らない。また、触覚提示装置3によって触覚が提示される部分も同様である。例えば、この2つを異ならせることで優れた触覚提示が可能となる。
<2. Summary>
As described above, the control unit 6 controls to generate control information for controlling the operation of the tactile presentation device 3 according to the distance between the virtual object and the real object that operates the virtual object. This makes it possible to perform a wide variety of excellent tactile presentations as described above. For example, the information processing system 1 is provided with a tactile presentation device 3 different from the operation body 10, and the control unit 6 controls the operation of the tactile presentation device 3, so that the tactile presentation on the operation body 10 can be performed. It becomes unnecessary. According to this, since the operating body 10 does not require a battery, an electric circuit, or the like, the operating body 10 can be made smaller and lighter. Even when the operating body 10 is a human body such as a user's hand, the user can be made to perform tactile presentation. The body part that becomes the user's operating body 10 and the part that comes into contact with the operating body 10 are not limited to the hands and fingers. The same applies to the portion where the tactile sensation is presented by the tactile sensation presenting device 3. For example, by making these two different, excellent tactile presentation becomes possible.
 以上、本技術の実施の形態について具体的に説明したが、本技術は、前述した実施の形態に限定されるものではなく、本技術の技術的思想に基づく各種の変形が可能である。例えば、次に述べるような各種の変形が可能である。また、次に述べる変形の態様は、任意に選択された一又は複数を、適宜に組み合わせることもできる。また、前述した実施の形態の構成、方法、工程、形状、材料及び数値などは、本技術の主旨を逸脱しない限り、互いに組み合わせることが可能である。 Although the embodiment of the present technology has been specifically described above, the present technology is not limited to the above-described embodiment, and various modifications based on the technical idea of the present technology are possible. For example, various modifications as described below are possible. In addition, one or a plurality of arbitrarily selected modifications may be appropriately combined in the following modification modes. In addition, the configurations, methods, processes, shapes, materials, numerical values, and the like of the above-described embodiments can be combined with each other as long as they do not deviate from the gist of the present technology.
 上述した実施の形態では、主として、仮想オブジェクトと操作体10とが接触することで仮想空間の現象が引き起こされるとしたが、接触に限らず、所定距離の接近によって引き起こされてもよい。これにより、仮想オブジェクトに操作体10を接触させなくても触覚提示を行わせることができる。例えば、仮想オブジェクトが火などの熱いものである場合に、所定距離の接近によって触覚提示が行われるようにしてもよい。このように温度を感じる仮想オブジェクトの場合、温度感覚を触覚提示装置3に提示させることが好ましい。つまり、操作体10が冷たい(温かい)ものに触れた又は近づいたときに、触覚提示装置3が備えるペルチェ素子の温度を変えるなどして触覚提示装置3の温度を変えて温度感覚を提示させてもよい。これにより、現実感を一層高めることができる。 In the above-described embodiment, it is said that the phenomenon of the virtual space is mainly caused by the contact between the virtual object and the operating body 10, but the phenomenon is not limited to the contact and may be caused by the approach of a predetermined distance. As a result, the tactile presentation can be performed without contacting the operating body 10 with the virtual object. For example, when the virtual object is a hot object such as a fire, the tactile presentation may be performed by approaching a predetermined distance. In the case of a virtual object that senses temperature in this way, it is preferable to have the tactile presenting device 3 present the sense of temperature. That is, when the operating body 10 touches or approaches a cold (warm) object, the temperature of the tactile presenting device 3 is changed by changing the temperature of the Perche element provided in the tactile presenting device 3 to present the temperature sensation. May be good. As a result, the sense of reality can be further enhanced.
 上述した実施の形態では、情報処理装置4の処理として、ユーザ1人に対する触覚提示について説明したが、これに限らず、表示装置2の表示及び触覚提示装置3の触覚提示を、他のユーザの装置と共有させてもよい。この場合、送受信制御部62によって表示装置2と、表示装置2とは別の他の表示装置(他のユーザ用の表示装置)とで画像情報を共有可能に制御すればよい。また、触覚提示装置3と、触覚提示装置3とは別の他の触覚提示装置(他のユーザ用の触覚提示装置)とで制御情報を共有可能に制御すればよい。例えば、同じ仮想空間を2人以上(仮にA、Bとする)で共有して見ているときに、Aの操作体10の操作に関する触覚提示をAの触覚提示装置3を備える操作体10及びBの操作体10でない本体側の触覚提示装置3に対して行うようにするなど、きめ細やかな触覚提示が可能となる。 In the above-described embodiment, the tactile presentation to one user has been described as the processing of the information processing device 4, but the present invention is not limited to this, and the display of the display device 2 and the tactile presentation of the tactile presentation device 3 can be performed by another user. It may be shared with the device. In this case, the transmission / reception control unit 62 may control the display device 2 so that the image information can be shared between the display device 2 and another display device (display device for another user) different from the display device 2. Further, the tactile presentation device 3 and another tactile presentation device (tactile presentation device for other users) different from the tactile presentation device 3 may be controlled so that the control information can be shared. For example, when the same virtual space is shared and viewed by two or more people (assumed to be A and B), the tactile presentation regarding the operation of the operation body 10 of A is performed by the operation body 10 including the tactile presentation device 3 of A. Fine-tuned tactile presentation is possible, such as by performing the tactile presentation device 3 on the main body side, which is not the operating body 10 of B.
 なお、本技術は、以下のような構成も採ることができる。
(1)
 仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御部を備える
 情報処理装置。
(2)
 前記制御部は、前記実オブジェクトとは異なる触覚提示装置の動作を制御する
 (1)に記載の情報処理装置。
(3)
 前記仮想オブジェクトは仮想平面であり、
 前記実オブジェエクトは前記仮想平面上に軌跡を描く操作体であり、
 前記制御部は、前記仮想平面に対する前記操作体の描き心地を表す触覚を前記触覚提示装置に提示させる
 (1)又は(2)に記載の情報処理装置。
(4)
 前記制御部は、前記操作体のペン種の設定に応じた触覚を前記触覚提示装置に提示させる
 (3)に記載の情報処理装置。
(5)
 前記制御部は、前記仮想オブジェクトと前記実オブジェクトとの接触を検出した場合に前記仮想オブジェクトの触感を表す触覚を前記触覚提示装置に提示させる
 (1)から(4)の何れかに記載の情報処理装置。
(6)
 前記制御部は、前記仮想オブジェクトの素材の設定に応じた触覚を前記触覚提示装置に提示させる
 (5)に記載の情報処理装置。
(7)
 前記制御部は、前記仮想オブジェクトの状態変化を表す触覚を前記触覚提示装置に提示させる
 (1)から(6)の何れかに記載の情報処理装置。
(8)
 前記制御部は、前記仮想オブジェクトへの前記実オブジェクトを用いた操作を表す触覚を前記触覚提示装置に提示させる
 (1)から(7)の何れかに記載の情報処理装置。
(9)
 前記制御部は、複数の触覚提示装置の動作を制御する制御情報を生成し、所定条件で触覚を提示させる触覚提示装置を使い分ける
 (1)から(8)の何れかに記載の情報処理装置。
(10)
 前記制御部は、前記触覚提示装置と、前記触覚提示装置とは別の他の触覚提示装置とで前記制御情報を共有可能に制御する
 (1)から(9)の何れかに記載の情報処理装置。
(11)
 前記制御装置は、狭義の触覚、圧覚、振動感覚、位置、動き、力の感覚、温度感覚、痛覚のいずれかを含む感覚を前記触覚提示装置に提示させる制御情報を生成する
 (1)から(10)の何れかに記載の情報処理装置。
(12)
 前記制御部は、前記仮想オブジェクトと前記実オブジェクトとの接触を検出した場合に前記仮想オブジェクトへの前記実オブジェクトの衝突速度に応じた強さの触覚を前記触覚提示装置に提示させる
 (1)から(11)の何れかに記載の情報処理装置。
(13)
 前記制御部は、前記実オブジェクトが接触又は接近禁止オブジェクトに所定距離以上近づいた場合に接触又は接近禁止を表す触覚を前記触覚提示装置に提示させる
 (1)から(12)の何れかに記載の情報処理装置。
(14)
 前記実オブジェクトは操作器具であり、
 前記制御部は、前記操作器具をユーザが把持する把持力に応じた触覚を前記触覚提示装置に提示させる
 (1)から(13)の何れかに記載の情報処理装置。
(15)
 前記制御部は、前記仮想オブジェクトと前記実オブジェクトとが接触したと判断する前記距離の設定を所定条件に応じて変更する
 (1)から(14)の何れかに記載の情報処理装置。
(16)
 前記実オブジェクトの位置を検出する検出部を備え、
 前記制御部は、前記検出部と前記実オブジェクトとの距離に応じた強さの触覚を前記触覚提示装置に提示させる
 (1)から(15)の何れかに記載の情報処理装置。
(17)
 プロセッサが、
 仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御を行う
 情報処理方法。
(18)
 コンピュータに、
 仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御機能を実現させる
 プログラム。
(19)
 仮想オブジェクトを表示する表示装置と、
 触覚を提示する触覚提示装置と、
 前記表示装置及び前記触覚提示装置と接続された情報処理装置と、
 を備え、
 前記情報処理装置は、前記仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて前記触覚提示装置の動作を制御する制御情報を生成し、
 前記触覚提示装置は、前記制御情報に基づく触覚を提示する
 情報処理システム。
The present technology can also adopt the following configurations.
(1)
An information processing device including a control unit that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
(2)
The information processing device according to (1), wherein the control unit controls the operation of a tactile presentation device different from the real object.
(3)
The virtual object is a virtual plane
The real object is an operating body that draws a locus on the virtual plane.
The information processing device according to (1) or (2), wherein the control unit causes the tactile presentation device to present a tactile sensation representing the drawing comfort of the operating body on the virtual plane.
(4)
The information processing device according to (3), wherein the control unit causes the tactile sensation presenting device to present a tactile sensation according to a setting of a pen type of the operating body.
(5)
The information according to any one of (1) to (4), wherein the control unit causes the tactile presentation device to present a tactile sensation representing the tactile sensation of the virtual object when the contact between the virtual object and the real object is detected. Processing equipment.
(6)
The information processing device according to (5), wherein the control unit causes the tactile presentation device to present a tactile sensation according to a setting of a material of the virtual object.
(7)
The information processing device according to any one of (1) to (6), wherein the control unit causes the tactile presentation device to present a tactile sensation representing a state change of the virtual object.
(8)
The information processing device according to any one of (1) to (7), wherein the control unit causes the tactile presentation device to present a tactile sensation representing an operation of the virtual object using the real object.
(9)
The information processing device according to any one of (1) to (8).
(10)
The information processing according to any one of (1) to (9), wherein the control unit controls the tactile presentation device and another tactile presentation device different from the tactile presentation device so that the control information can be shared. Device.
(11)
The control device generates control information for causing the tactile presenting device to present a sensation including any of tactile sensation, pressure sensation, vibration sensation, position, movement, force sensation, temperature sensation, and pain sensation in a narrow sense (1). The information processing apparatus according to any one of 10).
(12)
From (1), the control unit causes the tactile presenting device to present a tactile sensation having a strength corresponding to the collision speed of the real object with the virtual object when the contact between the virtual object and the real object is detected. The information processing device according to any one of (11).
(13)
2. Information processing device.
(14)
The real object is an operating instrument
The information processing device according to any one of (1) to (13).
(15)
The information processing device according to any one of (1) to (14), wherein the control unit changes the setting of the distance for determining that the virtual object and the real object are in contact with each other according to a predetermined condition.
(16)
It is provided with a detection unit that detects the position of the real object.
The information processing device according to any one of (1) to (15), wherein the control unit causes the tactile presentation device to present a tactile sensation having a strength corresponding to a distance between the detection unit and the real object.
(17)
The processor
An information processing method that controls to generate control information that controls the operation of a tactile presentation device according to the distance between a virtual object and a real object that operates the virtual object.
(18)
On the computer
A program that realizes a control function that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
(19)
A display device that displays virtual objects and
A tactile presentation device that presents tactile sensations,
An information processing device connected to the display device and the tactile presentation device,
With
The information processing device generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
The tactile presentation device is an information processing system that presents a tactile sensation based on the control information.
 1・・・情報処理システム、2・・・表示装置、3・・・触覚提示装置、4・・・情報処理装置、5・・・記憶部、6・・・制御部、7・・・通信部、8・・・検出部、10・・・操作体、11・・・操作部、62・・・送受信制御部、63・・・操作体認識部、64・・・接触提示制御部、S・・・情報共有サーバ 1 ... Information processing system, 2 ... Display device, 3 ... Tactile presentation device, 4 ... Information processing device, 5 ... Storage unit, 6 ... Control unit, 7 ... Communication Unit, 8 ... detection unit, 10 ... operation body, 11 ... operation unit, 62 ... transmission / reception control unit, 63 ... operation body recognition unit, 64 ... contact presentation control unit, S・ ・ ・ Information sharing server

Claims (19)

  1.  仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御部を備える
     情報処理装置。
    An information processing device including a control unit that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  2.  前記制御部は、前記実オブジェクトとは異なる触覚提示装置の動作を制御する
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the control unit controls the operation of a tactile presentation device different from the real object.
  3.  前記仮想オブジェクトは仮想平面であり、
     前記実オブジェエクトは前記仮想平面上に軌跡を描く操作体であり、
     前記制御部は、前記仮想平面に対する前記操作体の描き心地を表す触覚を前記触覚提示装置に提示させる
     請求項1に記載の情報処理装置。
    The virtual object is a virtual plane
    The real object is an operating body that draws a locus on the virtual plane.
    The information processing device according to claim 1, wherein the control unit causes the tactile sensation presenting device to present a tactile sensation representing the drawing comfort of the operating body on the virtual plane.
  4.  前記制御部は、前記操作体のペン種の設定に応じた触覚を前記触覚提示装置に提示させる
     請求項3に記載の情報処理装置。
    The information processing device according to claim 3, wherein the control unit causes the tactile sensation presenting device to present a tactile sensation according to a setting of a pen type of the operating body.
  5.  前記制御部は、前記仮想オブジェクトと前記実オブジェクトとの接触を検出した場合に前記仮想オブジェクトの触感を表す触覚を前記触覚提示装置に提示させる
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the control unit causes the tactile presentation device to present a tactile sensation representing the tactile sensation of the virtual object when the contact between the virtual object and the real object is detected.
  6.  前記制御部は、前記仮想オブジェクトの素材の設定に応じた触覚を前記触覚提示装置に提示させる
     請求項5に記載の情報処理装置。
    The information processing device according to claim 5, wherein the control unit causes the tactile presentation device to present a tactile sensation according to a setting of a material of the virtual object.
  7.  前記制御部は、前記仮想オブジェクトの状態変化を表す触覚を前記触覚提示装置に提示させる
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the control unit causes the tactile sensation presenting device to present a tactile sensation representing a state change of the virtual object.
  8.  前記制御部は、前記仮想オブジェクトへの前記実オブジェクトを用いた操作を表す触覚を前記触覚提示装置に提示させる
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the control unit causes the tactile presentation device to present a tactile sensation representing an operation using the real object on the virtual object.
  9.  前記制御部は、複数の触覚提示装置の動作を制御する制御情報を生成し、所定条件で触覚を提示させる触覚提示装置を使い分ける
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the control unit uses a tactile presentation device that generates control information for controlling the operation of a plurality of tactile presentation devices and presents the tactile sensation under predetermined conditions.
  10.  前記制御部は、前記触覚提示装置と、前記触覚提示装置とは別の他の触覚提示装置とで前記制御情報を共有可能に制御する
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the control unit controls the tactile presentation device and another tactile presentation device different from the tactile presentation device so that the control information can be shared.
  11.  前記制御装置は、狭義の触覚、圧覚、振動感覚、位置、動き、力の感覚、温度感覚、痛覚のいずれかを含む感覚を前記触覚提示装置に提示させる制御情報を生成する
     請求項1に記載の情報処理装置。
    The first aspect of claim 1, wherein the control device generates control information for causing the tactile presenting device to present a sensation including any of tactile sensations, pressure sensations, vibration sensations, positions, movements, force sensations, temperature sensations, and pain sensations in a narrow sense. Information processing equipment.
  12.  前記制御部は、前記仮想オブジェクトと前記実オブジェクトとの接触を検出した場合に前記仮想オブジェクトへの前記実オブジェクトの衝突速度に応じた強さの触覚を前記触覚提示装置に提示させる
     請求項1に記載の情報処理装置。
    According to claim 1, when the control unit detects a contact between the virtual object and the real object, the control unit causes the tactile presenting device to present a tactile sensation having a strength corresponding to the collision speed of the real object with the virtual object. The information processing device described.
  13.  前記制御部は、前記実オブジェクトが接触又は接近禁止オブジェクトに所定距離以上近づいた場合に接触又は接近禁止を表す触覚を前記触覚提示装置に提示させる
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the control unit causes the tactile presentation device to present a tactile sensation indicating contact or access prohibition when the real object approaches a contact or access prohibited object by a predetermined distance or more.
  14.  前記実オブジェクトは操作器具であり、
     前記制御部は、前記操作器具をユーザが把持する把持力に応じた触覚を前記触覚提示装置に提示させる
     請求項1に記載の情報処理装置。
    The real object is an operating instrument
    The information processing device according to claim 1, wherein the control unit causes the tactile presentation device to present a tactile sensation corresponding to a gripping force for the user to grip the operating device.
  15.  前記制御部は、前記仮想オブジェクトと前記実オブジェクトとが接触したと判断する前記距離の設定を所定条件に応じて変更する
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the control unit changes the setting of the distance for determining that the virtual object and the real object are in contact with each other according to a predetermined condition.
  16.  前記実オブジェクトの位置を検出する検出部を備え、
     前記制御部は、前記検出部と前記実オブジェクトとの距離に応じた強さの触覚を前記触覚提示装置に提示させる
     請求項1に記載の情報処理装置。
    It is provided with a detection unit that detects the position of the real object.
    The information processing device according to claim 1, wherein the control unit causes the tactile presentation device to present a tactile sensation having a strength corresponding to a distance between the detection unit and the real object.
  17.  プロセッサが、
     仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御を行う
     情報処理方法。
    The processor
    An information processing method that controls to generate control information that controls the operation of a tactile presentation device according to the distance between a virtual object and a real object that operates the virtual object.
  18.  コンピュータに、
     仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて触覚提示装置の動作を制御する制御情報を生成する制御機能を実現させる
     プログラム。
    On the computer
    A program that realizes a control function that generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
  19.  仮想オブジェクトを表示する表示装置と、
     触覚を提示する触覚提示装置と、
     前記表示装置及び前記触覚提示装置と接続された情報処理装置と、
     を備え、
     前記情報処理装置は、前記仮想オブジェクトと前記仮想オブジェクトを操作する実オブジェクトとの距離に応じて前記触覚提示装置の動作を制御する制御情報を生成し、
     前記触覚提示装置は、前記制御情報に基づく触覚を提示する
     情報処理システム。
    A display device that displays virtual objects and
    A tactile presentation device that presents tactile sensations,
    An information processing device connected to the display device and the tactile presentation device,
    With
    The information processing device generates control information that controls the operation of the tactile presentation device according to the distance between the virtual object and the real object that operates the virtual object.
    The tactile presentation device is an information processing system that presents a tactile sensation based on the control information.
PCT/JP2021/011342 2020-03-27 2021-03-19 Information processing device, information processing method, program, and information processing system WO2021193421A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-057037 2020-03-27
JP2020057037 2020-03-27

Publications (1)

Publication Number Publication Date
WO2021193421A1 true WO2021193421A1 (en) 2021-09-30

Family

ID=77892522

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/011342 WO2021193421A1 (en) 2020-03-27 2021-03-19 Information processing device, information processing method, program, and information processing system

Country Status (1)

Country Link
WO (1) WO2021193421A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009011748A (en) * 2007-07-09 2009-01-22 Nintendo Co Ltd Game program, and game apparatus
JP2010287221A (en) * 2009-05-11 2010-12-24 Univ Of Tokyo Haptic device
JP2014222492A (en) * 2013-05-14 2014-11-27 株式会社東芝 Drawing device and drawing system
JP2015212946A (en) * 2014-05-05 2015-11-26 イマージョン コーポレーションImmersion Corporation Systems and methods for viewport-based augmented reality haptic effects
JP2016062593A (en) * 2015-07-09 2016-04-25 キヤノン株式会社 Information processing apparatus, information processing method, and program
JP2018088260A (en) * 2008-07-15 2018-06-07 イマージョン コーポレーションImmersion Corporation System and method for shifting haptic feedback function between passive and active modes
WO2018116544A1 (en) * 2016-12-19 2018-06-28 ソニー株式会社 Information processing device, information processing method, and program
WO2018193650A1 (en) * 2017-04-18 2018-10-25 株式会社ソニー・インタラクティブエンタテインメント Vibration control device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009011748A (en) * 2007-07-09 2009-01-22 Nintendo Co Ltd Game program, and game apparatus
JP2018088260A (en) * 2008-07-15 2018-06-07 イマージョン コーポレーションImmersion Corporation System and method for shifting haptic feedback function between passive and active modes
JP2010287221A (en) * 2009-05-11 2010-12-24 Univ Of Tokyo Haptic device
JP2014222492A (en) * 2013-05-14 2014-11-27 株式会社東芝 Drawing device and drawing system
JP2015212946A (en) * 2014-05-05 2015-11-26 イマージョン コーポレーションImmersion Corporation Systems and methods for viewport-based augmented reality haptic effects
JP2016062593A (en) * 2015-07-09 2016-04-25 キヤノン株式会社 Information processing apparatus, information processing method, and program
WO2018116544A1 (en) * 2016-12-19 2018-06-28 ソニー株式会社 Information processing device, information processing method, and program
WO2018193650A1 (en) * 2017-04-18 2018-10-25 株式会社ソニー・インタラクティブエンタテインメント Vibration control device

Similar Documents

Publication Publication Date Title
JP6616546B2 (en) Tactile device incorporating stretch characteristics
US10338681B2 (en) Systems and methods for multi-output electrostatic haptic effects
US10564730B2 (en) Non-collocated haptic cues in immersive environments
EP3425481B1 (en) Control device
US9134797B2 (en) Systems and methods for providing haptic feedback to touch-sensitive input devices
US10317997B2 (en) Selection of optimally positioned sensors in a glove interface object
JP2022000154A (en) Game device and information processing device
US10474238B2 (en) Systems and methods for virtual affective touch
JP6761225B2 (en) Handheld information processing device
KR20050021500A (en) Hand-held computer interactive device
EP3333674A1 (en) Systems and methods for compliance simulation with haptics
EP3367216A1 (en) Systems and methods for virtual affective touch
WO2021193421A1 (en) Information processing device, information processing method, program, and information processing system
KR101528485B1 (en) System and method for virtual reality service based in smart device
WO2024090303A1 (en) Information processing device and information processing method
WO2023189405A1 (en) Input/output device
JP5773818B2 (en) Display control apparatus, display control method, and computer program
WO2022065120A1 (en) Information processing device, information processing method, and program
JP2020072806A (en) Game device and information processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21775665

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21775665

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP