WO2018216327A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2018216327A1
WO2018216327A1 PCT/JP2018/010433 JP2018010433W WO2018216327A1 WO 2018216327 A1 WO2018216327 A1 WO 2018216327A1 JP 2018010433 W JP2018010433 W JP 2018010433W WO 2018216327 A1 WO2018216327 A1 WO 2018216327A1
Authority
WO
WIPO (PCT)
Prior art keywords
control unit
information processing
display
remote user
display control
Prior art date
Application number
PCT/JP2018/010433
Other languages
English (en)
Japanese (ja)
Inventor
高橋 慧
石川 毅
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2018216327A1 publication Critical patent/WO2018216327A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a program.
  • Communication tools for remote communication are used in various fields such as business, education, and entertainment.
  • a remote user a user existing in a remote place
  • a local user existing in a local environment thereby realizing natural communication closer to reality.
  • Patent Literature 1 describes a technique for presenting an interaction between a virtual object and a real object.
  • the position of the camera that captures the local user is different from the position of the display device that displays the remote user. For this reason, there are cases where the eyes of each other may be felt as if they are not in line with each other, even though they are looking at each other. Even if the AR technology described above is applied to a communication tool and a remote user is presented as a virtual object in real space, depending on the position of the presented remote user, it may be felt that the lines of sight of each other do not match. is there. As a result, smooth communication may be hindered.
  • the present disclosure proposes an information processing apparatus, an information processing method, and a program capable of realizing smoother communication in remote communication.
  • an object indicating a remote user is located at a position in the real space according to a position of the imaging device based on relative position information indicating a relative positional relationship with the imaging device existing in the real space.
  • An information processing apparatus including a display control unit that controls display so as to be visually recognized by a user is provided.
  • An information processing method includes a processor controlling the display so that a local user can view the image.
  • a remote user is placed at a position in the real space according to the position of the imaging device based on relative position information indicating a relative positional relationship with the imaging device existing in the real space.
  • a program is provided for realizing a function of controlling display so that an object indicating is visually recognized by a local user.
  • FIG. 12 is a schematic diagram illustrating a local user's field of view V10 when the display control unit 135 according to Modification 1 displays an avatar in the vicinity of the imaging device 30.
  • FIG. 10 is a block diagram illustrating a configuration example of a communication system 1-2 according to Modification 2. It is explanatory drawing which shows the example of the animation which the display control part 135 which concerns on the modification displays.
  • FIG. 10 is a block diagram illustrating a configuration example of a communication system 1-3 according to Modification 3. It is explanatory drawing which shows the example of the animation which the display control part 135 which concerns on the same modification displays the avatar of a some remote user. It is explanatory drawing which shows the hardware structural example.
  • a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
  • it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration only the same reference numerals are given.
  • FIGS. 1 and 2 are explanatory diagrams for describing an overview of a communication system according to an embodiment of the present disclosure.
  • the communication system according to the present embodiment is an information processing system that realizes remote communication between a local user LU1 existing in the local environment 1000 shown in FIG. 1 and a remote user RU1 existing in the remote environment 2000.
  • the local environment 1000 and the remote environment 2000 may be any environment in a real space where a real object may exist. Further, the local environment 1000 and the remote environment 2000 may be far away, and for example, it may be a situation where direct face-to-face communication is difficult.
  • the local terminal 10 shown in FIG. 1 is an information processing apparatus that exists in the local environment 1000 and is used by the local user LU1.
  • the local terminal 10 has a transmissive (optical see-through) display unit arranged in front of one or both eyes of the local user LU1, and is mounted on the head of the local user LU1. Glasses-type device.
  • the remote terminal 20 shown in FIG. 1 is an information processing apparatus that exists in the remote environment 2000 and is used by the remote user RU1.
  • the remote terminal 20 has a display unit arranged in front of one or both of the eyes of the remote user RU1, and is an immersive HMD (Head Mounted Display) attached to the head of the remote user RU1. ).
  • HMD Head Mounted Display
  • the imaging device 30 shown in FIG. 1 is a so-called omnidirectional camera that can acquire 360-degree omnidirectional images in all directions in the vertical and horizontal directions by imaging.
  • an image is not limited to a still image, but is used as an expression including a moving image.
  • the local terminal 10, the remote terminal 20, and the imaging device 30 according to the present embodiment are connected to each other via a communication network (not shown).
  • the communication system according to the present embodiment for example, in addition to a message intercommunication function and a message output (display or sound output) function between the local terminal 10 and the remote terminal 20, can be remotely controlled by a display function described below. Realize communication.
  • the apparatus used for remote communication is not limited to the example shown in FIG. 1, and examples of other apparatuses will be described later.
  • the remote terminal 20 displays a display image G ⁇ b> 20 based on the imaging of the imaging device 30 existing in the local environment 1000.
  • the remote terminal 20 may generate a display image G20 by cutting out an area corresponding to the face direction of the remote user RU1 from the omnidirectional image acquired by imaging of the imaging device 30.
  • the remote user RU1 can observe the local environment 1000 in all directions from the viewpoint of the imaging device 30 by changing the face orientation.
  • the remote user RU1 can obtain a sense of facing the local user LU1 by turning his face so that the local user LU1 is included in the display image G20.
  • the local terminal 10 displays a virtual object (hereinafter referred to as an avatar) indicating the remote user RU1.
  • an avatar a virtual object indicating the remote user RU1.
  • the local terminal 10 has a transmissive display unit (not shown)
  • the local user LU1 wearing the local terminal 10 visually recognizes the avatar together with the real space through the display unit of the local terminal 10. It is possible. Therefore, the local terminal 10 can obtain a feeling as if the avatar of the remote user RU1 exists in the real space in front of you.
  • the field of view V10 viewed by the local user LU1 through the display unit of the local terminal 10 includes the avatar A1 of the remote user RU1 together with the real space.
  • the local terminal 10 according to the present embodiment has the avatar A1 at the position in the real space according to the position of the imaging device 30 based on the relative position information indicating the relative positional relationship with the imaging device 30 in the real space. Is displayed on the transmissive display unit. For example, in the field of view V10 shown in FIG. 1, the avatar A1 is superimposed on the imaging device 30 and is visually recognized by the local user LU1.
  • the local user LU1 turns his line of sight toward the imaging device 30 when he / she tries to see the avatar A1.
  • the remote user RU1 sees the local user LU1 in the display image G20 displayed on the remote terminal 20
  • the remote user RU1 feels that the line of sight is aligned with the local user LU1. can get.
  • the local user LU1 feels as if the remote user RU1 exists at the position of the imaging device 30, and the remote user RU1 observes the local environment 1000 from the position of the imaging device 30. As described above, since the remote user RU1 sees an image based on the imaging of the imaging device 30, the smooth communication can be realized by the local user LU1 feeling as described above.
  • the local terminal 10 may change the orientation (face orientation) of the face A11 of the avatar A1 to be displayed according to the face orientation of the remote user RU1.
  • Information regarding the face orientation of the remote user RU1 may be acquired by a sensor included in the remote terminal 20, for example, and provided to the local terminal 10.
  • FIG. 2 shows a display example of the local terminal 10 and the remote terminal 20 when the remote user RU1 faces the right side.
  • the region where the remote terminal 20 cuts out the display image from the omnidirectional image is also changed. Therefore, comparing FIG. 1 with FIG. Yes.
  • the orientation of the face A11 of the avatar A1 displayed by the local terminal 10 also changes according to the face orientation of the remote user RU1.
  • the local user LU1 confirms the angle of view in the imaging device 30 corresponding to the field of view of the remote user RU1, that is, what range the remote user RU1 is viewing in the local environment 1000. Can be grasped. With this configuration, the local user LU1 can perform communication while grasping the scenery that the remote user RU1 is viewing, and can perform communication more smoothly.
  • FIG. 3 is a block diagram illustrating a configuration example of the communication system according to the present embodiment.
  • the communication system 1 includes an information processing system including a local terminal 10, a remote terminal 20, an imaging device 30, a sensor device 40, a distribution server 50, a sensor device 60, and a communication network 70. It is.
  • the communication system 1 includes an information processing system including a local terminal 10, a remote terminal 20, an imaging device 30, a sensor device 40, a distribution server 50, a sensor device 60, and a communication network 70.
  • the local terminal 10 is a glasses-type device that is worn on the head of a local user existing in the local environment 1000 as described with reference to FIG.
  • the local terminal 10 has a transmissive display unit that displays a remote user's avatar at a position corresponding to the position of the imaging device 30 in real space. A more detailed configuration of the local terminal 10 will be described later with reference to FIG.
  • the remote terminal 20 is an immersive HMD attached to the head of a remote user existing in the remote environment 2000 as described with reference to FIG. Further, the remote terminal 20 displays an image based on the imaging of the imaging device 30. A more detailed configuration of the remote terminal 20 will be described later with reference to FIG.
  • the imaging device 30 is an omnidirectional camera that can acquire 360-degree omnidirectional images in all directions in the vertical and horizontal directions by imaging.
  • the imaging device 30 may have a plurality of imaging units, for example, and may acquire an omnidirectional image by performing image processing for combining images obtained by the plurality of imaging units.
  • the imaging device 30 transmits the omnidirectional image to the distribution server 50 via the communication network 70.
  • the imaging apparatus 30 may not have an image processing function. In such a case, for example, another information processing apparatus connected to the imaging apparatus 30 or a distribution server 50 described later is associated with the imaging apparatus 30 instead. It may have an image processing function.
  • the sensor device 40 acquires sensing data related to the local environment 1000 by sensing.
  • the sensor device 40 may include a plurality of sensors, for example, various sensors such as a camera, an infrared camera, a microphone, a depth sensor, an illuminance sensor, and a human sensor.
  • the sensor device 40 provides (transmits) the acquired sensing data to the local terminal 10.
  • the distribution server 50 is an information processing apparatus that distributes (transmits) the omnidirectional image received from the imaging apparatus 30 via the communication network 70 to another apparatus (for example, the remote terminal 20).
  • the distribution server 50 may perform streaming distribution while caching the omnidirectional image received from the imaging device 30.
  • the sensor device 60 acquires sensing data related to the remote environment 2000 by sensing.
  • the sensor device 60 may include a plurality of sensors, for example, various sensors such as a camera, an infrared camera, a microphone, a depth sensor, an illuminance sensor, and a human sensor.
  • the sensor device 60 provides (transmits) the acquired sensing data to the remote terminal 20.
  • the communication network 70 is a wired or wireless transmission path for information transmitted from a device or system connected to the communication network 70.
  • the communication network 70 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs (Local Area Network) including Ethernet (registered trademark), WAN (Wide Area Network), and the like.
  • the communication network 70 may include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
  • FIG. 4 is a block diagram illustrating a configuration example of the local terminal 10 according to the present embodiment.
  • the local terminal 10 according to the present embodiment is an information processing apparatus including a sensor unit 11, a communication unit 12, a control unit 13, a display unit 14, a sound output unit 15, and a storage unit 16.
  • the sensor unit 11 acquires sensing data regarding the local user wearing the local terminal 10 and the surrounding environment (local environment 1000) by sensing.
  • the sensor unit 11 may include, for example, an acceleration sensor, a gyro sensor, a camera, a microphone, a geomagnetic sensor, a force sensor, and the like.
  • the sensor data acquired by the sensor unit 11 may include information on the position and orientation of the local terminal 10.
  • the sensor unit 11 provides the acquired sensing data to the control unit 13.
  • the communication unit 12 is a communication interface that mediates communication between the local terminal 10 and other devices.
  • the communication unit 12 supports an arbitrary wireless communication protocol or wired communication protocol, and establishes a communication connection with another device, for example, via the communication network 70 described with reference to FIG. 3 or directly.
  • the control unit 13 controls the operation of each component of the local terminal 10.
  • the control unit 13 also functions as a communication control unit 131, a relative position acquisition unit 133, a display control unit 135, and a sound output control unit 137, as shown in FIG.
  • the communication control unit 131 shown in FIG. 4 controls communication by the communication unit 12, and acquires various types of information from other devices, or transmits them to other devices.
  • the communication control unit 131 causes the remote terminal 20 to transmit voice data acquired by a microphone included in the sensor unit 11 as a message.
  • the communication control unit 131 may cause text data input via an input device (not shown) to be transmitted to the remote terminal 20 as a message.
  • the communication control unit 131 may receive user information regarding the remote user wearing the remote terminal 20 from the remote terminal 20.
  • the user information may include, for example, remote user identification information, remote user status information, remote user attitude information, messages (text data, voice data, etc.) transmitted by the remote user, and the like. Good.
  • the communication control unit 131 may receive sensing data from the sensor device 40 described with reference to FIG.
  • the relative position acquisition unit 133 acquires relative position information indicating a relative positional relationship with the imaging device 30 in real space.
  • the relative position information may be information (coordinates) indicating the position of the imaging device 30 expressed in a coordinate system based on the current local terminal 10, for example.
  • the relative position acquisition unit 133 can acquire the relative position information by various methods. Several examples of the relative position information acquisition method will be described below.
  • the relative position acquisition unit 133 may acquire the relative position information by detecting the imaging device 30 from the image acquired by the camera included in the sensor unit 11.
  • the imaging device 30 may include a detection marker, or the imaging device 30 may include a light emitting unit, and the light emitting unit may emit light with a predetermined light emission pattern. Also good.
  • the relative position acquisition unit 133 may acquire the relative position information by detecting the local terminal 10 by image recognition from the omnidirectional image acquired by the imaging device 30.
  • the local terminal 10 may be provided with a detection marker, or the local terminal 10 may include a light emitting unit, and the light emitting unit may emit light with a predetermined light emission pattern. Also good.
  • the local terminal 10 may acquire an omnidirectional image directly from the imaging device 30 or via the communication network 70 or the distribution server 50.
  • the relative position acquisition unit 133 acquires relative position information based on the coordinates of the imaging device 30 in the absolute coordinate system (hereinafter referred to as absolute coordinates) and the position and orientation information of the local terminal 10. It is also possible to do.
  • the position and orientation information of the local terminal 10 may be included in the sensing data acquired by the sensor unit 11, or based on the sensing data, using self-position estimation technology such as SLAM (Simultaneous Localization and Mapping). It may be specified.
  • the absolute coordinates of the imaging device 30 may be stored, for example, in the storage unit 16 described later, or may be acquired from another device (for example, the imaging device 30) via the communication network 70.
  • the relative position acquisition unit 133 may detect the imaging device 30 from the image acquired by the camera included in the sensor device 40 and specify the absolute coordinates of the imaging device 30.
  • a detection marker may be provided in the imaging device 30, or the imaging device 30 may include a light emitting unit to facilitate detection by a light emission pattern.
  • the relative position acquisition unit 133 may specify the absolute coordinates of the imaging device 30 by matching the omnidirectional image acquired by the imaging device 30 with an image obtained by capturing the local environment 1000 in advance. Good. By dynamically specifying the absolute coordinates of the imaging device 30 in this way, even when the imaging device 30 moves, it is possible to specify the absolute coordinates of the imaging device 30 and acquire the relative position information. It is.
  • the relative position information may be acquired by a method other than the above.
  • the relative position information may be acquired by detecting the imaging device 30 from sensing data other than the image acquired by the sensor unit 11.
  • the relative position information may be acquired by the communication unit 12 receiving the relative position information specified by another device.
  • the display control unit 135 is a transmissive display unit so that the local user can visually recognize the object indicating the remote user at a position in the real space corresponding to the position of the imaging device 30 in the real space. 14 controls display.
  • the object indicating the remote user may be, for example, a virtual avatar, or may be an image of the remote user captured by the camera included in the sensor device 60 described with reference to FIG.
  • the display control unit 135 displays a remote user's avatar as an object indicating the remote user.
  • the display control unit 135 may display the avatar of the remote user so that the imaging device 30 and the avatar in the real space are superimposed and viewed by the local user.
  • the imaging device 30 and the avatar are superimposed and visually recognized means that at least a part of the imaging device 30 overlaps at least a part of the avatar and is visually recognized by the local user.
  • the local user turns his / her line of sight toward the imaging device 30 when he / she tries to see the avatar of the remote user. Then, when the remote user sees the local user in the image displayed on the remote terminal 20, an effect is obtained in which the remote user feels that his / her line of sight matches the local user.
  • the local user feels as if the remote user is present at the position of the imaging device 30 and the remote user observes the local environment 1000 with the position of the imaging device 30 as a viewpoint.
  • the remote user since the remote user sees an image based on the imaging of the imaging device 30 (an image displayed on the remote terminal 20), the local user can feel more as described above. Smooth communication can be realized.
  • the display control unit 135 is visually recognized by superimposing an imaging unit (not shown) of the imaging device 30 and the eyes included in the avatar. As shown, an avatar may be displayed. With such a configuration, the local user can more easily imagine the viewpoint of the remote user, and smoother communication can be realized.
  • the display control unit 135 may control the attitude of the avatar based on the attitude information indicating the attitude of the remote user received from the remote terminal 20 via the communication unit 12. For example, when the posture information includes information related to the face orientation of the remote user and the avatar includes a face (a part that appears to be a face), the display control unit 135 includes the information related to the face orientation. The direction of the face to be displayed may be controlled.
  • the local user confirms the range of the local environment 1000 by the remote user by confirming the face direction of the avatar. It becomes possible. Then, the local user can communicate while grasping the scenery viewed by the remote user, and can communicate more smoothly.
  • posture control by the display control unit 135 is not limited to face orientation control.
  • the information included in the posture information and the posture of the part corresponding to the human body included in the avatar can be controlled.
  • the display control unit 135 may control the avatar's hand, arm, and body posture, respectively. With such a configuration, the local user can feel stronger as if the remote user exists in the local environment 1000.
  • the display control unit 135 may display an avatar corresponding to the remote user.
  • a plurality of avatars are associated with remote user identification information and stored in the storage unit 16, and the display control unit 135 responds to the remote user identification information received from the remote terminal 20 via the communication unit 12. May be selected to display the selected avatar.
  • the local user can identify the remote user via the avatar.
  • the display control unit 135 may display a message transmitted by the remote user.
  • the message transmitted by the remote user is received from the remote terminal 20 via the communication unit 12, for example.
  • the message displayed by the display control unit 135 is not limited to text, and may include electronic data such as image data and document data.
  • the display control unit 135 may display an icon indicating electronic data.
  • the icon displayed by the display control unit 135 may be an image.
  • FIG. 5 is a schematic diagram showing the field of view V10 of the local user when the display control unit 135 displays a message.
  • the display control unit 135 displays the message M10 in the vicinity of the avatar A1.
  • the display control unit 135 displays the message M10 as a balloon from the position of the avatar A1.
  • the display control unit 135 may display an icon M11 indicating electronic data as shown in FIG.
  • the display control unit 135 may control the display according to the state of the remote user.
  • the display control unit 135 may display an indicator (for example, an icon or text) indicating the state of the remote user in the vicinity of the remote user's avatar.
  • the display control unit 135 may control parameters related to the display of the avatar according to the state of the remote user. Parameters relating to avatar display may include, for example, luminance, color, saturation, transparency, posture, and the like.
  • information indicating the status of the remote user may be received from the remote terminal 20 via the communication unit 12.
  • the information indicating the status of the remote user includes whether the remote user is online, whether the remote user is wearing the remote terminal 20, whether the remote user has permitted display of the avatar, and the like. Information may be included.
  • the local user can grasp the status of the remote user.
  • the display control unit 135 receives information on whether or not the image of the omnidirectional camera is normally transmitted to the remote terminal 20 from the distribution server 50 or the imaging device 30, and controls display based on the information. May be. In such a case, the display control unit 135 may display an icon or text indicating whether or not the image of the omnidirectional camera is normally transmitted to the remote terminal 20, or may control a parameter related to the display of the avatar. Good.
  • the display control unit 135 may control the display according to whether or not the voice communication between the local terminal 10 and the remote terminal 20 is normally performed. In such a case, the display control unit 135 may display an icon or text indicating whether or not the voice communication between the local terminal 10 and the remote terminal 20 is normally performed, and controls parameters related to the display of the avatar. May be.
  • the sound output control unit 137 may control the sound output unit 15 to sound output a message transmitted by the remote user.
  • the sound output control unit 137 may control the sound output so that the message can be heard from the position in the real space where the avatar is visually recognized by the local user. With this configuration, the local user can obtain a stronger feeling as if the remote user exists in the local environment 1000.
  • the display unit 14 is a display that is controlled by the display control unit 135 to display various information including objects such as avatars.
  • the display unit 14 may be a transmissive (optical see-through) display. With such a configuration, the local user wearing the local terminal 10 can view the real space and the information displayed on the display unit 14 at the same time.
  • the display unit 14 may be a non-transparent display unit, and such a case will be described later as a modified example.
  • the display unit 14 may be able to present different images to both eyes of the local user wearing the local terminal 10, for example, and the local user can display the shape of the displayed object, It is possible to recognize the position three-dimensionally.
  • the sound output unit 15 performs sound output under the control of the sound output control unit 137.
  • the sound output unit 15 may include, for example, a plurality of speakers, and the sound output unit 15 may be capable of outputting sound in a three-dimensional manner (stereo sound).
  • the storage unit 16 stores a program for the control unit 13 to execute the above processes and various data.
  • the storage unit 16 may store the avatar information described above and a history of information received from the remote terminal 20 via the communication unit 12.
  • the function of the storage unit 16 may exist in an external device, and the local terminal 10 may receive information stored in a storage unit included in the external device via the communication unit 12, for example.
  • the configuration example of the local terminal 10 has been described above with reference to FIG. 4, but the example illustrated in FIG. 4 is an example, and the present technology is not limited to the example.
  • some functions shown in FIG. 4 may be provided in another information processing apparatus connected to the local terminal 10.
  • some or all of the functions of the control unit 13 are provided in another information processing apparatus connected to the local terminal 10, and the local terminal 10 is controlled by the other information processing apparatus to display or output sound. You may go.
  • the relative position information may be information indicating a relative position between the local terminal 10 that performs display and the imaging device 30.
  • FIG. 6 is a block diagram illustrating a configuration example of the remote terminal 20 according to the present embodiment.
  • the remote terminal 20 according to the present embodiment is an information processing apparatus including a sensor unit 21, a communication unit 22, a control unit 23, a display unit 24, a sound output unit 25, and a storage unit 26.
  • the sensor unit 21 acquires sensing data related to the remote user wearing the remote terminal 20 and the surrounding environment (remote environment 2000) by sensing.
  • the sensor unit 21 may include, for example, an acceleration sensor, a gyro sensor, a camera, a microphone, a geomagnetic sensor, a force sensor, and the like.
  • the sensor data acquired by the sensor unit 21 may include posture information related to the posture (for example, the face orientation) of the remote user.
  • the sensor unit 21 provides the acquired sensing data to the control unit 23.
  • the communication unit 22 is a communication interface that mediates communication between the remote terminal 20 and other devices.
  • the communication unit 22 supports an arbitrary wireless communication protocol or wired communication protocol, and establishes a communication connection with another device, for example, via the communication network 70 described with reference to FIG.
  • the control unit 23 controls the operation of each component of the remote terminal 20. Further, as shown in FIG. 6, the control unit 23 also functions as a communication control unit 231, a display control unit 235, and a sound output control unit 237.
  • the communication control unit 231 illustrated in FIG. 6 controls communication by the communication unit 22 and acquires various types of information from other devices, or transmits them to other devices.
  • the communication control unit 231 transmits user information including information indicating the status of the remote user, attitude information of the remote user, messages (text data, voice data, and the like) transmitted by the remote user to the local terminal 10.
  • the communication control unit 231 may cause the voice data acquired by the microphone included in the sensor unit 21 to be transmitted as a message to the local terminal 10, or text data input via an input device (not shown) as a message. May be transmitted to the local terminal 10.
  • the communication control unit 231 may receive a message (text data, voice data, etc.) transmitted from the local terminal 10 by the local user.
  • the communication control unit 231 may receive the omnidirectional image from the distribution server 50 described with reference to FIG.
  • the communication control unit 231 may receive sensing data from the sensor device 60 described with reference to FIG.
  • the display control unit 235 controls display by the display unit 24.
  • the display control unit 235 may generate a display image by cutting out a region corresponding to the face orientation of the remote user acquired by the sensor unit 21 from the omnidirectional image, and display the display image on the display unit 24. .
  • a remote user wearing the remote terminal 20 can observe the local environment 1000 in all directions from the viewpoint of the imaging device 30 by changing the face orientation. Further, as shown in FIG. 1, the remote user can obtain a sense of facing the local user by facing the face in a direction that is included in the display image.
  • the display control unit 235 may further perform image processing on the area cut out from the omnidirectional image to generate a display image.
  • the image processing performed by the display control unit 235 includes a processing process that detects a local user from an area cut out from the omnidirectional image and processes the local user area, or converts the local user area into another object (for example, A replacement process may be included.
  • the processing performed by the display control unit 235 may include, for example, processing for removing the local terminal 10 worn by the local user and restoring the facial expression of the local user.
  • the image processing performed by the display control unit 235 may include processing for synthesizing a message transmitted by the local user.
  • the sound output control unit 237 illustrated in FIG. the sound output control unit 237 may control the sound output unit 25 to sound output a message transmitted by the local user.
  • the sound output control unit 237 may control the sound output so that a message can be heard from the position of the local user displayed on the display unit 24. With this configuration, the remote user can obtain a more immersive feeling.
  • the display unit 24 is a display that displays a display image under the control of the display control unit 235.
  • the display unit 24 may be able to present different images to both eyes of a remote user wearing the remote terminal 20, for example. With such a configuration, the remote user can observe the local environment 1000 in a three-dimensional manner, and can obtain a more immersive feeling.
  • the sound output unit 25 performs sound output under the control of the sound output control unit 237.
  • the sound output unit 25 may include, for example, a plurality of speakers, and the sound output unit 25 may be capable of outputting sound in a three-dimensional manner (stereo sound).
  • the storage unit 26 stores a program for the control unit 23 to execute the above processes and various data.
  • the configuration example of the remote terminal 20 has been described above with reference to FIG. 6, but the example illustrated in FIG. 6 is an example, and the present technology is not limited to the example.
  • some functions shown in FIG. 6 may be provided in another information processing apparatus connected to the remote terminal 20.
  • a part or all of the functions of the control unit 23 shown in FIG. 6 are provided in another information processing apparatus connected to the remote terminal 20, and the remote terminal 20 displays and is controlled by the other information processing apparatus. Or sound output may be performed.
  • FIG. 7 is a flowchart showing an operation example of the communication system 1 according to the present embodiment.
  • imaging is performed by the imaging device 30 (S102), and then image processing such as synthesis processing based on the captured image is performed, and an omnidirectional image is acquired (S104). Subsequently, the omnidirectional image is transmitted from the imaging device 30 to the distribution server 50 (S108). Further, the distribution server 50 distributes (transmits) the omnidirectional image received from the imaging device 30 to the remote terminal 20 (S108).
  • the sensor unit 21 of the remote terminal 20 acquires sensing data including, for example, information on the face direction of the remote user by sensing (S110). Further, the display control unit of the remote terminal 20 cuts out an area corresponding to the remote user's face direction from the omnidirectional image, generates a display image, and causes the display unit 24 to display the display image (S112).
  • the remote terminal 20 locally stores user information including remote user identification information, information indicating the status of the remote user, attitude information of the remote user, messages (text data, voice data, etc.) transmitted by the remote user, and the like. It transmits to the terminal 10 (S114).
  • the relative position acquisition unit 133 of the local terminal 10 acquires relative position information indicating the relative positional relationship between the local terminal 10 and the imaging device existing in the real space (S116). Further, the display control unit 135 of the local terminal 10 recognizes the remote user's avatar by the local user at a position in the real space according to the position of the imaging device 30 in the real space based on the user information and the relative position information. As shown, the avatar is displayed (S118).
  • FIG. 7 shows an example, and the present technology is not limited to the example.
  • the processes in steps S102 to S118 shown in FIG. 7 may be repeated.
  • the processing order of step S108 and step S110 may be reversed, and the processing order of step S114 and step S116 may be reversed.
  • FIG. 8 is a schematic diagram showing the field of view V10 of the local user when the display control unit 135 displays an avatar in the vicinity of the imaging device 30.
  • the remote user's avatar A ⁇ b> 1 is displayed beside the imaging device 30.
  • the position where the avatar is visually recognized is not limited to the side of the imaging device 30 and may be above, below, inside, or in front of the imaging device 30.
  • FIG. 9 is a block diagram showing a configuration example of the communication system 1-2 according to this modification.
  • the configuration of the communication system 1-2 illustrated in FIG. 9 is the same as the configuration of the communication system 1 described with reference to FIG. 3 except that the two imaging devices 30A and 30B exist in the local environment 1000. Therefore, it will be described while omitting as appropriate.
  • the imaging device 30A and the imaging device 30B are omnidirectional cameras that can acquire 360-degree omnidirectional images in all directions in the vertical and horizontal directions by imaging, as with the imaging device 30 described above.
  • the imaging device 30 ⁇ / b> A and the imaging device 30 ⁇ / b> B transmit an omnidirectional image to the distribution server 50 via the communication network 70.
  • the remote terminal 20 may display an image based on the imaging of one imaging device selected by the remote user from the imaging device 30A or the imaging device 30B.
  • the remote user may perform an input operation related to the selection via, for example, the sensor device 60, the sensor unit 21 of the remote terminal 20, or an input device (not shown).
  • the distribution server 50 may transmit both the omnidirectional image captured by the imaging device 30A and the omnidirectional image captured by the imaging device 30B to the remote terminal 20, or the imaging selected by the remote user. Only the omnidirectional image captured by the apparatus may be transmitted to the remote terminal 20.
  • the local terminal 10 may have the following functions in addition to the functions described with reference to FIG.
  • the display control unit 135 of the local terminal 10 has the remote user's avatar at a position in the real space corresponding to the position of one imaging device selected by the remote user among the plurality of imaging devices.
  • the display is controlled so as to be visually recognized by the local user.
  • the display control unit 135 when the position where the avatar is visually recognized is switched according to the selection of the remote user, the display control unit 135 according to this modification displays the avatar so that the avatar fades out and then fades in to a new position. You may let them.
  • the display control unit 135 according to the present modification is currently selected from the position corresponding to the imaging device selected immediately before when the position where the avatar is visually recognized is switched according to the selection of the remote user. You may display the animation which an avatar moves to the position according to an imaging device. In this animation, a video effect in which the avatar moves three-dimensionally may be added.
  • FIG. 10 is an explanatory diagram showing an example of an animation displayed by the display control unit 135 according to this modification.
  • FIG. 10 shows a state in which the imaging device 30B is selected by the remote user RU1 after the imaging device 30A has been selected.
  • the remote terminal 20 displays a display image G20 generated by cutting out an area corresponding to the face direction of the remote user RU1 from the omnidirectional image acquired by imaging of the imaging device 30B. Further, in the field of view V10 of the local user LU1, the avatar A1 that has been visually recognized while being superimposed with the imaging device 30A has moved to a position that is visually recognized with being superimposed with the imaging device 30B.
  • the remote user can observe the local environment 1000 from various viewpoints, and the local user can easily grasp the imaging device corresponding to the current viewpoint of the remote user. .
  • FIG. 11 is a block diagram showing a configuration example of the communication system 1-3 according to this modification.
  • the configuration of the communication system 1-3 illustrated in FIG. 11 is partially the same as the configuration of the communication system 1 described with reference to FIG.
  • the communication system 1-3 includes two remote terminals 20A and 20B, and is shown in FIG. 3 in that it includes two sensor devices 60A and 60B. Different from communication system 1. Further, the remote terminal 20A and the sensor device 60A exist in the remote environment 2000A, and the remote terminal 20B and the sensor device 60B exist in the remote environment 2000B. The remote environment 2000A and the remote environment 2000B may be the same (single) environment. In such a case, the sensor device 60A and the sensor device 60B may be the same (single) device.
  • the remote terminal 20A may be an information processing device used by a first remote user
  • the remote terminal 20B may be an information processing device used by a second remote user different from the first remote user. Note that the configurations of the remote terminal 20A and the remote terminal 20B according to this modification are substantially the same as the configuration of the remote terminal 20 described with reference to FIG.
  • the distribution server 50 distributes (transmits) the omnidirectional image obtained by the imaging of the imaging device 30 to both the remote terminal 20A and the remote terminal 20B.
  • the local terminal 10 according to this modification may have the following functions in addition to the functions described with reference to FIG.
  • the display control unit 135 of the local terminal 10 according to this modification may display an object (for example, an avatar) indicating a plurality of remote users.
  • FIG. 12 is an explanatory diagram showing an example of an animation in which the display control unit 135 according to this modification displays a plurality of remote user avatars.
  • the display control unit 135 displays both the avatar A1 of the first remote user RU1 wearing the remote terminal 20A and the avatar A2 of the second remote user RU2 wearing the remote terminal 20B. ing.
  • the display control unit 135 overlaps with other avatars.
  • An avatar may be displayed so as to be visually recognized in the vicinity of the imaging device 30 that should not be.
  • the display control unit 135 may control the display so that a plurality of avatars do not overlap with each other and can be visually recognized while being superimposed on the imaging device 30 by displaying the avatar in a reduced size.
  • the display control unit 135 of the local terminal 10 has a function of displaying an avatar corresponding to the remote user as described above, and in the example illustrated in FIG. 12, the avatar A1 and the avatar A2 are different avatars.
  • the display control unit 135 may control the avatar posture of each remote user in accordance with the posture information of each remote user.
  • the face of the avatar A1's face A11 is displayed in a direction corresponding to the face direction of the remote user RU1
  • the face of the avatar A2's face A21 is displayed in a direction corresponding to the face direction of the remote user RU2. Yes.
  • the remote user RU1 existing in the remote environment 2000A is wearing the remote terminal 20A. Further, on the remote terminal 20A, a display image G21 generated by cutting out an area corresponding to the face orientation of the remote user RU1 from the omnidirectional image obtained by imaging by the imaging device 30 is displayed.
  • the remote user RU2 existing in the remote environment 2000B is wearing the remote terminal 20B.
  • the remote terminal 20B displays a display image G22 generated by cutting out an area corresponding to the face direction of the remote user RU2 from the omnidirectional image obtained by imaging by the imaging device 30.
  • each remote terminal 20 may display the avatar of another remote user. For example, when the remote user RU1 turns to the left, the remote terminal 20A displays, as a display image, an image obtained by combining the avatar of the remote user RU2 with an image obtained by cutting out an area corresponding to the face direction from the omnidirectional image. May be.
  • the remote terminal 20A displays a message in the vicinity of the avatar of the remote user RU2, or outputs a sound so that a message transmitted by the remote user RU2 can be heard from the position of the avatar of the remote user RU2. Also good.
  • Modification 2 may be combined with Modification 2 described above.
  • a plurality of remote users select one imaging device from a plurality of imaging devices, and the display control unit 135 of the local terminal 10 displays the remote user's avatar at a position corresponding to the position of the imaging device selected by each remote user.
  • the display may be controlled so as to be visually recognized.
  • each remote terminal 20 may display the avatar of the other remote user at a position corresponding to the position of the imaging device selected by the remote user other than the remote user wearing the remote terminal 20.
  • the remote terminal 20 displays a message in the vicinity of the avatar of the other remote user, or outputs a sound so that a message transmitted by the remote user can be heard from the position of the avatar of the remote user. Also good.
  • the local terminal 10 is a glasses-type device having a transmissive display unit
  • the remote terminal 20 is an immersive HMD
  • the imaging device 30 is an omnidirectional camera.
  • the embodiment according to the invention is not limited to the example, and can be realized by various apparatus configurations.
  • the display unit 14 included in the local terminal 10 may not be a transmissive type.
  • an avatar an example of an object indicating a remote user
  • a real space image that is included in the sensor unit 11 included in the local terminal 10 and acquired by a camera that captures the field of view (real space) of the local user. May be displayed.
  • the display control unit 135 of the local terminal 10 allows the local user to visually recognize the avatar at a position in the real space according to the position of the imaging device 30. It is possible to control the display.
  • the local terminal 10 may be a smartphone or a tablet terminal. Even in such a case, a real space image and an avatar acquired by a camera included in the sensor unit 11 may be combined and displayed.
  • the display unit 14 of the local terminal 10 may be a projector.
  • the local terminal 10 may project an avatar image at a position in the real space according to the position of the imaging device 30.
  • the display control unit 135 of the local terminal 10 controls the projector so that the avatar is positioned at a position in the real space according to the position of the imaging device 30.
  • the display can be controlled to be viewed by a local user.
  • the display unit 24 of the remote terminal 20 may be a flat display.
  • the area corresponding to the field of view of the remote user may be specified according to an operation via an input device such as a remote controller or a gesture operation acquired by the sensor unit 21.
  • the remote terminal 20 may be a smartphone or a tablet terminal. Even in such a case, the region corresponding to the visual field of the remote user may be specified according to the touch operation or the posture of the remote terminal 20 acquired by the sensor unit 21.
  • the imaging device 30 may not be a spherical camera, and may be a camera having an imaging angle of view of 180 degrees, for example. Then, the display control unit 135 of the local terminal 10 may limit the face direction of the avatar to be displayed according to the imaging angle of view of the imaging device 30. With such a configuration, it is possible to suppress inconsistency between the avatar's face direction and the field of view actually seen by the remote user.
  • FIG. 13 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment.
  • the information processing apparatus 900 illustrated in FIG. 13 can implement the local terminal 10, the remote terminal 20, and the distribution server 50, for example.
  • Information processing by the local terminal 10, the remote terminal 20, and the distribution server 50 according to the present embodiment is realized by cooperation of software and hardware described below.
  • the information processing apparatus 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a.
  • the information processing apparatus 900 includes a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, a communication device 913, and a sensor 915.
  • the information processing apparatus 900 may include a processing circuit such as a DSP or an ASIC in place of or in addition to the CPU 901.
  • the CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor.
  • the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
  • the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. For example, the CPU 901 can form the control unit 13 and the control unit 23.
  • the CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus.
  • the host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904.
  • an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus
  • PCI Peripheral Component Interconnect / Interface
  • the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
  • the input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever.
  • the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900.
  • the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901.
  • a user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
  • the output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as lamps, audio output devices such as speakers and headphones, printer devices, and the like.
  • the output device 907 outputs results obtained by various processes performed by the information processing device 900. Specifically, the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
  • the output device 907 can form, for example, the display unit 14, the sound output unit 15, the display unit 24, and the sound output unit 25.
  • the storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900.
  • the storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
  • the storage device 908 can form the storage unit 16 and the storage unit 26, for example.
  • the drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900.
  • the drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903.
  • the drive 909 can also write information to a removable storage medium.
  • connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
  • USB Universal Serial Bus
  • the communication device 913 is a communication interface formed by a communication device or the like for connecting to the network 920, for example.
  • the communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like.
  • the communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices.
  • the communication device 913 can form the communication unit 12 and the communication unit 22, for example.
  • the sensor 915 is various sensors such as an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance measuring sensor, and a force sensor.
  • the sensor 915 acquires information on the state of the information processing apparatus 900 itself, such as the posture and movement speed of the information processing apparatus 900, and information on the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900.
  • Sensor 915 may also include a GPS sensor that receives GPS signals and measures the latitude, longitude, and altitude of the device.
  • the sensor 915 can form the sensor part 11 and the sensor part 21, for example.
  • the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920.
  • the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like.
  • the network 920 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
  • IP-VPN Internet Protocol-Virtual Private Network
  • each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
  • a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like.
  • a computer-readable recording medium storing such a computer program can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed via a network, for example, without using a recording medium.
  • the local terminal 10 displays the avatar that is a virtual object as the object indicating the remote user
  • the present technology is not limited to the example.
  • the local terminal 10 may display an image obtained by photographing the remote user with a camera included in the sensor device 60 as an object indicating the remote user.
  • an object indicating the remote user is visually recognized by the local user at a position in the real space corresponding to the position of the imaging device.
  • An information processing apparatus comprising a display control unit that controls display.
  • the display control unit displays the object so that the imaging apparatus and the object are superimposed and viewed.
  • the display control unit displays the object so that an imaging unit of the imaging apparatus and eyes included in the object are superimposed and viewed.
  • the information processing apparatus controls display according to a state of the remote user.
  • the display control unit displays an indicator indicating the state of the remote user in the vicinity of the object.
  • the display control unit controls a parameter related to display of the object according to a state of the remote user.
  • the display control unit controls display so that the object is visually recognized at a position in the real space according to the position of one imaging device selected from a plurality of imaging devices.
  • the display control unit changes from the position corresponding to the imaging device selected immediately before to the position corresponding to the currently selected imaging device.
  • the display control unit displays an object indicating a plurality of remote users.
  • An acoustic output control unit that acoustically outputs a message transmitted by the remote user, and the acoustic output control unit controls the acoustic output so that the message can be heard from a position in the real space where the object is visually recognized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Le problème de la présente invention est de fournir un dispositif de traitement d'informations, un procédé de traitement d'informations ainsi qu'un programme. À cet effet, l'invention concerne un dispositif de traitement d'informations qui est pourvu d'une unité de commande d'affichage qui, sur la base d'informations de position relative indiquant une relation de position relative avec un dispositif d'imagerie présent dans un espace réel, commande l'affichage de telle sorte qu'un objet indiquant un utilisateur distant est visuellement reconnu par un utilisateur local à une position dans l'espace réel qui correspond à la position du dispositif d'imagerie.
PCT/JP2018/010433 2017-05-24 2018-03-16 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2018216327A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-102488 2017-05-24
JP2017102488 2017-05-24

Publications (1)

Publication Number Publication Date
WO2018216327A1 true WO2018216327A1 (fr) 2018-11-29

Family

ID=64396463

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010433 WO2018216327A1 (fr) 2017-05-24 2018-03-16 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2018216327A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021061492A (ja) * 2019-10-04 2021-04-15 株式会社Ihiインフラ建設 遠隔監視システム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000244886A (ja) * 1999-01-20 2000-09-08 Canon Inc コンピュータ会議システム、コンピュータ処理装置、コンピュータ会議を行う方法、コンピュータ処理装置における処理方法、ビデオ会議システム、ビデオ会議を行う方法、ヘッドホン
JP2010092304A (ja) * 2008-10-08 2010-04-22 Sony Computer Entertainment Inc 情報処理装置および情報処理方法
JP2010219989A (ja) * 2009-03-18 2010-09-30 Oki Electric Ind Co Ltd コミュニケーション支援システム、表示制御装置および表示制御方法
JP2015172883A (ja) * 2014-03-12 2015-10-01 株式会社コナミデジタルエンタテインメント 端末装置、情報通信方法及び情報通信プログラム
WO2015185793A1 (fr) * 2014-06-02 2015-12-10 Nokia Technologies Oy Procédé et appareil pour une augmentation de ligne de vision pendant une vidéoconférence

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000244886A (ja) * 1999-01-20 2000-09-08 Canon Inc コンピュータ会議システム、コンピュータ処理装置、コンピュータ会議を行う方法、コンピュータ処理装置における処理方法、ビデオ会議システム、ビデオ会議を行う方法、ヘッドホン
JP2010092304A (ja) * 2008-10-08 2010-04-22 Sony Computer Entertainment Inc 情報処理装置および情報処理方法
JP2010219989A (ja) * 2009-03-18 2010-09-30 Oki Electric Ind Co Ltd コミュニケーション支援システム、表示制御装置および表示制御方法
JP2015172883A (ja) * 2014-03-12 2015-10-01 株式会社コナミデジタルエンタテインメント 端末装置、情報通信方法及び情報通信プログラム
WO2015185793A1 (fr) * 2014-06-02 2015-12-10 Nokia Technologies Oy Procédé et appareil pour une augmentation de ligne de vision pendant une vidéoconférence

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021061492A (ja) * 2019-10-04 2021-04-15 株式会社Ihiインフラ建設 遠隔監視システム

Similar Documents

Publication Publication Date Title
WO2016203792A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP7005753B2 (ja) プライバシースクリーン
JP6822410B2 (ja) 情報処理システム及び情報処理方法
CA2981208A1 (fr) Procede et systeme pour mettre en ƒuvre un environnement virtuel multiutilisateur
US11061466B2 (en) Apparatus and associated methods for presenting sensory scenes
US11361497B2 (en) Information processing device and information processing method
US10515481B2 (en) Method for assisting movement in virtual space and system executing the method
KR102684302B1 (ko) 가상 현실 (vr) 디바이스에 의해 디스플레이되는 가상 컨텐트를 네비게이트하는 방법 및 장치
CN111386517A (zh) 用于体验虚拟现实的用户之间的通信的装置和相关方法
US11151804B2 (en) Information processing device, information processing method, and program
CN109791436B (zh) 用于提供虚拟场景的装置及方法
JPWO2017191700A1 (ja) 画像生成装置、画像生成方法及びプログラム
WO2018216327A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2018094086A (ja) 情報処理装置および画像生成方法
JP6563592B2 (ja) 表示制御装置、表示制御方法及びプログラム
JP6159455B1 (ja) 仮想空間を提供する方法、プログラム、および記録媒体
US10940387B2 (en) Synchronized augmented reality gameplay across multiple gaming environments
WO2024004398A1 (fr) Dispositif de traitement d'informations, programme, et système de traitement d'informations
AU2022386387A1 (en) Systems, methods, and media for controlling shared extended reality presentations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18804986

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18804986

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP