WO2017159063A1 - Dispositif d'affichage et dispositif terminal de traitement d'informations - Google Patents

Dispositif d'affichage et dispositif terminal de traitement d'informations Download PDF

Info

Publication number
WO2017159063A1
WO2017159063A1 PCT/JP2017/002810 JP2017002810W WO2017159063A1 WO 2017159063 A1 WO2017159063 A1 WO 2017159063A1 JP 2017002810 W JP2017002810 W JP 2017002810W WO 2017159063 A1 WO2017159063 A1 WO 2017159063A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
unit
image
display
control unit
Prior art date
Application number
PCT/JP2017/002810
Other languages
English (en)
Japanese (ja)
Inventor
辰志 梨子田
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to EP17766071.9A priority Critical patent/EP3432590A4/fr
Priority to US16/075,295 priority patent/US10455184B2/en
Priority to JP2018505313A priority patent/JPWO2017159063A1/ja
Publication of WO2017159063A1 publication Critical patent/WO2017159063A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2668Creating a channel for a dedicated end-user group, e.g. insertion of targeted commercials based on end-user profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4131Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/38Transmitter circuitry for the transmission of television signals according to analogue transmission standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the technology disclosed in this specification relates to a display device that displays video and an information processing terminal device that controls transmission of video to the display device.
  • the viewer can freely select the viewpoint position and line-of-sight direction and enjoy the all-sky video.
  • the whole sky video can be viewed using a head mounted display that is worn on the head. Based on the result of detecting the movement of the viewer's head, the entire sky image is displayed while changing the viewpoint position and the direction of the line of sight. Can give viewers a unique experience.
  • an imaging system that captures a wide-angle image wider than the display image that is actually displayed is provided, and the display image that the user should view is cut out based on the position information of the user's head detected by the rotation angle sensor.
  • a head-mounted display system for display see, for example, Patent Document 1.
  • a head mounted display mounted on the viewer's head, an angular velocity sensor that detects rotation of the viewer's head, and a camera that calculates the rotation angle of the viewer's head and rotates the left and right cameras Proposals have been made on a mobile camera device including a direction calculation device (see, for example, Patent Document 2).
  • the control server manages the positions and acquired images of a plurality of mobile camera devices installed on a moving body such as a vehicle in the image database, and the control server searches the image database for an image having an imaging request from the terminal device or Proposals have been made on a mobile camera system that acquires from a new mobile camera device and transmits it to a terminal device (see, for example, Patent Document 3).
  • An object of the technology disclosed in this specification is to provide a display device that displays video, and an information processing terminal device that controls transmission of video to the display device.
  • a spherical display A receiving unit for receiving an image from an external device; A sensor unit that measures the user's line of sight, head position, or posture; A control unit; Comprising The control unit is configured to display at least one of a user's line of sight, head position, or posture measured by the sensor unit when displaying the first image received from the first external device on the display unit.
  • the second image is received from the second external device included in the first image, and the second image is displayed on the display unit. It is a display device.
  • the display device further includes a transmission unit that transmits information to the external device.
  • the control unit is configured to transmit information measured by the sensor unit to the second device when displaying the second image.
  • the display device further includes a sound collection unit.
  • the control unit is configured to transmit the sound collected by the sound collection unit to the external device.
  • the display device includes an air blowing unit, a temperature adjusting unit, a humidity adjusting unit, a tactile control unit, a vibration control unit, or a scent generating unit. At least one of them is provided.
  • the said control part is the said ventilation part, the said temperature adjustment part, the said humidity adjustment part, the said tactile sense control part, the said vibration control part according to the content of the image received from the said external apparatus and displayed on the said display part Or it is comprised so that a fragrance generating part may be controlled.
  • the display device further includes a measurement unit that measures the pressure acting between the user's foot and the ground plane.
  • the control unit is configured to perform display control by switching between the first image and the second image in accordance with a measurement result by the measurement unit.
  • the display unit of the display device includes a plurality of projection units that project an image onto a screen.
  • the control unit is configured to control the projection unit so that no shadow is generated on the screen.
  • the seventh aspect of the technology disclosed in this specification is: An imaging unit; A transmission unit that transmits an image captured by the imaging unit; A receiving unit for receiving a predetermined signal from an external device; A control unit; Comprising The control unit controls transmission of the image captured by the imaging unit to the external device based on line-of-sight information or posture information included in the predetermined signal.
  • An information processing terminal device An information processing terminal device.
  • the imaging unit of the information processing terminal device captures an all-sky image
  • the control unit includes the line-of-sight information or the posture. Based on the information, a predetermined image is specified from the all-sky image and transmission control is performed.
  • FIG. 1 is a diagram schematically illustrating a configuration example of a video viewing system 100 that views all-sky video.
  • FIG. 2 is a diagram schematically showing a configuration example of the video viewing system 200 for viewing the all-sky video.
  • FIG. 3 is a diagram schematically showing a configuration example of a video viewing system 300 for viewing the all-sky video.
  • FIG. 4 is a diagram schematically illustrating a configuration example of a video viewing system 400 for viewing the all-sky video.
  • FIG. 5 is a diagram showing an external configuration example of a video providing apparatus 500 that can be used in the video viewing systems 100 to 400.
  • FIG. 6 is a diagram illustrating a state in which a plurality of video providing devices are installed in a soccer stadium.
  • FIG. 6 is a diagram illustrating a state in which a plurality of video providing devices are installed in a soccer stadium.
  • FIG. 7 is a diagram exemplifying a mechanism (FIFO method) for limiting the number of video playback devices that transmit video from one video providing device to within the capacity.
  • FIG. 8 is a diagram exemplifying a mechanism (LIFO method) for limiting the number of video playback devices that transmit video from one video providing device to within the capacity.
  • FIG. 9 is a diagram illustrating a mechanism (priority method) for limiting the number of video playback devices that transmit video from one video providing device to within the capacity.
  • FIG. 10 is a diagram illustrating a mechanism for distributing past video to a video playback device outside the capacity.
  • FIG. 11 is a flowchart illustrating a processing procedure for transmitting video captured by the video providing apparatus to a plurality of video playback apparatuses.
  • FIG. 12 is a diagram schematically illustrating a functional configuration of an information processing apparatus 1200 that can function as a video providing apparatus.
  • FIG. 13 is a diagram schematically illustrating a functional configuration of an information processing apparatus 1300 that can function as a video reproduction apparatus.
  • FIG. 14 is a diagram illustrating a display example of a UI for realizing viewpoint movement.
  • FIG. 15 is a diagram illustrating a display example of a UI for realizing viewpoint movement.
  • FIG. 16 is a diagram for explaining a mechanism for detecting a video providing device existing on the screen of the video playback device.
  • FIG. 17 is a diagram illustrating a display example of a UI for realizing viewpoint movement.
  • FIG. 18 is a diagram illustrating a display example of a UI for realizing viewpoint movement.
  • FIG. 19 is a diagram illustrating a video providing apparatus 1900 including an indicator.
  • FIG. 20 is a diagram showing an example in which the viewpoint position distribution of each video playback device is displayed in a heat map format.
  • FIG. 21 is a flowchart showing a processing procedure for displaying a target mark or a heat map.
  • FIG. 22 is a diagram showing a state of switching to an image at the next viewpoint position while maintaining the line-of-sight direction.
  • FIG. 23 is a diagram illustrating a state in which the line-of-sight direction is changed after the viewpoint position is moved.
  • FIG. 24 is a diagram illustrating a state of the video providing apparatus in a video transmission standby state.
  • FIG. 20 is a diagram showing an example in which the viewpoint position distribution of each video playback device is displayed in a heat map format.
  • FIG. 21 is a flowchart showing a processing procedure for displaying a target mark or a heat map.
  • FIG. 22 is a diagram showing a state of switching to
  • FIG. 25 is a diagram for explaining an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 26 is a diagram for describing an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 27 is a diagram for explaining an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 28 is a diagram for explaining an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 29 is a diagram for describing an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 30 is a diagram for explaining an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 31 is a diagram for describing an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 32 is a diagram for describing an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 33 is a diagram for explaining an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 34 is a diagram for describing an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 35 is a diagram for describing an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 36 is a diagram for explaining an example in which video effects such as animation are used when the viewpoint position is switched.
  • FIG. 37 is a flowchart showing a processing procedure for moving the viewpoint of the displayed video in the video playback device.
  • FIG. 38 is a diagram illustrating a state of the video providing apparatus in a state where no image is captured.
  • FIG. 39 is a diagram illustrating a communication sequence example when the viewpoint position is switched in the video reproduction device.
  • FIG. 40 is a diagram illustrating an example in which a target mark is displayed on a ground image viewed from above.
  • FIG. 41 is a diagram illustrating a UI operation example for moving the viewpoint position via the hovering in the sky.
  • FIG. 42 is a diagram illustrating a UI operation example for moving the viewpoint position via the hovering in the sky.
  • FIG. 43 is a diagram illustrating a UI operation example for moving the viewpoint position via the hovering in the sky.
  • FIG. 44 is a diagram illustrating a UI operation example for moving the viewpoint position via the hovering in the sky.
  • FIG. 40 is a diagram illustrating an example in which a target mark is displayed on a ground image viewed from above.
  • FIG. 41 is a diagram illustrating a UI operation example for moving the viewpoint position via the hovering in the sky.
  • FIG. 42 is a diagram illustrating a UI operation
  • FIG. 45 is a flowchart showing a processing procedure for starting video transmission in the video providing apparatus in the standby state.
  • FIG. 46 is a flowchart showing a processing procedure performed during video transmission in the video providing apparatus.
  • FIG. 47 is a diagram showing a configuration example of the dome type display 4700.
  • FIG. 48 is a diagram showing a configuration example of the dome type display 4700.
  • FIG. 49 is a diagram showing a configuration example of the dome type display 4700.
  • FIG. 50 is a diagram showing a configuration example of the dome type display 4700.
  • FIG. 51 is a diagram showing a configuration example of the dome type display 4700.
  • FIG. 52 is a diagram showing a configuration example of the dome type display 4700.
  • FIG. 53 is a diagram showing a configuration example of the dome type display 4700.
  • FIG. 54 is a diagram showing a configuration example of the dome type display 4700.
  • FIG. 1 schematically shows a configuration example of a video viewing system 100 for viewing all-sky video.
  • the all-sky image described below does not necessarily have to be 360 degrees, and a part of the visual field may be missing.
  • the all-sky image may be a hemisphere image that does not include a floor surface with little information (the same applies hereinafter).
  • a video viewing system 100 shown in FIG. 1 is composed of one video providing device 101 that provides an all-around video and one video playback device 102 that reproduces an all-around video, and has a one-to-one network topology. Is configured.
  • the video providing apparatus 101 and the video reproduction apparatus 102 are interconnected via a wide area network such as a wireless or wired LAN (Local Area Network) or the Internet, for example.
  • a wide area network such as a wireless or wired LAN (Local Area Network) or the Internet, for example.
  • the video providing apparatus 101 includes an imaging unit that captures an all-sky video with the installation location as the viewpoint position, and transmits video in an arbitrary line-of-sight direction to the video playback apparatus 102.
  • the imaging unit is configured by one omnidirectional camera, and the angle of view in the line-of-sight direction displayed on the video playback device 102 is cut out from the captured omnidirectional video and transmitted to the video playback device 102. Also good.
  • an imaging unit composed of a single (wide-angle) camera is installed on a line-of-sight changing apparatus that can set an arbitrary line-of-sight direction, and the camera is directed toward the designated line-of-sight direction, and the captured image is displayed on the video reproduction apparatus 102. You may make it transmit to.
  • the line-of-sight changing device includes, for example, a three-axis table that supports the imaging unit so as to be rotatable about two axes in the horizontal direction (XY axis) and one axis in the vertical direction (Z axis).
  • the video providing apparatus 101 further includes an audio input unit such as a microphone, and multiplexes the audio collected from the imaging scene of the whole sky video with the video and transmits the multiplexed audio to the video playback apparatus 102. Also good.
  • an audio input unit such as a microphone
  • the video providing apparatus 101 may be installed at a specific place like a fixed point camera.
  • the video providing apparatus 101 (or the imaging unit) may be mounted on a moving body such as a person, an animal, or a car. In the latter case, the video providing apparatus 101 not only switches the line-of-sight direction of the video transmitted to the video playback apparatus 102 but can also change the viewpoint position by moving the moving body.
  • One video playback device 102 includes a display unit that displays the all-sky video received from the video providing device 101.
  • the video playback device 102 is configured as, for example, a head-mounted display that a viewer wears on the head and watches the video, and among the all-sky video captured by the video providing device 101, an image in a specific line-of-sight direction is displayed. Display the corner image.
  • the video playback device 102 may be configured as a dome-type display, and may display all the sky images captured at the place where the video providing device 101 is installed. For details of the dome type display, see, for example, Japanese Patent Application No. 2015-245710 already assigned to the present applicant.
  • the video playback device 102 may be a normal (or large screen) monitor display.
  • the video playback device 102 may include an audio output unit such as a speaker or headphones, and may play back and output the audio transmitted from the video providing device 101 in a multiplexed manner with the video.
  • an audio output unit such as a speaker or headphones
  • the visual line direction of the video transmitted from the video providing apparatus 101 to the video reproducing apparatus 102 is basically instructed from the video reproducing apparatus 102.
  • the video playback device 102 is configured as a head-mounted display
  • the video providing device 101 is instructed to change the line-of-sight direction based on the result of detecting the motion of the viewer's head.
  • an operation of the system 100 in which an apparatus (not shown) other than the video reproduction apparatus 102 instructs the video providing apparatus 101 about the line-of-sight direction is also conceivable.
  • FIG. 2 to 4 show a modification of the video viewing system 100 for viewing the all-sky video.
  • one video providing apparatus 101 and one video playback apparatus 102 constitute a one-to-one network topology.
  • the video viewing system 200 shown in FIG. 2 has one video providing apparatus 201 and a plurality (N) of video reproduction apparatuses 202-1, 202-2,.
  • a network topology is configured, and all-sky images captured by one image providing device 201 (the same images captured in the same viewing direction at the same viewpoint position) are respectively reproduced by the image reproducing devices 202-1, 202-. 2, ..., 202-N is used for simultaneous viewing.
  • FIG. 3 includes a plurality of (N) video providing apparatuses 301-1, 301-2,..., 301-N and one video reproducing apparatus 302, which is an N-to-1 network network. Topology is configured, and one video reproduction device 302 is selectively selected from any one of video providing devices 301-1, 301-2,..., 301-N installed at different locations. The video is received and displayed. Assume that the video reproduction device 302 can dynamically switch the video transmission source among the video providing devices 301-1, 301-2,..., 301-N.
  • the viewpoint position of the video that is played back (viewable) by the video playback device 302 is switched (the viewpoint position is instantaneously changed to the installation location of the selected video providing device 301). Moving). Further, it is assumed that the video reproduction device 302 can instruct the selected video providing device 301 to switch the line-of-sight direction.
  • the N-to-N network topology may include the one-to-one network shown in FIG. 1, the one-to-N network shown in FIG. 2, and the N-to-one network shown in FIG.
  • one viewing group is formed by a plurality of video playback devices, the video providing device that is the video providing source is switched for each viewing group, and images captured in the same line-of-sight direction from the same video providing device are simultaneously displayed. It is conceivable to operate the N-to-N video viewing system 400 for viewing.
  • a plurality of video providing devices constitute one video group, and a video playback device selects one of the video groups and views the video providing devices that receive video within the video group while sequentially switching and viewing.
  • the operation of the N-to-1 video viewing system 300 and the N-to-N video viewing system 400 is also conceivable.
  • the video providing device that is the video transmission source is switched, the viewpoint position of the video that can be viewed on the video playback device instantaneously moves to the installation location of the switched video providing device.
  • the video providing device may be switched by a remote control operation similar to channel switching in television broadcasting.
  • one video group is composed of video providing devices (cameras) installed at multiple locations in one facility, such as stadiums such as soccer and baseball, stadiums for various sports competitions such as athletics, and concert venues.
  • the video playback device (or its viewer) then plays the video shot at the desired viewpoint position while sequentially switching the video providing device that receives the video within the video group during a game, competition or event. Can watch.
  • FIG. 5 shows an external configuration example of a video providing apparatus 500 that can be used in the video viewing systems 100 to 400.
  • the illustrated video providing apparatus 500 includes an imaging unit 501, a support unit 502 that supports the imaging unit 501, and audio input units 503 and 504.
  • the imaging unit 501 is a two-lens stereo camera, but may be a wide-angle camera, a fish-eye camera, or a multi-lens omnidirectional camera.
  • the support unit 502 includes, for example, a line-of-sight changing device including a three-axis table that supports the imaging unit so as to be rotatable about each of two axes in the horizontal direction (XY axis) and one axis in the vertical direction (Z axis).
  • the imaging unit 501 can be set in an arbitrary line-of-sight direction.
  • the imaging unit 501 is a head, and the support unit 502 is also a driving unit that swings the head.
  • the support unit 502 is not intended to change the line-of-sight direction of the camera but to express the direction of the currently transmitted video as the head posture. You may make it swing.
  • the imaging unit 501 is an all-sky camera, the support unit 502 may simply support the imaging unit 501.
  • microphones 503 and 504 are arranged on the left and right sides of the imaging unit 501 to form a stereo microphone.
  • the sound at the time of sound collection can be reconstructed three-dimensionally on the playback side (that is, the video playback device).
  • the interval between the microphone 503 and the microphone 504 substantially the same as the interval between the left and right ears of the person, the sound image that can be heard by the person at the installation location of the video providing apparatus 500 can be acquired.
  • FIG. 6 illustrates a state in which a plurality of video providing devices are installed in a soccer stadium.
  • Each video providing device can take an image of a situation in the stadium (for example, a soccer game) while changing the line-of-sight direction with the installed location as the viewpoint position.
  • the video providing device (or the imaging unit thereof) may be attached to a player who participates in a soccer game, a person on the pitch such as a referee, or other moving body.
  • video provision apparatus may be utilized.
  • a spectator at a stand or a viewer at home uses a video playback device consisting of a head-mounted display or a dome-type display to view the video from one of the video providing devices installed in the stadium. Can watch.
  • a video playback device consisting of a head-mounted display or a dome-type display to view the video from one of the video providing devices installed in the stadium. Can watch.
  • the line-of-sight direction of each video providing device it is possible to view a video in another line-of-sight direction from the same viewpoint position.
  • the video providing apparatus captures an image while tracking the movement of the ball or the movement of a specific player at the same viewpoint position
  • the line-of-sight direction changes so as to follow the movement of the ball or the specific player on the video playback apparatus side. You can watch the video.
  • the viewpoint position of the video displayed on the video playback device can be instantaneously moved by switching the video providing device serving as the video transmission source. For example, when a long pass passes, it is possible to switch to the video from the video providing device closest to the current ball position. Further, when a dribbling player makes a turn (rapid change of direction), the player can be switched to a video from a video providing device capable of capturing the player from the front.
  • a dotted line arrow exemplifies a route where the video providing apparatus serving as a transmission source has been switched (that is, the viewpoint position has moved back and forth). Details of the viewpoint movement will be described later.
  • the video providing apparatus may limit video reproducing apparatuses that transmit video or audio (in other words, allow viewing), or limit the range of information provided to the video reproducing apparatus.
  • the video providing device when the video providing device gives viewing authority according to the user attribute of the user of the video playback device (that is, the viewer) and receives the transmission for the video from the video playback device, the video providing device responds according to the user attribute. It is determined whether or not transmission of video and audio is permitted, and the range of information to be provided is limited.
  • the user attributes here include personal information such as the user's age, gender, birthplace, occupation, and qualifications, as well as past viewing results such as cumulative viewing time, users on the video providing device side (the owner of the video providing device, Human relations (such as relationships, friendships, positions, etc.) with the installation location manager, etc., viewer ratings by users on the video provider side, and reputations (posts and votes) on other video provider side users Information).
  • personal information such as the user's age, gender, birthplace, occupation, and qualifications
  • past viewing results such as cumulative viewing time
  • users on the video providing device side the owner of the video providing device
  • Human relations such as relationships, friendships, positions, etc.
  • viewer ratings by users on the video provider side etc.
  • reputations posts and votes
  • one viewing group is formed by a plurality of video playback devices (in the case of the 1 to N video viewing system 200 or the N to N video viewing system 400), the viewing is limited for each user of the video playback device.
  • the viewing authority may be given for each viewing group based on the user attribute as the viewing group, and the same viewing restriction as described above may be performed for each viewing group.
  • the viewing service for all-sky video may be charged, and the viewing authority may be set according to the fee paid by the user (or viewing group) of the video playback device.
  • the stepwise viewing authority as shown below may be set and given according to the user attribute of the video playback device or viewing group, and the range in which information is transmitted from the video providing device may be controlled.
  • the movement of the viewpoint position is limited by, for example, switching the video providing apparatus that is the video transmission source in the N-to-1 video viewing system 300 that forms a video group with a plurality of video providing apparatuses. This can be applied when moving the position (restriction of switching to a specific video providing device) or when using a video providing device mounted on a moving body (restriction of movement into a specific area).
  • the user (viewer) of the video playback device may not want to view the video and audio received from the video providing device without limitation. Therefore, the video playback device may limit the playback output of the received video and audio according to the attribute of the video providing device.
  • examples of the attributes of the video providing device include a user attribute of the video providing device and a location where the video providing device is installed.
  • User attributes include personal information such as the user's age, gender, birthplace, occupation, and qualifications, as well as past imaging results such as cumulative imaging time, and the human relationship (relationship relationship) with the user (viewer) on the video playback device side. , Friend relations, position relations, etc.), viewer's own evaluation of the video providing device, and other viewers' reputation (posts, voting results, etc.).
  • a video group is configured by a plurality of video providing apparatuses (in the case of the N-to-1 video viewing system 300 or the N-to-N video viewing system 400), the attribute as the video group is used instead of the video providing apparatus unit. Based on the information, playback output of video and audio on the video playback device side may be controlled.
  • the viewing restriction may be performed not by the attribute of the video providing device but by individual circumstances on the video playback device side.
  • the video playback device for example, performs only video display without sound when the viewer is talking (including during a call), and outputs only audio when the viewer is performing other work visually.
  • the video playback device performs processing such as applying a mosaic or a mask.
  • video and audio playback output is limited to display only video (no audio output), only audio output (no video display), and limited gaze direction (range that can be displayed in the whole sky)
  • Video resolution adjustment including processing to apply a mosaic or mask when a specified area or specific subject enters the field of view, filtering processing such as parental control), modulation of sound Processing (for example, the voice of a specific person (or a person other than the specific person) is modulated or silenced) can be included.
  • the following staged viewing restrictions may be set and given according to the user attribute of the video providing device or the video group to control information viewed on the video playback device.
  • the movement of the viewpoint position is limited by, for example, switching the video providing apparatus that is the video transmission source in the N-to-1 video viewing system 300 that forms a video group with a plurality of video providing apparatuses. This can be applied when moving the position (restriction of switching to a specific video providing device) or when using a video providing device mounted on a moving body (restriction of movement into a specific area).
  • Listening restriction can be said to be a process of filtering video and audio.
  • a method for realizing the viewing restriction there are a method in which the video playback device first receives all information from the video providing device and performs a filtering process at the time of playback output, and a method in which the video providing device side performs a filtering process. In the latter method, the amount of information transmitted from the video providing device can be reduced, leading to effective use of the communication band.
  • Prior to setting viewing restrictions when starting transmission of video or audio information to the video reproducing device. Processing is required prior to setting viewing restrictions when starting transmission of video or audio information to the video reproducing device. Processing is required.
  • A-3 Capacity processing of video playback apparatus
  • a large number of video playback apparatuses 201 .. Must be simultaneously transmitted in real time to the video playback apparatuses 202-1, 202-2,.
  • the number of video playback devices increases, there arises a problem that a video transmission delay becomes obvious due to the restriction of the communication band allowed for the video viewing system 200.
  • the capacity (upper limit) of the video playback device that simultaneously transmits the video in real time from the video providing device may be set.
  • capacity refers to, for example, the number determined according to the capacity limit of the communication band. Or, depending on the circumstances of the user on the video providing device side, it may be the number that can be accepted as the transmission destination of the video imaged at the viewpoint position that is the current position (for example, if you do not like the spread of information, you can reduce the number It may be set to capacity).
  • a common capacity (number) may be set for the entire system,
  • Each video providing device may individually set a capacity.
  • the video providing device drives out the video playback device for the number of devices exceeding the capacity, and determines the number of video playback devices that simultaneously deliver real-time video. To fit within.
  • the following can be listed as a processing method for selecting a video playback device to be evicted.
  • the new video playback device can receive the video in order from the video playback device that has requested transmission start earlier (see FIG. 7). There is a merit that the opportunity to view the video shot at the viewpoint position is equally allocated to the users (viewers) of the video playback apparatuses.
  • the video playback device that has requested transmission start later is evicted (see FIG. 8)
  • the new video playback device is locked out and the video can be viewed after reaching the capacity.
  • the video playback apparatus that has made the transmission start request does not break the video, and the user (viewer) who has already entered can view it with peace of mind.
  • some members of the viewing group are not expelled on the way.
  • FIG. 9 illustrates a mechanism for limiting the number of video playback devices that receive video from video providing devices in the priority order method to within the capacity.
  • the priority order is expressed in shades (the darker the color, the higher the priority order).
  • priority order may be assigned to each video playback device in accordance with the user attribute (described above) of the user of the video playback device (ie, viewer).
  • video distribution from the video providing device is a paid service (that is, monetize)
  • priority is assigned according to the amount of money paid by the video playback device (or its user)
  • the priority assigned to each video playback device may be dynamically changed.
  • real-time video transmission is not performed for video playback devices that exceed the capacity, but instead, past video can be transmitted. According to this method, it is possible to express that the video reproducing apparatus leaked from the capacity is not simply expelled but “dismissed in the past”.
  • the video providing device records the real-time video to be distributed to each video playback device within the capacity with an external device. Then, instead of directly transmitting the real-time video from the video providing device, the past video recorded on the external device is distributed to the kicked-out video playback device (see FIG. 10). .
  • the external device referred to here is, for example, a recording server that records video and is installed physically independently of the video providing device.
  • a recording server that records video and is installed physically independently of the video providing device.
  • the recording server By entrusting the recording server to distribute the video to the video playback device that has been driven out of the capacity, the load on the video providing device can be distributed.
  • a video playback device that has been kicked out of the capacity cannot view live images taken at the installation location (viewpoint position) of the video providing device, but can relive it as long as time delay is allowed. Can do.
  • the real-time video captured by each video providing device is also transmitted to the recording server.
  • the received video is recorded in association with information for identifying the source video providing device or information for specifying the captured viewpoint position (installation location of the video providing device).
  • the video transmission start requests to the video providing apparatus installed at the viewpoint position where the player who keeps the ball can be imaged tend to concentrate.
  • real-time video is not directly transmitted from the video providing device to the evicted video playback device, but to an external device. Distribute past recorded video.
  • the capacity (such as the capacity limit of the communication band) has been increased, and the video playback device that has been driven out of the capacity until now has entered the capacity. It is also assumed that live video can be transmitted. In such a case, when the video to be distributed to the new video playback device is directly switched from the past video to the real-time video, the video information of the time difference is interrupted in the video playback device. In addition, when a scene suddenly switches from a past video to a live video, the viewer may feel uncomfortable.
  • a so-called “chase playback” or “time-shift playback” process (for example, refer to Patent Document 4) is performed to provide a new video playback device.
  • the video to be viewed may be made to catch up with the real-time video from the past video. For example, if chasing playback is performed at a speed of 1.0x (x is an arbitrary integer), the video can be seamlessly switched to a real-time video, and the viewer can view the video that is chased and played without a sense of incongruity.
  • FIG. 11 shows a processing procedure for transmitting the video captured by the video providing apparatus to a plurality of video reproduction apparatuses in the form of a flowchart.
  • the video providing device checks whether the number of video playback devices that are transmitting video is still within the capacity (step S1102).
  • step S1101 when the number of video playback devices requested to start transmission in step S1101 is not exceeded (No in step S1102), the video providing device transmits real-time video to the video playback device. Start (step S1103).
  • step S1101 determines whether the video playback device is within the capacity (in other words, this A capacity determination process is performed to determine whether or not the video playback device should be evicted (step S1107). Whether or not this video playback device should be evicted may be determined by any one of the above-described FIFO method, LIFO method, and priority method, for example.
  • step S1101 If it is determined in step S1101 that the number of video playback apparatuses that have requested transmission start is within the capacity (Yes in step S1108), the video providing apparatus starts transmission of real-time video to the video playback apparatus ( Step S1103).
  • step S1101 If it is determined in step S1101 that the video playback device that has requested transmission start is out of capacity (that is, should be evicted) (No in step S1108), the recording server sends a past video to this video playback device. Is transmitted (step S1109). Since the recording server transmits the past video instead of the video providing device, the video transmission load is distributed (described above).
  • step S1109 the number of video playback devices to which the video providing device is transmitting real-time video decreases, and the capacity is vacant.
  • the video providing apparatus performs a capacity determination process again to determine whether or not the video playback apparatus requested to start transmission in step S1101 should be included in the capacity (step S1111). Whether or not the video reproduction apparatus should be included in the capacity may be determined by any of the above-described FIFO method, LIFO method, and priority method, for example.
  • step S1112 If it is determined that the video playback device that has requested transmission start in S1101 should not be included in the capacity (that is, the video playback device should be left out) (No in step S1112), the video providing device performs the video playback.
  • the past video is continuously distributed (from the recording server) to the apparatus (step S1109).
  • step S1112 If it is determined in S1101 that the video playback device requested to start transmission should be included in the capacity (Yes in step S1112), the video providing device sends a recording server to the video playback device. After performing the chasing playback from above (described above) (step S1113), switching to real-time video transmission is performed (step S1103).
  • step S1103 when another video playback device requests the same video providing device to start transmission and the capacity is over (step S1103).
  • a capacity determination process is performed to determine whether or not the video playback apparatus requested to start transmission in Step S1101 can remain in the capacity (in other words, whether or not to eject) (Step S1005). Whether or not this video reproduction device can remain in the capacity may be determined by any one of the above-described FIFO method, LIFO method, and priority method, for example.
  • the video providing device continuously transmits real-time video to the video playback device. (Step S1103).
  • the video providing apparatus sends a past video (for example, from a recording server) to this video playback apparatus. (Step S1109).
  • the video playback apparatus kicked out due to over-capacity can view past video with a time delay of, for example, 5 minutes from the recording server.
  • a time delay of, for example, 5 minutes from the recording server.
  • FIG. 12 schematically shows a functional configuration of an information processing apparatus 1200 that can function as a video providing device in the video viewing systems 100 to 400.
  • the illustrated information processing apparatus 1200 includes an imaging unit 1201, a status display unit 1202, a video encoding unit 1203, an audio input unit 1204, an audio encoding unit 1205, a multiplexing unit (MUX) 1206, and a communication unit. 1207 and a control unit 1208.
  • MUX multiplexing unit
  • the imaging unit 1201 is configured by a multi-lens omnidirectional camera, for example.
  • the imaging unit 1201 includes a single-lens camera (including a wide-angle camera and a fish-eye camera) and a two-lens stereo camera, and is mounted on a support unit including an XYZ table as shown in FIG. You may comprise so that the arbitrary gaze directions of the whole sky may be imaged by a swing operation
  • the imaging unit 1201 images the surroundings using the place where the information processing apparatus 1200 is installed as the viewpoint position.
  • the imaging unit 1201 identifies an image in the line-of-sight direction designated by the control unit 1207 out of the whole sky, and outputs the image to the video encoding unit 1203.
  • the video encoding unit 1203 performs encoding processing of the video signal captured by the imaging unit 1201.
  • the entire sky image captured by the image capturing unit 1201 is transmitted to the server as it is, and the server receives a predetermined image from the entire sky image.
  • An image in the line-of-sight direction may be cut out and distributed to the video playback device.
  • the audio input unit 1204 is configured by, for example, a small microphone or a stereo microphone, and can be collected together with the imaging unit 1201 to collect the sound at the imaging site of the whole sky video. If a stereo microphone is used, the sound at the time of sound collection can be reconstructed three-dimensionally on the playback side (that is, the video playback device).
  • the audio encoding unit 1205 encodes the audio signal input by the audio input unit 1204.
  • the multiplexing unit 1206 multiplexes the encoded video signal and the encoded audio signal encoded by the video encoding unit 1203 and the audio encoding unit 1205, respectively, and transmits the multiplexed signal to the video reproduction apparatus (packet). To form.
  • the communication unit 1207 performs mutual communication with the video reproduction device, including transmission of video and audio. Further, communication with the recording server (described above) is performed via the communication unit 1207 as necessary.
  • the communication unit 1207 performs mutual communication with a video reproduction device, a recording server, and other external devices via a wide area network such as a wireless or wired LAN or the Internet.
  • the control unit 1208 comprehensively controls the operations of the above-described units 1201 to 1207.
  • the control unit 1208 includes a video playback device among all-round images captured by the imaging unit 120 in accordance with line-of-sight information and posture information received from a video playback device (or viewing group) that is a video transmission destination.
  • the image of the area (viewing angle) to be displayed is identified by and transmitted from the communication unit 1207 to the video reproduction device.
  • the control unit 1208 may control the resolution of the captured image of the imaging unit 1201 (for example, by reducing the resolution, the video sickness of the viewer and the spatial malfunction when changing the viewpoint position are suppressed. be able to).
  • the control unit 1208 may also control zoom (zoom up, zoom down) of the imaging unit 1201.
  • the control unit 1208 turns on / off the imaging operation and the audio input operation in order to limit the range of information provided according to the attribute information of the video reproduction device (or viewing group) that is the video transmission destination, Mosaic, mask processing, and input audio modulation processing are performed on the captured video.
  • the state display unit 1202 displays the operation state of the information processing apparatus 1200 as a video providing apparatus around.
  • As an operation state here for example, video transmission is in a standby state (or video transmission is stopped), video is already being transmitted to several video playback devices, and video transmission from many video playback devices Examples include a busy state in which start requests are concentrated, a state in which part or all of operations are stopped, and a service cannot be provided (described later).
  • the status display unit 1209 is configured by, for example, a light such as an LED or a display panel such as a liquid crystal.
  • FIG. 13 schematically shows a functional configuration of an information processing device 1300 that can function as a video reproducing device in the video viewing systems 100 to 400.
  • the illustrated information processing apparatus 1300 includes a communication unit 1301, a separation unit (DEMUX) 1302, an audio decoding unit 1303, an audio output unit 1304, a video decoding unit 1305, a display unit 1306, a control unit 1307, a line of sight.
  • a direction instruction unit 1308, a sensor unit 1308, and a sound collection unit 1309 are provided.
  • the units 1301 to 1309 will be described.
  • the communication unit 1301 performs mutual communication with the video providing apparatus, including transmission of video and audio. Further, communication with the recording server (described above) is performed via the communication unit 1301 as necessary.
  • the communication unit 1301 performs mutual communication with a video providing device, a recording server, and other external devices via a wireless or wired LAN or a wide area network such as the Internet.
  • a video or audio transmission start request is transmitted from the communication unit 1301 to the video providing apparatus installed at a place where the video is desired to be viewed (that is, the viewpoint position).
  • the communication unit 1301 receives a transmission signal formed in a predetermined signal format (packet) from the video providing apparatus.
  • a transmission signal formed in a predetermined signal format (packet) from the video providing apparatus.
  • the gaze direction change request is transmitted from the communication unit 1301.
  • a transmission stop request is transmitted from the communication unit 1301 to the video providing device that is receiving video and audio, and a transmission start request is sent to the destination video providing device. Is transmitted from the communication unit 1301.
  • the demultiplexing unit 1302 demultiplexes the signal multiplexed and transmitted from the video providing apparatus into an encoded video signal and an encoded audio signal, and distributes them to the audio decoding unit 1303 and the video decoding unit 1305, respectively.
  • the audio decoding unit 1303 decodes the encoded audio signal to generate a baseband audio signal, and outputs the audio from the audio output unit 1304.
  • the audio output unit 1304 includes monaural, stereo, multi-channel speakers, and the like.
  • the video decoding unit 1305 decodes the encoded video signal to generate a baseband video signal, and displays the video captured by the transmission source video providing apparatus on the display unit 1306.
  • the display unit 1306 (or the information processing apparatus 1300 main body) includes, for example, a head-mounted display, a dome-type display, or a large screen (or normal) monitor display.
  • the control unit 1307 controls the output of video and audio received from the video providing device.
  • the control unit 1307 controls display of UI and OSD (On-Screen Display) on the screen of the display unit 1306, and performs processing of operations performed by the user (viewer) on the UI and OSD.
  • UI and OSD On-Screen Display
  • the sensor unit 1308 measures the line-of-sight direction, head position, or posture of the user (viewer who views the video displayed on the screen of the display unit 1306).
  • the sensor unit 1308 is configured by combining a plurality of sensor elements such as a gyro sensor, an acceleration sensor, and a geomagnetic sensor (a total of 9 axes including a 3-axis gyro sensor, a 3-axis acceleration sensor, and a 3-axis geomagnetic sensor can be detected. Sensor).
  • the sensor unit 1308 may be integrated with the information processing apparatus 1300 main body (head mounted display, etc.), or may be an accessory part attached to the main body.
  • the sensor unit 1308 may measure the user's line-of-sight direction, head position, or posture based on the recognition result of a camera (not shown) that images the user. Alternatively, not only the user's head but also movements of the torso and limbs may be recognized from the captured image and input as a gesture. Further, the sensor unit 1308 measures a pressure acting between the user's foot and the ground surface, such as a load applied to a chair on which the user is sitting and a load (pressure) applied to a shoe sole worn by the user. It may be. The sensor unit 1308 may also detect biological information such as a user's brain wave, myoelectric potential, and body temperature in parallel. By using a sensor in combination as the sensor unit 1308, erroneous determination based on the detection result of the sensor unit 1308 can be prevented.
  • Operations such as the user's line-of-sight direction, head position, or posture detected by the sensor unit 1308 (or gesture operations using not only the head but also the torso and limbs) are displayed on the UI displayed on the display unit 1306. It may mean an operation on the OSD or an indication of the viewing direction of the displayed video. For example, the user's horizontal and vertical swings (turning to the right or left, looking up, looking down, etc.) can be handled as an instruction to change the viewing direction. In addition, an operation in which the user tilts the body forward or backward may be handled as a zoom operation of the camera in the current line-of-sight direction (zoom up if tilted forward, zoom down if tilted backward). Then, the detection result of the sensor unit 1308 is output to the control unit 1307.
  • the detection result of the sensor unit 1308 is output to the control unit 1307.
  • the sound collection unit 1309 is configured by a microphone or the like, collects a user's voice and the like, and outputs the collected sound to the control unit 1307.
  • the user's voice is an impression or exclamation for the video displayed on the display unit 1306, or a voice instruction to the control unit 1307 (or video playback device) (for example, changing the line-of-sight direction of the all-sky video). There is also.
  • the control unit 1307 controls the video providing apparatus that is receiving the video based on the user's line-of-sight direction, horizontal and vertical head swing (turning right or left, looking up, looking down, etc.) or a change in posture.
  • an instruction to change the line-of-sight direction for viewing the all-sky video is transmitted via the communication unit 1301.
  • the control unit 1307 transmits the user's voice instruction collected by the sound collection unit 1309 to the video providing apparatus via the communication unit 1301 after converting the voice instruction as it is or into text information or command information.
  • control unit 1307 displays the movement of the user's line-of-sight direction, head, and posture (or gesture movement using not only the head but also the torso and limbs) for operations on the UI and OSD on the screen. Performs processing on the display image of the display unit 1306 in accordance with this operation. For example, when a UI operation for instructing switching of the viewpoint position is performed by the user, the control unit 1307 sends a transmission stop request to the video providing apparatus that is receiving the video or audio to the switching destination video providing apparatus. On the other hand, a transmission start request is transmitted via the communication unit 1301. The switching of the viewpoint position is instructed by, for example, a “jump operation” by the user, and details thereof will be described later.
  • the information processing apparatus 1300 may further include a known input device such as a keyboard, a mouse, a touch panel, a joystick, or a game controller (all not shown).
  • a known input device such as a keyboard, a mouse, a touch panel, a joystick, or a game controller (all not shown).
  • This type of input device may be used for input operations on the UI and OSD on the screen of the display unit 1306, and for instructions for changing the line of sight or switching the viewpoint position.
  • the information processing apparatus 1300 may further include a multimodal interface (MIF) 1310 as means for freely controlling the environment of the space where the video is viewed.
  • the multimodal interface 1310 includes, for example, a blower that blows wind (light breeze, headwind, air blast) and splashes (water blast) to the user, a temperature adjustment unit that adjusts the ambient temperature and humidity of the user, and a humidity adjustment unit , A scent generating unit that gives scent and scent to the user, a tactile control unit that gives a tactile sensation to the user's body (such as the effect of pecking the back, a feeling that something touches the neck and feet) and vibration to the user's body You may provide one or more drive parts, such as a vibration control part to add.
  • the control unit 1307 is installed with the video providing device by driving some driving units included in the multimodal interface 1310 according to the content of the video received from the video providing device at a remote location, for example. By recreating (or feeding back) the sensation and experience received at the shooting location, the user can have a realistic and realistic experience similar to that at the shooting location.
  • FIG. 47 and 48 show a configuration example of a dome type display 4700 that can be applied as the display unit 1306 in the information processing apparatus 1300 as a video reproduction apparatus.
  • the viewer enters the dome, the viewer can observe the projected image.
  • 47 shows a cross section of the dome type screen 4701 cut by the frontal plane
  • FIG. 48 shows a cross section of the dome type screen 4701 cut by the sagittal plane.
  • the illustrated dome type display 4700 includes a dome type screen 4701, a support 4702 that supports the dome type screen 4701, and two projectors 4703 and 4704. Each projector 4703 and 4704 projects an image toward the dome-shaped screen 4701 based on the baseband image signal decoded by the image decoding unit 1305.
  • a chair 4706 on which a viewer who observes the projected image is seated is installed in a space formed by the dome-shaped screen 4701.
  • the inner circumference of the dome-shaped screen 4701 is a display surface for a projected image.
  • the dome-shaped screen 4701 is made of, for example, a resin such as a lightweight FRP (Fiber Reinforced Plastics), metal, glass, or the like.
  • the inner peripheral surface of the dome type screen 4701 is preferably subjected to painting, coating, or other surface treatment for preventing irregular reflection of light (projected image).
  • the inner periphery of the dome-shaped screen 4701 has a spherical or hemispherical shape. When a dome-shaped screen 4701 having a spherical or hemispherical shape is used, a realistic image with a wide viewing angle in the horizontal and vertical directions can be projected. Note that the outer shape of the dome-shaped screen 4701 is not particularly limited.
  • the projected image on the dome-shaped screen 4701 makes it easier for the viewer to feel the scale of the subject than when observing a magnified virtual image with a head-mounted display.
  • the inner diameter of the dome-shaped screen 4701 is set to about 1.5 to 2 meters, an image of a subject (person or the like) that the viewer feels life-size can be displayed, and the reality is increased.
  • the dome type display 4700 has a feeling of release compared to the HMD, but the feeling of immersion increases by presenting a 360-degree all-around video in the horizontal direction.
  • the support 4702 includes a pair of shaft portions 4702A and 4702B whose rotational axes coincide with each other, and the pair of shaft portions 4702A and 4702B supports the dome-shaped screen 4701 so as to be rotatable around the horizontal axis within the sagittal plane.
  • the structure supported by the pair of shaft portions 4702A and 4702B is not limited as long as the dome-shaped screen 4701 can be rotatably supported around the horizontal axis within the sagittal plane.
  • the support 4702 may also include a mechanism that supports the dome-shaped screen 4701 so as to be rotatable about a vertical axis. Further, the support 4702 may have a structure that supports the dome-shaped screen 4701 such as vertical movement so that the dome-shaped screen 4701 has a degree of freedom other than rotation.
  • the two projectors 4703 and 4704 respectively project the video signal (wide viewing angle video signal) supplied from the video decoding unit 1305 onto the inner periphery of the dome-shaped screen 4701.
  • Each projector 4703 and 4704 can project an image with high saturation and good color reproducibility onto a dome-shaped screen 4701 using a laser or LED as a light source.
  • Each projector 4703 and 4704 has a relative position and posture with respect to the dome-shaped screen 4701 in the vicinity of the edge of the dome-shaped screen 4701 so that the projected image of each other can cover the entire display surface on the inner periphery of the dome-shaped screen 101. It is fixed.
  • Each projector 4703 and 4704 is fixed to the dome-shaped screen 4701 via a table (not shown) having, for example, three axes directions and six degrees of freedom around each axis, and each optical axis (projection direction) can be finely adjusted.
  • the dome-shaped screen 4701 is rotated around the horizontal axis (described later), the projectors 4703 and 4704 also move together.
  • a wide viewing angle image can be presented on the dome-shaped screen 4701 by performing a stitching process on a joint portion between images projected on the dome-shaped screen 4701 from the projectors 4703 and 4704. Any algorithm can be applied to the stitching process. It is assumed that the projected images from the projectors 4703 and 4704 each have a resolution of 4K (width 4000 ⁇ length 2000). Further, the distortion of the wide viewing angle video due to the optical distortion of the projectors 4703 and 4704, the deformation of the inner periphery of the dome-shaped screen 4701 (including changes over time), or the like may be corrected by image processing.
  • a test pattern having a known shape may be projected from the projectors 4703 and 4704 onto the dome-shaped screen 4701, and image processing may be performed to cancel the distortion of the projected image of the test pattern. Further, distortion of the projected image caused by positioning errors when the projectors 4703 and 4704 are fixed to the dome type screen 4701 may be corrected by image processing. Further, a GUI including menus, buttons, and the like may be superimposed and displayed on the omnidirectional video projected from each projector 4703 and 4704.
  • the dome-type display 4700 is assumed to be installed indoors and used, but of course, it may be installed outdoor and used. Further, a moving part such as a caster may be attached to the lower end of the support 4702 so that the installation location can be easily moved. Further, it is assumed that not only one dome type display 4700 is used by one person but also a plurality of persons and use in B2B (Business to Business).
  • the dome type screen 4701 is rotatably supported. As shown in FIGS. 47 and 48, when the dome-shaped screen 101 is supported substantially horizontally, a 360-degree all-around image can be presented on the display surface of the dome-shaped screen 101 in the horizontal direction. On the other hand, as shown in FIGS. 49 and 50, when the dome-shaped screen 4701 is rotated about the rotation axis of the shaft portions 4702A and 4702B by 90 degrees around the horizontal axis in the sagittal plane, the display of the dome-shaped screen 4701 is displayed. A 360-degree omnidirectional image can be presented on the screen in the vertical direction. For example, when observing a wide viewing angle image assuming the sky, high-rises, etc., as shown in FIGS.
  • dome-shaped screen 4701 if the dome-shaped screen 4701 is rotated by 90 degrees, an image below (for example, the ground) is also presented. can do. Further, as shown in FIGS. 47 to 50, not only the dome type screen 4701 is installed in the horizontal direction or the vertical direction, but also the dome type screen 4701 is arranged in the sagittal plane as shown in FIGS.
  • the dome type display 4700 can be used by tilting it at an arbitrary angle of 0 to 90 degrees around the horizontal axis.
  • the dome-type display 4700 includes two projectors 4703 and 4704, but three or more projectors may be installed.
  • FIG. 53 shows a configuration example of a dome type display 4700 in which two projectors 4708 and 4709 are attached to a dome type screen 4701 in addition to the projectors 4703 and 4704.
  • FIG. 54 shows a state where a large number of pico projectors are installed on a dome-shaped screen 4701. Increasing the number of projectors installed can improve the brightness, contrast, and resolution of projected images.
  • a projected image from one projector may be blocked by a hand protruding from the viewer, but this can be supplemented with a projected image from another projector. .
  • the control unit 1307 may perform control so that the projector is partially driven in accordance with the viewer's body posture, hand position, and the like. Install a camera, distance sensor, etc. for each projector, detect whether there is an obstacle between each projector and the screen 4701 surface, or whether there is a shadow on the projected image, and turn off the projector that does not project the image well. Instead, you can turn on the adjacent projector.
  • the pico projector displayed in white is on and the pico projector displayed in gray is off.
  • C. UI for viewpoint movement (video switching)
  • the video providing device that is the video transmission source is switched.
  • the viewpoint position of the video that can be viewed on the video playback device can be instantaneously moved to the installation location of the switched video providing device (see, for example, FIG. 6).
  • display content is switched such that a viewable moving image content is represented by a plurality of thumbnails and the clicked moving image content is reproduced (for example, see Patent Document 5).
  • the user (viewer) of the video playback device displays a plurality of thumbnails while virtually experiencing the all-sky video with the viewpoint of the installation location of the current video source device as the transmission source.
  • the viewpoint of the installation location of the current video source device as the transmission source.
  • FIG. 14 shows a display example of a screen where a video playback device is viewing a viewpoint video from a video providing device.
  • a video providing device 1401 other than the video providing device that is transmitting this video is also shown.
  • Another video providing device 1401 that appears on the display screen of the video playback device is a candidate for the destination of the next viewpoint position. Therefore, when another video providing device 1401 is found in the display screen, as shown in FIG. 15, a target mark 1501 indicating that there is a candidate for the movement destination of the viewpoint position is superimposed on the video providing device. And display it.
  • another video providing apparatus 1401 is installed at a position close to the drop position of the long-passed ball.
  • the user wants to move the viewpoint position to the installation location of the video providing device 1401 where the ball can be seen well
  • the user (viewer) moves the viewpoint position by performing a “jump operation” that gently moves his / her head or torso up and down. Instruct.
  • the user's jump operation is detected based on the measurement result of the sensor unit 1308 (see FIG. 13) of the video reproduction device, it is determined that the movement of the viewpoint position to the place where the target mark 1501 is displayed is instructed.
  • a transmission stop request is transmitted to the video providing apparatus that is the transmission source of the currently displayed video, and a transmission start request is transmitted to the destination video providing apparatus 1401.
  • video transmission is started from the video providing device 1401
  • the viewpoint movement is realized, and the video playback device can view the video captured from the new viewpoint position.
  • a position management server that manages the installation locations of all the video providing apparatuses accessible by the video playback apparatus is provided, and the video playback apparatus displays the video that exists within the display angle of view every time the viewpoint position or line-of-sight direction is changed. You may make it inquire about a provision apparatus and its installation place.
  • a video providing apparatus that can be moved by being mounted on a moving body may notify the position management server of its position information every time it moves (or periodically). .
  • each video providing apparatus intermittently transmits a beacon signal describing its own position information, and the video reproduction apparatus determines a video providing apparatus existing within the display angle of view based on the received beacon signal. It may be.
  • the video playback device intermittently transmits a location notification request to surrounding video providing devices, and the video providing device that has received this location information notification request returns its location information to the video playback device. May be.
  • FIG. 16 shows a bird's-eye view of a space in which a plurality of video providing devices accessible from a video playback device (video transmission start request is possible) are scattered.
  • reference numeral 1601 is a video providing apparatus that is transmitting video to the video playback apparatus, and reference numerals 1611 to 1616 are accessible but not transmitting video (or video playback apparatus) It is a video providing device (waiting for video transmission to).
  • the position management server manages each video providing device 1601, 1611-1616, or the video reproducing device receives a beacon signal from each video providing device 1601, 1611-1616, or the video reproducing device receives each video providing device 1601.
  • the position information of each of the video providing devices 1601 and 1611 to 1616 is acquired by an arbitrary method such as inquiring of the images 1611 to 1616. Then, when the angle of view 1622 displayed on the video playback device is calculated based on the position information of the video providing device 1601, the line-of-sight direction of the imaging unit, zoom, and the like, the video providing devices 1611 and 1612 appear on the screen. You can figure out that you are.
  • Such an indexing process is performed, for example, every time the video playback device starts displaying video, changes the line-of-sight direction, or switches the viewpoint position.
  • the indexing process may be executed by the video playback device, or may be executed by the location management server and notified to the video playback device.
  • the video providing device 1611 recognizes an image of the video providing device (or a two-dimensional barcode or visual marker provided therein) included in the video displayed on the screen, and provides the video providing device 1611. And 1612 may be found on the screen.
  • FIGS. 14 and 15 show an example in which only one video providing device exists on the screen and a target mark is displayed on the single video providing device for the sake of simplicity. Actually, it is assumed that there are two or more video providing devices on the display screen of the video playback device. In such a case, the target mark is superimposed on each video providing device.
  • FIG. 17 shows a state in which target marks 1701, 1702, and 1703 are superimposed and displayed on a plurality of video providing apparatuses existing on the display screen, respectively.
  • the user can select the target mark by staring at the target mark at the viewpoint position to be moved (or continuing to watch for a certain period of time).
  • the video providing apparatus on which the target mark 1703 is superimposed transitions to a central display screen.
  • the selected target mark 1703 is highlighted, or the other target marks 1701 and 1702 are grayed to visually indicate that the target mark 1703 is selected. It may be.
  • the target mark is arranged on the video imaged in the horizontal direction.
  • the target mark may be displayed using an image viewed from above rather than in the horizontal direction. For example, from a plurality of images captured in the horizontal direction, an image viewed from above can be synthesized. Or you may use a map, a floor plan of a building, etc. instead of the bird's-eye view image from the sky.
  • FIG. 40 shows an example in which target marks 4001, 4002, and 4003 indicating the installation locations of the respective video providing devices are displayed on the ground video viewed from above.
  • the user viewer
  • the user can select the next viewpoint position with an image of landing on a helicopter or the like while hovering in the sky and landing.
  • the user keeps looking up over a certain time (for example, about several seconds). Then, as shown in FIG. 42, a target mark 4201 appears in the sky (in the direction of the user's line of sight).
  • a target mark 4201 appears in the sky (in the direction of the user's line of sight).
  • the view switches from the sky as shown in FIG. 43, and the hover is performed in the sky.
  • the image looks like an overhead view of the ground. Although illustration is omitted, it is assumed that the target mark is displayed on the ground as shown in FIG. 40, and the user can look down on the target mark.
  • the target mark 4401 switches to highlight display as shown in FIG.
  • the user performs a “jump operation” that further moves the torso up and down while the target mark 4401 is selected, the user falls to the point where the target mark 4401 is displayed, and uses that as the viewpoint position.
  • the image is switched to a ground image (not shown).
  • processing for switching the viewpoint position includes transmission of a transmission stop request to the video providing device that is the transmission source of the currently displayed video, This includes transmission of a transmission start request to the video providing apparatus.
  • FIG. 39 shows a communication sequence example when the viewpoint position is switched in the video reproduction apparatus.
  • the video playback device 3901 when a user (viewer) performs a jump operation or the like and a viewpoint position switching instruction is issued (SEQ3901), the video providing device 3902 that is the transmission source of the currently displayed video is sent to the video playback device 3901. Then, a transmission stop request is transmitted (SEQ 3902).
  • the video playback device 3901 transmits a transmission start request to the destination video providing device 3903 (SEQ 3903).
  • Information regarding the current line-of-sight direction of the video reproduction device 3901 is added to the transmission start request and transmitted.
  • the video transmitted from the video providing device 3903 can also maintain the same line-of-sight direction as before the viewpoint movement, and the spatial malfunction of the user (viewer) can be suppressed (described later).
  • the transmission stop request is transmitted in advance, and then the transmission start request is transmitted.
  • the video reproduction apparatus transmits the transmission stop request and the transmission start request at the same time, or sends the transmission start request. You may make it transmit previously.
  • the video playback device takes time from the video transmission from the video providing device that is the current video transmission source to when the video playback device switches to the video from the destination video providing device. Therefore, buffering of the video from the video source device that is the transmission source of the current video may be performed, and based on the buffered video, a transition video until switching to the destination video may be displayed (SEQ 3904). .
  • the transition video is preferably a video that has the effect of suppressing the spatial malfunction of the user (viewer) before and after the switching of the video. Details of this point will be described later.
  • the video playback device 3901 instructs the video providing device 3901 during video transmission or It is assumed that a transmission direction change request is sequentially transmitted to 3902.
  • the status display target mark of the video providing device can also serve as an indicator that indicates a movable viewpoint position to the user (viewer) of the video playback device.
  • the video providing device 1900 also lights a light 1901 such as an LED to notify its presence. Good.
  • the irradiation light of the light 1901 appeals the presence of the video providing device 1900 within the screen of the video playback device, informs the presence at the installation location of the video providing device 1900, and indicates the operating state of the video providing device 1900. It can also be used for display.
  • the operation state may be displayed on a display panel such as a liquid crystal display.
  • the light 1901 corresponds to the status display unit 1209 equipped in the information processing apparatus 1200 illustrated in FIG.
  • the video providing apparatus 1900 when the video providing apparatus 1900 is in a standby state for video transmission (or in a state where video transmission is stopped), or when video is already being transmitted to some video playback apparatuses, the video providing apparatus 1900 receives a large number of video playback apparatuses.
  • the video transmission start request is busy, when the service cannot be provided because part or all of the operation is stopped, the lighting color of the indicator 1901 and the irradiation light intensity are switched according to the video transmission state. May be. Further, the indicator 1901 may be blinked or the blinking pattern may be switched according to the video transmission state.
  • the video providing apparatus 1900 is an information source useful for a remote video reproduction apparatus, but there is a risk of giving an impression that people in the vicinity of the installation location are being monitored or voyeurized. Therefore, the indicator 1901 of the video providing apparatus 1900 that has not yet sent the video may be turned off to reassure the surrounding people even if it looks.
  • the support unit 502 is configured to support a two-lens stereo camera so as to be rotatable about the pitch axis
  • the robot head is lowered as shown in FIG.
  • the video providing apparatus 1900 preferably keeps the camera and the actuator in an active state in order to be able to respond to the transmission start request at high speed even in the standby state.
  • the video providing apparatus when the video providing apparatus is in the standby state as shown in FIG. 24, when the transmission start request received from any of the video playback apparatuses is received and imaging is started, the driving of the pitch axis is resumed. Then, the robot head returns to the posture where the robot head has been raised as shown in FIG. When returning, for example, the yaw axis may be driven so that the stereo camera is directed in the line-of-sight direction designated by the video playback device that is the transmission start request source. By maintaining the line-of-sight direction before and after moving the viewpoint position, the spatial malfunction of the user (viewer) can be suppressed (described later).
  • a stereo camera lens may be provided with an openable / closable lid 3801 to indicate that no image is taken with the lid 3801 closed, thereby giving a sense of security.
  • the video playback device is configured with a head-mounted display
  • the user's field of view is obstructed by the displayed video, so the operation of the input device may be difficult and the operation is not intuitive.
  • the gesture input switches the hand function, and the input operation becomes discontinuous.
  • the “jump operation” that moves the head or the body lightly up and down does not compete with other input operations, and is considered intuitive and easy to understand.
  • the sensor unit 1308 (see FIG. 13) of the video playback device is configured by combining a plurality of sensor elements such as a gyro sensor, an acceleration sensor, and a geomagnetic sensor, for example, and based on the result of measuring the movement of the head. Detecting user jump movement. Alternatively, the sensor unit 1308 may detect a jump operation based on a recognition result of a video (not shown) that captures the user. Further, the sensor unit 1308 measures a pressure acting between the user's foot and the ground surface, such as a load applied to a chair on which the user is sitting and a load (pressure) applied to a shoe sole worn by the user. It may be.
  • a pressure acting between the user's foot and the ground surface such as a load applied to a chair on which the user is sitting and a load (pressure) applied to a shoe sole worn by the user. It may be.
  • the sensor unit 1308 may also detect biological information such as a user's brain wave, myoelectric potential, and body temperature in parallel. By using a sensor in combination as the sensor unit 1308, erroneous determination based on the detection result of the sensor unit 1308 can be prevented.
  • the movements imaged by humans have large individual differences. Even if the motion data values detected based on the analysis results of the images taken by the sensors and cameras are the same, some people intend to move their heads up. Often, such an action is not intended. Similarly, individual differences are also great in operations such as turning the head up or down and moving the torso lightly up and down in a sitting posture. Therefore, in order to realize a comfortable user interface, the viewer is given a predetermined number of actions such as turning his head up, turning down, and slightly moving the torso up and down while sitting down. It is preferable to perform the matching between the intended motion and motion data, that is, machine learning in advance.
  • the guidance / guide target mark to the viewpoint position is a UI that is displayed superimposed on the video providing device that appears on the screen of the video playback device (described above) and moves to the user (viewer) of the video playback device. It serves as an indicator of possible viewpoint positions.
  • other video playback devices may be aggregated to display the distribution of viewpoint positions in a heat map format.
  • the target mark and the heat map are identical in that they have a role of guiding or guiding the viewpoint position.
  • the heat map has the effect of visually representing popular viewpoint positions.
  • the tendency of other video playback devices may be reference information or clues for determining the next viewpoint position.
  • FIG. 20 shows a state in which heat maps 2001 to 2004 are superimposed on a display image of a soccer stadium.
  • the user (viewer) of the video playback device can easily see the location (or favorite) indicated by more video playback devices through the screen shown in the figure, and decide which viewpoint position to move to. Can do.
  • the heat map can be called a collective pointer display. Note that the heat map is not mapped to pixel positions on the display screen, but is mapped to an actual three-dimensional space. Therefore, even if the user changes the line-of-sight direction, the heat map remains superimposed on the same place in the three-dimensional space.
  • FIG. 21 shows an example of a processing procedure for displaying a target mark or a heat map in the video reproducing apparatus in the form of a flowchart.
  • step S2101 it is checked whether or not the number of other video playback devices viewing a video shot in the same space as the video playback device exceeds a predetermined number.
  • step S2102 the video playback device executes normal target mark display processing (step S2102), and ends this processing routine. To do. This is because, if the number of other video playback devices is less than a predetermined number, it is considered that powerful information that can guide or guide the viewer's viewpoint position cannot be obtained even if the distribution is taken.
  • the normal target mark display process is executed, as shown in FIG. 17, for example, the target mark is superimposed and displayed on each video providing device appearing on the current display screen.
  • the video playback device displays a heat map. Specifically, the distribution of the locations indicated by each of the other video playback devices in the three-dimensional space of the field of view being viewed by this video playback device is taken (step S2103), and the distribution is plotted according to the color scale or the gray scale.
  • the created heat map is created (step S2104).
  • the video reproduction apparatus displays the created heat map superimposed on the screen (step S2105).
  • the following measures may be taken in order to prevent spatial movement when the viewpoint position is moved by a jump operation.
  • FIG. 22 is a bird's-eye view of the above-described (1) while switching to the image of the next viewpoint position while maintaining the line-of-sight direction.
  • reference numeral 2201 indicates the viewpoint position before movement
  • reference numeral 2211 indicates the line-of-sight direction before movement at this viewpoint position 2201.
  • the video providing device installed at the viewpoint position 2201 displays the video obtained by capturing the line-of-sight direction 2211 on the video playback device.
  • reference number 2202 is the viewpoint position after movement (that is, selected by the target mark), and reference number 2212 indicates the line-of-sight direction after movement at this viewpoint position 2202.
  • the video displayed on the video playback device is switched to the video obtained by photographing the line-of-sight direction 2212 by the video providing device installed at the viewpoint position 2202.
  • the next viewpoint position 2202 is a position ahead of the line of sight that has been viewed so far. Then, when the video is switched to the video at the next viewpoint position 2202 while keeping the line-of-sight direction, the user (viewer) of the video playback device can easily understand that the line-of-sight direction has not changed even if it has just moved forward. Therefore, loss of direction can be prevented.
  • the video playback device requests the video providing device installed at the viewpoint position 2202 to add the information about the current line-of-sight direction and start transmission of the video, the video after moving the viewpoint is also shown in FIG. As shown, the line-of-sight direction can be maintained, and the spatial malfunction of the user (viewer) can be suppressed.
  • the viewpoint position when the viewpoint position is moved across a pitch in a soccer stadium, the user loses directionality by switching to the video of the next viewpoint position while maintaining the line-of-sight direction.
  • the image of the stand is displayed and the game cannot be watched. Therefore, as a process after switching the viewpoint position, the user may perform an operation of changing the line of sight in the direction facing the pitch at the new viewpoint position.
  • the line-of-sight direction immediately after the viewpoint position is moved may be controlled according to the state of the transmission destination video playback device. For example, when the user of the video playback device is standing and watching the video, sitting and watching the video, or walking (or running) while watching the video, the spatial malfunction may occur. The degree of waking is different.
  • the spatial imbalance may be suppressed by controlling the video resolution instead of the visual line direction immediately after the movement of the viewpoint position (or in conjunction with the visual line direction control). If the resolution is reduced, viewers will not be prone to motion sickness. In addition, in a state where the viewer is running, it is difficult to identify a fine portion of the video, and even if the resolution is lowered, no adverse effect is caused.
  • the above effect line (2) and motion blur (3) are realized by adding a video effect using animation to the video before and after moving the viewpoint position.
  • a baseball game as an example, as shown in FIG. 25, it is installed behind the catcher as shown in FIG. 26 from an image obtained by imaging the back of the pitcher with the video providing device installed behind the center (back screen).
  • FIG. 27 to FIG. 36 illustrate video effects that are inserted when the viewpoint position is switched to a video obtained by imaging the pitcher from the front by the video providing apparatus.
  • video effects such as blur animation that follows the movement of jumping in the direction of the target viewpoint position before and after switching the viewpoint position, the time until the next video display time is gained, and the space Prevent malfunction.
  • a cylindrical focus 2701 is displayed at the destination viewpoint position.
  • the target mark is switched to the display of the focus 2701.
  • the focus 2701 may be an animation indicating an elapsed time from the start of the viewpoint position switching process (or the remaining time until the viewpoint position switching process is completed).
  • an effect line 3001 is displayed from the point where the focus 2701 appears.
  • the illustrated effect line 3001 is composed of innumerable concentric circles whose intervals gradually decrease toward the destination viewpoint position (or the center of the focus 2701), and visually represents the direction toward the center of the circle.
  • other effect lines such as a “concentrated line” that draws innumerable lines from the periphery toward the viewpoint position of the movement destination may be used.
  • the effect line 3001 has a visual effect of guiding (or guiding the viewer's line of sight) to the destination viewpoint position (the place where the focus 2701 is arranged), and suppresses spatial ataxia and video sickness. Can do.
  • an animation that shrinks the radius of each circle composing the effect line 3001 is displayed to visually express the progress toward the center of the circle. You may make it do.
  • the motion sickness of the user can be suppressed by synchronizing the animation of the effect line 3001 with the acceleration of the movement to the viewpoint position.
  • FIGS. 33 to 35 after inserting a blur animation that follows the movement of jumping toward the center of the circle (focus 2701), as shown in FIG. Switch to the image captured with.
  • a transition image using an effect line or blur is inserted as shown in FIGS.
  • Blur also has the effect of reducing the resolution of the original video, and can suppress spatial malfunction and video sickness of the user (viewer).
  • the time required for completing the process of switching the transmission source video providing apparatus is obtained by the display time of the transition video.
  • the video playback apparatus displays a video providing apparatus as a transmission source of the currently displayed video while displaying a transition video using an animation such as an effect line or blur.
  • a transmission stop request is transmitted to the destination video providing apparatus and a transmission start request is transmitted to the destination video providing apparatus.
  • FIG. 37 shows a processing procedure for moving the viewpoint of the displayed video in the video playback apparatus in the form of a flowchart.
  • the illustrated processing procedure can be realized, for example, in a form in which a predetermined program code is executed in the control unit 1307 in the information processing apparatus 1300 illustrated in FIG. 13.
  • the video playback device every time the display video is switched, such as switching the viewpoint position or switching the line-of-sight direction at the same viewpoint position (Yes in step S3701), the video providing device existing on the current display screen is detected (step) S3702).
  • the video reproduction device may recognize the display video and detect a video providing device included in the screen, or may make an inquiry to an external location management server. You may make it determine whether each position information is acquired by direct communication with a provision apparatus, and it exists in a screen (above-mentioned).
  • step S3703 it is checked whether or not the video providing device has been detected in the current display screen.
  • the location with the target mark (that is, the installation location of the corresponding video providing device) is a candidate for the destination of the viewpoint position.
  • the user stares at one target mark to be moved (or looks at a certain time).
  • the viewpoint position (the video providing apparatus installed in the viewpoint position) is selected.
  • step S3707 when the user performs a jump operation to instruct the movement of the viewpoint position to the place where the target mark is attached (Yes in step S3707), the video playback apparatus uses the viewpoint to the captured video from the destination viewpoint position.
  • the position moving process is started (step S3708).
  • the user's jump operation can be detected by, for example, the sensor unit 1308 in the video playback device.
  • the sensor unit 1308 includes a plurality of sensor elements such as a gyro sensor, an acceleration sensor, and a geomagnetic sensor, a camera, and the like.
  • the sensor unit 1308 includes a plurality of sensor elements such as a gyro sensor, an acceleration sensor, and a geomagnetic sensor, a camera, and the like.
  • the jump movement is detected. It may be.
  • the jumping motion may be detected in parallel with the user's brain waves or the like.
  • the user may instruct the movement of the viewpoint position via a normal input device such as a keyboard or a mouse.
  • the video playback device transmits a transmission stop request to the video providing device that is the transmission source of the currently displayed video, and also transmits a transmission start request to the destination video providing device. Send.
  • the viewpoint position moving process in step S3708 includes a display transition process for switching the display to the video imaged at the destination viewpoint position.
  • the video playback device displays a transition video using animation such as an effect line and blur as shown in FIGS.
  • the transition video has an effect of reducing a user's spatial malfunction while gaining display time until the process of switching the viewpoint position is completed.
  • step S3708 the user blows away the wind from the moving destination, or the object such as a leaf or a piece of paper is blown away. You may make it suppress.
  • step S3708 The movement of the viewpoint position in step S3708 is completed when the video signal from the destination video providing apparatus reaches the video reproduction apparatus and the display of the video imaged at the new viewpoint position is started. Thereafter, the process returns to step S3701, and the same processing as described above is repeatedly executed every time the display video is switched by the video playback device.
  • FIG. 45 shows a processing procedure for starting video transmission in the video providing apparatus in the standby state in the form of a flowchart.
  • the illustrated processing procedure can be realized by executing a predetermined program code in the transmission start request 1208 in the information processing apparatus 1200 shown in FIG. 12, for example.
  • the video providing device in the standby state receives a transmission start request from any of the video playback devices in the standby state (Yes in step S4501), the video providing device performs confirmation processing on the request source video playback device such as user attributes. (Step S4502), it is checked whether or not transmission of video to the video playback device is restricted (step S4503).
  • the viewing restriction of the video playback device is as already described in the above section A-2.
  • step S4503 If the transmission of the video is not permitted to the requesting video reproduction device (No in step S4503), the process returns to step S4501, and again enters a standby state until a transmission start request is received.
  • step S4503 if transmission of the video is permitted to the requesting video reproduction device (Yes in step S4503), initialization is performed in the line-of-sight direction specified in the transmission start request received in step S4501 (step S4503). In step S4504, the captured image is transmitted (step S4505).
  • FIG. 46 shows a processing procedure executed during video transmission in the video providing apparatus in the form of a flowchart.
  • the illustrated processing procedure can be realized by executing a predetermined program code in the transmission start request 1208 in the information processing apparatus 1200 shown in FIG. 12, for example.
  • the video providing apparatus performs imaging with the imaging unit 1201 (step S4601), encodes the video signal, and transmits the encoded video signal to the requesting video reproduction apparatus (step S4602).
  • the gaze direction is changed (step S4604).
  • the change of the line-of-sight direction is realized by a method of physically changing the line-of-sight direction of the image capturing unit 1201 itself or a method by image processing for changing the position at which the video for transmission is cut out from the whole sky image captured by the image capturing unit 1201. can do.
  • step S4605 Until a video transmission stop request is received from the video playback device of the video transmission destination (No in step S4605), the process returns to step S4601 and repeatedly executes imaging and transmission of the captured video.
  • step S4605 when a video transmission stop request is received from the video playback device of the video transmission destination (Yes in step S4605), this processing routine is terminated and video transmission is stopped.
  • a spherical display unit A receiving unit for receiving an image from an external device; A sensor unit that measures the user's line of sight, head position, or posture; A control unit; Comprising The control unit is configured to display at least one of a user's line of sight, head position, or posture measured by the sensor unit when displaying the first image received from the first external device on the display unit.
  • the second image is received from the second external device included in the first image, and the second image is displayed on the display unit.
  • Display device (2) further comprising a transmission unit for transmitting information to the external device; The control unit transmits information measured by the sensor unit to the second device when displaying the second image.
  • a sound collecting part is further provided,
  • the control unit transmits the sound collected by the sound collection unit to the external device;
  • the display device according to any one of (1) and (2) above.
  • It includes at least one of a blower, a temperature controller, a humidity controller, a tactile controller, a vibration controller, or a scent generator,
  • the control unit receives the air blower, the temperature adjustment unit, the humidity adjustment unit, the tactile control unit, the vibration control unit, or the vibration control unit according to the content of an image received from the external device and displayed on the display unit. Control the scent generating part, The display device according to any one of (1) to (3).
  • the control unit performs display control by switching between the first image and the second image according to a measurement result by the measurement unit.
  • the display device according to any one of (1) to (4) above.
  • the display unit includes a plurality of projection units that project an image on a screen. The control unit controls the projection unit so that no shadow is generated on the screen; The display device according to any one of (1) to (5) above.
  • an imaging unit A transmission unit that transmits an image captured by the imaging unit; A receiving unit for receiving a predetermined signal from an external device; A control unit; Comprising The control unit controls transmission of the image captured by the imaging unit to the external device based on line-of-sight information or posture information included in the predetermined signal.
  • Information processing terminal device The imaging unit captures a whole sky image, The control unit specifies and controls transmission of a predetermined image from the whole sky image based on the line-of-sight information or the posture information.
  • the technology disclosed in the present specification can be configured as follows.
  • (11) a communication unit that communicates with the playback device;
  • a line-of-sight direction changing unit that changes the line-of-sight direction of the video to be transmitted to the playback device;
  • a control unit for controlling video transmission to the playback device;
  • An information processing apparatus comprising: (12)
  • the apparatus further includes a state display unit that displays a state of the information processing apparatus.
  • the status display unit displays whether or not video is being transmitted to the playback device.
  • the information processing apparatus further comprising an imaging unit;
  • the line-of-sight direction changing unit changes a line-of-sight direction of the imaging unit, or changes an area for cutting out a transmission video from the whole sky video captured by the imaging unit.
  • the information processing apparatus according to any one of (11) to (13).
  • 15) Further comprising a support for supporting the imaging unit, The controller is supported by the support so that the imaging unit is in a predetermined posture in a standby state of video transmission; The information processing apparatus according to (14) above.
  • the control unit starts video transmission in response to a transmission start request from the playback device, and stops video transmission in response to a transmission stop request from the playback device.
  • the information processing apparatus according to any one of (11) to (15).
  • the line-of-sight direction changing unit sets the line-of-sight direction of the video according to the designation by the transmission start request.
  • the line-of-sight direction changing unit changes the line-of-sight direction of the video according to an instruction from the playback device.
  • a line-of-sight direction changing step for changing the line-of-sight direction of the video to be transmitted to the playback device;
  • a control step for controlling video transmission to the playback device;
  • An information processing method comprising: (20) a communication unit that communicates with the imaging device; A control unit that controls display of an image received from the imaging device and instructs a line-of-sight direction to be captured by the imaging device; Comprising The control unit detects a second viewpoint position included in the first video imaged at the first viewpoint position and displays the first video on which a mark indicating the second viewpoint position is superimposed. , Information processing device.
  • a detection unit that detects an operation on the mark is further provided, The control unit controls switching to display of a second video imaged at the second viewpoint position according to a detection result of the detection unit.
  • the detection unit detects the operation based on an operation of a user who views the display video.
  • the detection unit includes an acceleration sensor that detects acceleration of the user's head, an image of the user, a load applied to a chair on which the user sits, or a load applied to a shoe sole worn by the user. Detecting the operation based on at least one or a combination of two or more of the pressures; The information processing apparatus according to (22) above.
  • the control unit causes the first imaging device that captures an image at the first viewpoint position to transmit a video transmission stop request and captures an image at the second viewpoint position according to a detection result of the detection unit. Causing the second imaging device to transmit a video transmission start request, The information processing apparatus according to any one of (21) to (23). (25) The control unit designates the same line-of-sight direction as the first imaging device to the second imaging device. The information processing apparatus according to (24) above. (26) The control unit displays a transition video while switching from the first video to the second video. The information processing apparatus according to any one of (21) to (25) above.
  • the control unit includes at least one of an effect line that guides the line of sight to the second viewpoint position in the first video, or a blur that follows a movement jumping in the direction of the second viewpoint position. Displaying the transition image including one; The information processing apparatus according to (26) above. (28) a detection unit for detecting the movement of the head of the user who views the display image; The control unit instructs the imaging device in a line-of-sight direction according to a detection result of the detection unit; The information processing apparatus according to any one of (20) to (27).
  • (29) a communication step of communicating with the imaging device; A control step of controlling the display of the video received from the imaging device and instructing the line-of-sight direction to be captured by the imaging device; Have In the control step, a second viewpoint position included in the first video imaged at the first viewpoint position is detected, and the first video on which a mark indicating the second viewpoint position is superimposed is displayed.
  • Information processing method a plurality of transmission devices that transmit images captured at different viewpoint positions; A receiving device for displaying video received from any of the plurality of transmitting devices; Comprising When receiving the first video imaged at the first viewpoint position in the receiving device, a mark indicating the second viewpoint position is detected by detecting the second viewpoint position included in the first video image. Displaying the superimposed first video; Information communication system.
  • DESCRIPTION OF SYMBOLS 100 ... Video viewing system 101 ... Video provision apparatus, 102 ... Video reproduction apparatus 200 ... Video viewing system 201 ... Video provision apparatus, 202 ... Video reproduction apparatus 300 ... Video viewing system 301 ... Video provision apparatus, 302 ... Video reproduction apparatus 400 ... Video viewing system 401 ... Video providing device, 402 ... Video playback device 500 ... Video providing device, 501 ... Imaging unit, 502 ... Supporting unit 503, 504 ... Microphone (stereo microphone) 1200 ... Information processing device (video providing device) DESCRIPTION OF SYMBOLS 1201 ... Imaging part 1202 ... Status display part 1203 ... Video encoding part 1204 ... Audio

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention concerne un dispositif d'affichage qui affiche des images et un dispositif terminal de traitement d'informations qui commande la transmission d'images au dispositif d'affichage. Lorsqu'un mouvement de saut d'un utilisateur est détecté, un dispositif de lecture d'image détermine qu'il y a eu une instruction de déplacement de position de point de vue vers un emplacement auquel une cible/marque est indiquée, transmet une demande d'arrêt de transmission à un dispositif de fourniture d'image au niveau de la source de transmission de l'image en cours d'affichage, et transmet une demande de début de transmission au dispositif de fourniture d'image au niveau de la destination de déplacement. Lorsqu'une image commence à être transmise à partir du dispositif de fourniture d'image de la destination de déplacement, le déplacement de point de vue est mis en œuvre, et une image capturée depuis une nouvelle position de point de vue peut être visualisée sur le dispositif de lecture d'image.
PCT/JP2017/002810 2016-03-14 2017-01-26 Dispositif d'affichage et dispositif terminal de traitement d'informations WO2017159063A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP17766071.9A EP3432590A4 (fr) 2016-03-14 2017-01-26 Dispositif d'affichage et dispositif terminal de traitement d'informations
US16/075,295 US10455184B2 (en) 2016-03-14 2017-01-26 Display device and information processing terminal device
JP2018505313A JPWO2017159063A1 (ja) 2016-03-14 2017-01-26 表示装置並びに情報処理端末装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016050052 2016-03-14
JP2016-050052 2016-03-14

Publications (1)

Publication Number Publication Date
WO2017159063A1 true WO2017159063A1 (fr) 2017-09-21

Family

ID=59850669

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/002810 WO2017159063A1 (fr) 2016-03-14 2017-01-26 Dispositif d'affichage et dispositif terminal de traitement d'informations

Country Status (4)

Country Link
US (1) US10455184B2 (fr)
EP (1) EP3432590A4 (fr)
JP (1) JPWO2017159063A1 (fr)
WO (1) WO2017159063A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019124158A1 (fr) * 2017-12-19 2019-06-27 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme, système d'affichage et corps mobile
WO2019150675A1 (fr) * 2018-02-02 2019-08-08 株式会社Nttドコモ Dispositif de traitement d'informations
WO2019235106A1 (fr) * 2018-06-06 2019-12-12 株式会社アルファコード Dispositif de présentation de carte thermique et programme de présentation de carte thermique
CN111201796A (zh) * 2017-10-04 2020-05-26 Vid拓展公司 定制的360度媒体观看
JP2021520726A (ja) * 2018-04-05 2021-08-19 ヴィド スケール インコーポレイテッド 全方位ビデオに対する視点メタデータ
WO2021261099A1 (fr) * 2020-06-23 2021-12-30 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de détermination de plage d'affichage et programme
JP2022525906A (ja) * 2019-03-20 2022-05-20 北京小米移動軟件有限公司 Vr360において視点切り替え能力を伝送する方法および装置
JP7398854B1 (ja) 2023-06-30 2023-12-15 ヴィアゲート株式会社 ウェブページ閲覧解析システム、ウェブページ閲覧解析方法およびウェブページ閲覧解析プログラム
JP7398853B1 (ja) 2023-06-30 2023-12-15 ヴィアゲート株式会社 動画視聴解析システム、動画視聴解析方法および動画視聴解析プログラム

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018079166A1 (fr) * 2016-10-26 2018-05-03 ソニー株式会社 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations, et programme
WO2019155904A1 (fr) * 2018-02-08 2019-08-15 ソニー株式会社 Dispositif de traitement d'image, procédé de traitement d'image, programme, et système de projection
CN110248115B (zh) * 2019-06-21 2020-11-24 上海摩象网络科技有限公司 图像处理方法、装置及存储介质
JP6655751B1 (ja) * 2019-07-25 2020-02-26 エヌ・ティ・ティ・コミュニケーションズ株式会社 映像表示制御装置、方法およびプログラム
US11166050B2 (en) * 2019-12-11 2021-11-02 At&T Intellectual Property I, L.P. Methods, systems, and devices for identifying viewed action of a live event and adjusting a group of resources to augment presentation of the action of the live event
JP7154609B2 (ja) * 2019-12-27 2022-10-18 株式会社コナミデジタルエンタテインメント 観戦システム、観戦システム用のコンピュータプログラム、及び観戦システムの制御方法
IL295170A (en) * 2020-01-30 2022-09-01 Amatelus Inc Apparatus, system, method and software for video distribution
US11412310B2 (en) * 2020-05-18 2022-08-09 Qualcomm Incorporated Performing and evaluating split rendering over 5G networks
JPWO2021256326A1 (fr) * 2020-06-19 2021-12-23
WO2022014369A1 (fr) * 2020-07-17 2022-01-20 ソニーグループ株式会社 Dispositif de traitement d'image, procédé de traitement d'image et programme
WO2022091215A1 (fr) * 2020-10-27 2022-05-05 Amatelus株式会社 Dispositif de distribution de vidéo, système de distribution de vidéo, procédé de distribution de vidéo et programme
US20220321944A1 (en) * 2021-04-02 2022-10-06 Comcast Cable Communications, Llc Augmenting Content Items
KR20230123219A (ko) * 2022-02-16 2023-08-23 한국전자통신연구원 공간정보를 활용한 촉감 실감 콘텐츠 제공 시스템 및 방법

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0937137A (ja) * 1995-07-25 1997-02-07 Minolta Co Ltd 移動型立体カメラ装置
JP2001189927A (ja) * 1999-12-28 2001-07-10 Tokyo Gas Co Ltd 移動ステーションおよび制御ステーションならびに擬似体験システム
JP2003153263A (ja) * 2001-11-19 2003-05-23 Hitachi Ltd 映像表示システムおよびその方法
JP2004078736A (ja) * 2002-08-21 2004-03-11 Yamatosokki Joho Center Co Ltd 音声データまたは動画像データ等の再生を制御するための再生制御システム
JP2010068059A (ja) * 2008-09-08 2010-03-25 Sp Forum Inc 映像データ生成プログラム
JP2012194579A (ja) * 2012-06-21 2012-10-11 Ohira Giken:Kk 複合プラネタリウムシステム
JP2013141272A (ja) * 2006-11-22 2013-07-18 Sony Corp 表示装置、表示方法、画像表示システム
JP2013169234A (ja) * 2012-02-17 2013-09-02 Tokyo Univ Of Agriculture & Technology 匂いの空間分布制御方法、匂いの空間分布制御装置、視聴覚システム及び顧客誘導システム
WO2014077046A1 (fr) * 2012-11-13 2014-05-22 ソニー株式会社 Dispositif d'affichage d'image et procédé d'affichage d'image, dispositif formant corps en mouvement, système d'affichage d'image, et programme informatique

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08191419A (ja) 1995-01-10 1996-07-23 Yamaha Corp 頭部装着型表示システム
US5905525A (en) 1995-07-13 1999-05-18 Minolta Co., Ltd. Image display apparatus having a display controlled by user's head movement
JP2005302103A (ja) 2004-04-08 2005-10-27 Sony Corp 追っかけ再生装置,テレビジョン受像機,および追っかけ再生方法
JP4926400B2 (ja) 2004-12-27 2012-05-09 京セラ株式会社 移動カメラシステム
JP2007208340A (ja) 2006-01-30 2007-08-16 Makani Networks Co Ltd コンテンツ管理システム、管理サーバ、コンテンツ管理方法、およびプログラム
JP5245257B2 (ja) 2006-11-22 2013-07-24 ソニー株式会社 画像表示システム、表示装置、表示方法
KR101167246B1 (ko) * 2007-07-23 2012-07-23 삼성전자주식회사 3차원 콘텐츠 재생 장치 및 그 제어 방법
US8270767B2 (en) * 2008-04-16 2012-09-18 Johnson Controls Technology Company Systems and methods for providing immersive displays of video camera information from a plurality of cameras
CN102013222A (zh) * 2010-10-29 2011-04-13 鸿富锦精密工业(深圳)有限公司 球状显示屏
US9117389B2 (en) * 2011-10-18 2015-08-25 Shenzhen YuanWang cocotop Technology Co., Ltd. Dome-screen device, dome-screen playing system and image generation method thereof
WO2014014963A1 (fr) * 2012-07-16 2014-01-23 Questionmine, LLC Appareil et procédé pour synchroniser un contenu interactif avec un élément multimédia
US10474228B2 (en) * 2014-11-17 2019-11-12 Yanmar Co., Ltd. Display system for remote control of working machine
US9846968B2 (en) * 2015-01-20 2017-12-19 Microsoft Technology Licensing, Llc Holographic bird's eye view camera
WO2016129549A1 (fr) * 2015-02-12 2016-08-18 株式会社コロプラ Dispositif et système pour visualiser un contenu à l'aide d'un visiocasque
US20170168303A1 (en) * 2015-12-09 2017-06-15 Facebook, Inc. Head-Mounted Display Systems with Nose Piece
US9838657B2 (en) * 2016-01-07 2017-12-05 International Business Machines Coporation Projection enhancement system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0937137A (ja) * 1995-07-25 1997-02-07 Minolta Co Ltd 移動型立体カメラ装置
JP2001189927A (ja) * 1999-12-28 2001-07-10 Tokyo Gas Co Ltd 移動ステーションおよび制御ステーションならびに擬似体験システム
JP2003153263A (ja) * 2001-11-19 2003-05-23 Hitachi Ltd 映像表示システムおよびその方法
JP2004078736A (ja) * 2002-08-21 2004-03-11 Yamatosokki Joho Center Co Ltd 音声データまたは動画像データ等の再生を制御するための再生制御システム
JP2013141272A (ja) * 2006-11-22 2013-07-18 Sony Corp 表示装置、表示方法、画像表示システム
JP2010068059A (ja) * 2008-09-08 2010-03-25 Sp Forum Inc 映像データ生成プログラム
JP2013169234A (ja) * 2012-02-17 2013-09-02 Tokyo Univ Of Agriculture & Technology 匂いの空間分布制御方法、匂いの空間分布制御装置、視聴覚システム及び顧客誘導システム
JP2012194579A (ja) * 2012-06-21 2012-10-11 Ohira Giken:Kk 複合プラネタリウムシステム
WO2014077046A1 (fr) * 2012-11-13 2014-05-22 ソニー株式会社 Dispositif d'affichage d'image et procédé d'affichage d'image, dispositif formant corps en mouvement, système d'affichage d'image, et programme informatique

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ERIKA OISHI ET AL.: "Haptic Force Feedback System by Pulling Clothes", ENTERTAINMENT COMPUTING SYMPOSIUM, 27 October 2015 (2015-10-27), pages 95 - 99, XP009509540 *
See also references of EP3432590A4 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11895365B2 (en) 2017-10-04 2024-02-06 Interdigital Madison Patent Holdings, Sas Customized 360-degree media viewing
US11622153B2 (en) 2017-10-04 2023-04-04 Interdigital Madison Patent Holdings, Sas Customized 360-degree media viewing
CN111201796A (zh) * 2017-10-04 2020-05-26 Vid拓展公司 定制的360度媒体观看
WO2019124158A1 (fr) * 2017-12-19 2019-06-27 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme, système d'affichage et corps mobile
US11410634B2 (en) 2017-12-19 2022-08-09 Sony Corporation Information processing apparatus, information processing method, display system, and mobile object
JPWO2019124158A1 (ja) * 2017-12-19 2021-01-21 ソニー株式会社 情報処理装置、情報処理方法、プログラム、表示システム、及び移動体
US10949159B2 (en) 2018-02-02 2021-03-16 Ntt Docomo, Inc. Information processing apparatus
JPWO2019150675A1 (ja) * 2018-02-02 2020-02-06 株式会社Nttドコモ 情報処理装置
WO2019150675A1 (fr) * 2018-02-02 2019-08-08 株式会社Nttドコモ Dispositif de traitement d'informations
JP2021520726A (ja) * 2018-04-05 2021-08-19 ヴィド スケール インコーポレイテッド 全方位ビデオに対する視点メタデータ
US11736675B2 (en) 2018-04-05 2023-08-22 Interdigital Madison Patent Holdings, Sas Viewpoint metadata for omnidirectional video
JP7401453B2 (ja) 2018-04-05 2023-12-19 ヴィド スケール インコーポレイテッド 全方位ビデオに対する視点メタデータ
WO2019235106A1 (fr) * 2018-06-06 2019-12-12 株式会社アルファコード Dispositif de présentation de carte thermique et programme de présentation de carte thermique
JP7286791B2 (ja) 2019-03-20 2023-06-05 北京小米移動軟件有限公司 Vr360において視点切り替え能力を伝送する方法および装置
JP2022525906A (ja) * 2019-03-20 2022-05-20 北京小米移動軟件有限公司 Vr360において視点切り替え能力を伝送する方法および装置
WO2021261099A1 (fr) * 2020-06-23 2021-12-30 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de détermination de plage d'affichage et programme
JP7398854B1 (ja) 2023-06-30 2023-12-15 ヴィアゲート株式会社 ウェブページ閲覧解析システム、ウェブページ閲覧解析方法およびウェブページ閲覧解析プログラム
JP7398853B1 (ja) 2023-06-30 2023-12-15 ヴィアゲート株式会社 動画視聴解析システム、動画視聴解析方法および動画視聴解析プログラム

Also Published As

Publication number Publication date
US20190075269A1 (en) 2019-03-07
EP3432590A1 (fr) 2019-01-23
JPWO2017159063A1 (ja) 2019-01-17
EP3432590A4 (fr) 2019-02-27
US10455184B2 (en) 2019-10-22

Similar Documents

Publication Publication Date Title
WO2017159063A1 (fr) Dispositif d'affichage et dispositif terminal de traitement d'informations
US11436803B2 (en) Insertion of VR spectator in live video of a live event
JP6992845B2 (ja) 情報処理装置、情報処理方法、プログラム、および情報処理システム
US9615177B2 (en) Wireless immersive experience capture and viewing
JP6759451B2 (ja) 人による追跡装置のオクルージョンの影響を低減するシステム及び方法
JP6988980B2 (ja) 画像表示装置
JP6346131B2 (ja) 情報処理装置および画像生成方法
US20190073830A1 (en) Program for providing virtual space by head mount display, method and information processing apparatus for executing the program
WO2017043399A1 (fr) Dispositif de traitement d'informations et procédé de génération d'image
US20180225537A1 (en) Methods and apparatus relating to camera switching and/or making a decision to switch between cameras
US20180249189A1 (en) Methods and apparatus for use in a system or device where switching between cameras may occur
US20180278995A1 (en) Information processing apparatus, information processing method, and program
WO2017064926A1 (fr) Dispositif de traitement d'information et procédé de traitement d'information
JP6556295B2 (ja) 情報処理装置および画像生成方法
JPWO2017187764A1 (ja) 情報処理端末装置並びに配信装置
JP6523038B2 (ja) 感覚提示装置
WO2017068926A1 (fr) Dispositif de traitement d'informations, procédé de commande associé, et programme informatique
US20240205513A1 (en) Video display system, information processing device, information processing method, and recording medium
KR102407516B1 (ko) 돔형 경기장 스포츠 및 이벤트 촬영 및 중계 시스템
JP2021186215A (ja) パフォーマンスイベント実施方法及び該パフォーマンスイベント実施方法において用いられる中継装置
WO2022190917A1 (fr) Dispositif de traitement d'information, terminal de traitement d'information, procédé de traitement d'information et programme
KR20220005942A (ko) 현장 서비스 제공 방법 및 시스템
GB2566734A (en) Wearable device, system and method

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018505313

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017766071

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017766071

Country of ref document: EP

Effective date: 20181015

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17766071

Country of ref document: EP

Kind code of ref document: A1