WO2019158000A1 - 虚拟现实vr直播中的视角同步方法及装置 - Google Patents

虚拟现实vr直播中的视角同步方法及装置 Download PDF

Info

Publication number
WO2019158000A1
WO2019158000A1 PCT/CN2019/074529 CN2019074529W WO2019158000A1 WO 2019158000 A1 WO2019158000 A1 WO 2019158000A1 CN 2019074529 W CN2019074529 W CN 2019074529W WO 2019158000 A1 WO2019158000 A1 WO 2019158000A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewing angle
information
image frame
displayed
content
Prior art date
Application number
PCT/CN2019/074529
Other languages
English (en)
French (fr)
Inventor
张哲�
Original Assignee
阿里巴巴集团控股有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿里巴巴集团控股有限公司 filed Critical 阿里巴巴集团控股有限公司
Priority to JP2020541660A priority Critical patent/JP7294757B2/ja
Priority to EP19753985.1A priority patent/EP3754980A4/en
Priority to US16/965,734 priority patent/US11290573B2/en
Publication of WO2019158000A1 publication Critical patent/WO2019158000A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/613Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for the control of the source by the destination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Definitions

  • the present application relates to the field of virtual reality VR live broadcast technology, and in particular, to a method and device for viewing angle synchronization in virtual reality VR live broadcast.
  • Virtual Reality can use a computer to generate a simulation environment, which is a computer-generated, real-time dynamic and realistic image, which is viewed by a user through a screen of a mobile terminal device such as a mobile phone, or can be specially made.
  • the head-mounted device and the like are immersed in the simulation environment for viewing, and the like.
  • the obvious difference between VR content and ordinary video content is that each frame of VR video can usually adopt 360-degree panoramic shooting, which can restore the shooting scene more clearly and accurately.
  • the playback device In the process of video playback, since the screen of the playback device is usually a flat structure and cannot perform 360-degree panoramic display at the same time, the playback device first needs to determine the viewing angle required by the user, and then, according to the viewing angle, The angle of view plays on each frame of the image.
  • the VR content may have a default display viewing angle, and during the playing process, the viewer may change by rotating the terminal device or by turning the head or the eyeball in the case of wearing the head-mounted device. Viewing the angle of view to view the image content of each frame of the video from more angles.
  • VR live broadcast is a new type of application developed by combining VR content with live broadcast technology.
  • the VR content can usually be pre-made VR video and the like, and the VR sending device can obtain the VR content in advance, and then perform real-time synchronous playback to one or more VR receiving devices.
  • the VR live broadcast can be applied to the pure presentation content of the movie class or the exploration content of the game class to the VR receiving device, and the like.
  • the present application provides a method and device for viewing angle synchronization in a virtual reality VR live broadcast, which can solve the problem of perspective synchronization in the VR live broadcast process.
  • a method for synchronizing a perspective in a virtual reality VR live broadcast comprising: determining that the VR content is played on the sending device side, and the sender user corresponding to the image frame views the viewing angle information;
  • the sender user viewing angle information corresponding to the preset number of image frames determines the display angle of view of the current image frame to be displayed by the VR receiving device.
  • a perspective synchronization method in virtual reality VR live broadcast comprising:
  • VR content information provided by the VR sending device, where the VR content information includes an image frame and a corresponding sender user viewing angle information;
  • the image frame to be displayed is displayed according to the determined display perspective.
  • a perspective synchronization method in virtual reality VR live broadcast comprising:
  • a perspective synchronization method in virtual reality VR live broadcast comprising:
  • VR content information provided by the VR sending device, where the VR content information includes an image frame and a corresponding sender user viewing angle information;
  • the prompt information is provided when the image frame to be displayed is displayed.
  • a perspective synchronization device for virtual reality VR live broadcast comprising:
  • a first sender user view information determining unit configured to determine, in a process of playing the VR content on the sending device side, the sender user corresponding to the image frame viewing the view information
  • a first VR content providing unit configured to provide the image frame in the VR content and the corresponding sender user viewing angle information to the VR receiving device, where the VR receiving device displays the VR content according to the The image frame to be displayed and the sender user viewing angle information corresponding to the pre-preset number of image frames determine the display angle of the VR receiving device to the current image frame to be displayed.
  • a perspective synchronization device for virtual reality VR live broadcast comprising:
  • a VR content obtaining unit configured to obtain VR content information provided by the VR sending device, where the VR content information includes an image frame and a corresponding sender user viewing angle information;
  • a display angle determining unit configured to determine, according to the currently viewed image frame of the image frame to be displayed and the sender user viewing angle information corresponding to the pre-preset number of image frames, the display angle of the VR receiving device to the current image frame to be displayed;
  • a display unit configured to display the image frame to be displayed according to the determined display angle.
  • a perspective synchronization device for virtual reality VR live broadcast comprising:
  • a second sender user view information determining unit configured to determine, during the process of playing the VR content on the sending device side, the sender user corresponding to the image frame viewing the view information
  • a second VR content providing unit configured to provide the image frame in the VR content and the corresponding sender user viewing angle information to the VR receiving device, where the VR receiving device displays the VR content according to the The disparity between the viewing angle of the recipient user and the viewing angle of the sender user provides prompt information for viewing the viewing angle of the sender user.
  • a perspective synchronization device for virtual reality VR live broadcast comprising:
  • a VR content obtaining unit configured to obtain VR content information provided by the VR sending device, where the VR content information includes an image frame and a corresponding sender user viewing angle information;
  • a receiving user viewing angle information determining unit configured to determine a viewing user viewing angle information corresponding to an image frame to be displayed currently
  • a prompt information generating unit configured to generate prompt information for viewing the viewing angle of the sender user according to the disparity information between the viewing angle of the sender user and the viewing angle of the sender user corresponding to the currently to-be-displayed image frame;
  • the prompt information providing unit is configured to provide the prompt information when displaying the image frame to be displayed.
  • a perspective synchronization method in augmented reality AR live broadcast comprising:
  • the sender user viewing angle information corresponding to the preset number of image frames determines the presentation angle of the AR receiving device to the current image frame to be displayed.
  • a perspective synchronization method in augmented reality AR live broadcast comprising:
  • the AR content information includes an image frame and a corresponding sender user viewing angle information
  • the image frame to be displayed is displayed according to the determined display perspective.
  • a perspective synchronization method in augmented reality AR live broadcast comprising:
  • a perspective synchronization method in augmented reality AR live broadcast comprising:
  • the AR content information includes an image frame and a corresponding sender user viewing angle information
  • the prompt information is provided when the image frame to be displayed is displayed.
  • the present application includes the following advantages:
  • the VR sending device can simultaneously send the image frame in the VR content and the corresponding sender user viewing angle information to the VR receiving device.
  • the display image frame is displayed as the display viewing angle of the receiving device. Therefore, the VR receiving device can change the display viewing angle of each image frame more smoothly. Even if there is a sudden change in the viewing angle of the sender user on the transmitting device side, the probability of occurrence of a situation such as dizziness of the recipient user due to such a sudden change is controlled or lowered.
  • the receiving user may be prompted to perform operations such as rotating the VR receiving device, and can synchronize with the viewing angle of the sending user, so that the receiving user adjusts the viewing angle according to the prompt information, so that the user can view the VR receiving device.
  • the same VR content as viewed by the sender user is a registered trademark of the sender user.
  • FIG. 1 is a schematic diagram of a scenario provided by an embodiment of the present application.
  • FIG. 3 is a flowchart of a second method provided by an embodiment of the present application.
  • FIG. 5 is a flowchart of a fourth method provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a first device provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a second device provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a third device provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a fourth apparatus provided by an embodiment of the present application.
  • the inventor of the present invention found that the interaction mode between the transmitting device and the receiving device may be different for different types of VR live broadcast content, but in the prior art, there are certain perspective synchronization problems to varying degrees.
  • the pure presentation content of the movie class is usually the main perspective selected by the sender user, and the information sent by the sending device to the receiving device includes the sender corresponding to each frame image in addition to the image data of each frame image.
  • the receiving device determines the display angle of the receiving device for the corresponding image frame through the specific image frame and the corresponding sender's perspective, and plays the received VR content. That is, the recipient user passively follows the perspective of the sender user. At this time, for the receiving user, even if it rotates the terminal device such as the mobile phone or rotates the head or the eyeball while wearing the head display device, the viewing angle of the VR content is not changed, and the viewing angle is always transmitted and transmitted.
  • the perspective of the party user is kept synchronized, which is equivalent to the fact that the sender user leads the recipient users to view the VR content.
  • the manner in which the sender user selects or changes the viewing angle is usually realized by rotating a mobile terminal device such as a mobile phone or turning the head or the eyeball in the case of wearing the head display device.
  • the sender user may have a large rotation of its terminal equipment, or a large rotation of the head.
  • the sender user may suddenly look up the content in the top of the panoramic data, or suddenly bow down to perform some operation, etc., which may cause the sender's user to suddenly change the viewing angle of the VR content, correspondingly
  • the display angle of the VR video data played by the VR receiving device may also be caused by a sudden change in the viewing angle of the sender user, which may cause the receiving user to see the video data in which a large content change suddenly occurs. Even the recipient user is given a feeling of dizziness due to the inability to adapt to such sudden image movement.
  • the sender user can also select or change his or her viewing angle.
  • the viewing angle information of the sender user is not provided to the receiving device.
  • the receiving user can change his or her viewing angle by rotating a terminal device such as a mobile phone or turning a head or an eyeball while wearing the head display device, for use in an interface such as a map of a specific game scene. Explore, and more. That is to say, the recipient user can actively change its viewing angle, and the receiving device determines the viewing angle of the VR content according to the viewing angle of the recipient user when playing the VR content. This makes the presentation angle of the same VR content different for the sending device and each receiving device during the same live broadcast process.
  • the recipient user may need to know the viewing angle of the sender user in order to better perform game interaction and the like.
  • the sender user can notify the recipient user of the direction of viewing the view by voice or the like, and the receiver user searches for the viewing angle direction of the sender user according to the voice prompt of the sender user.
  • the area of one frame of image may be very large.
  • the sender user may not be able to clearly describe the direction of viewing angle in the language, on the other hand, even if the sender user performs With a clear description, the recipient user may also have difficulty finding the direction of the viewing angle of the other party due to the excessive content of the screen, and so on. Therefore, in the above case, such a prior art method may become unapplicable, so that each recipient user in the same game may often have a situation in which the direction of the main viewing angle of the sender user cannot be confirmed.
  • the embodiment of the present application provides a corresponding solution.
  • the live broadcast device can image the image of each frame image and the corresponding sender user.
  • the viewing angle information is provided to the receiving device.
  • the receiving device can perform different processing for live broadcast of different types of VR content.
  • the VR receiving device side does not directly display the VR content according to the view angle of the sender user corresponding to each frame data, but firstly displays according to the current viewing angle of the sender user.
  • the display angle of each frame image is smoothed, and the VR receiving device side plays the VR video data to the user watching the live broadcast according to the smoothed display view angle, thereby generating the view when the live broadcast user views.
  • the buffering effect of the mutation of the direction to reduce the feeling of dizziness of the viewing user.
  • the VR receiving device can still view the viewing angle of the receiving user as the display angle of the corresponding image frame, but at the same time, according to the sender user corresponding to each frame image. Viewing the viewing angle and the viewing angle of the receiving user, calculating the disparity direction information between the two, and then, according to the disparity direction information, providing the prompt information in the process of the receiving device playing the VR content, for example, the VR content
  • a prompt message such as "arrow" is displayed in the playback interface, or a prompt message of a voice class can be generated, and the like.
  • the first embodiment of the present invention is mainly directed to a perspective synchronization solution corresponding to a live broadcast mode of a virtual display VR content that is biased toward a movie, that is, in this solution, a display angle of the VR sending device when displaying the VR content. It is related to the motion of the VR transmitting device, but the display angle of the VR receiving device to the current image frame to be displayed is independent of the motion of the VR receiving device.
  • FIG. 1 it is a schematic diagram of a scenario of an exemplary embodiment of a practical application in the present application. In FIG.
  • the foregoing information provided by the VR sending device 101 may be forwarded to the VR receiving device 103 through the server 102, or may be established with a point-to-point connection with the VR receiving device 103, directly sent to the VR receiving device 103, and the like.
  • the VR video data of each frame sent by the VR sending device may be 360-degree panoramic data.
  • the VR video device smoothes the view information, for example,
  • the angle of view information corresponding to the image of the first N frames is averaged as the display angle of the current image frame, and the VR receiving device displays the current image frame to the receiving user according to the display angle of the current image frame, and other image frames can also be used. Similar treatment. Only one VR receiving device is shown in FIG. 1, and it can be understood that in practical applications, there may be multiple VR receiving devices.
  • the embodiment of the present application does not directly view the viewing angle of the sending user as the display viewing angle in the VR receiving device, but uses the smoothed processed viewing angle information as the display viewing angle of the VR receiving device side, so The occurrence of a sudden movement or rotation of an image screen played in the receiving device due to a sudden change in viewing angle of the sender user is avoided, which reduces the probability of occurrence of dizziness of the receiving user.
  • the second embodiment of the present invention provides a perspective synchronization method in a virtual reality VR live broadcast from the perspective of the VR sending device.
  • the method may specifically include:
  • S201 Determine, in the process of playing the VR content on the sending device side, the sender user corresponding to the image frame views the viewing angle information.
  • the sender user uses the VR sending device to view the VR video.
  • the live broadcast user wears a VR headset to watch a movie, etc.
  • the direction of the viewing angle of the sender user may change at any time.
  • the VR transmitting device not only needs to obtain the panoramic data of the VR video of each frame of the VR video, but also needs to obtain the viewing angle information corresponding to the video data of each frame when the VR video is viewed by the sender, and the viewing angle information is used to indicate the VR transmission.
  • the sender user on the device side views the line of sight of the VR content.
  • the VR sending device may be provided with a sensor for detecting viewing angle change information caused by the motion of the VR sending device. Therefore, in the process of determining that the VR content is played on the VR sending device side, when the sender user corresponding to each frame image views the viewing angle information, the image corresponding to the frame image may be determined according to the viewing angle change information uploaded by the sensor. The sender user views the angle of view information.
  • each of the image frames may be corresponding to the respective user's viewing angle information. Therefore, the sender's viewing angle information corresponding to the determined image frame may be determined by the corresponding sender for each image frame. Perspective. Alternatively, in an actual application, it is also possible to provide a corresponding sender viewing angle information every several frames, and the like.
  • the image frame in the VR content and the corresponding sender user viewing angle information are provided to the VR receiving device, and used by the VR receiving device to display the VR content according to the current image frame to be displayed.
  • the sender user viewing angle information corresponding to the preset number of image frames is determined, and the display angle of the VR receiving device to the current image frame to be displayed is determined.
  • the VR sending device can provide the image frame in the VR content and the corresponding sender user viewing angle information to the VR receiving device through technologies such as remote data synchronization. Specifically, it may be forwarded to the VR receiving device through the server, or may be directly sent to the VR receiving device point-to-point, and the like. In an actual application, each image frame in the VR content and the corresponding sender user viewing angle information may be sent in units of frames, and several image frames may be synchronized as a video stream to the VR receiving device, and in the video stream. The sender viewing angle information corresponding to each frame image in the VR content is provided. Regardless of the transmission mode, it is only necessary to agree between the VR transmitting device and the VR receiving device, so that the VR receiving device can correspond each frame image with the sender user viewing angle.
  • the VR receiving device may extract a specific image frame and corresponding sender user viewing angle information, wherein the current image frame is displayed in the VR receiving device and the front of the VR content.
  • the preset number of users (for example, 5 frames, etc.) may be related to the viewing angle of the sender user corresponding to the frame image. Therefore, the VR receiving device may perform the viewing angle of the sender user corresponding to the received at least the preset number of image frames. save.
  • a series of sender viewing angle information may be saved according to a specific preset number, for example, (...xn-1, xn, xn+1, xn+2...), where xn may be a vector.
  • the sender user viewing angle information corresponding to the nth frame image is displayed.
  • the length of the sliding window may be equal to the preset number, and each time the sender user viewing angle information corresponding to the new image frame is received, the new one may be A vector corresponding to the sender user viewing angle information corresponding to the image frame is added to the aforementioned sequence, and the sliding window can be slid forward once.
  • the image frame to be displayed and the previous preset number of adjacent image frames may be determined according to the playing order of each frame image in the VR content, and then the current image to be displayed is calculated. And an average value of the viewing angle information of the sender user corresponding to the image frame and the pre-preset number of image frames, and determining the average value as a display angle of view of the current image frame to be displayed by the VR receiving device.
  • the length of the sliding window is preset to be m+1, that is, the current image frame to be displayed is related to the sender user viewing angle information corresponding to the previous m adjacent image frames, where m may be an integer greater than 1.
  • the smoothed view information After the smoothed view information is obtained, it can be used as a display angle in the VR receiving device, and the current image frame to be displayed is displayed according to the display angle. Other frames can be similarly processed.
  • the display angle of the display of each frame image of the VR receiver during the display process is relatively stable, and the mutation of the user's perspective is reduced.
  • the probability of occurrence of a situation such as dizziness of the recipient user may cause the VR content viewed by the recipient user to be completely identical to the viewing angle of the sender user, and a delay or the like may occur.
  • each frame of the image is smoothed and then displayed, which may cause a certain amount of waste of computing resources, etc., and is caused by this method.
  • the delay in the content viewed by the recipient user may not be worth it.
  • the first determination may be performed to determine the sender of the current image to be displayed relative to the images of the previous frames. If there is a sudden change in the viewing angle, if the mutation occurs, the smoothing process is performed according to the foregoing manner. Otherwise, the viewing angle of the sender user corresponding to the current image frame to be displayed may be directly used as the display viewing angle in the receiving device. Show image frames for display.
  • the information about the degree of change of the viewing angle information of the sender user corresponding to the adjacent pre-preset number of image frames may be determined, and if the change degree information reaches a preset threshold, And triggering the step of calculating an average value of the viewing angle information of the sender user corresponding to the image frame to be displayed and the pre-preset number of image frames. Otherwise, if the change degree information does not reach the preset threshold, the sender user viewing angle corresponding to the current image frame is determined as the display angle of view of the current image frame to be displayed by the VR receiving device.
  • the distance between the vectors and the length of the distance may be determined to determine the length.
  • the level of change That is, the preset threshold corresponding to the degree of change information may be a length value representing a distance between the vector and the vector, and the like.
  • the on-demand smoothing processing can be realized in the above manner, that is, when the smoothing processing is required, for example, when the transmission user's viewing angle is abruptly changed, the calculation of the average value or the like is performed.
  • computing resources can be saved, and the VR content viewed by the recipient user can be guaranteed to be more consistent with the viewing angle of the sender user.
  • the VR receiving device can simultaneously view the image frame in the VR content and the corresponding viewing angle information of the sending user by the VR sending device, so that the VR receiving device side can view the viewing angle of the sending user.
  • the display image frame is displayed as a display view of the receiving device. Therefore, the VR receiving device can make the display angle change of each image frame smoother even if the transmitting device side views the transmitting device.
  • the probability of occurrence of a situation such as dizziness caused by the user of the recipient is also controlled or lowered.
  • the second embodiment corresponds to the embodiment, and provides a perspective synchronization method in a virtual reality VR live broadcast from the perspective of the VR receiving device.
  • the method may specifically include:
  • S301 Obtain VR content information provided by a VR sending device, where the VR content information includes an image frame and a corresponding sender user viewing angle information;
  • S302 Determine, according to the current user to view the image frame and the pre-preset number of image frames, the viewing angle information of the sender user, and determine the display angle of the VR receiving device to the current image frame to be displayed.
  • the average value of the viewing angle information of the sender user corresponding to the image frame to be displayed and the image frame to be displayed before is determined by determining the display angle of the current image frame to be displayed by the VR receiving device.
  • the average value is determined as a display angle of view of the current image frame to be displayed by the VR receiving device.
  • the information about the degree of change of the viewing angle information of the sender user corresponding to the adjacent pre-preset number of image frames may be first determined.
  • the step of calculating the average value of the view information of the sender user corresponding to the image frame to be displayed and the pre-preset number of image frames is triggered. If the change degree information does not reach the preset threshold, the sender user viewing angle corresponding to the current image frame may be determined as a display angle of view of the current image frame to be displayed by the VR receiving device.
  • the display angle of the VR receiving device to the current image frame to be displayed is independent of the motion of the VR receiving device.
  • the third embodiment is mainly for introducing the perspective synchronization problem of the VR content of the exploration-oriented VR content in the live broadcast process. That is, in this case, the display perspective of the VR content by the VR transmitting device is related to the motion of the VR transmitting device, and the viewing angle of the VR transmitting device to the VR content is related to the motion of the VR transmitting device.
  • the third embodiment first provides a method for synchronizing the perspective of the virtual reality VR live broadcast from the perspective of the VR sending device. Specifically, referring to FIG. 4, the method may include:
  • This step may be the same as step S201 in the first embodiment, that is, even if the VR receiving device displays the VR content, the specific display viewing angle changes according to the viewing angle of the receiving user, and may also be sent.
  • the viewing angle information of the sender side of the device side is provided to the VR receiving device, so that the VR receiving device can provide the receiving user with prompt information about the direction of the viewing angle of the sending party.
  • S402 The image frame in the VR content and the corresponding sender user viewing angle information are provided to the VR receiving device, where the VR receiving device displays the viewing angle and the sending according to the receiving user when displaying the VR content.
  • the parallax between the viewing angles of the party user provides prompt information for viewing the viewing angle of the sender user.
  • the VR receiving device When the VR content is provided to the VR receiving device, it is also possible to provide both the specific image frame in the VR content and the sender viewing angle information corresponding to the image frame.
  • the viewing angle of the specific image frame in the VR receiving device is determined according to the viewing angle of the receiving user. For example, in a game scenario, the sender user has entered the upper right of the scene screen, and the receiver user is still at the lower left of the scene screen. In this case, the VR receiving device does not follow the upper right display of the scene screen.
  • the view angle plays the current image frame, but the current image frame is played according to the display angle of the lower left of the scene picture in which the recipient user is located.
  • the VR receiving device can acquire the viewing angle information of the sending user, the disparity between the viewing angle of the receiving user and the viewing angle of the sending user can also be calculated, and then, according to this,
  • the parallax provides the prompting information to the receiving user to help the receiving user to find the direction in which the user views the viewing angle by rotating the VR receiving device.
  • the parallax is an angle value between a viewing angle direction of the sender user and a viewing angle direction of the receiving user, and can also be represented by a vector or the like.
  • a prompt information may be generated according to the disparity to prompt the receiving user how to rotate the VR receiving device to find the viewing angle of the sending user. position.
  • the prompt information may be implemented in multiple manners.
  • the visual direction indication information may be generated according to the disparity information, so that when the prompt information is provided to the receiving user, When the image frame to be displayed is displayed, the visualization direction indication information is superimposed into the image frame for display.
  • the visual direction indication information may be implemented by means of an arrow, and the arrow may have a certain degree of distortion to prompt the recipient user how to rotate his VR receiving device. The larger the parallax between the viewing angle of the recipient user and the viewing angle of the sender user, the more obvious the arrow distortion.
  • user identification information such as the name of the sender user may be displayed on the visualization information such as an arrow.
  • the prompting may also be performed by means of voice or the like.
  • the audio prompt information may be generated according to the disparity information, and then the audio prompt information is played when the current image frame to be displayed is displayed.
  • the corpus template and the voice play model may be provided in advance. After calculating the specific disparity information, the corresponding corpus may be generated according to the template, and then converted into voice for playing through the voice play model, and the like. For example, if it is found through calculation that the viewing angle of the sender user is located at the right rear of the viewing angle of the current recipient user, the generated corpus may be “If you need to find the perspective of the sender user, please rotate the mobile phone or the head to the right rear. ",and many more.
  • the receiver user can adjust the viewing angle of the viewer by rotating the VR receiving device according to the prompt information, so that the viewing angle is The viewing angle of the sender user is consistent, thereby achieving synchronization with the perspective of the sender user.
  • the VR receiving device may display the VR content to the receiver user according to the adjusted viewing angle direction of the receiver user. . Still speaking, the above example is used. If the line of sight of the receiver user also enters the upper right side of the scene picture, the VR image on the upper right side of the scene picture needs to be played on the VR receiving device. In order to enable the recipient user to view this part of the image.
  • the calculation of the disparity information between the viewing angle of the receiver user and the viewing angle of the sender user corresponding to the currently to-be-displayed image frame, and the prompt information provided according to the method can prompt the receiving user how to The operation of rotating the VR receiving device and the like can be synchronized with the viewing angle of the sender user, so that the receiving user adjusts the viewing angle according to the prompt information, so that the VR receiving device can view the viewing with the sender user.
  • the disparity information between the VR receiving devices may be determined between the VR receiving devices according to the manner of the embodiment, and the corresponding VR receiving devices may be provided.
  • the user views the information of the viewing angle. For example, if the viewing direction of the receiving user A corresponding to the VR receiving device A and the receiving user B corresponding to the VR receiving device B are different, the viewing angle information of the user B may be prompted on the display screen of the VR receiving device A. So that user A can find user B's viewing angle position, and so on.
  • the fourth embodiment corresponds to the three-phase of the embodiment. From the perspective of the VR receiving device, a method for synchronizing the viewing angle in the virtual reality VR live broadcast is provided. Referring to FIG. 5, the method may specifically include:
  • S501 Obtain VR content information provided by the VR sending device, where the VR content information includes an image frame and a corresponding sender user viewing angle information;
  • S502 Determine, by the receiver user, viewing angle information corresponding to the image frame to be displayed.
  • S503 Generate, according to the disparity information between the viewing angle of the receiver user and the viewing angle of the sender user corresponding to the current image frame to be displayed, generating prompt information for viewing the viewing angle of the sender user;
  • S504 The prompt information is provided when the image frame to be displayed is displayed.
  • the display angle of the VR sending device to the VR content is related to the motion of the VR sending device; the display angle of the VR receiving device to the VR content, and the motion of the VR receiving device Related.
  • the visual direction indication information when generating the prompt information for viewing the viewing angle of the sender user, the visual direction indication information may be generated according to the disparity information; at this time, the visual direction may be displayed when the image frame to be displayed is displayed. The indication information is superimposed into the image frame for presentation.
  • user identification information of the sender user may also be added to the visualization information.
  • Another manner of generating the prompt information for viewing the viewing angle of the sender user may be: generating audio prompt information according to the disparity information; at this time, playing the image frame when the current image frame to be displayed is displayed Audio prompt information.
  • the embodiment of the present application further provides a view synchronization device in a virtual reality VR live broadcast.
  • the device may include:
  • the first sender user view information determining unit 601 is configured to determine, in the process of playing the VR content on the sending device side, the sender user corresponding to the image frame views the view information;
  • the first VR content providing unit 602 is configured to provide the image frame in the VR content and the corresponding sender user viewing angle information to the VR receiving device, for the VR receiving device to display the VR content, Determining a viewing angle of the current image frame to be displayed by the VR receiving device according to the viewing angle information of the sender user corresponding to the image frame to be displayed and the image frame of the pre-preset number.
  • the display angle of view when the VR sending device displays the VR content is related to the motion of the VR sending device.
  • the display angle of the VR receiving device to the current image frame to be displayed is independent of the motion of the VR receiving device.
  • the VR sending device may be configured to detect a viewing angle change information caused by the motion of the VR sender device.
  • the first sender user view information determining unit may be specifically configured to:
  • the embodiment of the present application further provides a view synchronization device in a virtual reality VR live broadcast.
  • the device may include:
  • the VR content obtaining unit 701 is configured to obtain VR content information provided by the VR sending device, where the VR content information includes an image frame and a corresponding sender user viewing angle information;
  • the display angle of view determining unit 702 is configured to determine, according to the currently viewed image view of the image frame to be displayed and the sender user viewing angle information corresponding to the pre-preset number of image frames, the display angle of the VR receiving device to the current image frame to be displayed;
  • the displaying unit 703 is configured to display the image frame to be displayed according to the determined display viewing angle.
  • the display angle of view determining unit may specifically include:
  • a calculation subunit configured to calculate an average value of the viewing angle information of the sender user corresponding to the image frame to be displayed and the pre-preset number of image frames
  • the display angle determining unit is configured to determine the average value as a display angle of view of the current image frame to be displayed by the VR receiving device.
  • the device may further include:
  • a viewing angle change degree determining unit configured to determine information about degree of change of the currently viewed image frame relative to the sender's viewing angle information corresponding to the adjacent pre-preset number of image frames
  • a triggering unit configured to trigger, when the change degree information reaches a preset threshold, the step of calculating an average value of the view information of the sender user corresponding to the image frame to be displayed and the pre-preset number of image frames.
  • a direct determining unit configured to determine, when the change degree information does not reach the preset threshold, the sender user viewing angle corresponding to the current image frame, as a display angle of view of the current image frame to be displayed by the VR receiving device.
  • the display angle of the VR receiving device to the current image frame to be displayed is independent of the motion of the VR receiving device triggered by the receiver user.
  • the embodiment of the present application further provides a view synchronization device in a virtual reality VR live broadcast.
  • the device may include:
  • a second sender user view information determining unit 801 configured to determine, in a process of playing the VR content on the sending device side, the sender user corresponding to the image frame views the view information;
  • the second VR content providing unit 802 is configured to provide the image frame in the VR content and the corresponding sender user viewing angle information to the VR receiving device, where the VR receiving device displays the VR content. According to the disparity between the viewing angle of the recipient user and the viewing angle of the sender user, the prompt information for viewing the viewing angle of the sender user is provided.
  • the display angle of the VR sending device to the VR content is related to the motion of the VR sending device; the display angle of the VR sending device to the VR content is related to the motion of the VR sending device.
  • the embodiment of the present application further provides a view synchronization device in a virtual reality VR live broadcast.
  • the device may include:
  • the VR content obtaining unit 901 is configured to obtain VR content information provided by the VR sending device, where the VR content information includes an image frame and a corresponding sender user viewing angle information;
  • a receiving user viewing angle information determining unit 902 configured to determine a viewing user viewing angle information corresponding to an image frame to be displayed currently
  • the prompt information generating unit 903 is configured to generate prompt information for viewing the viewing angle of the sender user according to the disparity information between the viewing angle of the sender and the viewing angle of the sender user corresponding to the currently to-be-displayed image frame;
  • the prompt information providing unit 904 is configured to provide the prompt information when displaying the image frame to be displayed.
  • the display angle of the VR sending device to the VR content is related to the motion of the VR sending device.
  • the display angle of the VR receiving device to the VR content is related to the motion of the VR receiving device.
  • the prompt information generating unit may be specifically configured to:
  • the prompt information providing unit may be specifically configured to: when displaying the image frame to be displayed, superimpose the visual direction indication information into the image frame for display.
  • the prompt information generating unit may be further configured to:
  • User identification information of the sender user is added to the visualization information.
  • the prompt information generating unit may be specifically configured to:
  • the prompt information providing unit may be specifically configured to: when the image frame to be displayed is displayed, play the audio prompt information.
  • an augmented reality (AR)-based live broadcast application may also be included, which is different from the VR live broadcast in that, during the live broadcast of the AR, the sender user and the receiver user may be required to face each other. Image acquisition is performed on the same physical object, and then the associated AR content is displayed based on the display position of the physical object on the screen.
  • the solution provided by the embodiment of the present application can also be used in an AR-based live broadcast, that is, if each frame image in the AR content is also a 360-degree panoramic image, and the user can change the viewing angle by rotating its AR device.
  • the solution provided by the embodiment of the present application can also achieve the synchronization of the perspective of the sender user and the receiver user, and prevent the occurrence of dizziness of the receiver user.
  • the live broadcast of the AR there may be a live broadcast of the pure display content and a live broadcast of the searched content.
  • the specific synchronization problem and the corresponding processing manner may be different.
  • the embodiment of the present application further provides a perspective synchronization method in an augmented reality AR live broadcast, and the method may include:
  • Step 1 determining that the AR content is played on the sending device side, and the sender user corresponding to the image frame views the viewing angle information;
  • Step 2 The image frame in the AR content and the corresponding sender user viewing angle information are provided to the AR receiving device, for the AR receiving device to display the image frame according to the current image frame to be displayed when displaying the AR content. And the sender user viewing angle information corresponding to the pre-preset number of image frames, and determining the display viewing angle of the AR receiving device to the current image frame to be displayed.
  • the embodiment of the present application further provides a perspective synchronization method in an augmented reality AR live broadcast, and the method may specifically include:
  • Step 1 obtaining AR content information provided by the AR sending device, where the AR content information includes an image frame and a corresponding sender user viewing angle information;
  • Step 2 determining, according to the current viewing angle information of the image frame to be displayed and the pre-preset number of the image frames, the display viewing angle of the AR receiving device to the current image frame to be displayed;
  • Step 3 Display the image frame to be displayed according to the determined display angle.
  • the embodiment of the present application further provides a perspective synchronization method in an augmented reality AR live broadcast, and the method may specifically include:
  • Step 1 determining that the AR content is played on the sending device side, and the sender user corresponding to the image frame views the viewing angle information;
  • Step 2 The image frame and the corresponding sender user viewing angle information in the AR content are provided to the AR receiving device, and used by the AR receiving device to display the AR content according to the viewing angle of the receiving user.
  • the sender user views the disparity between the viewing angles and provides prompt information for the viewing angle of the sender user.
  • the embodiment of the present application further provides a perspective synchronization method in an augmented reality AR live broadcast, where the method may specifically include:
  • Step 1 obtaining AR content information provided by the AR sending device, where the AR content information includes an image frame and a corresponding sender user viewing angle information;
  • Step 2 determining the viewing angle information of the recipient user corresponding to the image frame to be displayed
  • Step 3 Generate, according to the disparity information between the viewing angle of the receiver user and the viewing angle of the sender user corresponding to the current image frame to be displayed, generating prompt information for viewing the viewing angle of the sender user;
  • Step 4 providing the prompt information when displaying the image frame to be displayed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Social Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本申请提供了虚拟现实VR直播中的视角同步方法及装置,该方法包括:确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角。采用本申请实施例,可以在VR接收设备侧平滑地为接收方用户播放VR直播内容,避免了接收方用户由于视角突变步产生晕眩的感觉。

Description

虚拟现实VR直播中的视角同步方法及装置
本申请要求2018年02月14日递交的申请号为201810151922.0、发明名称为“虚拟现实VR直播中的视角同步方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及虚拟现实VR直播技术领域,特别涉及虚拟现实VR直播中的视角同步方法及装置。
背景技术
虚拟现实(Virtual Reality,VR)可以利用计算机生成一种模拟环境,该模拟环境是由计算机生成的、实时动态的逼真图像,用户通过手机等移动终端设备的屏幕进行观看,或者,还可以通过特制的头显设备等沉浸到该模拟环境中进行观看,等等。其中,VR内容与普通视频内容之间一个明显的不同是,VR视频的每一帧通常都可以采用360度全景拍摄,可以将拍摄现场更清晰、准确的还原。在视频播放的过程中,由于播放设备的屏幕通常是平面结构,无法同时进行360度的全景展示,因此,播放设备首先需要确定用户所需的观看视角,然后,按照将这种观看视角作为展示视角对每一帧图像进行播放。其中,在初始状态下,VR内容可以具有默认的展示视角,在播放的过程中,观看者可以通过转动终端设备,或者在头戴头显设备的情况下通过转动头部或者眼球等,来改变观看视角,实现从更多角度观看视频各帧的图像内容。
VR直播是将VR内容与直播技术相结合而发展起来的一种新型的应用。在VR直播中,VR内容通常可以是预先制作好的VR视频等内容,VR发送设备可以预先获得这种VR内容,然后,向一个或者多个VR接收设备进行实时的同步播放。VR直播的应用场景可以有多种,例如,在网络销售***中,针对一些需要大型布景才能够更清楚的介绍具体产品特点(如,家居或者装修类产品等)的情况下,就可以预先将这种产品布置在这种布景下,然后进行VR内容的制作,然后通过VR直播的方式,向买家用户播放该VR内容,使得用户能够获得关于具体产品更准确的信息。具体实现时,VR直播可以应用于向VR接收设备播放电影类的纯展现内容、或者游戏类的探索类内容,等。
发明内容
本申请提供了虚拟现实VR直播中的视角同步方法及装置,能够解决VR直播过程中的视角同步问题。
为了解决上述问题,本申请实施例公开了:
一种虚拟现实VR直播中的视角同步方法,包括:确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角。
一种虚拟现实VR直播中的视角同步方法,包括:
获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角;
根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
一种虚拟现实VR直播中的视角同步方法,包括:
确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
一种虚拟现实VR直播中的视角同步方法,包括:
获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
确定当前待展示的图像帧对应的接收方用户观看视角信息;
根据接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息,生成对所述发送方用户观看视角的提示信息;
在展示所述当前待展示的图像帧时,提供所述提示信息。
一种虚拟现实VR直播中的视角同步装置,包括:
第一发送方用户视角信息确定单元,用于确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
第一VR内容提供单元,用于将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角。
一种虚拟现实VR直播中的视角同步装置,包括:
VR内容获得单元,用于获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
展示视角确定单元,用于根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角;
展示单元,用于根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
一种虚拟现实VR直播中的视角同步装置,包括:
第二发送方用户视角信息确定单元,用于确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
第二VR内容提供单元,用于将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
一种虚拟现实VR直播中的视角同步装置,包括:
VR内容获得单元,用于获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
接收方用户观看视角信息确定单元,用于确定当前待展示的图像帧对应的接收方用户观看视角信息;
提示信息生成单元,用于根据接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息,生成对所述发送方用户观看视角的提示信息;
提示信息提供单元,用于在展示所述当前待展示的图像帧时,提供所述提示信息。
一种增强现实AR直播中的视角同步方法,包括:
确定AR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
将所述AR内容中所述图像帧及其对应的发送方用户观看视角信息提供给AR接收设备,以用于AR接收设备在对AR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定AR接收设备对所述当前待展示图像帧的展示视角。
一种增强现实AR直播中的视角同步方法,包括:
获得AR发送设备提供的AR内容信息,所述AR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定AR接收设备对所述当前待展示图像帧的展示视角;
根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
一种增强现实AR直播中的视角同步方法,包括:
确定AR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
将所述AR内容中所述图像帧及其对应的发送方用户观看视角信息提供给AR接收设备,以用于AR接收设备在对AR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
一种增强现实AR直播中的视角同步方法,包括:
获得AR发送设备提供的AR内容信息,所述AR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
确定当前待展示的图像帧对应的接收方用户观看视角信息;
根据接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息,生成对所述发送方用户观看视角的提示信息;
在展示所述当前待展示的图像帧时,提供所述提示信息。
与现有技术相比,本申请包括以下优点:
在本申请实施例中,针对偏向于电影的纯展现类VR内容的直播,通过由VR发送设备向VR接收设备同时发送VR内容中的图像帧以及对应的发送方用户观看视角信息的方式,可以在VR接收设备侧对发送方用户观看视角信息进行预处理之后,再作为接收设备的展示视角对待展示图像帧进行展示,因此,能够使得VR接收设备对各图像帧的展示视角变化更为平滑,即使发送设备侧的出现发送方用户观看视角发生突变,接收方用户由于这种突变而发生的晕眩等情况的发生概率也会得到控制或者降低。
针对偏向于游戏的探索类VR内容的直播,通过对接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息的计算,以及据此提供的提示信息,可以提示接收方用户如何对其VR接收设备进行旋转等操作,能够与发送方用户的观看视角同步,以便接收方用户按照提示信息,来调整自己的观看视角,以便可以在VR接收设备中观看到与发送方用户观看到的同样的VR内容。
当然,实施本申请的任一产品并不一定需要同时达到以上所述的所有优点。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的场景示意图;
图2是本申请实施例提供的第一方法的流程图;
图3是本申请实施例提供的第二方法的流程图;
图4是本申请实施例提供的第三方法的流程图;
图5是本申请实施例提供的第四方法的流程图;
图6是本申请实施例提供的第一装置的示意图;
图7是本申请实施例提供的第二装置的示意图;
图8是本申请实施例提供的第三装置的示意图;
图9是本申请实施例提供的第四装置的示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请发明人在研究过程中发现,对于不同类型的VR直播内容,发送设备与接收设备之间的互动方式会有不同,但现有技术中,都会在不同程度上存在一定的视角同步问题。
其中,对于电影类的纯展现内容,通常是由发送方用户来选择观看的主视角,发送设备发送到接收设备的信息除了每一帧图像的图像数据,还包括每一帧图像对应的发送方视角信息,接收设备通过具体的图像帧以及对应的发送方视角,确定接收设备对对应图像帧的展示视角,并对接收到的VR内容进行播放。也就是说,接收方用户被动地跟随发送方用户的视角。此时,对于接收方用户而言,即使其转动其手机等终端设备,或者在佩戴头显设备的情况下转动头部或者眼球,都不会引起VR内容的视角变化,其观看视角始终与发送方用户的视角保持同步,相当于是由发送方用户引领各接收方用户进行VR内容的观看。
但是,在上述直播类型中,发送方用户选择或者改变观看视角的方式,通常也是通过转动手机等移动终端设备,或者在佩戴头显设备的情况下,转动头部或者眼球等方式来实现的。而在直播过程中,发送方用户可能会发生大幅度转动其终端设备,或者大幅度转动头部等情况。例如,在直播过程中,发送方用户可能突然抬头观看全景数据中上方的内容,或者突然低头执行某种操作等,都会导致发送方用户对VR内容的观看视角发生突然的大幅度改变,相应的,VR接收设备播放的VR视频数据的展示视角也会因为发送方用户观看视角的突变,而发生剧烈的图像移动,这可能会导致接收方用户看到突然发生了较大的内容变化的视频数据,甚至使接收方用户由于不适应这种突然的图像移动,而产生晕眩的感觉。
而对于游戏类的探索类内容而言,发送方用户同样可以选择或改变自己的观看视角,但是,现有技术中,发送方用户的观看视角信息不会提供给接收设备。并且,接收方用户可以通过转动其手机等终端设备,或者在佩戴头显设备的情况下转动头部或者眼球等方式,改变自己的观看视角,以用于在具体游戏场景的地图等界面中进行探索,等等。也就是说,接收方用户能够主动改变其观看视角,并且接收设备在播放VR内容时,是根据接收方用户的观看视角来确定VR内容的展示视角。这就使得同一直播过程中,发送设备以及各个接收设备对同一VR内容的展示视角可能都是不同的。
但是,毕竟是在同一游戏等活动中进行直播互动中,因此,在有些情况下,接收方用户可能需要知晓发送方用户的观看视角,以便更好的进行游戏互动等。在现有技术中,可以由发送方用户通过语音等方式通知接收方用户其观看视角的方向,接收方用户自行按照发送方用户的语音提示寻找发送方用户的观看视角方向。但是,对于一些大型的游戏场景等VR内容,一帧图像的面积可能会非常大,一方面,发送方用户可能无法清晰地用语言描述自己的观看视角方向,另一方面,即使发送方用户进行了清晰的描述,接 收方用户也可能会由于画面内容过大而导致难以找到对方观看视角方向,等等。因此,在上述情况下会使得这种现有技术的方式可能变得不再适用,以至于同一游戏中的各接收方用户可能经常出现无法确认发送方用户主视角方向的情况。
基于上述两种情况,本申请实施例提供了相应的解决方案。在该解决方案中,无论是偏向于电影的纯展示类内容的直播,还是偏向于游戏的探索类内容的直播,直播的发送设备都可以将各帧图像的图像数据以及对应的发送方用户的观看视角信息提供给接收设备。之后,针对不同类VR内容的直播,接收设备可以进行不同的处理。
首先,对于偏向于电影的纯展示类内容的直播,VR接收设备侧在进行VR内容的展示时,并不是直接根据各帧数据对应的发送方用户观看视角进行展示,而是首先根据当前待展示的图像帧,以及该图像帧之前的若干个图像帧对应的发送方用户观看视角,进行取平均值等处理,然后将得到的处理后的视角信息,确定为接收设备中当前图像帧的展示视角,以此实现对各帧图像的展示视角进行平滑处理,并在VR接收设备侧按照平滑处理后的展示视角,来向观看直播的用户播放VR视频数据,从而起到对直播用户观看时产生的方向的突变的缓冲作用,以减少观看用户的晕眩的感觉。
而对于偏向于游戏的探索类VR内容的直播,VR接收设备仍然可以将接收方用户的观看视角作为对应图像帧的展示视角,但与此同时,还可以根据各帧图像对应的发送方用户的观看视角,以及接收方用户的观看视角,计算出两者之间的视差方向信息,然后,可以根据该视差方向信息,在接收设备播放VR内容的过程中提供提示信息,例如,可以在VR内容播放界面中显示“箭头”等提示信息,或者,还可以生成语音类的提示信息,等等。
下面分别针对各种不同类型的VR内容直播方式对应的视角同步方案进行详细介绍。
实施例一
该实施例一主要针对偏向于电影的纯展示类VR内容的直播方式对应的视角同步解决方案进行介绍,也即,在该方案中,VR发送设备对所述VR内容进行展示时的展示视角,与VR发送设备的运动相关,但是,VR接收设备对所述当前待展示图像帧的展示视角,与VR接收设备的运动无关。参考图1所示,为本申请在实际应用中一个示例性实施例的场景示意图。在图1中,为一个VR视频直播的应用场景中视角同步的应用场景,VR发送设备101的使用用户为直播用户,也称发送方用户,发送方用户使用VR发送设备101例如VR头显,或者移动终端设备等,观看VR内容,VR发送设备101可以将发送方用户观看到的VR内容,以及VR内容中具体图像帧对应的发送方观看视角信息也提供 给VR接收设备103。其中,VR发送设备101提供的上述信息可以通过服务器102转发至VR接收设备103,或者,还可以与VR接收设备103之间建立点对点的连接,直接发送给VR接收设备103,等等。
其中,VR发送设备所发送的各帧VR视频数据都可以为360度的全景数据,在VR接收设备接收到VR视频数据和对应的发送方用户视角信息后,对视角信息进行平滑处理,例如,将前N帧图像对应的所述视角信息求得平均值作为当前图像帧的展示视角,VR接收设备再按照当前图像帧的展示视角向接收方用户展示当前图像帧,其他图像帧也均可以做类似处理。在图1中仅示出了一个VR接收设备,可以理解的是,在实际应用中,VR接收设备可以有多个。
与现有技术相比,本申请实施例并不是直接将发送方用户观看视角直接作为VR接收设备中的展示视角,而是将平滑处理后的视角信息作为VR接收设备侧的展示视角,因此可以避免由于发送方用户突然的观看视角变动导致的接收设备中播放的图像画面的突然移动或者转动等情况降低接收方用户晕眩的发生概率。
具体的,该实施例二首先从VR发送设备的角度,提供了一种虚拟现实VR直播中的视角同步方法,参考图2,该方法具体可以包括:
S201:确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息。
在本实施例中,发送方用户使用VR发送设备观看VR视频,例如,直播用户头戴VR头显观看电影等,在发送方用户观看过程中,发送方用户的观看视角方向可能会随时发生变化,例如抬头操作导致的视线方向突然上移,或者低头操作导致的视线方向突然下移,等等。因此,VR发送设备不仅要获取到VR视频的各帧VR视频的全景数据,还需要获取到发送方用户观看VR视频时各帧视频数据对应的观看视角信息,该观看视角信息用于表示VR发送设备侧的发送方用户观看VR内容的视线方向。具体实现时,VR发送设备上可以设有传感器,用于检测由于VR发送设备的运动所导致的观看视角变化信息。因此,具体在确定VR内容在VR发送设备侧进行播放的过程中,各帧图像对应的发送方用户观看视角信息时,可以根据所述传感器上传的视角变化信息,确定所述各帧图像对应的发送方用户观看视角信息。
具体的,每一个图像帧都可以对应各自的发送方用户观看视角信息,因此,所述确定图像帧对应的发送方用户观看视角信息,可以是分别为每一图像帧确定出对应的发送方观看视角。或者,在实际应用中,还可以是每隔数帧提供一次对应的发送方观看视角 信息,等等。
S202:将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角。
VR发送设备可以通过远程数据同步等技术,将VR内容中的图像帧及对应的发送方用户观看视角信息提供给VR接收设备。具体的,可以通过服务器进行转发至VR接收设备,或者,还可以直接点对点发送到VR接收设备,等等。在实际应用中,可以以帧为单位发送VR内容中的各图像帧和对应的发送方用户观看视角信息,还可以将若干图像帧作为一个视频流同步至VR接收设备,并在该视频流中提供VR内容中各帧图像对应的发送方观看视角信息。无论哪一种发送方式,只需要在VR发送设备和VR接收设备之间约定好,以便VR接收设备能够将每一帧图像与发送方用户观看视角进行对应即可。
VR接收设备在接收到具体的VR内容后,可以从中提取出具体的图像帧以及对应的发送方用户观看角度信息,其中,由于当前图像帧在VR接收设备中的展示视角与VR内容中的前预置数目(例如,5帧等)帧图像对应的发送方用户观看视角都可能有关,因此,VR接收设备可以对接收到的至少所述预置数目的图像帧对应的发送方用户观看视角进行保存。具体实现时,可以根据具体的预置数目,保存一系列的发送方用户观看视角信息,例如,(…xn-1,xn,xn+1,xn+2…),其中,xn可以为向量,代表第n帧图像对应的发送方用户观看视角信息。其中,在所述预置数目已知的情况下,滑动窗口的长度可以是与该预置数目相等,每次接收到新的图像帧对应的发送方用户观看视角信息时,可以将该新的图像帧对应的发送方用户观看视角信息对应的向量添加到前述序列中,并且该滑动窗口可以向前滑动一次。
具体在对接收到的发送方用户观看视角信息进行处理时,可以有多种方式,只要能够实现视角变化信息的平滑过渡即可。例如,在一种实现方式下,可以按照VR内容中各帧图像的播放顺序,确定出当前待展示的图像帧,及其之前的预置数目的相邻图像帧,然后,计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值,将所述平均值确定为VR接收设备对所述当前待展示图像帧的展示视角。例如,预先设置滑动窗口的长度为m+1,也即,当前待展示图像帧与其前m个相邻图像帧对应的发送方用户观看视角信息相关,其中,m可以为大于1的整数。则对于当前待展示的图像帧,可以采用如下方式计算其展示视角:Yn=Xn-m+Xn-m+1+…Xn/(m+1)。 即,第n帧图像在VR接收设备中的展示视角,是第n帧及其前m帧图像对应的发送方用户观看视角的平均值。
当然,可以理解的是,m的取值越大,则计算得到的展示视角的变化就会越平稳,也即,VR接收设备中各帧图像的展示视角变化曲线越平滑。
在得到平滑处理后的视角信息后,便可以将其作为VR接收设备中的展示视角,并按照该展示视角对当前待展示图像帧进行展示。其他帧也可以进行类似的处理。
需要说明的是,通过本申请实施例中,通过对视角的平滑处理,可以使得VR接收方的各帧图像在展示过程中的展示视角变化比较平稳,降低由于发送方用户视角的突变,而导致接收方用户的晕眩等情况的发生概率。但是,这种平滑处理可能会使得接收方用户观看到的VR内容与发送方用户的观看视角并不是完全一致,出现延迟等情况。另外,如果发送方用户的观看视角并没有出现突变等情况,则每一帧图像都进行平滑处理之后再进行展示,则可能会对计算资源等造成一定程度的浪费,并且由于这种方式造成的接收方用户观看的内容出现延迟的情况,会显得不值得。
为此,在本申请的可选实施方式中,在针对当前待展示图像对应的视角信息进行平滑处理之前,还可以首先进行判断,以确定出当前待展示图像相对于前几帧图像的发送方观看视角是否发生了突变等情况,如果发生了突变,再按照前述方式进行平滑处理,否则,可以直接按照当前待展示图像帧对应的发送方用户观看视角作为接收设备中的展示视角,对该待展示图像帧进行展示即可。具体的,可以确定所述当前待展示的图像帧相对于相邻的前预置数目的图像帧对应的发送方用户观看视角信息的变化程度信息,如果所述变化程度信息达到预置阈值,则触发所述计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值的步骤。否则,如果所述变化程度信息未达到预置阈值,则将所述当前图像帧对应的发送方用户观看视角,确定为VR接收设备对所述当前待展示图像帧的展示视角。其中,具体在所述变化程度信息时,由于每帧图像对应的发送方用户观看视角对应着一个向量,因此,可以通过计算向量之间的距离,并判断距离长短等方式,来确定出所述变化程度的高低。也即,变化程度信息对应的预置阈值可以是代表向量与向量之间距离的长度值,等等。
这样,通过上述方式可以实现按需的平滑处理,也即,在需要进行平滑处理时,例如,发现发送方用户观看视角发生突变时,再进行所述计算平均值等处理。这样,可以节省计算资源,并且,可以更大限度的保证接收方用户观看的VR内容与发送方用户的观看视角保持一致。
总之,在本申请实施例中,通过由VR发送设备向VR接收设备同时发送VR内容中的图像帧以及对应的发送方用户观看视角信息的方式,可以在VR接收设备侧对发送方用户观看视角信息进行预处理之后,再作为接收设备的展示视角对待展示图像帧进行展示,因此,能够使得VR接收设备对各图像帧的展示视角变化更为平滑,即使发送设备侧的出现发送方用户观看视角发生突变,接收方用户由于这种突变而发生的晕眩等情况的发生概率也会得到控制或者降低。
实施例二
该实施例二是与实施例一项对应的,从VR接收设备的角度,提供了一种虚拟现实VR直播中的视角同步方法,参考图3,该方法具体可以包括:
S301:获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
S302:根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角;
其中,具体在确定VR接收设备对所述当前待展示图像帧的展示视角时,可以计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值,将所述平均值确定为VR接收设备对所述当前待展示图像帧的展示视角。
另外,在进行上述平均值计算之前,还可以首先确定所述当前待展示的图像帧相对于相邻的前预置数目的图像帧对应的发送方用户观看视角信息的变化程度信息,如果所述变化程度信息达到预置阈值,则触发所述计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值的步骤。如果所述变化程度信息未达到预置阈值,则可以将所述当前图像帧对应的发送方用户观看视角,确定为VR接收设备对所述当前待展示图像帧的展示视角。
S303:根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
其中,所述VR接收设备对所述当前待展示图像帧的展示视角,与VR接收设备的运动无关。
由于该实施例二是与实施例一相对应的,因此,相关的具体实现可以参见前述实施例一中的记载,这里不再赘述。
实施例三
该实施例三主要是针对偏向于游戏的探索类VR内容在直播过程中的视角同步问题进行介绍。也即,在该情况下,VR发送设备对所述VR内容的展示视角,与VR发送设备 的运动相关,VR发送设备对所述VR内容的展示视角,则与VR发送设备的运动相关。具体的,该实施例三首先从VR发送设备角度,提供了一种虚拟现实VR直播中的视角同步方法,具体的,参见图4,该方法可以包括:
S401:确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
该步骤与实施例一中的步骤S201可以是相同的,也即,即使VR接收设备在对VR内容进行展示时,具体的展示视角,是跟随接收方用户观看视角而变化的,也可以将发送设备侧发送方用户的观看视角信息提供给VR接收设备,使得VR接收设备可以据此向接收方用户提供关于发送方观看视角方向的提示信息。
S402:将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
在向VR接收设备提供VR内容时,同样可以既提供VR内容中的具体图像帧,还可以提供与图像帧对应的发送方用户观看视角信息。对于VR接收设备而言,是根据接收方用户观看视角,确定具体图像帧在VR接收设备中的展示视角。例如,在一个游戏场景中,发送方用户已经进入到场景画面的右上方,而接收方用户仍然在场景画面的左下方,则这种情况下VR接收设备不会按照场景画面的右上方的展示视角来播放当前图像帧,而是按照接收方用户所处的场景画面的左下方的展示视角来播放该当前图像帧。
但是,在本申请实施例中,由于VR接收设备能够获取到发送方用户观看视角信息,因此,还可以计算出接收方用户观看视角与发送方用户观看视角之间的视差,然后,可以根据这种视差,向接收方用户提供提示信息,以帮助接收方用户通过自行旋转VR接收设备等方式,找到发送用户观看视角的方向。其中,所述视差为发送方用户的观看视角方向与接收方用户的观看视角方向之间的角度值,同样可以通过一个向量等方式进行表示。
在计算得到接收方用户观看视角与发送方用户观看视角之间的视差后,可以根据该视差生成一提示信息,以提示接收方用户,如何转动其VR接收设备能够找到发送方用户观看视角所在的位置。
具体的,所述提示信息可以有多种实现方式,例如,其中一种方式下,可以是根据所述视差信息,生成可视化方向指示信息,这样,在向接收方用户提供该提示信息时,可以在展示所述当前待展示的图像帧时,将所述可视化方向指示信息叠加到所述图像帧 中进行展示。例如,具体的,所述可视化方向指示信息可以通过箭头的方式来实现,箭头可以是具有一定扭曲程度,用以提示接收方用户如何转动其VR接收设备。其中,接收方用户观看视角与发送方用户观看视角之间的视差越大,则箭头扭曲越明显。另外,在具体实现时,还可以在箭头等可视化信息上展示发送方用户的名字等用户标识信息。
除了通过可视化信息的方式提供所述提示信息之外,还可以通过语音等方式来进行提示。例如,可以根据所述视差信息,生成音频提示信息,然后在展示所述当前待展示的图像帧时,播放所述音频提示信息。其中,具体实现时,可以预先提供语料模板以及语音播放模型,在计算出具体的视差信息后,可以按照该模板生成对应的语料,然后,通过语音播放模型转化为语音进行播放,等等。例如,如果通过计算发现,发送方用户的观看视角位于当前接收方用户的观看视角的右后方,则生成的语料可以是“如果需要找发送方用户的视角,请向右后方旋转手机或头显”,等等。
在提供了上述提示信息后,如果需要与发送方用户观看视角同步,则接收方用户可以依据提示信息,来通过旋转其VR接收设备等方式,实现对其观看视角的调整,使得其观看视角与发送方用户的观看视角一致,以此实现与发送方用户的视角同步。
也即,在接收方用户获得具体的提示信息后,如果按照提示信息调整了自己的观看视角方向,则VR接收设备可以根据接收方用户调整后的观看视角方向,来向接收方用户显示VR内容。仍以上述例子进行说明,如果接收方用户的视线也进入了场景画面的右上方,则在VR接收设备上则需要按照场景画面的右上方的对应的展示视角,播放场景画面右上方的VR图像,使得接收方用户能够观看到这部分图像。
可见,通过本申请实施例对接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息的计算,以及据此提供的提示信息,可以提示接收方用户如何对其VR接收设备进行旋转等操作,能够与发送方用户的观看视角同步,以便接收方用户按照提示信息,来调整自己的观看视角,以便可以在VR接收设备中观看到与发送方用户观看到的同样的VR内容。
可以理解的是,在VR接收设备有多个的情况下,在各个VR接收设备之间,也可以按照本实施例的方式确定各个VR接收设备之间的视差信息,提供关于其他VR接收设备对应的用户观看视角的提示信息。例如,VR接收设备A对应的接收方用户A和VR接收设备B对应的接收方用户B的观看视角方向不同,则可以在VR接收设备A的显示屏幕上,对用户B的观看视角信息进行提示,从而方便用户A来找到用户B的观看视角位置,等等。
实施例四
该实施例四是与实施例三相对应的,从VR接收设备的角度,提供了一种虚拟现实VR直播中的视角同步方法,参见图5,该方法具体可以包括:
S501:获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
S502:确定当前待展示的图像帧对应的接收方用户观看视角信息;
S503:根据接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息,生成对所述发送方用户观看视角的提示信息;
S504:在展示所述当前待展示的图像帧时,提供所述提示信息。
其中,在该实施例中,所述VR发送设备对所述VR内容的展示视角,与VR发送设备的运动相关;所述VR接收设备对所述VR内容的展示视角,与VR接收设备的运动相关。
具体在生成对所述发送方用户观看视角的提示信息时,可以根据所述视差信息,生成可视化方向指示信息;此时,可以在展示所述当前待展示的图像帧时,将所述可视化方向指示信息叠加到所述图像帧中进行展示。另外,还可以将所述发送方用户的用户标识信息添加到所述可视化信息中。
另一种生成对所述发送方用户观看视角的提示信息的方式可以是,根据所述视差信息,生成音频提示信息;此时,可以在展示所述当前待展示的图像帧时,播放所述音频提示信息。
关于该实施例四中的未详述部分内容,可以参见前述实施例三中的记载,这里不再赘述。
对于前述的方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请所必须的。
与实施例一相对应,本申请实施例还提供了一种虚拟现实VR直播中的视角同步装置,参见图6,该装置可以包括:
第一发送方用户视角信息确定单元601,用于确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
第一VR内容提供单元602,用于将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角。
其中,所述VR发送设备对所述VR内容进行展示时的展示视角,与VR发送设备的运动相关。所述VR接收设备对所述当前待展示图像帧的展示视角,与VR接收设备的运动无关。
所述VR发送设备上可以设有传感器,用于检测VR发送方设备的运动所导致的观看视角变化信息;此时,所述第一发送方用户视角信息确定单元具体可以用于:
根据所述传感器上传的视角变化信息,确定所述各帧图像对应的发送方用户观看视角信息。
与实施例二相对应,本申请实施例还提供了一种虚拟现实VR直播中的视角同步装置,参见图7,该装置可以包括:
VR内容获得单元701,用于获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
展示视角确定单元702,用于根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角;
展示单元703,用于根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
具体的,所述展示视角确定单元具体可以包括:
计算子单元,用于计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值;
展示视角确定子单元,用于将所述平均值确定为VR接收设备对所述当前待展示图像帧的展示视角。
另外,该装置还可以包括:
视角变化程度确定单元,用于确定所述当前待展示的图像帧相对于相邻的前预置数目的图像帧对应的发送方用户观看视角信息的变化程度信息;
触发单元,用于如果所述变化程度信息达到预置阈值,则触发所述计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值的步骤。
直接确定单元,用于如果所述变化程度信息未达到预置阈值,则将所述当前图像帧对应的发送方用户观看视角,确定为VR接收设备对所述当前待展示图像帧的展示视角。
其中,所述VR接收设备对所述当前待展示图像帧的展示视角,与接收方用户所触发的VR接收设备的运动无关。
与实施例三相对应,本申请实施例还提供了一种虚拟现实VR直播中的视角同步装置,参见图8,该装置可以,包括:
第二发送方用户视角信息确定单元801,用于确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
第二VR内容提供单元802,用于将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
其中,所述VR发送设备对所述VR内容的展示视角,与VR发送设备的运动相关;所述VR发送设备对所述VR内容的展示视角,与VR发送设备的运动相关。
与实施例四相对应,本申请实施例还提供了一种虚拟现实VR直播中的视角同步装置,参见图9,该装置可以包括:
VR内容获得单元901,用于获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
接收方用户观看视角信息确定单元902,用于确定当前待展示的图像帧对应的接收方用户观看视角信息;
提示信息生成单元903,用于根据接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息,生成对所述发送方用户观看视角的提示信息;
提示信息提供单元904,用于在展示所述当前待展示的图像帧时,提供所述提示信息。
其中,所述VR发送设备对所述VR内容的展示视角,与VR发送设备的运动相关;
所述VR接收设备对所述VR内容的展示视角,与VR接收设备的运动相关。
其中,所述提示信息生成单元具体可以用于:
根据所述视差信息,生成可视化方向指示信息;
此时,所述提示信息提供单元具体可以用于:在展示所述当前待展示的图像帧时,将所述可视化方向指示信息叠加到所述图像帧中进行展示。
另外,所述提示信息生成单元还可以用于:
将所述发送方用户的用户标识信息添加到所述可视化信息中。
或者,所述提示信息生成单元具体可以用于:
根据所述视差信息,生成音频提示信息;
此时,所述提示信息提供单元具体可以用于:在展示所述当前待展示的图像帧时,播放所述音频提示信息。
需要说明的是,在实际应用中,还可以包括基于增强现实(AR)的直播应用,与VR直播的不同之处在于,AR直播过程中,可能需要发送方用户与接收方用户都能够对着同一实体物体进行图像采集,然后,基于该实体物体在屏幕中的展示位置,展示出相关联的AR内容。本申请实施例所提供的方案也可以用于基于AR的直播中,也即,如果AR内容中每一帧图像也是360度的全景图像,并且用户可以通过旋转其AR设备的方式来改变观看视角,则也可以通过本申请实施例提供的方案实现发送方用户与接收方用户的视角同步,并防止出现接收方用户的晕眩等情况出现。
其中,关于AR直播,同样可以存在纯展示类内容的直播,以及探索类内容的直播,相应的,针对不同类型的AR内容,具体存在的同步问题以及对应的处理方式可以有所不同。
其中,与实施例一类似,本申请实施例还提供了一种增强现实AR直播中的视角同步方法,该方法可以包括:
步骤一:确定AR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
步骤二:将所述AR内容中所述图像帧及其对应的发送方用户观看视角信息提供给AR接收设备,以用于AR接收设备在对AR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定AR接收设备对所述当前待展示图像帧的展示视角。
与实施例二类似,本申请实施例还提供了一种增强现实AR直播中的视角同步方法,该方法具体可以包括:
步骤一:获得AR发送设备提供的AR内容信息,所述AR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
步骤二:根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定AR接收设备对所述当前待展示图像帧的展示视角;
步骤三:根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
与实施例三类似,本申请实施例还提供了一种增强现实AR直播中的视角同步方法,该方法具体可以包括:
步骤一:确定AR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
步骤二:将所述AR内容中所述图像帧及其对应的发送方用户观看视角信息提供给AR接收设备,以用于AR接收设备在对AR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
与实施例四类似,本申请实施例还提供了一种增强现实AR直播中的视角同步方法,该方法具体可以包括:
步骤一:获得AR发送设备提供的AR内容信息,所述AR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
步骤二:确定当前待展示的图像帧对应的接收方用户观看视角信息;
步骤三:根据接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息,生成对所述发送方用户观看视角的提示信息;
步骤四:在展示所述当前待展示的图像帧时,提供所述提示信息。其中,关于与AR相关的具体实现,可以参见前述各对应的实施例中关于VR的介绍,这里不再赘述。
需要说明的是,本说明书中的各个实施例均采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似的部分互相参见即可。对于装置类实施例而言,由于其与方法实施例基本相似,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
最后,还需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上对本申请所提供的虚拟现实VR直播中的视角同步方法及装置进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是 用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (24)

  1. 一种虚拟现实VR直播中的视角同步方法,其特征在于,包括:确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
    将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角。
  2. 根据权利要求1所述的方法,其特征在于,
    所述VR发送设备对所述VR内容进行展示时的展示视角,与VR发送设备的运动相关。
  3. 根据权利要求2所述的方法,其特征在于,所述VR发送设备上设有传感器,用于检测由于VR发送方设备的运动所导致的观看视角变化信息;
    所述确定VR内容在VR发送设备侧进行播放的过程中,各帧图像对应的发送方用户观看视角信息,包括:
    根据所述传感器上传的视角变化信息,确定所述各帧图像对应的发送方用户观看视角信息。
  4. 根据权利要求1所述的方法,其特征在于,
    所述VR接收设备对所述当前待展示图像帧的展示视角,与VR接收设备的运动无关。
  5. 一种虚拟现实VR直播中的视角同步方法,其特征在于,包括:
    获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
    根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角;
    根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
  6. 根据权利要求5所述的方法,其特征在于,
    所述确定VR接收设备对所述当前待展示图像帧的展示视角,包括:
    计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值;
    将所述平均值确定为VR接收设备对所述当前待展示图像帧的展示视角。
  7. 根据权利要求6所述的方法,其特征在于,
    所述计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值之前,还包括:
    确定所述当前待展示的图像帧相对于相邻的前预置数目的图像帧对应的发送方用户观看视角信息的变化程度信息;
    如果所述变化程度信息达到预置阈值,则触发所述计算当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息的平均值的步骤。
  8. 根据权利要求7所述的方法,其特征在于,还包括:
    如果所述变化程度信息未达到预置阈值,则将所述当前图像帧对应的发送方用户观看视角,确定为VR接收设备对所述当前待展示图像帧的展示视角。
  9. 根据权利要求5所述的方法,其特征在于,
    所述VR接收设备对所述当前待展示图像帧的展示视角,与接收方用户所触发的VR接收设备的运动无关。
  10. 一种虚拟现实VR直播中的视角同步方法,其特征在于,包括:
    确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
    将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
  11. 根据权利要求10所述的方法,其特征在于,
    所述VR发送设备对所述VR内容的展示视角,与VR发送设备的运动相关;
    所述VR发送设备对所述VR内容的展示视角,与VR发送设备的运动相关。
  12. 一种虚拟现实VR直播中的视角同步方法,其特征在于,包括:
    获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
    确定当前待展示的图像帧对应的接收方用户观看视角信息;
    根据接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之间的视差信息,生成对所述发送方用户观看视角的提示信息;
    在展示所述当前待展示的图像帧时,提供所述提示信息。
  13. 根据权利要求12所述的方法,其特征在于,
    所述VR发送设备对所述VR内容的展示视角,与VR发送设备的运动相关;
    所述VR接收设备对所述VR内容的展示视角,与VR接收设备的运动相关。
  14. 根据权利要求12所述的方法,其特征在于,
    所述生成对所述发送方用户观看视角的提示信息,包括:
    根据所述视差信息,生成可视化方向指示信息;
    所述在展示所述当前待展示的图像帧时,提供所述提示信息,包括:
    在展示所述当前待展示的图像帧时,将所述可视化方向指示信息叠加到所述图像帧中进行展示。
  15. 根据权利要求14所述的方法,其特征在于,
    所述生成对所述发送方用户观看视角的提示信息,还包括:
    将所述发送方用户的用户标识信息添加到所述可视化信息中。
  16. 根据权利要求12所述的方法,其特征在于,
    所述生成对所述发送方用户观看视角的提示信息,包括:
    根据所述视差信息,生成音频提示信息;
    所述在展示所述当前待展示的图像帧时,提供所述提示信息,包括:
    在展示所述当前待展示的图像帧时,播放所述音频提示信息。
  17. 一种虚拟现实VR直播中的视角同步装置,其特征在于,包括:
    第一发送方用户视角信息确定单元,用于确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
    第一VR内容提供单元,用于将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角。
  18. 一种虚拟现实VR直播中的视角同步装置,其特征在于,包括:
    VR内容获得单元,用于获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
    展示视角确定单元,用于根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定VR接收设备对所述当前待展示图像帧的展示视角;
    展示单元,用于根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
  19. 一种虚拟现实VR直播中的视角同步装置,其特征在于,包括:
    第二发送方用户视角信息确定单元,用于确定VR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
    第二VR内容提供单元,用于将所述VR内容中所述图像帧及其对应的发送方用户观看视角信息提供给VR接收设备,以用于VR接收设备在对VR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
  20. 一种虚拟现实VR直播中的视角同步装置,其特征在于,包括:
    VR内容获得单元,用于获得VR发送设备提供的VR内容信息,所述VR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
    接收方用户观看视角信息确定单元,用于确定当前待展示的图像帧对应的接收方用户观看视角信息;
    提示信息生成单元,用于根据接收方用户观看视角与所述当前待展示图像帧对应的 发送方用户观看视角之间的视差信息,生成对所述发送方用户观看视角的提示信息;
    提示信息提供单元,用于在展示所述当前待展示的图像帧时,提供所述提示信息。
  21. 一种增强现实AR直播中的视角同步方法,其特征在于,包括:
    确定AR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
    将所述AR内容中所述图像帧及其对应的发送方用户观看视角信息提供给AR接收设备,以用于AR接收设备在对AR内容进行展示时,根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定AR接收设备对所述当前待展示图像帧的展示视角。
  22. 一种增强现实AR直播中的视角同步方法,其特征在于,包括:
    获得AR发送设备提供的AR内容信息,所述AR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
    根据当前待展示的图像帧及其前预置数目图像帧对应的发送方用户观看视角信息,确定AR接收设备对所述当前待展示图像帧的展示视角;
    根据所确定出的展示视角对所述当前待展示的图像帧进行展示。
  23. 一种增强现实AR直播中的视角同步方法,其特征在于,包括:
    确定AR内容在发送设备侧进行播放的过程中,图像帧对应的发送方用户观看视角信息;
    将所述AR内容中所述图像帧及其对应的发送方用户观看视角信息提供给AR接收设备,以用于AR接收设备在对AR内容进行展示时,根据接收方用户观看视角与发送方用户观看视角之间的视差,提供对所述发送方用户观看视角的提示信息。
  24. 一种增强现实AR直播中的视角同步方法,其特征在于,包括:
    获得AR发送设备提供的AR内容信息,所述AR内容信息中包括图像帧及其对应的发送方用户观看视角信息;
    确定当前待展示的图像帧对应的接收方用户观看视角信息;
    根据接收方用户观看视角与所述当前待展示图像帧对应的发送方用户观看视角之 间的视差信息,生成对所述发送方用户观看视角的提示信息;
    在展示所述当前待展示的图像帧时,提供所述提示信息。
PCT/CN2019/074529 2018-02-14 2019-02-02 虚拟现实vr直播中的视角同步方法及装置 WO2019158000A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2020541660A JP7294757B2 (ja) 2018-02-14 2019-02-02 仮想現実ライブストリーミングにおいて視野角を同期させるための方法および装置
EP19753985.1A EP3754980A4 (en) 2018-02-14 2019-02-02 VIEWING ANGLE SYNCHRONIZATION METHOD AND DEVICE IN VIRTUAL REALITY (VR) LIVE BROADCASTING
US16/965,734 US11290573B2 (en) 2018-02-14 2019-02-02 Method and apparatus for synchronizing viewing angles in virtual reality live streaming

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810151922.0 2018-02-14
CN201810151922.0A CN110166764B (zh) 2018-02-14 2018-02-14 虚拟现实vr直播中的视角同步方法及装置

Publications (1)

Publication Number Publication Date
WO2019158000A1 true WO2019158000A1 (zh) 2019-08-22

Family

ID=67619734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/074529 WO2019158000A1 (zh) 2018-02-14 2019-02-02 虚拟现实vr直播中的视角同步方法及装置

Country Status (6)

Country Link
US (1) US11290573B2 (zh)
EP (1) EP3754980A4 (zh)
JP (1) JP7294757B2 (zh)
CN (1) CN110166764B (zh)
TW (1) TW201935924A (zh)
WO (1) WO2019158000A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111131852A (zh) * 2019-12-31 2020-05-08 歌尔科技有限公司 视频直播方法、***及计算机可读存储介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110913278B (zh) * 2019-12-06 2022-04-08 深圳创维新世界科技有限公司 视频播放方法、显示终端及存储介质
CN113014961A (zh) * 2019-12-19 2021-06-22 中兴通讯股份有限公司 视频推送及传输方法、视角同步方法及装置、存储介质
CN111343475B (zh) * 2020-03-04 2022-04-15 广州虎牙科技有限公司 数据处理方法和装置、直播服务器及存储介质
CN113453083B (zh) * 2020-03-24 2022-06-28 腾讯科技(深圳)有限公司 多自由度场景下的沉浸式媒体获取方法、设备及存储介质
CN112882674B (zh) * 2021-03-04 2022-11-08 腾讯科技(深圳)有限公司 虚拟现实图像数据的显示方法和设备
CN114449162B (zh) * 2021-12-22 2024-04-30 天翼云科技有限公司 一种播放全景视频的方法、装置、计算机设备及存储介质
CN114630100A (zh) * 2022-01-28 2022-06-14 北京威尔文教科技有限责任公司 数据同步显示方法和***
CN114900506B (zh) * 2022-07-12 2022-09-30 中国科学技术大学 面向用户体验质量的360度视频视口预测方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2490179A1 (en) * 2011-02-18 2012-08-22 Alcatel Lucent Method and apparatus for transmitting and receiving a panoramic video stream
CN106125930A (zh) * 2016-06-27 2016-11-16 上海乐相科技有限公司 一种虚拟现实设备及主视角画面校准的方法
CN106331732A (zh) * 2016-09-26 2017-01-11 北京疯景科技有限公司 生成、展现全景内容的方法及装置
CN106385587A (zh) * 2016-09-14 2017-02-08 三星电子(中国)研发中心 分享虚拟现实视角的方法、装置及***
CN107635152A (zh) * 2017-09-28 2018-01-26 深圳晶恒数码科技有限公司 一种共享vr视频的方法及装置

Family Cites Families (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7796162B2 (en) 2000-10-26 2010-09-14 Front Row Technologies, Llc Providing multiple synchronized camera views for broadcast from a live venue activity to remote viewers
US8954596B2 (en) 2010-04-02 2015-02-10 Netflix, Inc. Dynamic virtual chunking of streaming media content
US9143729B2 (en) 2010-05-12 2015-09-22 Blue Jeans Networks, Inc. Systems and methods for real-time virtual-reality immersive multimedia communications
US9088714B2 (en) * 2011-05-17 2015-07-21 Apple Inc. Intelligent image blending for panoramic photography
US8600194B2 (en) * 2011-05-17 2013-12-03 Apple Inc. Positional sensor-assisted image registration for panoramic photography
US9389677B2 (en) 2011-10-24 2016-07-12 Kenleigh C. Hobby Smart helmet
US20130141526A1 (en) 2011-12-02 2013-06-06 Stealth HD Corp. Apparatus and Method for Video Image Stitching
US9380327B2 (en) 2011-12-15 2016-06-28 Comcast Cable Communications, Llc System and method for synchronizing timing across multiple streams
DE112013001869T5 (de) * 2012-04-02 2014-12-24 Panasonic Corporation Bilderzeugungsvorrichtung, Kameravorrichtung, Bilddarstellungsvorrichtung und Bilderzeugungsverfahren
US8803916B1 (en) 2012-05-03 2014-08-12 Sprint Communications Company L.P. Methods and systems for an augmented reality service delivery platform
US9452354B2 (en) 2013-06-07 2016-09-27 Sony Interactive Entertainment Inc. Sharing three-dimensional gameplay
CN104766274A (zh) * 2014-03-11 2015-07-08 北京博锐尚格节能技术股份有限公司 一种3d能耗展示模型的旋转方法及装置
WO2016002445A1 (ja) 2014-07-03 2016-01-07 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
EP3178233A1 (en) 2014-08-07 2017-06-14 ARRIS Enterprises LLC Systems and methods for multicast delivery of a managed bundle in service provider networks
WO2016048983A1 (en) 2014-09-22 2016-03-31 Arris Enterprises, Inc. Video quality of experience based on video quality estimation
US9818225B2 (en) * 2014-09-30 2017-11-14 Sony Interactive Entertainment Inc. Synchronizing multiple head-mounted displays to a unified space and correlating movement of objects in the unified space
US10108256B2 (en) 2014-10-30 2018-10-23 Mediatek Inc. Systems and methods for processing incoming events while performing a virtual reality session
WO2016077262A1 (en) * 2014-11-10 2016-05-19 Swarms Ventures, Llc Method and system for programmable loop recording
US9804257B2 (en) 2014-11-13 2017-10-31 WorldViz LLC Methods and systems for an immersive virtual reality system using multiple active markers
US10102674B2 (en) 2015-03-09 2018-10-16 Google Llc Virtual reality headset connected to a mobile computing device
US10360729B2 (en) * 2015-04-06 2019-07-23 Scope Technologies Us Inc. Methods and apparatus for augmented reality applications
KR101670939B1 (ko) * 2015-04-14 2016-10-31 (주)일렉콤 지형에 따른 경사각 구현기능을 갖는 가상현실 모션 플렛폼을 이용한 구동 방법
US20160314624A1 (en) 2015-04-24 2016-10-27 Eon Reality, Inc. Systems and methods for transition between augmented reality and virtual reality
US10015370B2 (en) * 2015-08-27 2018-07-03 Htc Corporation Method for synchronizing video and audio in virtual reality system
CN105704468B (zh) * 2015-08-31 2017-07-18 深圳超多维光电子有限公司 用于虚拟和现实场景的立体显示方法、装置及电子设备
US9298283B1 (en) * 2015-09-10 2016-03-29 Connectivity Labs Inc. Sedentary virtual reality method and systems
US10112111B2 (en) * 2016-03-18 2018-10-30 Sony Interactive Entertainment Inc. Spectator view perspectives in VR environments
KR101788452B1 (ko) * 2016-03-30 2017-11-15 연세대학교 산학협력단 시선 인식을 이용하는 콘텐츠 재생 장치 및 방법
WO2017205642A1 (en) 2016-05-25 2017-11-30 Livit Media Inc. Methods and systems for live sharing 360-degree video streams on a mobile device
CN107678715A (zh) * 2016-08-02 2018-02-09 北京康得新创科技股份有限公司 虚拟信息的共享方法,装置和***
CN106210861B (zh) * 2016-08-23 2020-08-07 上海幻电信息科技有限公司 显示弹幕的方法及***
CN106358036B (zh) * 2016-08-31 2018-05-08 杭州当虹科技有限公司 一种以预设视角观看虚拟现实视频的方法
CN106170094B (zh) * 2016-09-07 2020-07-28 阿里巴巴(中国)有限公司 全景视频的直播方法及装置
CN106791769A (zh) * 2016-12-16 2017-05-31 广东威创视讯科技股份有限公司 虚拟现实实现方法及***
US20180288557A1 (en) * 2017-03-29 2018-10-04 Samsung Electronics Co., Ltd. Use of earcons for roi identification in 360-degree video
CN108933920B (zh) * 2017-05-25 2023-02-17 中兴通讯股份有限公司 一种视频画面的输出、查看方法及装置
JP6873830B2 (ja) * 2017-06-05 2021-05-19 キヤノン株式会社 表示制御装置、その制御方法及びプログラム
CN107274472A (zh) * 2017-06-16 2017-10-20 福州瑞芯微电子股份有限公司 一种提高vr播放帧率的方法和装置
US10639557B2 (en) * 2017-06-22 2020-05-05 Jntvr Llc Synchronized motion simulation for virtual reality
US11024078B2 (en) * 2017-08-07 2021-06-01 Verizon Patent And Licensing Inc. Systems and methods compression, transfer, and reconstruction of three-dimensional (3D) data meshes
CN108107578B (zh) * 2017-12-14 2019-07-26 腾讯科技(深圳)有限公司 虚拟现实的视角调节方法、装置、计算设备及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2490179A1 (en) * 2011-02-18 2012-08-22 Alcatel Lucent Method and apparatus for transmitting and receiving a panoramic video stream
CN106125930A (zh) * 2016-06-27 2016-11-16 上海乐相科技有限公司 一种虚拟现实设备及主视角画面校准的方法
CN106385587A (zh) * 2016-09-14 2017-02-08 三星电子(中国)研发中心 分享虚拟现实视角的方法、装置及***
CN106331732A (zh) * 2016-09-26 2017-01-11 北京疯景科技有限公司 生成、展现全景内容的方法及装置
CN107635152A (zh) * 2017-09-28 2018-01-26 深圳晶恒数码科技有限公司 一种共享vr视频的方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3754980A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111131852A (zh) * 2019-12-31 2020-05-08 歌尔科技有限公司 视频直播方法、***及计算机可读存储介质

Also Published As

Publication number Publication date
TW201935924A (zh) 2019-09-01
EP3754980A4 (en) 2021-11-17
EP3754980A1 (en) 2020-12-23
US11290573B2 (en) 2022-03-29
US20210037116A1 (en) 2021-02-04
CN110166764B (zh) 2022-03-01
JP2021513773A (ja) 2021-05-27
JP7294757B2 (ja) 2023-06-20
CN110166764A (zh) 2019-08-23

Similar Documents

Publication Publication Date Title
WO2019158000A1 (zh) 虚拟现实vr直播中的视角同步方法及装置
US11490132B2 (en) Dynamic viewpoints of live event
US8514275B2 (en) Three-dimensional (3D) display method and system
CN113347405B (zh) 缩放相关的方法和装置
US10560724B2 (en) Video content distribution system and content management server
US20170237941A1 (en) Realistic viewing and interaction with remote objects or persons during telepresence videoconferencing
CN112272817B (zh) 用于在沉浸式现实中提供音频内容的方法和装置
US20120087571A1 (en) Method and apparatus for synchronizing 3-dimensional image
JP5599063B2 (ja) 表示制御装置、表示制御方法及びプログラム
CN110691231A (zh) 一种虚拟现实播放***及其同步播放方法
TWI491244B (zh) 調整物件三維深度的方法與裝置、以及偵測物件三維深度的方法與裝置
US9667951B2 (en) Three-dimensional television calibration
JP2012186652A (ja) 電子機器、画像処理方法及び画像処理プログラム
EP2590419A2 (en) Multi-depth adaptation for video content
US20220007078A1 (en) An apparatus and associated methods for presentation of comments
US20220286658A1 (en) Stereo image generation method and electronic apparatus using the same
US20190149811A1 (en) Information processing apparatus, information processing method, and program
Johanson The turing test for telepresence
TWM626646U (zh) 電子裝置
JP2023002032A (ja) 表示制御装置、表示制御方法および表示制御プログラム
TW202213995A (zh) 三維直播影像的橫向縱向控制裝置
TW202335494A (zh) 用於在裸視立體顯示裝置上顯示之三維內容的縮放
JP2019103099A (ja) 映像処理装置及びプログラム
JP2012023415A (ja) 立体映像表示装置及び立体映像音声信号記録再生装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19753985

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020541660

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019753985

Country of ref document: EP

Effective date: 20200914