WO2019056904A1 - 视频传输方法、服务器、vr播放终端及计算机可读存储介质 - Google Patents

视频传输方法、服务器、vr播放终端及计算机可读存储介质 Download PDF

Info

Publication number
WO2019056904A1
WO2019056904A1 PCT/CN2018/101574 CN2018101574W WO2019056904A1 WO 2019056904 A1 WO2019056904 A1 WO 2019056904A1 CN 2018101574 W CN2018101574 W CN 2018101574W WO 2019056904 A1 WO2019056904 A1 WO 2019056904A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
location information
playback terminal
terminal
server
Prior art date
Application number
PCT/CN2018/101574
Other languages
English (en)
French (fr)
Inventor
胡佳
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Priority to JP2019571954A priority Critical patent/JP2020527300A/ja
Priority to US16/626,357 priority patent/US20200120380A1/en
Priority to EP18859378.4A priority patent/EP3691280B1/en
Publication of WO2019056904A1 publication Critical patent/WO2019056904A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present disclosure relates to the field of virtual reality technologies, and in particular, to a video transmission method, a server, a VR playback terminal, and a computer readable storage medium.
  • the simulation system uses a computer to generate a simulation environment and integrates many The fusion of source information and the participation of users, is an interactive system simulation of 3D dynamic vision and physical behavior, immersing users in the simulation environment, which is VR (Virtual Reality) and AR (Augmented Reality, Augmented Reality).
  • VR Virtual Reality
  • AR Augmented Reality
  • VR technology mainly includes simulation environment, perception, natural skills and sensing equipment.
  • the simulation environment is a computer-generated, real-time dynamic three-dimensional realistic image
  • the perception is that the ideal VR should have the perception of all people, in addition to the visual perception generated by computer graphics technology, there are also hearing, touch, and force.
  • Perceptions such as perception and movement, and even olfactory and gustatory, also known as multi-perception
  • natural skills are the rotation of the person's head, eyes, gestures, and/or other human behaviors, which are handled by the computer and the actions of the participants.
  • the data is adapted and responds to the user's input in real time and is fed back to the user's facial features;
  • the sensing device is a three-dimensional interactive device.
  • AR technology is a technology that seamlessly integrates real-world information with virtual world information.
  • AR technology superimposes physical information (such as visual information, sound, taste, and touch) that is difficult to experience in a certain time and space of the real world, and then superimposes it through scientific simulations such as computers, and applies virtual information to The real world is perceived by human senses to achieve a sensory experience that transcends reality.
  • the real environment and virtual objects are superimposed in real time on the same picture or space.
  • VR playback devices are generally based on immersive virtual reality experience products, including:
  • External head-mounted device that can be connected to a personal computer, smartphone and/or game console as a computing and storage device with a separate screen; an integrated headset with screen, computing and storage devices, no external connection required
  • the device can operate independently; wearing a mobile terminal device, also known as a glasses box, can put a mobile terminal (eg, a mobile phone) into a box to function as a screen and a computing and storage device.
  • a mobile terminal eg, a mobile phone
  • VR playback devices mainly have the following problems:
  • the VR playback device can give the user a private space to protect the user from being disturbed by the outside world, the VR video or AR video viewed by the user cannot be shared to other users in real time; the VR playback device is connected to the input device through the cable, and is easy for the user to act. Wrapped or tripped by a cable, or pulled out of the input device when the user moves, affecting the user's experience; the mobile terminal in the mobile terminal device is slowed down due to heat, affecting the user's VR video or AR video viewing experience.
  • the technical problem to be solved by the present disclosure is to provide a video transmission method, a server, a VR playback terminal, and a computer readable storage medium, which overcomes the defect that the VR video or AR video that the user cannot view can be shared to other users in real time in the prior art. .
  • the video transmission method is applied to a server, and the method includes:
  • the compressed video of the second location information corresponding view is sent to the VR play terminal
  • the compressed video is sent to another video play terminal. And playing the compressed video for the other video playing terminal.
  • the method before the compressed video of the second location information corresponding view is sent to the VR play terminal, the method further includes:
  • the video of the second location information corresponding to the view includes: the second location information corresponds to the VR video of the view, and/or the second location information corresponds to the augmented reality AR video of the view.
  • the method before the acquiring the video of the second location information corresponding view angle in the preset video source according to the second location information, the method further includes:
  • the set visual information is superimposed into the video source.
  • the manner of acquiring the first location information includes:
  • the image information of the VR playback terminal is acquired by a camera preset around the VR playback terminal, and the image information is located to obtain first location information of the VR playback terminal.
  • the present disclosure also provides a video transmission method, which is applied to a VR playback terminal, and the method includes:
  • the playing the compressed video comprises:
  • the compressed video is decompressed to obtain a first decompressed video, and the first decompressed video is played.
  • the playing the compressed video comprises:
  • the second decompressed video is adjusted to the third decompressed video of the third position information corresponding view angle by the GPU based on the preset view angle algorithm according to the currently obtained third location information of the VR play terminal;
  • the method before the receiving the compressed video corresponding to the view of the second location information sent by the server, the method further includes:
  • the method before the first location information of the VR play terminal that is acquired in real time is sent to the server in real time, the method further includes:
  • the present disclosure also provides a server, the server including a processor, a memory, and a communication bus;
  • the communication bus is used to implement connection communication between a processor and a memory
  • the processor is configured to execute a video transmission program stored in the memory to implement the steps of the video transmission method described above.
  • the present disclosure also provides a VR playback terminal, the VR playback terminal including a processor, a memory, and a communication bus;
  • the communication bus is used to implement connection communication between a processor and a memory
  • the processor is configured to execute a video transmission program stored in the memory to implement the steps of the video transmission method described above.
  • the present disclosure also provides a computer readable storage medium storing one or more programs, the one or more programs being executable by one or more processors to implement the video transmission described above The steps of the method.
  • the present disclosure has at least the following advantages:
  • the video transmission method, the server, the VR playing terminal and the computer readable storage medium of the present disclosure realize wireless transmission of VR video and/or AR video between the VR playing terminal and the server, effectively avoiding the wired transmission belt
  • the incoming cable is bound to the user; and the VR video and/or AR video played by the VR playback terminal is shared to other video playing terminals for playback, thereby effectively improving the user experience; and GPU-to-VR through the VR playback terminal Fine-tuning video and/or AR video effectively improves the visual effects of VR video and/or AR video.
  • FIG. 1 is a flowchart of a video transmission method according to a first embodiment of the present disclosure
  • FIG. 2 is a flowchart of a video transmission method according to a second embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram of a server according to a third embodiment of the present disclosure.
  • FIG. 4 is a flowchart of a video transmission method according to a fifth embodiment of the present disclosure.
  • FIG. 5 is a schematic structural diagram of a VR playback terminal according to a sixth embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of receiving, by a server, a first location information sent by a VR playback terminal in real time according to an eighth embodiment of the present disclosure
  • FIG. 7 is a schematic diagram of obtaining, by an infrared camera, a first location information of a VR playback terminal according to an eighth embodiment of the present disclosure
  • FIG. 8 is a schematic diagram of obtaining, by a server, a first location information of a VR playback terminal by using a camera according to an eighth embodiment of the present disclosure
  • FIG. 9 is a schematic diagram of sharing a VR video through a large-screen video playing terminal according to a tenth embodiment of the present disclosure.
  • a first embodiment of the present disclosure is applied to a server. As shown in FIG. 1, the video transmission method includes the following specific steps:
  • Step S101 predict, according to the acquired first location information of the VR playback terminal, second location information of the VR playback terminal within a set duration.
  • the VR playback terminal is worn on the user's head to play VR video and/or AR video.
  • step S101 includes:
  • the position prediction model is a model for predicting the position of the user's head in the future set time based on the Kalman filter and/or the measurement data preprocessing technique.
  • the position prediction model includes, but is not limited to, a random acceleration model based on a maneuvering target state estimation.
  • the position prediction model can predict the position of the user's head in the future set time, and the position data of the user's head in the future set time can be accurately obtained. It effectively overcomes the delay caused by VR video and/or AR video due to compression, compression and/or wireless transmission, and effectively improves the playback precision of VR video and/or AR video.
  • the first location information of the obtained virtual reality VR playback terminal including but not limited to:
  • infrared light information of the VR playback terminal by positioning an infrared camera preset around the VR playback terminal, and positioning the infrared light information to obtain first position information of the VR playback terminal; wherein the number of the infrared cameras is one or more
  • the infrared camera is connected to the server via a cable or via a wireless transmission network;
  • the image information of the VR playback terminal is obtained by a camera preset around the VR playback terminal, and the image information is located to obtain first position information of the VR playback terminal; wherein the number of cameras is one or more; the camera and the camera
  • the server is connected by cable or by wireless transmission network.
  • Step S102 In the case that the compressed video of the second location information corresponding view is sent to the VR play terminal, when receiving the sharing instruction sent by the VR play terminal, the compressed video is sent to other video play terminals for other video playback.
  • the terminal plays compressed video.
  • other video playback terminals include at least one of the following video players:
  • VR playback terminals flat display devices, curved display devices, spherical display devices, and projection devices, etc., for playing VR video and/or AR video.
  • the number of other video playback terminals is one or more.
  • step S102 includes:
  • the compressed video of the second location information corresponding view is sent to the VR play terminal through the wireless transmission network
  • the compressed video is sent to other video play terminals for other The video playing terminal plays the compressed video
  • the method for transmitting the compressed video of the second location information corresponding to the view to the VR play terminal by using the wireless transmission network includes at least one of the following methods:
  • the compressed video corresponding to the view of the second location information is sent to the VR play terminal by using a wireless transmission network in a manner of wireless stream processing.
  • the AirPlay protocol is a wireless video transmission protocol.
  • the Miracast protocol is a wireless video transmission protocol.
  • Ways to send compressed video to other video playback terminals include but are not limited to:
  • a video transmission method implements wireless transmission of VR video and/or AR video between a VR playback terminal and a server, thereby effectively avoiding cable-to-user binding caused by wired transmission. And realize sharing the VR video and/or AR video played by the VR playback terminal to other video playing terminals for playing, thereby effectively improving the user experience.
  • a second embodiment of the present disclosure is a video transmission method applied to a server. As shown in FIG. 2, the video transmission method includes the following specific steps:
  • Step S201 predict, according to the acquired first location information of the virtual reality VR playback terminal, the second location information of the VR playback terminal within the set duration.
  • the VR playback terminal is worn on the user's head to play VR video and/or AR video.
  • step S201 includes:
  • the position prediction model is a model for predicting the position of the user's head in the future set time based on the Kalman filter and/or the measurement data preprocessing technique.
  • the position prediction model includes, but is not limited to, a random acceleration model based on a maneuvering target state estimation.
  • the position prediction model can predict the position of the user's head in the future set time, and the position data of the user's head in the future set time can be accurately obtained. It effectively overcomes the delay caused by VR video and/or AR video due to compression, compression and/or wireless transmission, and effectively improves the playback precision of VR video and/or AR video.
  • the first location information of the obtained virtual reality VR playback terminal including but not limited to:
  • infrared light information of the VR playback terminal by positioning an infrared camera preset around the VR playback terminal, and positioning the infrared light information to obtain first position information of the VR playback terminal; wherein the number of the infrared cameras is one or more
  • the infrared camera is connected to the server via a cable or via a wireless transmission network;
  • the image information of the VR playback terminal is obtained by a camera preset around the VR playback terminal, and the image information is located to obtain first position information of the VR playback terminal; wherein the number of cameras is one or more; the camera and the camera
  • the server is connected by cable or by wireless transmission network.
  • Step S202 Acquire, according to the second location information, a video corresponding to the view of the second location information in the preset video source.
  • the video corresponding to the view of the second location information includes: a VR video corresponding to the view of the second location information, and/or an augmented reality AR video corresponding to the view of the second location information.
  • step S202 includes:
  • the VR video of the second location information corresponding to the perspective is acquired.
  • step S202 includes:
  • the video of the second location information corresponding to the perspective is acquired.
  • step S202 includes:
  • the AR video source performs video capture through a 360° panoramic acquisition device set in a wireless camera or a server.
  • Step S203 compress the video of the corresponding view angle of the second location information according to the transmission rate of the preset wireless transmission network that transmits the compressed video, to obtain a compressed video of the corresponding view of the second location information.
  • Step S204 When the compressed video corresponding to the view of the second location information is sent to the VR play terminal, when receiving the sharing instruction sent by the VR play terminal, the compressed video is sent to other video play terminals for other video play terminals. Play compressed video.
  • step S204 includes:
  • the compressed video of the second location information corresponding view is sent to the VR play terminal through the wireless transmission network
  • the compressed video is sent to other video play terminals for other The video playing terminal plays the compressed video
  • other video playback terminals include at least one of the following video players:
  • VR playback terminals flat display devices, curved display devices, spherical display devices, and projection devices, etc., for playing VR video and/or AR video.
  • the number of other video playback terminals is one or more.
  • the method for transmitting the compressed video corresponding to the view of the second location information to the VR play terminal by using the wireless transmission network includes at least one of the following methods:
  • the compressed video corresponding to the view of the second location information is sent to the VR play terminal by using a wireless transmission network in a manner of wireless stream processing.
  • the AirPlay protocol is a wireless video transmission protocol.
  • the Miracast protocol is a wireless video transmission protocol.
  • a video transmission method implements wireless transmission of VR video and/or AR video between a VR playback terminal and a server, thereby effectively avoiding cable-to-user binding caused by wired transmission. And realize sharing the VR video and/or AR video played by the VR playback terminal to other video playing terminals for playing, thereby effectively improving the user experience.
  • a third embodiment of the present disclosure includes the following components:
  • the processor 110 the memory 109, and the communication bus 201.
  • communication bus 201 is used to implement connection communication between processor 110 and memory 109.
  • the processor 110 may be a general-purpose processor, such as a central processing unit (CPU), or may be a digital signal processor (DSP), an application specific integrated circuit (ASIC), or One or more integrated circuits configured to implement embodiments of the present disclosure.
  • the memory 109 is configured to store executable instructions of the processor 110;
  • the memory 109 is configured to store the program code and transmit the program code to the processor 110.
  • the memory 109 may include a volatile memory (Volatile Memory), such as a random access memory (RAM); the memory 109 may also include a non-volatile memory (Non-Volatile Memory), such as a read-only memory (Read- Only Memory, ROM), Flash Memory, Hard Disk Drive (HDD), or Solid-State Drive (SSD); the memory 109 may also include a combination of the above types of memories.
  • volatile memory such as a random access memory (RAM)
  • non-Volatile Memory such as a read-only memory (Read- Only Memory, ROM), Flash Memory, Hard Disk Drive (HDD), or Solid-State Drive (SSD)
  • the memory 109 may also include a combination of the above types of memories.
  • the processor 110 is configured to invoke the video transmission program code stored in the memory 109 to perform some or all of the steps in any one of the first embodiment to the second embodiment of the present disclosure.
  • a server implements wireless transmission of VR video and/or AR video with a VR playback terminal, thereby effectively avoiding cable-to-user restraint caused by wired transmission;
  • the VR video and/or AR video played by the VR playback terminal are shared to other video playback terminals for playback, which effectively improves the user experience.
  • a fourth embodiment of the present disclosure is a video transmission method applied to a VR playback terminal, where the video transmission method includes the following specific steps:
  • Step S401 When receiving the compressed video corresponding to the view of the second location information sent by the server, playing the compressed video, and when receiving the sharing instruction triggered by the user, sending the sharing instruction to the server, so that the server is sharing the instruction.
  • the compressed video is sent to other video playback terminals under the control of the camera.
  • the manner in which the second location information sent by the receiving server corresponds to the compressed video of the perspective includes at least one of the following methods:
  • the compressed video corresponding to the view of the second location information sent by the server is received by the wireless transmission network in a wireless stream processing manner.
  • other video playback terminals include at least one of the following video players:
  • VR playback terminals flat display devices, curved display devices, spherical display devices, and projection devices, etc., for playing VR video and/or AR video.
  • the number of other video playback terminals is one or more.
  • a video transmission method implements wireless transmission of VR video and/or AR video between a VR playback terminal and a server, thereby effectively avoiding cable-to-user binding caused by wired transmission. And realize sharing the VR video and/or AR video played by the VR playback terminal to other video playing terminals for playing, thereby effectively improving the user experience.
  • a fifth embodiment of the present disclosure is a video transmission method applied to a VR playback terminal. As shown in FIG. 4, the video transmission method includes the following specific steps:
  • Step S501 Acquire first position information of the VR play terminal in real time according to the head motion track of the wear user of the VR play terminal.
  • the first location information of the VR play terminal includes, but is not limited to, motion track information of the current user's head.
  • Step S502 The first location information of the VR playback terminal acquired in real time is sent to the server in real time, so that the server predicts the second location information of the VR playback terminal within the set duration according to the first location information.
  • step S502 includes:
  • the first location information of the VR playback terminal acquired in real time is sent to the server in real time through the wireless transmission network, so that the server predicts the second location information of the VR playback terminal within the set duration according to the first location information.
  • Step S503 in the case that the compressed video corresponding to the perspective of the second location information sent by the server is received, the compressed video is played, and when the sharing instruction triggered by the user is received, the sharing instruction is sent to the server for the server to share the instruction.
  • the compressed video is sent to other video playback terminals under the control of the camera.
  • the manner in which the second location information sent by the receiving server corresponds to the compressed video of the perspective includes at least one of the following methods:
  • the compressed video corresponding to the view of the second location information sent by the server is received by the wireless transmission network in a wireless stream processing manner.
  • other video playback terminals include at least one of the following video players:
  • VR playback terminals flat display devices, curved display devices, spherical display devices, and projection devices, etc., for playing VR video and/or AR video.
  • the number of other video playback terminals is one or more.
  • the method for playing the compressed video includes: mode one, or mode two.
  • the compressed video is decompressed to obtain a second decompressed video.
  • the second decompressed video is adjusted to the third decompressed video of the third view information corresponding view angle by the GPU based on the preset view angle algorithm according to the third location information of the currently acquired VR play terminal;
  • the view angle algorithm includes but is not limited to: an algorithm such as ATW (Asynchronous Time Warp).
  • the asynchronous time warping algorithm is processed in a thread (ATW thread), which runs in parallel with the rendering thread (or asynchronously). Before each synchronization, the ATW thread generates a new frame according to the last frame of the rendering thread. In this way, when the current frame has not been synchronized, the previous frame can be used for rendering to reduce the delay.
  • ATW thread a thread which runs in parallel with the rendering thread (or asynchronously).
  • the ATW thread Before each synchronization, the ATW thread generates a new frame according to the last frame of the rendering thread. In this way, when the current frame has not been synchronized, the previous frame can be used for rendering to reduce the delay.
  • a video transmission method implements wireless transmission of VR video and/or AR video between a VR playback terminal and a server, thereby effectively avoiding cable-to-user binding caused by wired transmission. And realize sharing the VR video and/or AR video played by the VR playback terminal to other video playing terminals for playing, effectively improving the user experience; and performing VR video and/or AR video through the GPU in the VR playback terminal. Fine-tuning effectively improves the visual effects of VR video and/or AR video.
  • a sixth embodiment of the present disclosure includes the following components:
  • Processor 210 memory 209, and communication bus 202.
  • communication bus 202 is used to implement connection communication between processor 210 and memory 209.
  • the processor 210 may be a general-purpose processor, such as a central processing unit (CPU), or may be a digital signal processor (DSP), an application specific integrated circuit (ASIC), or One or more integrated circuits configured to implement embodiments of the present disclosure.
  • the memory 209 is configured to store executable instructions of the processor 210;
  • the memory 209 is configured to store the program code and transmit the program code to the processor 210.
  • the memory 209 may include a volatile memory (Volatile Memory), such as a random access memory (RAM); the memory 209 may also include a non-volatile memory (Non-Volatile Memory), such as a read-only memory (Read- Only Memory, ROM), Flash Memory, Hard Disk Drive (HDD), or Solid-State Drive (SSD); the memory 209 may also include a combination of the above types of memories.
  • volatile memory such as a random access memory (RAM)
  • non-Volatile Memory such as a read-only memory (Read- Only Memory, ROM), Flash Memory, Hard Disk Drive (HDD), or Solid-State Drive (SSD)
  • the memory 209 may also include a combination of the above types of memories.
  • the processor 210 is configured to invoke the video transmission program code stored in the memory 209 to perform some or all of the steps in any of the fourth embodiment to the fifth embodiment of the present disclosure.
  • a VR playback terminal implements wireless transmission of VR video and/or AR video between a VR playback terminal and a server, thereby effectively avoiding cable-to-user binding caused by wired transmission. And realize sharing the VR video and/or AR video played by the VR playback terminal to other video playing terminals for playing, effectively improving the user experience; and performing VR video and/or AR video through the GPU in the VR playback terminal. Fine-tuning effectively improves the visual effects of VR video and/or AR video.
  • a seventh embodiment of the present disclosure is a computer readable storage medium.
  • the computer storage medium can be RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, mobile hard disk, CD-ROM, or any other form of storage medium known in the art.
  • the computer readable storage medium stores one or more programs that are executable by one or more processors to implement the first embodiment of the present disclosure to any of the second embodiments of the present disclosure. Part or all of the steps, or to implement some or all of the steps in any of the fifth embodiment to the sixth embodiment of the present disclosure.
  • a computer readable storage medium which stores one or more programs, the one or more programs can be executed by one or more processors, and implements a VR playback terminal and a server.
  • the wireless transmission of VR video and/or AR video effectively avoids the cable-to-user restraint caused by wired transmission; and realizes sharing VR video and/or AR video played by VR playback terminal to other video playback terminals. Playback, effectively improve the user experience; and fine-tune VR video and / or AR video through the GPU in the VR playback terminal, effectively improving the visual effect of VR video and / or AR video.
  • the eighth embodiment of the present disclosure is based on the foregoing embodiment, and an application example of the present disclosure is described by taking a transmission method of a VR video as an example and referring to FIGS. 6-8.
  • Step S801 When the multiplayer game is played, the server acquires the first location information of each VR playback terminal.
  • the manner in which the server obtains the first location information of each VR playback terminal includes but is not limited to:
  • the infrared light information of each VR playing terminal acquired by the infrared camera preset around the VR playing terminal, and performing positioning calculation on the infrared light information, to obtain first position information of each VR playing terminal;
  • the infrared The number of cameras is one or more; the infrared camera is connected to the server through a cable or through a wireless transmission network, as shown in FIG. 7;
  • the image information of each VR playback terminal acquired by the camera preset around the VR playback terminal is performed, and the image information is subjected to positioning calculation to obtain first position information of each VR playback terminal; wherein the number of cameras is One or more; the camera and server are connected by cable or through a wireless transmission network, as shown in Figure 8.
  • the number of VR play terminals is four; the VR play terminal includes: a first VR play terminal, a second VR play terminal, a third VR play terminal, and a fourth VR play terminal.
  • Step S802 the server superimposes the second VR play terminal, the third VR play terminal and the fourth VR play terminal with the current game scene, and predicts the first VR play according to the first position information x of the current time t of the first VR play terminal.
  • step S803 the server uses the second location information X as an input to generate a VR video corresponding to the location of the first VR playback terminal.
  • Step S804 the server compresses the VR video of the location of the first VR playback terminal, and then streams the data to the first VR playback terminal through Wi-Fi (WIreless-Fidelity).
  • Wi-Fi Wi-Fi
  • Step S805 The first VR play terminal acquires the real-time position of the current head of the user corresponding to the first VR play terminal, and uses the ATW algorithm to fine-tune the decompressed VR video through the preset GPU in the first VR play terminal, and the play adjusts to The first VR playback terminal corresponds to the VR video of the real-time location of the user's current head.
  • Step S806 when the first VR play terminal receives the sharing instruction sent by the user to share the VR video to the second VR play terminal, the third VR play terminal, and the fourth VR play terminal, the sharing instruction is sent to the server, and the server is The VR video corresponding to the real-time position of the current head of the user is sent to the second VR play terminal, the third VR play terminal, and the fourth VR play terminal for the second VR play under the control of the shared command.
  • the sharing instruction is sent to the server, and the server is under the control of the sharing instruction
  • the VR video of the first VR playback terminal corresponding to the real-time location of the current head of the user is sent to the second VR playback terminal for playback by the second VR playback terminal.
  • the first VR play terminal and the second VR play terminal may be connected to the server through the same wireless local area network, or connected to the server through different wireless local area networks.
  • a method for transmitting VR video realizes wireless transmission of VR video between a VR playback terminal and a server, thereby effectively avoiding cable-to-user restraint caused by wired transmission;
  • the VR video played by the VR playback terminal is shared to other video playing terminals for playing, which effectively improves the user experience; and the VR video is fine-tuned by the GPU in the VR playback terminal, thereby effectively improving the visual effect of the VR video.
  • the ninth embodiment of the present disclosure is based on the foregoing embodiment, and an application example of the present disclosure is introduced by taking an AR video transmission method as an example.
  • Step S901 the server acquires first location information of the VR playback terminal.
  • the manner in which the server obtains the first location information of each VR playback terminal includes but is not limited to:
  • the infrared light information of each VR playing terminal acquired by the infrared camera preset around the VR playing terminal, and performing positioning calculation on the infrared light information, to obtain first position information of each VR playing terminal;
  • the infrared The number of cameras is one or more; the infrared camera is connected to the server via a cable or through a wireless transmission network;
  • the image information of each VR playback terminal acquired by the camera preset around the VR playback terminal is used to perform positioning calculation on the image information to obtain first position information of the VR playback terminal; wherein the number of cameras is one or Multiple; the camera is connected to the server via a cable or via a wireless transmission network.
  • Step S902 the server generates the AR video of the real scene by using the video acquired by the preset one or more wireless cameras.
  • Step S903 the server identifies one or more setting objects in the AR video of the real scene, and performs image processing on the setting object.
  • the server identifies one or more set people in the AR video of the real scene and puts the set person on the virtual clothing.
  • step S905 the server uses the second location information X as an input, generates an AR video corresponding to the location of the VR playback terminal, and superimposes one or more setting images into the AR video.
  • Step S906 after the server compresses the AR video, it streams to the VR playback terminal through Wi-Fi.
  • Step S907 the VR playback terminal decompresses the compressed video after the AR video compression.
  • Step S908 The VR playback terminal acquires the real-time position of the current head of the VR-playing terminal corresponding to the user, and uses the ATW algorithm to fine-tune the decompressed AR video through the preset GPU in the VR playback terminal, and the playback is adjusted to the corresponding user of the VR playback terminal.
  • AR video of the real-time position of the current head
  • the decompressed AR video is played.
  • Step S909 when the VR playback terminal receives the sharing instruction sent by the user, sends the sharing instruction to the server, and the server sends the AR video streamed to the VR playback terminal to the other video playing terminal under the control of the sharing instruction, for Other video playback terminals play AR video.
  • the VR play terminal and other video play terminals can be connected to the server through the same wireless local area network, or connected to the server through different wireless local area networks.
  • the AR video transmission method described in the ninth embodiment of the present disclosure implements wireless transmission of AR video between the VR playback terminal and the server, thereby effectively avoiding cable-to-user restraint caused by wired transmission;
  • the AR video played by the VR playback terminal is shared to other video playback terminals for playback, which effectively improves the user experience; and the AR video is fine-tuned by the GPU in the VR playback terminal, thereby effectively improving the visual effect of the AR video.
  • the tenth embodiment of the present disclosure is based on the foregoing embodiment, and the VR video transmission method is taken as an example. As shown in FIG. 9, an application example of the present disclosure is introduced.
  • Step S1001 During the game, the server acquires the first location information of the VR playback terminal.
  • the manner in which the server obtains the first location information of the VR playback terminal includes but is not limited to:
  • the infrared light information of the VR playback terminal acquired by the infrared camera preset around the VR playback terminal, and the positioning calculation of the infrared light information, the first position information of the VR playback terminal is obtained; wherein the number of the infrared camera is One or more; the infrared camera is connected to the server via a cable or through a wireless transmission network;
  • the image information of the VR playback terminal acquired by the camera preset around the VR playback terminal is used to perform positioning calculation on the image information to obtain first position information of the VR playback terminal; wherein the number of cameras is one or more
  • the camera is connected to the server via a cable or via a wireless transmission network.
  • step S1003 the server uses the second location information X as an input to generate a VR video corresponding to the location of the VR playback terminal.
  • step S1004 the server compresses the VR video of the location where the VR playback terminal is located, and then streams the data to the VR playback terminal through the Wi-Fi.
  • Step S1005 The VR playback terminal decompresses the compressed video compressed by the VR video.
  • Step S1006 The VR playback terminal acquires the real-time location of the current head of the user corresponding to the VR playback terminal, and uses the ATW algorithm to fine-tune the decompressed VR video through the preset GPU in the VR playback terminal, and the playback adjustment is performed to the corresponding user of the VR playback terminal.
  • VR video of the current head's real-time location is obtained by the VR playback terminal.
  • Step S1007 When the VR playback terminal receives the sharing instruction issued by the user, the VR playback terminal sends the sharing instruction to the server, and the server sends the VR video streamed to the VR playback terminal to the large-screen video playback terminal under the control of the sharing instruction.
  • the VR playback terminal For large screen video playback terminal to play VR video; as shown in Figure 9;
  • the large-screen video playback terminal is connected to the server through an HDMI (High Definition Multimedia Interface) cable.
  • HDMI High Definition Multimedia Interface
  • Connecting to the server via an HDMI cable can dramatically reduce video transmission delays.
  • the VR video transmission method described in the tenth embodiment of the present disclosure realizes wireless transmission of VR video between the VR playback terminal and the server, thereby effectively avoiding cable-to-user restraint caused by wired transmission;
  • the VR video played by the VR playback terminal is shared to other video playing terminals for playing, which effectively improves the user experience; and the VR video is fine-tuned by the GPU in the VR playback terminal, thereby effectively improving the visual effect of the VR video.
  • the present disclosure relates to the field of virtual reality technology.
  • the technical solution of the present disclosure realizes wireless transmission of VR video and/or AR video between the VR playback terminal and the server, effectively avoids the cable-to-user restraint brought by the wired transmission; and realizes playing the VR playback terminal.
  • VR video and/or AR video sharing to other video playing terminals for playback effectively improving the user experience; and fine-tuning VR video and/or AR video through the GPU in the VR playback terminal, effectively improving VR video and/or Or the visual effect of AR video.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种视频传输方法、服务器、VR播放器及计算机可读存储介质,该方法包括:根据获取到的虚拟现实VR播放终端的第一位置信息,预测得到VR播放终端在设定时长内的第二位置信息;在将第二位置信息对应视角的压缩视频发送至VR播放终端的情况下,当接收到VR播放终端发送的共享指令时,将压缩视频发送至其他视频播放终端,以供其他视频播放终端播放压缩视频。实现了VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放器播放的VR视频和/或AR视频共享至其他视频播放器进行播放,有效的提高了用户体验。

Description

视频传输方法、服务器、VR播放终端及计算机可读存储介质 技术领域
本公开涉及虚拟现实技术领域,尤其涉及一种视频传输方法、服务器、VR播放终端及计算机可读存储介质。
背景技术
随着计算机图形学、人机接口技术、多媒体技术传感技术和网络技术等多种技术的发展,创建和体验虚拟世界的计算机仿真***,该仿真***利用计算机生成一种模拟环境,集成了多源信息的融合,并能让使用者参与其中,是一种交互式的三维动态视景和实体行为的***仿真,使用户沉浸到模拟环境中,这就是VR(Virtual Reality,虚拟现实技术)及AR(Augmented Reality,增强现实技术)。
VR技术主要包括模拟环境、感知、自然技能和传感设备等方面。其中,模拟环境是由计算机生成的、实时动态的三维立体逼真图像;感知是理想的VR应该具有一切人所具有的感知,除计算机图形技术所生成的视觉感知外,还有听觉、触觉、力觉和运动等感知,甚至还包括嗅觉和味觉等,也称为多感知;自然技能是人的头部转动,眼睛、手势、和/或其他人体行为动作,由计算机来处理与参与者的动作相适应的数据,并对用户的输入作出实时响应,并分别反馈到用户的五官;传感设备是三维交互设备。
AR技术是一种将真实世界信息和虚拟世界信息无缝集成的技术。AR技术将原本在现实世界的一定时间空间范围内很难体验到的实体信息(例如:视觉信息、声音、味道和触觉等)通过电脑等科学技术模拟仿真后再叠加,将虚拟的信息应用到真实世界,被人类感官所感知,从而达到超越现实的感官体验。真实的环境和虚拟的物体实时地叠加到了同一个画面或空间同时存在。
目前VR播放设备一般以沉浸式虚拟现实体验的产品为主,主要包括:
外接式头戴设备,该设备能够外接个人电脑、智能手机和/或游戏机等设备作为计算与存储设备,具备独立屏幕;一体式头戴设备,该设备具有屏幕、计算和存储设备,无需外接设备,可以独立运行;头戴移动终端设备,又称眼镜盒子,该设备能够将移动终端(例如:手机)放入盒子起到屏幕和计算与存储设备的作用。
目前,VR播放设备主要存在以下问题:
VR播放设备虽然能够给用户私人空间,保护用户不被外界打扰,但不能将用户观看的VR视频或AR视频实时共享给其他用户;VR播放设备均通过电缆与输入设备连接,在用户行动时容易被电缆缠绕或绊倒,或者在用户运动时将电缆从输入设备中拉出,影响了用户的使用体验;头戴移动终端设备中的移动终端由于发热导致帧率 下降,影响用户的VR视频或AR视频观看体验。
发明内容
本公开要解决的技术问题是,提供一种视频传输方法、服务器、VR播放终端及计算机可读存储介质,克服现有技术中不能将用户观看的VR视频或AR视频实时共享给其他用户的缺陷。
本公开采用的技术方案是,所述一种视频传输方法,应用于服务器,所述方法包括:
根据获取到的虚拟现实VR播放终端的第一位置信息,预测得到所述VR播放终端在设定时长内的第二位置信息;
在将所述第二位置信息对应视角的压缩视频发送至所述VR播放终端的情况下,当接收到所述VR播放终端发送的共享指令时,将所述压缩视频发送至其他视频播放终端,以供所述其他视频播放终端播放所述压缩视频。
根据一个示例性实施例,在将所述第二位置信息对应视角的压缩视频发送至所述VR播放终端之前,所述方法还包括:
根据所述第二位置信息,在预置的视频源中,获取所述第二位置信息对应视角的视频;
根据传输所述压缩视频的预设无线传输网络的传输速率,对所述第二位置信息对应视角的视频进行压缩,得到所述第二位置信息对应视角的压缩视频。
根据一个示例性实施例,所述第二位置信息对应视角的视频包括:所述第二位置信息对应视角的VR视频,和/或所述第二位置信息对应视角的增强现实AR视频。
根据一个示例性实施例,在所述根据所述第二位置信息,在预置的视频源中,获取所述第二位置信息对应视角的视频之前,所述方法还包括:
在所述第二位置信息对应视角的视频为所述第二位置信息对应视角的AR视频的情况下,将设定视觉信息叠加至所述视频源中。
根据一个示例性实施例,获取所述第一位置信息的方式,包括:
接收所述VR播放终端实时发送的第一位置信息;
或者,通过在所述VR播放终端周围预置的红外摄像头获取所述VR播放终端的红外光信息,并对所述红外光信息进行定位,得到所述VR播放终端的第一位置信息;
或者,通过在所述VR播放终端周围预置的摄像头获取所述VR播放终端的图像信息,并对所述图像信息进行定位,得到所述VR播放终端的第一位置信息。
本公开还提供一种视频传输方法,应用于VR播放终端,所述方法包括:
在接收到服务器发送的第二位置信息对应视角的压缩视频的情况下,播放所述压 缩视频,并当接收到用户触发的共享指令时,将所述共享指令发送至所述服务器,以供所述服务器在所述共享指令的控制下,将所述压缩视频发送至其他视频播放终端。
根据一个示例性实施例,所述播放所述压缩视频,包括:
在所述VR播放终端中未设置有图形处理器GPU的情况下,对所述压缩视频进行解压,得到第一解压视频,并播放所述第一解压视频。
根据一个示例性实施例,所述播放所述压缩视频,包括:
在所述VR播放终端中设置有GPU的情况下,对所述压缩视频进行解压,得到第二解压视频;
根据当前获取到的所述VR播放终端的第三位置信息,通过所述GPU基于预置的视角算法,将所述第二解压视频调整至所述第三位置信息对应视角的第三解压视频;
播放所述第三解压视频。
根据一个示例性实施例,在所述接收到所述服务器发送的所述第二位置信息对应视角的压缩视频之前,所述方法还包括:
将实时获取到的所述VR播放终端的第一位置信息实时发送至服务器,以供所述服务器根据所述第一位置信息,预测得到所述VR播放终端在设定时长内的第二位置信息。
根据一个示例性实施例,在所述将实时获取到的所述VR播放终端的第一位置信息实时发送至服务器之前,所述方法还包括:
根据所述VR播放终端的佩戴用户的头部运动轨迹,实时获取所述VR播放终端的第一位置信息。
本公开还提供一种服务器,所述服务器包括处理器、存储器及通信总线;
所述通信总线用于实现处理器和存储器之间的连接通信;
所述处理器用于执行存储器中存储的视频传输程序,以实现上述的视频传输方法的步骤。
本公开还提供一种VR播放终端,所述VR播放终端包括处理器、存储器及通信总线;
所述通信总线用于实现处理器和存储器之间的连接通信;
所述处理器用于执行存储器中存储的视频传输程序,以实现上述的视频传输方法的步骤。
本公开还提供一种计算机可读存储介质,所述计算机可读存储介质存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现上述的视 频传输方法的步骤。
采用上述技术方案,本公开至少具有下列优点:
本公开所述一种视频传输方法、服务器、VR播放终端及计算机可读存储介质,实现了VR播放终端与服务器之间的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验;并通过VR播放终端中的GPU对VR视频和/或AR视频进行微调,有效的提高了VR视频和/或AR视频的视觉效果。
附图说明
图1为本公开第一实施例的视频传输方法流程图;
图2为本公开第二实施例的视频传输方法流程图;
图3为本公开第三实施例的服务器组成结构示意图;
图4为本公开第五实施例的视频传输方法流程图;
图5为本公开第六实施例的VR播放终端组成结构示意图;
图6为本公开第八实施例的服务器接收VR播放终端实时发送的第一位置信息示意图;
图7为本公开第八实施例的服务器通过红外摄像头获取VR播放终端的第一位置信息示意图;
图8为本公开第八实施例的服务器通过摄像头获取VR播放终端的第一位置信息示意图;
图9为本公开第十实施例的通过大屏视频播放终端共享VR视频示意图。
具体实施方式
为更进一步阐述本公开为达成预定目的所采取的技术手段及功效,以下结合附图及较佳实施例,对本公开进行详细说明如后。
本公开第一实施例,一种视频传输方法,应用于服务器,如图1所示,该视频传输方法包括以下具体步骤:
步骤S101,根据获取到的VR播放终端的第一位置信息,预测得到VR播放终端在设定时长内的第二位置信息。
其中,VR播放终端佩戴于用户头部,播放VR视频和/或AR视频。
可选的,步骤S101,包括:
基于预置的位置预测模型,根据获取到的VR播放终端的第一位置信息,预测得到VR播放终端在设定时长内的第二位置信息;
其中,位置预测模型为基于卡尔曼滤波和/或测量数据预处理技术对用户头部未来设定时长内的位置进行预测的模型。
位置预测模型包括但不限于:基于机动目标状态估计的随机加速度模型。
由于用户的头部运动为具有高斯-马尔可夫特性的随机过程,通过位置预测模型对用户头部未来设定时长内的位置进行预测,能够精确获取用户头部未来设定时长内的位置数据,有效克服由于VR视频和/或AR视频由于压缩、加压缩和/或无线传输导致的延时,有效提高VR视频和/或AR视频的播放精准度。
获取到的虚拟现实VR播放终端的第一位置信息,包括但不限于:
接收VR播放终端实时发送的第一位置信息;
或者,通过在VR播放终端周围预置的红外摄像头获取VR播放终端的红外光信息,并对红外光信息进行定位,得到VR播放终端的第一位置信息;其中,红外摄像头的数量为一个或多个;红外摄像头与服务器通过电缆连接或通过无线传输网络连接;
或者,通过在VR播放终端周围预置的摄像头获取VR播放终端的图像信息,并对图像信息进行定位,得到VR播放终端的第一位置信息;其中,摄像头的数量为一个或多个;摄像头与服务器通过电缆连接或通过无线传输网络连接。
步骤S102,在将第二位置信息对应视角的压缩视频发送至VR播放终端的情况下,当接收到VR播放终端发送的共享指令时,将压缩视频发送至其他视频播放终端,以供其他视频播放终端播放压缩视频。
其中,其他视频播放终端至少包括以下视频播放器之一:
其他VR播放终端、平面显示设备、曲面显示设备、球面显示设备和投影设备等用于播放VR视频和/或AR视频的设备。
其他视频播放终端的数量为一个或多个。
可选的,步骤S102,包括:
在通过无线传输网络,将第二位置信息对应视角的压缩视频发送至VR播放终端的情况下,当接收到VR播放终端发送的共享指令时,将压缩视频发送至其他视频播放终端,以供其他视频播放终端播放压缩视频;
其中,通过无线传输网络,将第二位置信息对应视角的压缩视频发送至VR播放终端的方式至少包括以下方式之一:
通过无线传输网络,以DLNA(Digital Living Network Alliance,数字生活网络联盟)协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以Miracast协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以AirPlay协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以WIDI(Intel Wireless Display,无线高清技术)协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以闪联协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以无线串流处理的方式将第二位置信息对应视角的压缩视频发送至VR播放终端。
其中,AirPlay协议为一种无线视频传输协议。
Miracast协议为一种无线视频传输协议。
将压缩视频发送至其他视频播放终端的方式包括但不限于:
通过无线传输网络将压缩视频发送至其他视频播放终端,或者通过传输电缆将压缩视频发送至其他视频播放终端。
本公开第一实施例所述的一种视频传输方法,实现了VR播放终端与服务器之间的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验。
本公开第二实施例,一种视频传输方法,应用于服务器,如图2所示,该视频传输方法包括以下具体步骤:
步骤S201,根据获取到的虚拟现实VR播放终端的第一位置信息,预测得到VR播放终端在设定时长内的第二位置信息。
其中,VR播放终端佩戴于用户头部,播放VR视频和/或AR视频。
可选的,步骤S201,包括:
基于预置的位置预测模型,根据获取到的VR播放终端的第一位置信息,预测得到VR播放终端在设定时长内的第二位置信息;
其中,位置预测模型为基于卡尔曼滤波和/或测量数据预处理技术对用户头部未来设定时长内的位置进行预测的模型。
位置预测模型包括但不限于:基于机动目标状态估计的随机加速度模型。
由于用户的头部运动为具有高斯-马尔可夫特性的随机过程,通过位置预测模型对用户头部未来设定时长内的位置进行预测,能够精确获取用户头部未来设定时长内的位置数据,有效克服由于VR视频和/或AR视频由于压缩、加压缩和/或无线传输导致的延时,有效提高VR视频和/或AR视频的播放精准度。
获取到的虚拟现实VR播放终端的第一位置信息,包括但不限于:
接收VR播放终端实时发送的第一位置信息;
或者,通过在VR播放终端周围预置的红外摄像头获取VR播放终端的红外光信息,并对红外光信息进行定位,得到VR播放终端的第一位置信息;其中,红外摄像 头的数量为一个或多个;红外摄像头与服务器通过电缆连接或通过无线传输网络连接;
或者,通过在VR播放终端周围预置的摄像头获取VR播放终端的图像信息,并对图像信息进行定位,得到VR播放终端的第一位置信息;其中,摄像头的数量为一个或多个;摄像头与服务器通过电缆连接或通过无线传输网络连接。
步骤S202,根据第二位置信息,在预置的视频源中,获取第二位置信息对应视角的视频。
其中,第二位置信息对应视角的视频包括:第二位置信息对应视角的VR视频,和/或第二位置信息对应视角的增强现实AR视频。
可选的,步骤S202,包括:
根据第二位置信息,在预置的VR视频源中,获取第二位置信息对应视角的VR视频。
可选的,步骤S202,包括:
将设定视觉信息叠加至视频源中;
根据第二位置信息,在预置的视频源中,获取第二位置信息对应视角的视频。
可选的,步骤S202,包括:
将设定视觉信息叠加至AR视频源中;
根据第二位置信息,在预置的AR视频源中,获取第二位置信息对应视角的AR视频。
其中,AR视频源通过无线摄像头或服务器中设置的360°全景采集装置进行视频采集。
步骤S203,根据传输压缩视频的预设无线传输网络的传输速率,对第二位置信息对应视角的视频进行压缩,得到第二位置信息对应视角的压缩视频。
步骤S204,将第二位置信息对应视角的压缩视频发送至VR播放终端的情况下,当接收到VR播放终端发送的共享指令时,将压缩视频发送至其他视频播放终端,以供其他视频播放终端播放压缩视频。
可选的,步骤S204,包括:
在通过无线传输网络,将第二位置信息对应视角的压缩视频发送至VR播放终端的情况下,当接收到VR播放终端发送的共享指令时,将压缩视频发送至其他视频播放终端,以供其他视频播放终端播放压缩视频;
其中,其他视频播放终端至少包括以下视频播放器之一:
其他VR播放终端、平面显示设备、曲面显示设备、球面显示设备和投影设备等用于播放VR视频和/或AR视频的设备。
其他视频播放终端的数量为一个或多个。
其中,通过无线传输网络,将第二位置信息对应视角的压缩视频发送至VR播放 终端的方式至少包括以下方式之一:
通过无线传输网络,以DLNA(Digital Living Network Alliance,数字生活网络联盟)协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以Miracast协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以AirPlay协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以WIDI(Intel Wireless Display,无线高清技术)协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以闪联协议将第二位置信息对应视角的压缩视频发送至VR播放终端;
或者,通过无线传输网络,以无线串流处理的方式将第二位置信息对应视角的压缩视频发送至VR播放终端。
其中,AirPlay协议为一种无线视频传输协议。
Miracast协议为一种无线视频传输协议。
本公开第二实施例所述的一种视频传输方法,实现了VR播放终端与服务器之间的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验。
本公开第三实施例,一种服务器,如图3所示,包括以下组成部分:
处理器110、存储器109和通信总线201。在本公开的一些实施例中,通信总线201用于实现处理器110和存储器109之间的连接通信。
处理器110可以是通用处理器,例如中央处理器(Central Processing Unit,CPU),还可以是数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC),或者是被配置成实施本公开实施例的一个或多个集成电路。其中,存储器109用于存储处理器110的可执行指令;
存储器109,用于存储程序代码,并将该程序代码传输给处理器110。存储器109可以包括易失性存储器(Volatile Memory),例如随机存取存储器(Random Access Memory,RAM);存储器109也可以包括非易失性存储器(Non-Volatile Memory),例如只读存储器(Read-Only Memory,ROM)、快闪存储器(Flash Memory)、硬盘(Hard Disk Drive,HDD)或固态硬盘(Solid-State Drive,SSD);存储器109还可以包括上述种类的存储器的组合。
其中,处理器110用于调用存储器109存储的视频传输程序代码,执行本公开第一实施例至本公开第二实施例中任一实施例中部分或全部步骤。
本公开第三实施例所述的一种服务器,实现了与VR播放终端的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验。
本公开第四实施例,一种视频传输方法,应用于VR播放终端,该视频传输方法包括以下具体步骤:
步骤S401,在接收到服务器发送的第二位置信息对应视角的压缩视频的情况下,播放压缩视频,并当接收到用户触发的共享指令时,将共享指令发送至服务器,以供服务器在共享指令的控制下,将压缩视频发送至其他视频播放终端。
其中,接收服务器发送的第二位置信息对应视角的压缩视频的方式至少包括以下方式之一:
通过无线传输网络,以DLNA协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以Miracast协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以AirPlay协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以WIDI协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以闪联协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以无线串流处理的方式接收到服务器发送的第二位置信息对应视角的压缩视频。
其中,其他视频播放终端至少包括以下视频播放器之一:
其他VR播放终端、平面显示设备、曲面显示设备、球面显示设备和投影设备等用于播放VR视频和/或AR视频的设备。
其他视频播放终端的数量为一个或多个。
本公开第四实施例所述的一种视频传输方法,实现了VR播放终端与服务器之间的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验。
本公开第五实施例,一种视频传输方法,应用于VR播放终端,如图4所示,该视频传输方法包括以下具体步骤:
步骤S501,根据VR播放终端的佩戴用户的头部运动轨迹,实时获取VR播放终端的第一位置信息。
VR播放终端的第一位置信息包括但不限于:当前用户头部的运动轨迹信息。
步骤S502,将实时获取到的VR播放终端的第一位置信息实时发送至服务器,以供服务器根据第一位置信息,预测VR播放终端在设定时长内的第二位置信息。
可选的,步骤S502,包括:
通过无线传输网络,将实时获取到的VR播放终端的第一位置信息实时发送至服务器,以供服务器根据第一位置信息,预测VR播放终端在设定时长内的第二位置信息。
步骤S503,在接收到服务器发送的第二位置信息对应视角的压缩视频的情况下,播放压缩视频,并当接收到用户触发的共享指令时,将共享指令发送至服务器,以供服务器在共享指令的控制下,将压缩视频发送至其他视频播放终端。
其中,接收服务器发送的第二位置信息对应视角的压缩视频的方式至少包括以下方式之一:
通过无线传输网络,以DLNA协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以Miracast协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以AirPlay协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以WIDI协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以闪联协议接收到服务器发送的第二位置信息对应视角的压缩视频;
或者,通过无线传输网络,以无线串流处理的方式接收到服务器发送的第二位置信息对应视角的压缩视频。
其中,其他视频播放终端至少包括以下视频播放器之一:
其他VR播放终端、平面显示设备、曲面显示设备、球面显示设备和投影设备等用于播放VR视频和/或AR视频的设备。
其他视频播放终端的数量为一个或多个。
其中,播放压缩视频的方式包括:方式一,或者方式二。
方式一:在VR播放终端中未设置有GPU(Graphics Processing Unit,图形处理器)的情况下,对压缩视频进行解压,得到第一解压视频,并播放第一解压视频;
方式二,在VR播放终端中设置有GPU的情况下,对压缩视频进行解压,得到第二解压视频;
根据当前获取到的VR播放终端的第三位置信息,通过GPU基于预置的视角算法,将第二解压视频调整至第三位置信息对应视角的第三解压视频;
播放第三解压视频;
其中,视角算法包括但不限于:ATW(Asynchronous Timewarp,异步时间扭曲算法)等算法。
异步时间扭曲算法为在一个线程(ATW线程)中进行处理,这个线程和渲染线程平行运行(或异步运行),在每次同步之前,ATW线程根据渲染线程的最后一帧生成一个新的帧,这样,当前帧还未同步出来时,可以使用上一帧画面进行渲染,以减小延迟。
本公开第五实施例所述的一种视频传输方法,实现了VR播放终端与服务器之间的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验;并通过VR播放终端中的GPU对VR视频和/或AR视频进行微调,有效的提高了VR视频和/或AR视频的视觉效果。
本公开第六实施例,一种VR播放终端,如图5所示,包括以下组成部分:
处理器210、存储器209和通信总线202。在本公开的一些实施例中,通信总线202用于实现处理器210和存储器209之间的连接通信。
处理器210可以是通用处理器,例如中央处理器(Central Processing Unit,CPU),还可以是数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC),或者是被配置成实施本公开实施例的一个或多个集成电路。其中,存储器209用于存储处理器210的可执行指令;
存储器209,用于存储程序代码,并将该程序代码传输给处理器210。存储器209可以包括易失性存储器(Volatile Memory),例如随机存取存储器(Random Access Memory,RAM);存储器209也可以包括非易失性存储器(Non-Volatile Memory),例如只读存储器(Read-Only Memory,ROM)、快闪存储器(Flash Memory)、硬盘(Hard Disk Drive,HDD)或固态硬盘(Solid-State Drive,SSD);存储器209还可以包括上述种类的存储器的组合。
其中,处理器210用于调用存储器209存储的视频传输程序代码,执行本公开第四实施例至本公开第五实施例中任一实施例中部分或全部步骤。
本公开第六实施例所述的一种VR播放终端,实现了VR播放终端与服务器之间的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验;并通过VR播放终端中的GPU对VR视频和/或AR视频进行微调,有效的提高了VR视频和/或AR视频的视觉效果。
本公开第七实施例,一种计算机可读存储介质。
计算机存储介质可以是RAM存储器、闪存、ROM存储器、EPROM存储器、EEPROM存储器、寄存器、硬盘、移动硬盘、CD-ROM或者本领域已知的任何其他形式的存储介质。
计算机可读存储介质存储有一个或者多个程序,该一个或者多个程序可被一个或者多个处理器执行,以实现本公开第一实施例至本公开第二实施例中任一实施例中部分或全部步骤,或者以实现本公开第五实施例至本公开第六实施例中任一实施例中部分或全部步骤。
本公开第七实施例中所述的一种计算机可读存储介质,存储有一个或者多个程序,该一个或者多个程序可被一个或者多个处理器执行,实现了VR播放终端与服务器之间的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验;并通过VR播放终端中的GPU对VR视频和/或AR视频进行微调,有效的提高了VR视频和/或AR视频的视觉效果。
本公开第八实施例,本实施例是在上述实施例的基础上,以VR视频的传输方法为例,结合附图6~8介绍一个本公开的应用实例。
步骤S801,多人游戏时,服务器获取每个VR播放终端的第一位置信息。
其中,服务器获取每个VR播放终端的第一位置信息的方式包括但不限于:
通过无线传输网络,接收每个VR播放终端实时发送的第一位置信息,如图6所示;
或者,通过在VR播放终端周围预置的红外摄像头获取到的每个VR播放终端的红外光信息,并对红外光信息进行定位计算,得到每个VR播放终端的第一位置信息;其中,红外摄像头的数量为一个或多个;红外摄像头与服务器通过电缆连接或通过无线传输网络连接,如图7所示;;
或者,通过在VR播放终端周围预置的摄像头获取到的每个VR播放终端的图像信息,并对图像信息进行定位计算,得到每个VR播放终端的第一位置信息;其中,摄像头的数量为一个或多个;摄像头与服务器通过电缆连接或通过无线传输网络连接,如图8所示。
VR播放终端的数量为4个;VR播放终端包括:第一VR播放终端、第二VR播放终端、第三VR播放终端和第四VR播放终端。
步骤S802,服务器将第二VR播放终端、第三VR播放终端和第四VR播放终端与当前游戏场景叠加,并根据第一VR播放终端当前时间t的第一位置信息x,预测第一VR播放终端将来某个时间点t+Δt时的第二位置信息X;其中,Δt=压缩时间+ 网路传输时间+解压时间。
步骤S803,服务器使用第二位置信息X作为输入,生成对应于第一VR播放终端所在位置的VR视频。
步骤S804,服务器对第一VR播放终端所在位置的VR视频压缩后,通过Wi-Fi(WIreless-Fidelity,无线保真)串流至第一VR播放终端。
步骤S805,第一VR播放终端获取第一VR播放终端对应用户的当前头部的实时位置,使用ATW算法通过第一VR播放终端中预置的GPU对解压后的VR视频进行微调,播放调整至第一VR播放终端对应用户的当前头部的实时位置的VR视频。
步骤S806,当第一VR播放终端接收到用户发出的将VR视频共享至第二VR播放终端、第三VR播放终端和第四VR播放终端的共享指令时,将共享指令发送至服务器,服务器在共享指令的控制下,将第一VR播放终端对应用户的当前头部的实时位置的VR视频发送至第二VR播放终端、第三VR播放终端和第四VR播放终端,以供第二VR播放终端、第三VR播放终端和第四VR播放终端播放;
或者,当第二VR播放终端接收到用户发出的第一VR播放终端对应用户的当前头部的实时位置的VR视频共享指令时,将共享指令发送至服务器,服务器在共享指令的控制下,将第一VR播放终端对应用户的当前头部的实时位置的VR视频发送至第二VR播放终端,以供第二VR播放终端播放。
其中,第一VR播放终端与第二VR播放终端可以通过同一个无线局域网与服务器连接,或通过不同的无线局域网与服务器连接。
本公开第八实施例中所述的一种VR视频的传输方法,实现了VR播放终端与服务器之间的VR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频共享至其他视频播放终端进行播放,有效的提高了用户体验;并通过VR播放终端中的GPU对VR视频进行微调,有效的提高了VR视频的视觉效果。
本公开第九实施例,本实施例是在上述实施例的基础上,以AR视频的传输方法为例,介绍一个本公开的应用实例。
步骤S901,服务器获取VR播放终端的第一位置信息。
其中,服务器获取每个VR播放终端的第一位置信息的方式包括但不限于:
通过无线传输网络,接收每个VR播放终端实时发送的第一位置信息;
或者,通过在VR播放终端周围预置的红外摄像头获取到的每个VR播放终端的红外光信息,并对红外光信息进行定位计算,得到每个VR播放终端的第一位置信息;其中,红外摄像头的数量为一个或多个;红外摄像头与服务器通过电缆连接或通过无线传输网络连接;
或者,通过在VR播放终端周围预置的摄像头获取到的每个VR播放终端的图像 信息,并对图像信息进行定位计算,得到VR播放终端的第一位置信息;其中,摄像头的数量为一个或多个;摄像头与服务器通过电缆连接或通过无线传输网络连接。
步骤S902,服务器通过预置的一个或多个无线摄像头获取到的视频,生成真实场景的AR视频。
步骤S903,服务器对真实场景的AR视频中的一个或多个设定对象进行识别,并对设定对象进行图像处理。
例如,服务器对真实场景的AR视频中的一个或多个设定人物进行识别,并将设定人物穿上虚拟衣物。
步骤S904,服务器根据VR播放终端当前时间t的第一位置信息x,预测第一VR播放终端将来某个时间点t+Δt时的第二位置信息X;其中,Δt=压缩时间+网路传输时间+解压时间。
步骤S905,服务器使用第二位置信息X作为输入,生成对应于VR播放终端所在位置的AR视频,并将一个或多个设定图像叠加至AR视频中。
步骤S906,服务器对AR视频压缩后,通过Wi-Fi串流至VR播放终端。
步骤S907,VR播放终端对AR视频压缩后的压缩视频进行解压缩。
步骤S908,VR播放终端获取VR播放终端对应用户的当前头部的实时位置,使用ATW算法通过VR播放终端中预置的GPU对解压后的AR视频进行微调,播放调整至VR播放终端对应用户的当前头部的实时位置的AR视频;
在VR播放终端中未预置GPU的情况下,播放解压后的AR视频。
步骤S909,当VR播放终端接收到用户发出的共享指令时,将共享指令发送至服务器,服务器在共享指令的控制下,将串流至VR播放终端的AR视频发送至其他视频播放终端,以供其他视频播放终端播放AR视频。
其中,VR播放终端与其他视频播放终端可以通过同一个无线局域网与服务器连接,或通过不同的无线局域网与服务器连接。
本公开第九实施例中所述的一种AR视频的传输方法,实现了VR播放终端与服务器之间的AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验;并通过VR播放终端中的GPU对AR视频进行微调,有效的提高了AR视频的视觉效果。
本公开第十实施例,本实施例是在上述实施例的基础上,以VR视频的传输方法为例,如图9所示,介绍一个本公开的应用实例。
步骤S1001,游戏时,服务器获取VR播放终端的第一位置信息。
其中,服务器获取VR播放终端的第一位置信息的方式包括但不限于:
通过无线传输网络,接收VR播放终端实时发送的第一位置信息;
或者,通过在VR播放终端周围预置的红外摄像头获取到的VR播放终端的红外光信息,并对红外光信息进行定位计算,得到VR播放终端的第一位置信息;其中,红外摄像头的数量为一个或多个;红外摄像头与服务器通过电缆连接或通过无线传输网络连接;
或者,通过在VR播放终端周围预置的摄像头获取到的VR播放终端的图像信息,并对图像信息进行定位计算,得到VR播放终端的第一位置信息;其中,摄像头的数量为一个或多个;摄像头与服务器通过电缆连接或通过无线传输网络连接。
步骤S1002,服务器根据VR播放终端当前时间t的第一位置信息x,预测VR播放终端将来某个时间点t+Δt时的第二位置信息X;其中,Δt=压缩时间+网路传输时间+解压时间。
步骤S1003,服务器使用第二位置信息X作为输入,生成对应于VR播放终端所在位置的VR视频。
步骤S1004,服务器对VR播放终端所在位置的VR视频压缩后,通过Wi-Fi串流至VR播放终端。
步骤S1005,VR播放终端对VR视频压缩后的压缩视频进行解压缩。
步骤S1006,VR播放终端获取VR播放终端对应用户的当前头部的实时位置,使用ATW算法通过VR播放终端中预置的GPU对解压后的VR视频进行微调,播放调整至VR播放终端对应用户的当前头部的实时位置的VR视频。
步骤S1007,当VR播放终端接收到用户发出的共享指令时,将共享指令发送至服务器,服务器在共享指令的控制下,将串流至VR播放终端的VR视频发送至大屏视频播放终端,以供大屏视频播放终端播放VR视频;如图9所示;
其中,大屏视频播放终端通过HDMI(High Definition Multimedia Interface,高清晰度多媒体接口)电缆与服务器连接。
通过HDMI电缆与服务器连接,能够极大的缩短视频传输延时。
本公开第十实施例中所述的一种VR视频的传输方法,实现了VR播放终端与服务器之间的VR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频共享至其他视频播放终端进行播放,有效的提高了用户体验;并通过VR播放终端中的GPU对VR视频进行微调,有效的提高了VR视频的视觉效果。
通过具体实施方式的说明,应当可对本公开为达成预定目的所采取的技术手段及功效得以更加深入且具体的了解,然而所附图示仅是提供参考与说明之用,并非用来对本公开加以限制。
工业实用性
本公开涉及虚拟现实技术领域。本公开的技术方案实现了VR播放终端与服务器 之间的VR视频和/或AR视频的无线传输,有效的避免了有线传输带来的电缆对用户的束缚;并实现了将VR播放终端播放的VR视频和/或AR视频共享至其他视频播放终端进行播放,有效的提高了用户体验;并通过VR播放终端中的GPU对VR视频和/或AR视频进行微调,有效的提高了VR视频和/或AR视频的视觉效果。

Claims (13)

  1. 一种视频传输方法,应用于服务器,所述方法包括:
    根据获取到的虚拟现实VR播放终端的第一位置信息,预测得到所述VR播放终端在设定时长内的第二位置信息;
    在将所述第二位置信息对应视角的压缩视频发送至所述VR播放终端的情况下,当接收到所述VR播放终端发送的共享指令时,将所述压缩视频发送至其他视频播放终端,以供所述其他视频播放终端播放所述压缩视频。
  2. 根据权利要求1所述的方法,其中,在将所述第二位置信息对应视角的压缩视频发送至所述VR播放终端之前,所述方法还包括:
    根据所述第二位置信息,在预置的视频源中,获取所述第二位置信息对应视角的视频;
    根据传输所述压缩视频的预设无线传输网络的传输速率,对所述第二位置信息对应视角的视频进行压缩,得到所述第二位置信息对应视角的压缩视频。
  3. 根据权利要求2所述的方法,其中,所述第二位置信息对应视角的视频包括:所述第二位置信息对应视角的VR视频,和/或所述第二位置信息对应视角的增强现实AR视频。
  4. 根据权利要求3所述的方法,其中,在所述根据所述第二位置信息,在预置的视频源中,获取所述第二位置信息对应视角的视频之前,所述方法还包括:
    在所述第二位置信息对应视角的视频为所述第二位置信息对应视角的AR视频的情况下,将设定视觉信息叠加至所述视频源中。
  5. 根据权利要求1至4任一项所述的方法,其中,获取所述第一位置信息的方式,包括:
    接收所述VR播放终端实时发送的第一位置信息;
    或者,通过在所述VR播放终端周围预置的红外摄像头获取所述VR播放终端的红外光信息,并对所述红外光信息进行定位,得到所述VR播放终端的第一位置信息;
    或者,通过在所述VR播放终端周围预置的摄像头获取所述VR播放终端的图像信息,并对所述图像信息进行定位,得到所述VR播放终端的第一位置信息。
  6. 一种视频传输方法,应用于VR播放终端,所述方法包括:
    在接收到服务器发送的第二位置信息对应视角的压缩视频的情况下,播放所述压缩视频,并当接收到用户触发的共享指令时,将所述共享指令发送至所述服务器,以供所述服务器在所述共享指令的控制下,将所述压缩视频发送至其他视频播放终端。
  7. 根据权利要求6所述的方法,其中,所述播放所述压缩视频,包括:
    在所述VR播放终端中未设置有图形处理器GPU的情况下,对所述压缩视频进 行解压,得到第一解压视频,并播放所述第一解压视频。
  8. 根据权利要求6所述的方法,其中,所述播放所述压缩视频,包括:
    在所述VR播放终端中设置有GPU的情况下,对所述压缩视频进行解压,得到第二解压视频;
    根据当前获取到的所述VR播放终端的第三位置信息,通过所述GPU基于预置的视角算法,将所述第二解压视频调整至所述第三位置信息对应视角的第三解压视频;
    播放所述第三解压视频。
  9. 根据权利要求6所述的方法,其中,在所述接收到所述服务器发送的所述第二位置信息对应视角的压缩视频之前,所述方法还包括:
    将实时获取到的所述VR播放终端的第一位置信息实时发送至服务器,以供所述服务器根据所述第一位置信息,预测得到所述VR播放终端在设定时长内的第二位置信息。
  10. 根据权利要求9所述的方法,其中,在所述将实时获取到的所述VR播放终端的第一位置信息实时发送至服务器之前,所述方法还包括:
    根据所述VR播放终端的佩戴用户的头部运动轨迹,实时获取所述VR播放终端的第一位置信息。
  11. 一种服务器,包括处理器、存储器及通信总线;其中,
    所述通信总线设置为实现处理器和存储器之间的连接通信;
    所述处理器设置为执行存储器中存储的视频传输程序,以实现权利要求1~5中任一项所述的视频传输方法的步骤。
  12. 一种VR播放终端,包括处理器、存储器及通信总线;其中,
    所述通信总线设置为实现处理器和存储器之间的连接通信;
    所述处理器设置为执行存储器中存储的视频传输程序,以实现权利要求6~10中任一项所述的视频传输方法的步骤。
  13. 一种计算机可读存储介质,存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现权利要求1~5中任一项所述的视频传输方法的步骤,或者以实现权利要求6~10中任一项所述的视频传输方法的步骤。
PCT/CN2018/101574 2017-09-25 2018-08-21 视频传输方法、服务器、vr播放终端及计算机可读存储介质 WO2019056904A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2019571954A JP2020527300A (ja) 2017-09-25 2018-08-21 ビデオ伝送方法、サーバ、vr再生端末及びコンピュータ読み取り可能な記憶媒体
US16/626,357 US20200120380A1 (en) 2017-09-25 2018-08-21 Video transmission method, server and vr playback terminal
EP18859378.4A EP3691280B1 (en) 2017-09-25 2018-08-21 Video transmission method, server, vr playback terminal and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710874249.9 2017-09-25
CN201710874249.9A CN107613338A (zh) 2017-09-25 2017-09-25 视频传输方法、服务器、vr播放终端及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2019056904A1 true WO2019056904A1 (zh) 2019-03-28

Family

ID=61057855

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/101574 WO2019056904A1 (zh) 2017-09-25 2018-08-21 视频传输方法、服务器、vr播放终端及计算机可读存储介质

Country Status (5)

Country Link
US (1) US20200120380A1 (zh)
EP (1) EP3691280B1 (zh)
JP (1) JP2020527300A (zh)
CN (1) CN107613338A (zh)
WO (1) WO2019056904A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI699103B (zh) * 2019-04-22 2020-07-11 圓展科技股份有限公司 無線攝影機與影像串流方法
JP2021136683A (ja) * 2020-02-21 2021-09-13 聯好▲娯▼樂股▲分▼有限公司 全景リアリティシミュレーションシステム及びその使用方法

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107613338A (zh) * 2017-09-25 2018-01-19 中兴通讯股份有限公司 视频传输方法、服务器、vr播放终端及计算机可读存储介质
EP3588470A1 (en) * 2018-06-26 2020-01-01 Siemens Aktiengesellschaft Method and system for sharing automatically procedural knowledge
CN113727143A (zh) * 2018-08-30 2021-11-30 华为技术有限公司 视频投屏方法、装置、计算机设备及存储介质
CN110798708A (zh) * 2019-10-12 2020-02-14 重庆爱奇艺智能科技有限公司 一种vr设备的显示内容的投屏方法、装置与***
CN110958325B (zh) * 2019-12-11 2021-08-17 联想(北京)有限公司 一种控制方法、装置、服务器及终端
WO2021134618A1 (zh) * 2019-12-31 2021-07-08 华为技术有限公司 通信方法及装置
CN113110234B (zh) * 2021-05-11 2023-03-31 武汉灏存科技有限公司 联动控制***及方法
CN113542849B (zh) * 2021-07-06 2023-06-30 腾讯科技(深圳)有限公司 视频数据处理方法及装置、电子设备、存储介质
CN116665509B (zh) * 2023-06-02 2024-02-09 广东精天防务科技有限公司 伞降模拟训练信息处理***及伞降模拟训练***

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892683A (zh) * 2016-04-29 2016-08-24 上海乐相科技有限公司 一种显示方法及目标设备
KR20170081456A (ko) * 2016-01-04 2017-07-12 한국전자통신연구원 가상현실 기반 개인형 체험요소 공유 시스템 및 그 방법
CN107024995A (zh) * 2017-06-05 2017-08-08 河北玛雅影视有限公司 多人虚拟现实交互***及其控制方法
CN107613338A (zh) * 2017-09-25 2018-01-19 中兴通讯股份有限公司 视频传输方法、服务器、vr播放终端及计算机可读存储介质

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9319741B2 (en) * 2006-09-07 2016-04-19 Rateze Remote Mgmt Llc Finding devices in an entertainment system
US9378582B2 (en) * 2012-07-31 2016-06-28 Siemens Product Lifecycle Management Software Inc. Rendering of design data
JP6353214B2 (ja) * 2013-11-11 2018-07-04 株式会社ソニー・インタラクティブエンタテインメント 画像生成装置および画像生成方法
JP6610546B2 (ja) * 2014-07-03 2019-11-27 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
US10204658B2 (en) * 2014-07-14 2019-02-12 Sony Interactive Entertainment Inc. System and method for use in playing back panorama video content
CN204465755U (zh) * 2015-01-20 2015-07-08 刘宛平 可交互立体混合现实的装置及应用该装置的虚拟现实头盔
JPWO2016158001A1 (ja) * 2015-03-30 2018-01-25 ソニー株式会社 情報処理装置、情報処理方法、プログラム及び記録媒体
US10112111B2 (en) * 2016-03-18 2018-10-30 Sony Interactive Entertainment Inc. Spectator view perspectives in VR environments
US9924238B2 (en) * 2016-03-21 2018-03-20 Screenovate Technologies Ltd. Method and a system for using a computerized source device within the virtual environment of a head mounted device
CN107147824A (zh) * 2016-06-22 2017-09-08 深圳市量子视觉科技有限公司 多视角视频的输出方法和装置
CN106385587B (zh) * 2016-09-14 2019-08-02 三星电子(中国)研发中心 分享虚拟现实视角的方法、装置及***
CN106534892A (zh) * 2016-11-23 2017-03-22 上海沙塔信息科技有限公司 基于可视角度再编码的虚拟现实直播***和方法
CN106502427B (zh) * 2016-12-15 2023-12-01 北京国承万通信息科技有限公司 虚拟现实***及其场景呈现方法
CN106875431B (zh) * 2017-02-10 2020-03-17 成都弥知科技有限公司 具有移动预测的图像追踪方法及扩增实境实现方法
US10979663B2 (en) * 2017-03-30 2021-04-13 Yerba Buena Vr, Inc. Methods and apparatuses for image processing to optimize image resolution and for optimizing video streaming bandwidth for VR videos

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170081456A (ko) * 2016-01-04 2017-07-12 한국전자통신연구원 가상현실 기반 개인형 체험요소 공유 시스템 및 그 방법
CN105892683A (zh) * 2016-04-29 2016-08-24 上海乐相科技有限公司 一种显示方法及目标设备
CN107024995A (zh) * 2017-06-05 2017-08-08 河北玛雅影视有限公司 多人虚拟现实交互***及其控制方法
CN107613338A (zh) * 2017-09-25 2018-01-19 中兴通讯股份有限公司 视频传输方法、服务器、vr播放终端及计算机可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3691280A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI699103B (zh) * 2019-04-22 2020-07-11 圓展科技股份有限公司 無線攝影機與影像串流方法
JP2021136683A (ja) * 2020-02-21 2021-09-13 聯好▲娯▼樂股▲分▼有限公司 全景リアリティシミュレーションシステム及びその使用方法

Also Published As

Publication number Publication date
EP3691280A4 (en) 2021-01-27
EP3691280A1 (en) 2020-08-05
JP2020527300A (ja) 2020-09-03
EP3691280B1 (en) 2023-12-13
CN107613338A (zh) 2018-01-19
US20200120380A1 (en) 2020-04-16

Similar Documents

Publication Publication Date Title
EP3691280B1 (en) Video transmission method, server, vr playback terminal and computer-readable storage medium
JP6724110B2 (ja) 仮想空間中のアバター表示システム、仮想空間中のアバター表示方法、コンピュータプログラム
CN109874021B (zh) 直播互动方法、装置及***
CN107735152B (zh) 用于虚拟现实(vr)观看的扩展视野重新渲染
JP7135141B2 (ja) 情報処理システム、情報処理方法、および情報処理プログラム
US10477179B2 (en) Immersive video
US10650590B1 (en) Method and system for fully immersive virtual reality
US20210001216A1 (en) Method and device for generating video frames
US9940898B2 (en) Variable refresh rate video capture and playback
Kämäräinen et al. CloudVR: Cloud accelerated interactive mobile virtual reality
EP3573026B1 (en) Information processing apparatus, information processing method, and program
WO2018000609A1 (zh) 一种虚拟现实***中分享3d影像的方法和电子设备
JP2015184689A (ja) 動画生成装置及びプログラム
CN113766958A (zh) 用于通过使用分布式游戏引擎来预测状态的***和方法
US20190230317A1 (en) Immersive mixed reality snapshot and video clip
JP2021513773A (ja) 仮想現実ライブストリーミングにおいて視野角を同期させるための方法および装置
US20180336069A1 (en) Systems and methods for a hardware agnostic virtual experience
US20240163528A1 (en) Video data generation method and apparatus, electronic device, and readable storage medium
CN111298427A (zh) 一种在虚拟现实云游戏***中降低画面抖动的方法
JP7496558B2 (ja) コンピュータプログラム、サーバ装置、端末装置、及び方法
US20190295324A1 (en) Optimized content sharing interaction using a mixed reality environment
JP2020187706A (ja) 画像処理装置、画像処理システム、画像処理方法およびプログラム
US11100716B2 (en) Image generating apparatus and image generation method for augmented reality
KR20160136160A (ko) 가상현실 공연시스템 및 공연방법
US11294615B2 (en) Incorporating external guests into a virtual reality environment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18859378

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019571954

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018859378

Country of ref document: EP

Effective date: 20200428