CN113438495A - VR live broadcast method, device, system, equipment and storage medium - Google Patents

VR live broadcast method, device, system, equipment and storage medium Download PDF

Info

Publication number
CN113438495A
CN113438495A CN202110701479.1A CN202110701479A CN113438495A CN 113438495 A CN113438495 A CN 113438495A CN 202110701479 A CN202110701479 A CN 202110701479A CN 113438495 A CN113438495 A CN 113438495A
Authority
CN
China
Prior art keywords
virtual reality
slice
user
resolution
stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110701479.1A
Other languages
Chinese (zh)
Inventor
魏鸿斌
陈浩源
莫俊彬
潘桂新
徐舒
胡翔
成静静
林少泽
王笃炎
罗文杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN202110701479.1A priority Critical patent/CN113438495A/en
Publication of CN113438495A publication Critical patent/CN113438495A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application provides a VR live broadcast method, device, system, equipment and storage medium, wherein, this method carries out FOV panorama section processing to VR panoramic video stream of VR panoramic camera propelling movement through marginal cloud platform, obtain VR section stream, the transmission bandwidth of in-service use has been reduced, network bandwidth cost has been reduced, thus, user terminal is based on above-mentioned VR section stream, obtain 360 degrees panorama drawing, reach better live broadcast effect, the current cost for reducing network bandwidth has been solved, reduce the transmission code rate of VR live broadcast video, bring not clear, dizzy problem for the user. In addition, the edge cloud platform can also determine a main visual angle of a user corresponding to the user terminal, and adjust camera parameters of the VR panoramic camera according to the main visual angle, so that a three-dimensional live broadcast scene can be restored really, the user can experience a 3D effect and an immersion feeling better, and the degree of freedom of watching live broadcast videos and the fluency of direction switching of the user are also improved.

Description

VR live broadcast method, device, system, equipment and storage medium
Technical Field
The present application relates to the field of live video technologies, and in particular, to a Virtual Reality (VR) live broadcasting method, apparatus, system, device, and storage medium.
Background
Live broadcast is a recently emerging switching mode. In live broadcasting, a handheld anchor terminal device of an anchor collects images, and the collected images are synchronously transmitted to a plurality of watching terminal devices in real time through a network, so that a plurality of viewers watch the images. Meanwhile, the viewer can input information such as characters, images, expressions, gifts and the like on the viewing terminal equipment, and the information is displayed to the anchor terminal equipment and other viewing terminal equipment through the network, so that the information interacts with the anchor terminal equipment and other viewers on line.
VR live broadcast combines virtual reality with live broadcast, and is a novel live broadcast mode. In the aspect of VR live broadcasting, the method is popularized on a scale in sports events, hot news, concerts and release meetings at present. Compare with traditional plane 2D video live broadcast, VR live broadcast can adopt panoramic camera or the camera in a plurality of different position to snatch the picture when video acquisition, then realize the concatenation and the coding of picture at local or high in the clouds encoder.
However, as the picture content increases, the network bandwidth required for VR live broadcast also increases. In order to reduce the cost of network bandwidth, the existing VR live broadcast platform can reduce the transmission code rate of videos, but the VR live broadcast can bring unclear and dizzy feelings to users, and the user experience is greatly reduced.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides a VR live broadcast method, device, system, equipment and storage medium.
In a first aspect, an embodiment of the present application provides a VR live broadcast method, where the VR live broadcast method is applied to an edge cloud platform, and the VR live broadcast method includes the following steps:
receiving a compression-coded VR panoramic video stream pushed by a VR panoramic video camera through a fifth Generation Mobile Communication Technology (5G) access device;
performing field angle of view (FOV) panorama slice processing on the compression-encoded VR panorama video stream to obtain a VR slice stream;
after an acquisition request sent by a user terminal through the 5G access device is received, the VR slice stream is sent to the user terminal according to the acquisition request, so that the user terminal decodes the VR slice stream, and a 360-degree panoramic drawing is obtained based on the decoded VR slice stream;
and determining a main visual angle of a user corresponding to the user terminal, and sending an adjusting instruction to the VR panoramic camera according to the main visual angle of the user, so that the VR panoramic camera adjusts camera parameters according to the adjusting instruction.
In a possible implementation manner, the performing FOV panorama slice processing on the compression-encoded VR panorama video stream to obtain a VR slice stream includes:
according to different visual angles of the user, carrying out FOV panorama slice processing on the VR panorama video stream subjected to compression coding to obtain the VR slice stream, so that the resolution of a VR slice corresponding to a main visual angle of the user is a first resolution, the resolution of a VR slice corresponding to a left-right direction visual angle of the user is a second resolution, and the resolution of a VR slice corresponding to the rest visual angles except the main visual angle and the left-right direction visual angle of the user is a third resolution, wherein the first resolution is greater than the second resolution, and the second resolution is greater than the third resolution.
In one possible implementation, the method further includes:
receiving the decoded VR slice stream sent by the user terminal through the 5G access equipment;
and performing synthesis rendering on the decoded VR slice stream, and returning a synthesis rendering result to the user terminal, so that the user terminal obtains the 360-degree panoramic drawing according to the synthesis rendering result.
In a possible implementation manner, the receiving a compressed and encoded VR panorama video stream pushed by a VR panorama camera through a 5G access device includes:
and receiving the VR panoramic video stream after the compression coding pushed by the user side distribution gateway, wherein the user side distribution gateway is used for carrying out data distribution on the VR panoramic video stream after the compression coding pushed by the 5G access equipment, and the flow is unloaded nearby.
In a second aspect, an embodiment of the present application provides a VR live broadcast device, where the device is applied to an edge cloud platform, and the device includes:
the receiving module is used for receiving the compressed and coded VR panoramic video stream pushed by the VR panoramic camera through the 5G access device;
the cutting module is used for carrying out FOV panorama slicing processing on the VR panoramic video stream subjected to compression coding to obtain a VR slice stream;
a sending module, configured to send the VR slice stream to the user terminal according to an acquisition request sent by the user terminal through the 5G access device after receiving the acquisition request, so that the user terminal decodes the VR slice stream, and obtains a 360-degree panorama drawing based on the decoded VR slice stream;
and the adjusting module is used for determining a main visual angle of a user corresponding to the user terminal and sending an adjusting instruction to the VR panoramic camera according to the main visual angle of the user so that the VR panoramic camera adjusts the parameters of the camera according to the adjusting instruction.
In a possible implementation manner, the cutting module is specifically configured to:
according to different visual angles of the user, carrying out FOV panorama slice processing on the VR panorama video stream subjected to compression coding to obtain the VR slice stream, so that the resolution of a VR slice corresponding to a main visual angle of the user is a first resolution, the resolution of a VR slice corresponding to a left-right direction visual angle of the user is a second resolution, and the resolution of a VR slice corresponding to the rest visual angles except the main visual angle and the left-right direction visual angle of the user is a third resolution, wherein the first resolution is greater than the second resolution, and the second resolution is greater than the third resolution.
In one possible implementation, the system further includes a rendering module configured to:
receiving the decoded VR slice stream sent by the user terminal through the 5G access equipment;
and performing synthesis rendering on the decoded VR slice stream, and returning a synthesis rendering result to the user terminal, so that the user terminal obtains the 360-degree panoramic drawing according to the synthesis rendering result.
In a possible implementation manner, the receiving module is specifically configured to:
and receiving the VR panoramic video stream after the compression coding pushed by the user side distribution gateway, wherein the user side distribution gateway is used for carrying out data distribution on the VR panoramic video stream after the compression coding pushed by the 5G access equipment, and the flow is unloaded nearby.
In a third aspect, an embodiment of the present application provides a VR live broadcast system, including:
the VR panoramic camera is used for pushing the compressed and coded VR panoramic video stream to the edge cloud platform through the 5G access equipment;
the edge cloud platform is used for performing FOV panorama slice processing on the VR panoramic video stream subjected to compression coding to obtain a VR slice stream;
the user terminal is used for sending an acquisition request to the edge cloud platform through the 5G access equipment;
the edge cloud platform is used for sending the VR slice stream to the user terminal according to the acquisition request;
the user terminal is configured to decode the VR slice stream, and obtain a 360-degree panorama drawing based on the decoded VR slice stream;
the edge cloud platform is used for determining a main view angle of a user corresponding to the user terminal and sending an adjusting instruction to the VR panoramic camera according to the main view angle of the user;
and the VR panoramic camera is used for adjusting camera parameters according to the adjusting instruction.
In a fourth aspect, an embodiment of the present application provides a VR live broadcast device, including:
a processor;
a memory; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program causes a server to execute the method according to the first aspect.
In a sixth aspect, the present application provides a computer program product, which includes computer instructions for executing the method of the first aspect by a processor.
The VR live broadcast method, the VR live broadcast device, the VR live broadcast system, equipment and a storage medium are provided, the VR live broadcast method receives VR panoramic video streams pushed by a VR panoramic camera through an edge cloud platform, and further performs FOV panoramic slice processing on the VR panoramic video streams after compression coding to obtain VR slice streams, the actual use transmission bandwidth is reduced, the network bandwidth cost is reduced, and therefore a user terminal obtains 360-degree panoramic drawing based on the VR slice streams, a better live broadcast effect is achieved, the existing cost for reducing the network bandwidth is solved, the transmission code rate of VR live broadcast videos is reduced, and the problems of unclear and dizzy are brought to users. Moreover, the edge cloud platform also determines a main view angle of a user corresponding to the user terminal, and adjusts camera parameters of the VR panoramic camera according to the main view angle, so that a three-dimensional live broadcast scene can be restored really, the user can experience a 3D effect and an immersion feeling better, the degree of freedom of watching live broadcast videos and the fluency of direction switching of the user are improved, and the edge cloud platform is suitable for application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic view of a VR live broadcast system architecture provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of a VR live broadcast method provided in an embodiment of the present application;
fig. 3 is a schematic flow chart of another VR live broadcasting method provided in the embodiment of the present application;
fig. 4 is a schematic structural diagram of a VR live broadcasting device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of another VR live broadcasting device according to an embodiment of the present application;
fig. 6A is a schematic diagram of a basic hardware architecture of a VR live broadcast device provided in the present application;
fig. 6B is a schematic diagram of a basic hardware architecture of another VR live device provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," and "fourth," if any, in the description and claims of this application and the above-described figures are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The VR virtual reality is a 3D virtual world generated through a series of image renderings, and a user uses a specific user terminal (for example, VR glasses) to realize simulation of senses such as vision, and obtain an immersive experience, thereby achieving an immersive feeling. With the gradual development and maturity of VR technology, VR is widely used in the fields of medical health, education and training, cultural and educational entertainment, live video broadcast, etc. VR live broadcast is a novel live broadcast mode, and the existing VR live broadcast platform can reduce the transmission code rate of videos for reducing the cost of network bandwidth because the network bandwidth required by VR live broadcast is large. However, the live VR broadcast may bring users with feelings of unclear and dizziness, and cannot meet practical application.
Therefore, in order to solve the above problems, an embodiment of the present application provides a VR live broadcast method, where FOV panoramic slice processing is performed on a VR panoramic video stream through an edge cloud platform to obtain a VR slice stream, so that an actually used transmission bandwidth is reduced, and a network bandwidth cost is reduced. In addition, the edge cloud platform can also determine a main visual angle of a user corresponding to the user terminal, and adjust camera parameters of the VR panoramic camera according to the main visual angle, so that a three-dimensional live broadcast scene can be restored really, the user can experience a 3D effect and an immersion feeling better, and the degree of freedom of watching live broadcast videos and the fluency of direction switching of the user are also improved.
Optionally, the VR live broadcasting method provided by the present application may be applied to the VR live broadcasting system architecture diagram shown in fig. 1, and as shown in fig. 1, the system may include a VR panoramic camera 101, a 5G access device 102, an edge cloud platform 103, and a user terminal 104. The user terminal 104 may be a mobile phone, a tablet computer, or VR glasses.
It is understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the architecture of the VR live broadcast system. In other possible embodiments of the present application, the foregoing architecture may include more or less components than those shown in the drawings, or combine some components, or split some components, or arrange different components, which may be determined according to practical application scenarios, and is not limited herein. The components shown in fig. 1 may be implemented in hardware, software, or a combination of software and hardware.
In a specific implementation process, the VR panoramic camera 101 may generate a VR panoramic video stream, and may implement lossless compression coding and stream pushing functions. The 5G access device 102 supports 5G access, and realizes that VR panoramic video streams are transmitted to the edge cloud platform 103. And the edge cloud platform 103 carries out FOV panorama slice processing on the VR live-broadcast end ascending VR panoramic video stream to obtain a VR slice stream. The user terminal 104 obtains and plays the VR slice stream, for example, the 5G access device 102 sends an obtaining request to the edge cloud platform 103, the edge cloud platform 103 returns the VR slice stream to the user terminal 104, and the user terminal 104 decodes the VR slice stream to obtain a 360-degree panorama drawing. Wherein, edge cloud platform 103 carries out FOV panorama section processing to VR panorama video stream, obtains VR section stream to, reduced the in-service use transmission bandwidth, reduced network bandwidth cost, solved current cost for reducing network bandwidth, reduced the transmission code rate of VR live broadcast video, brought unclear, dizzy problem for the user.
Here, the edge cloud platform 103 may further determine a main view angle of the user corresponding to the user terminal 104, so as to adjust a camera parameter of the VR panoramic camera 101 according to the main view angle, for example, adjust an angle at which the VR panoramic camera 101 collects a live stream, so that a three-dimensional live scene can be restored more truly, so that the user can experience a 3D effect and an immersion feeling better, the degree of freedom of watching a live video by the user and the fluency of direction switching are also improved, and the method is suitable for application.
In addition, the system may further include a user plane shunt gateway, where the user plane shunt gateway is used to perform data shunt, traffic offload processing, and the like on the VR panoramic video stream pushed by the VR panoramic camera 101.
When the 5G access device performs the uplink push flow and the downlink pull flow, the 5G network slice may be used, for example, the slice 1 and the slice 2 are used to perform video slicing, so that the time delay and the bandwidth of the uplink push flow and the downlink pull flow are guaranteed.
The system can further comprise a 5G core network, VR panoramic video streaming is locally deployed on a local edge cloud platform according to the nearby selection of the area where the VR panoramic video camera 101 is located, the purpose that the time delay required by streaming pushing, visual angle switching and definition resolution changing can be reduced to the condition that a user does not sense is achieved, the slice selection of the 5G network when VR panoramic video streaming is uploaded and VR video content is played by a user terminal is also achieved, the special network is specially used when VR videos are uploaded and downloaded, and therefore the VR live broadcast smooth use is guaranteed.
In addition, the system architecture and the service scenario described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not constitute a limitation to the technical solution provided in the embodiment of the present application, and it can be known by a person skilled in the art that the technical solution provided in the embodiment of the present application is also applicable to similar technical problems along with the evolution of the network architecture and the appearance of a new service scenario.
The technical solutions of the present application are described below with several embodiments as examples, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 2 is a schematic flow diagram of a VR live broadcast method provided in an embodiment of the present application, where an execution subject of the embodiment may be an edge cloud platform in the embodiment shown in fig. 1, and as shown in fig. 2, the method may include:
s201: and receiving the compressed and coded VR panoramic video stream pushed by the VR panoramic camera through the 5G access equipment.
Here, the VR panoramic camera may collect live streaming in real time, generate VR panoramic video streaming, compress and encode the VR panoramic video streaming, and then push the VR panoramic video streaming after compression and encoding to the edge cloud platform through the 5G access device.
In addition, the VR panoramic camera can also push the VR panoramic video stream after compression coding to a user plane distribution gateway through 5G access equipment. The user surface shunt gateway is used for shunting data of the VR panoramic video stream subjected to compression coding, the flow is unloaded and processed nearby, and then the VR panoramic video stream subjected to compression coding is pushed to the edge cloud platform, so that the application requirement is met.
S202: and performing FOV panorama slice processing on the VR panorama video stream subjected to compression coding to obtain a VR slice stream.
Exemplarily, after receiving the VR panoramic video stream after compression coding pushed by the VR panoramic camera, the edge cloud platform may perform FOV panoramic slice processing on the VR panoramic video stream after compression coding according to different viewing angles of the user to obtain a VR slice stream, so that the resolution of a VR slice corresponding to a main viewing angle of the user is a first resolution, the resolution of a VR slice corresponding to a left-right direction viewing angle of the user is a second resolution, and the resolution of a VR slice corresponding to a remaining viewing angle of the user except the main viewing angle and the left-right direction viewing angle is a third resolution, where the first resolution is greater than the second resolution, and the second resolution is greater than the third resolution. For example, above-mentioned marginal cloud platform cuts the video according to the different visual angle mirrors of user, makes the main visual angle video section that the user actually watched be 4K resolution ratio, and the comparatively frequent pivoted left and right direction visual angle video section of user habit is 2K resolution ratio, and other visual angle slices are 1080P or lower resolution ratio etc. to, make follow-up reduction three-dimensional live scene that can be more real, experience that the user can be better 3D effect and immerse the sense.
In addition, when the 5G access device performs the uplink push flow and the downlink pull flow, the 5G network slice may be used, for example, the slice 1 and the slice 2 are used to perform video slicing, so as to ensure the time delay and the bandwidth of the uplink push flow and the downlink pull flow.
In the embodiment of the application, the FOV slicing technology is deployed on the edge cloud, and compared with the FOV slicing technology which is deployed on the public cloud, the time delay is short, so that when a user rotates to switch the visual angle in the using process, the user can quickly switch to high definition, and the actual experience effect of the user is improved.
Moreover, the embodiment of the application adopts a slicing technology besides a 5G + VR live broadcast system, so that the occupied bandwidth of a VR video live broadcast source can be reduced when network congestion occurs, and a better live broadcast effect is guaranteed.
S203: and after receiving an acquisition request sent by the user terminal through the 5G access device, sending the VR slice stream to the user terminal according to the acquisition request so that the user terminal decodes the VR slice stream, and obtaining 360-degree panoramic drawings based on the decoded VR slice stream.
S204: and determining a main view angle of a user corresponding to the user terminal, and sending an adjusting instruction to the VR panoramic camera according to the main view angle of the user, so that the VR panoramic camera adjusts camera parameters according to the adjusting instruction.
Here, above-mentioned marginal cloud platform can also realize that the user switches the dispatch in the visual angle of watching the live process rotation of VR, promptly through judging user's main visual angle to in time feed back VR panorama camera adjustment camera machine parameter, in order to guarantee that the user watches video flow and watches the effect.
In addition, the edge cloud platform may further establish an adjustment model according to the habit and preference of the user, so that an adjustment instruction is sent to the VR panoramic camera based on the adjustment model to adjust camera parameters of the VR panoramic camera, for example, an angle at which the VR panoramic camera acquires a live stream is adjusted.
The VR panoramic video stream after the compression coding of VR panoramic camera propelling movement is received through edge cloud platform in this application embodiment, and then, to the VR panoramic video stream after the compression coding, carry out FOV panorama section processing, obtain VR section stream, thereby, the in-service use transmission bandwidth has been reduced, network bandwidth cost has been reduced, the current cost for reducing network bandwidth has been solved, reduce the transmission code rate of VR live broadcast video, bring unclear, dizzy problem for the user. After receiving an acquisition request sent by a user terminal, the edge cloud platform sends the VR slice stream to the user terminal according to the acquisition request, so that the user terminal decodes the VR slice stream, and obtains 360-degree panoramic drawing based on the decoded VR slice stream, wherein the edge cloud platform also determines a main view angle of a user corresponding to the user terminal, and adjusts parameters of a camera of the VR panoramic camera according to the main view angle, so that a three-dimensional live broadcast scene can be really restored, the user can better experience a 3D effect and an immersion feeling, the degree of freedom of the user for watching live broadcast video and the smoothness of direction switching are also improved, and the method is suitable for application. In addition, the embodiment of the application can meet the requirements of different bandwidths and time delays, can be accessed and used at any time in scenes such as large exhibitions and the like, does not need special line guarantee, and is time-saving and labor-saving.
In addition, the edge cloud platform in the embodiment of the application may further deploy video processing rendering to achieve slicing and composite rendering of different code rates of a video stream of the VR terminal, and achieve low-delay rendering by using the characteristics of edge cloud localized deployment and hardware such as a dedicated Graphics Processing Unit (GPU). Fig. 3 is a schematic flow chart of another VR live broadcasting method provided in the embodiment of the present application.
As shown in fig. 3, the method includes:
s301: and receiving the compressed and coded VR panoramic video stream pushed by the VR panoramic camera through the 5G access equipment.
S302: and performing FOV panorama slice processing on the VR panorama video stream subjected to compression coding to obtain a VR slice stream.
In steps S301 to S302, refer to the related description of steps S201 to S202, which is not described herein again.
S303: and after receiving an acquisition request sent by the user terminal through the 5G access equipment, sending the VR slice stream to the user terminal according to the acquisition request so that the user terminal decodes the VR slice stream.
S304: and receiving the decoded VR slice stream sent by the user terminal through the 5G access equipment.
S305: and performing synthesis rendering on the decoded VR slice stream, and returning a synthesis rendering result to the user terminal so that the user terminal obtains 360-degree panoramic drawing according to the synthesis rendering result.
The edge cloud platform deploys video processing and rendering, achieves slicing and synthesis rendering of different code rates of video streams of the VR terminal, and achieves low-delay rendering by means of the characteristics of edge cloud localization deployment and special hardware such as a GPU.
S306: and determining a main visual angle of a user corresponding to the user terminal, and sending an adjusting instruction to the VR panoramic camera according to the main visual angle of the user, so that the VR panoramic camera adjusts the parameters of the camera according to the adjusting instruction.
Step S306 refers to the related description of step S204, and is not described herein again.
In the embodiment of the application, the edge cloud platform can also deploy video processing and rendering, so that slicing and synthetic rendering of different code rates of video streams of the VR terminal are realized, and low-delay rendering is realized by using the characteristics of edge cloud localization deployment and special hardware such as a GPU. Moreover, the VR panoramic video stream is processed by FOV panoramic slicing through the edge cloud platform to obtain the VR slice stream, so that the transmission bandwidth of actual use is reduced, the network bandwidth cost is reduced, the existing cost for reducing the network bandwidth is solved, the transmission code rate of VR live video is reduced, and the problems of unclear and dizzy are brought to users. In addition, the edge cloud platform can also determine a main visual angle of a user corresponding to the user terminal, and adjust camera parameters of the VR panoramic camera according to the main visual angle, so that a three-dimensional live broadcast scene can be restored really, the user can experience a 3D effect and an immersion feeling better, the degree of freedom of watching live broadcast videos and the fluency of direction switching of the user are improved, and the method is suitable for application.
Corresponding to the VR live broadcast method in the foregoing embodiment, fig. 4 is a schematic structural diagram of a VR live broadcast device provided in the embodiment of the present application. For convenience of explanation, only portions related to the embodiments of the present application are shown. Fig. 4 is a schematic structural diagram of a live device of VR provided in an embodiment of the present application, where the live device 40 of VR includes: a receiving module 401, a cutting module 402, a sending module 403 and an adjusting module 404. The VR live broadcast device may be the edge cloud platform itself, or a chip or an integrated circuit that implements the functions of the edge cloud platform. It should be noted that the division of the receiving module, the cutting module, the sending module and the adjusting module is only a division of logical functions, and the two may be integrated or independent physically.
The receiving module 401 is configured to receive a VR panoramic video stream pushed by a VR panoramic video camera through a 5G access device and subjected to compression coding.
And a cutting module 402, configured to perform FOV panorama slice processing on the compression-encoded VR panorama video stream to obtain a VR slice stream.
A sending module 403, configured to send the VR slice stream to the user terminal according to the acquisition request after receiving the acquisition request sent by the user terminal through the 5G access device, so that the user terminal decodes the VR slice stream, and obtains a 360-degree panorama drawing based on the decoded VR slice stream.
An adjusting module 404, configured to determine a main viewing angle of a user corresponding to the user terminal, and send an adjusting instruction to the VR panoramic camera according to the main viewing angle of the user, so that the VR panoramic camera adjusts camera parameters according to the adjusting instruction.
In a possible implementation manner, the cutting module 402 is specifically configured to:
according to different visual angles of the user, carrying out FOV panorama slice processing on the VR panorama video stream subjected to compression coding to obtain the VR slice stream, so that the resolution of a VR slice corresponding to a main visual angle of the user is a first resolution, the resolution of a VR slice corresponding to a left-right direction visual angle of the user is a second resolution, and the resolution of a VR slice corresponding to the rest visual angles except the main visual angle and the left-right direction visual angle of the user is a third resolution, wherein the first resolution is greater than the second resolution, and the second resolution is greater than the third resolution.
In a possible implementation manner, the receiving module 401 is specifically configured to:
and receiving the VR panoramic video stream after the compression coding pushed by the user side distribution gateway, wherein the user side distribution gateway is used for carrying out data distribution on the VR panoramic video stream after the compression coding pushed by the 5G access equipment, and the flow is unloaded nearby.
The apparatus provided in the embodiment of the present application may be used to implement the technical solution of the method embodiment in fig. 2, which has similar implementation principles and technical effects, and is not described herein again in the embodiment of the present application.
Fig. 5 is a schematic structural diagram of another VR live broadcast device provided in an embodiment of the present application. On the basis of fig. 4, the VR live broadcasting device 40 further includes: a rendering module 405.
The sending module 403 is configured to, after receiving an acquisition request sent by a user terminal through the 5G access device, send the VR slice stream to the user terminal according to the acquisition request, so that the user terminal decodes the VR slice stream.
A rendering module 405, configured to receive the decoded VR slice stream sent by the user terminal through the 5G access device; performing composite rendering on the decoded VR slice stream;
a sending module 403, configured to return a composite rendering result to the user terminal, so that the user terminal obtains the 360-degree panoramic drawing according to the composite rendering result.
The apparatus provided in the embodiment of the present application may be used to implement the technical solution of the method embodiment in fig. 3, which has similar implementation principles and technical effects, and is not described herein again in the embodiment of the present application.
Alternatively, fig. 6A and 6B schematically provide a basic hardware architecture of a VR live device described in the present application, respectively.
Referring to fig. 6A and 6B, a VR live device includes at least one processor 601 and a communication interface 603. Further optionally, a memory 602 and a bus 604 may also be included.
In VR live broadcast device, the number of processors 601 may be one or more, and fig. 6A and 6B only illustrate one of the processors 601. Alternatively, the processor 601 may be a Central Processing Unit (CPU), a GPU, or a Digital Signal Processor (DSP). If the VR live device has multiple processors 601, the types of the multiple processors 601 may be different, or may be the same. Optionally, the plurality of processors 601 of the VR live device may also be integrated into a multi-core processor.
Memory 602 stores computer instructions and data; the memory 602 may store computer instructions and data necessary to implement the VR live broadcast method provided herein, e.g., the memory 602 stores instructions for implementing the steps of the VR live broadcast method described above. The memory 602 may be any one or any combination of the following storage media: nonvolatile memory (e.g., Read Only Memory (ROM), Solid State Disk (SSD), hard disk (HDD), optical disk), volatile memory.
The communication interface 603 may provide information input/output for the at least one processor. Any one or any combination of the following devices may also be included: a network interface (e.g., an ethernet interface), a wireless network card, etc. having a network access function.
Optionally, the communication interface 603 may also be used for data communication between the VR live device and other computing devices or terminals.
Further alternatively, fig. 6A and 6B show the bus 604 by a thick line. The bus 604 may connect the processor 601 with the memory 602 and the communication interface 603. Thus, via bus 604, processor 601 may access memory 602 and may also interact with other computing devices or terminals using communication interface 603.
In this application, the VR live broadcast device executes the computer instructions in the memory 602, so that the VR live broadcast device implements the VR live broadcast method provided by this application, or deploys the VR live broadcast apparatus.
From the viewpoint of logical functional division, as shown in fig. 6A, the memory 602 may include a receiving module 401, a cutting module 402, a sending module 403, and an adjusting module 404. The inclusion herein merely refers to that the instructions stored in the memory, when executed, may implement the functions of the receiving module, the cutting module, the sending module and the adjusting module, respectively, without limitation to physical structures.
In one possible design, as shown in FIG. 6B, rendering module 405 may be included in memory 602, where the inclusion merely refers to the ability of instructions stored in the memory to perform the functions of the rendering module when executed, and is not limited to a physical structure.
In addition, the VR live broadcast device can be implemented by software as shown in fig. 6A and 6B, or can be implemented by hardware as a hardware module or as a circuit unit.
A computer-readable storage medium is provided, where the computer program product includes computer instructions that instruct a computing device to execute the above VR live broadcast method provided herein.
An embodiment of the present application provides a computer program product, which includes computer instructions, where the computer instructions are executed by a processor to perform the VR live broadcasting method provided in the present application.
The present application provides a chip comprising at least one processor and a communication interface providing information input and/or output for the at least one processor. Further, the chip may also include at least one memory for storing computer instructions. The at least one processor is used for calling and executing the computer instructions to execute the VR live broadcast method provided by the application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

Claims (10)

1. A virtual reality live broadcasting method is applied to an edge cloud platform and comprises the following steps:
receiving a compressed and coded virtual reality panoramic video stream pushed by a virtual reality panoramic camera through a fifth generation mobile communication technology 5G access device;
carrying out field angle panoramic slice processing on the compressed and coded virtual reality panoramic video stream to obtain a virtual reality slice stream;
after receiving an acquisition request sent by a user terminal through the 5G access equipment, sending the virtual reality slice stream to the user terminal according to the acquisition request so that the user terminal decodes the virtual reality slice stream, and obtaining 360-degree panoramic drawing based on the decoded virtual reality slice stream;
and determining a main visual angle of a user corresponding to the user terminal, and sending an adjusting instruction to the virtual reality panoramic camera according to the main visual angle of the user, so that the virtual reality panoramic camera adjusts camera parameters according to the adjusting instruction.
2. The method according to claim 1, wherein the performing field-of-view panorama slice processing on the compressed and encoded virtual reality panorama video stream to obtain a virtual reality slice stream comprises:
according to different visual angles of the user, carrying out visual angle panoramic slice processing on the compressed and coded virtual reality panoramic video stream to obtain the virtual reality slice stream, so that the resolution of a virtual reality slice corresponding to a main visual angle of the user is a first resolution, the resolution of a virtual reality slice corresponding to a left visual angle and a right visual angle of the user is a second resolution, the resolution of a virtual reality slice corresponding to the remaining visual angles except the main visual angle and the left visual angle of the user is a third resolution, wherein the first resolution is greater than the second resolution, and the second resolution is greater than the third resolution.
3. The method of claim 1, further comprising:
receiving the decoded virtual reality slice stream sent by the user terminal through the 5G access device;
and performing synthesis rendering on the decoded virtual reality slice stream, and returning a synthesis rendering result to the user terminal, so that the user terminal obtains the 360-degree panoramic drawing according to the synthesis rendering result.
4. The method of claim 1, wherein the receiving the compressed and encoded virtual reality panorama video stream pushed by the virtual reality panorama camera through the 5G access device comprises:
receiving the virtual reality panoramic video stream after the compression coding pushed by the user surface shunt gateway, wherein the user surface shunt gateway is used for shunting data of the virtual reality panoramic video stream after the compression coding pushed by the 5G access equipment, and the flow is unloaded nearby.
5. The utility model provides a live device of virtual reality, its characterized in that, the device is applied to marginal cloud platform, the device includes:
the receiving module is used for receiving a compressed and coded virtual reality panoramic video stream pushed by the virtual reality panoramic camera through the 5G access equipment;
the cutting module is used for carrying out field angle panoramic slice processing on the compressed and coded virtual reality panoramic video stream to obtain a virtual reality slice stream;
a sending module, configured to send the virtual reality slice stream to the user terminal according to an acquisition request sent by the user terminal through the 5G access device after receiving the acquisition request, so that the user terminal decodes the virtual reality slice stream, and obtains a 360-degree panoramic drawing based on the decoded virtual reality slice stream;
and the adjusting module is used for determining a main visual angle of a user corresponding to the user terminal and sending an adjusting instruction to the virtual reality panoramic camera according to the main visual angle of the user so that the virtual reality panoramic camera adjusts the parameters of the camera according to the adjusting instruction.
6. The device according to claim 5, characterized in that the cutting module is specifically configured to:
according to different visual angles of the user, carrying out visual angle panoramic slice processing on the compressed and coded virtual reality panoramic video stream to obtain the virtual reality slice stream, so that the resolution of a virtual reality slice corresponding to a main visual angle of the user is a first resolution, the resolution of a virtual reality slice corresponding to a left visual angle and a right visual angle of the user is a second resolution, the resolution of a virtual reality slice corresponding to the remaining visual angles except the main visual angle and the left visual angle of the user is a third resolution, wherein the first resolution is greater than the second resolution, and the second resolution is greater than the third resolution.
7. A virtual reality live broadcast system, comprising:
the virtual reality panoramic camera is used for pushing the compressed and coded virtual reality panoramic video stream to the edge cloud platform through the 5G access equipment;
the edge cloud platform is used for carrying out field angle panoramic slice processing on the compressed and coded virtual reality panoramic video stream to obtain a virtual reality slice stream;
the user terminal is used for sending an acquisition request to the edge cloud platform through the 5G access equipment;
the edge cloud platform is used for sending the virtual reality slice stream to the user terminal according to the acquisition request;
the user terminal is used for decoding the virtual reality slice stream and obtaining 360-degree panoramic drawing based on the decoded virtual reality slice stream;
the edge cloud platform is used for determining a main view angle of a user corresponding to the user terminal and sending an adjusting instruction to the virtual reality panoramic camera according to the main view angle of the user;
and the virtual reality panoramic camera is used for adjusting the parameters of the camera according to the adjusting instruction.
8. A virtual reality live device, comprising:
a processor;
a memory; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of any of claims 1-4.
9. A computer-readable storage medium, characterized in that it stores a computer program that causes a server to execute the method of any one of claims 1-4.
10. A computer program product comprising computer instructions for executing the method of any one of claims 1 to 4 by a processor.
CN202110701479.1A 2021-06-23 2021-06-23 VR live broadcast method, device, system, equipment and storage medium Pending CN113438495A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110701479.1A CN113438495A (en) 2021-06-23 2021-06-23 VR live broadcast method, device, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110701479.1A CN113438495A (en) 2021-06-23 2021-06-23 VR live broadcast method, device, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113438495A true CN113438495A (en) 2021-09-24

Family

ID=77753711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110701479.1A Pending CN113438495A (en) 2021-06-23 2021-06-23 VR live broadcast method, device, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113438495A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114727126A (en) * 2022-04-12 2022-07-08 杭州当虹科技股份有限公司 Implementation method for applying image stitching to multi-machine-position VR (virtual reality) broadcasting-directing station
CN115314730A (en) * 2022-08-10 2022-11-08 中国电信股份有限公司 Video streaming transmission method and device applied to virtual reality VR scene
CN116266868A (en) * 2021-12-17 2023-06-20 聚好看科技股份有限公司 Display equipment and viewing angle switching method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101262597A (en) * 2007-03-09 2008-09-10 索尼株式会社 Image display system, device and method, image transmission apparatus and method, and program
CN110351607A (en) * 2018-04-04 2019-10-18 优酷网络技术(北京)有限公司 A kind of method, computer storage medium and the client of panoramic video scene switching
CN111739141A (en) * 2020-08-12 2020-10-02 绿漫科技有限公司 3D cloud rendering method for light terminal
CN112188303A (en) * 2020-09-03 2021-01-05 北京火眼目测科技有限公司 VR (virtual reality) streaming media playing method and device based on visual angle
CN112702522A (en) * 2020-12-25 2021-04-23 李灯 Self-adaptive control playing method based on VR live broadcast system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101262597A (en) * 2007-03-09 2008-09-10 索尼株式会社 Image display system, device and method, image transmission apparatus and method, and program
CN110351607A (en) * 2018-04-04 2019-10-18 优酷网络技术(北京)有限公司 A kind of method, computer storage medium and the client of panoramic video scene switching
CN111739141A (en) * 2020-08-12 2020-10-02 绿漫科技有限公司 3D cloud rendering method for light terminal
CN112188303A (en) * 2020-09-03 2021-01-05 北京火眼目测科技有限公司 VR (virtual reality) streaming media playing method and device based on visual angle
CN112702522A (en) * 2020-12-25 2021-04-23 李灯 Self-adaptive control playing method based on VR live broadcast system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
成静静 等: "基于5G边缘云的新媒体解决方案", 《数据通信》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116266868A (en) * 2021-12-17 2023-06-20 聚好看科技股份有限公司 Display equipment and viewing angle switching method
CN114727126A (en) * 2022-04-12 2022-07-08 杭州当虹科技股份有限公司 Implementation method for applying image stitching to multi-machine-position VR (virtual reality) broadcasting-directing station
CN114727126B (en) * 2022-04-12 2023-09-19 杭州当虹科技股份有限公司 Implementation method for applying image stitching to multi-machine-position VR (virtual reality) guide broadcasting station
CN115314730A (en) * 2022-08-10 2022-11-08 中国电信股份有限公司 Video streaming transmission method and device applied to virtual reality VR scene

Similar Documents

Publication Publication Date Title
US11381801B2 (en) Methods and apparatus for receiving and/or using reduced resolution images
CN110149542B (en) Transmission control method
CN113347405B (en) Scaling related method and apparatus
CN113438495A (en) VR live broadcast method, device, system, equipment and storage medium
US10897646B2 (en) Video stream transmission method and related device and system
CN106303289B (en) Method, device and system for fusion display of real object and virtual scene
US11153615B2 (en) Method and apparatus for streaming panoramic video
CN108632674A (en) A kind of playback method and client of panoramic video
CN101002471A (en) Method and apparatus to encode image, and method and apparatus to decode image data
US10958950B2 (en) Method, apparatus and stream of formatting an immersive video for legacy and immersive rendering devices
CN111800653B (en) Video decoding method, system, device and computer readable storage medium
US11457053B2 (en) Method and system for transmitting video
KR101922970B1 (en) Live streaming method for virtual reality contents and system thereof
CN115174942A (en) Free visual angle switching method and interactive free visual angle playing system
CN114666565B (en) Multi-view video playing method, device and storage medium
CN115567756A (en) View-angle-based VR video system and processing method
CN112203101B (en) Remote video live broadcast method and device and electronic equipment
CN117596373B (en) Method for information display based on dynamic digital human image and electronic equipment
JP7443536B2 (en) Rank information in immersive media processing
CN115883811A (en) Posture correction method, device and storage medium
CN115706793A (en) Image transmission method, image processing device and image generation system suitable for virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210924