CN110602517A - Live broadcast method, device and system based on virtual environment - Google Patents

Live broadcast method, device and system based on virtual environment Download PDF

Info

Publication number
CN110602517A
CN110602517A CN201910877247.4A CN201910877247A CN110602517A CN 110602517 A CN110602517 A CN 110602517A CN 201910877247 A CN201910877247 A CN 201910877247A CN 110602517 A CN110602517 A CN 110602517A
Authority
CN
China
Prior art keywords
live
site
virtual
audience
interactive behavior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910877247.4A
Other languages
Chinese (zh)
Other versions
CN110602517B (en
Inventor
卓伟彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910877247.4A priority Critical patent/CN110602517B/en
Publication of CN110602517A publication Critical patent/CN110602517A/en
Application granted granted Critical
Publication of CN110602517B publication Critical patent/CN110602517B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a method, a device and a system based on a virtual environment, wherein the method comprises the following steps: the virtual reality equipment displays a live virtual environment of a live broadcast site to off-site audiences and captures first interactive behaviors of the off-site audiences in the live virtual environment; the projection equipment projects the live virtual avatar of the off-site audience to the live broadcast site; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience. The scheme provided by the application can enable off-site audiences to view the real live virtual environment of the live broadcast site, and meanwhile, the off-site audiences can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site.

Description

Live broadcast method, device and system based on virtual environment
Technical Field
The present application relates to the field of virtual environments, and in particular, to a live broadcast method, apparatus, system, computer-readable storage medium, and computer device based on a virtual environment.
Background
With the development of the field of Virtual environments, a live broadcast technology based on a Virtual environment appears, the live broadcast technology mainly utilizes a Virtual Reality technology (VR) to construct a Virtual environment on site, a user can wear a Virtual Reality device and can be put on the site if wearing the Virtual Reality device, and the user can freely switch viewing angles to view on-site pictures at different viewing angles, so that the viewing effect of the user can be the same as that of the on-site environment.
However, the current live broadcast technology based on the virtual environment is only that a user views a live broadcast image unilaterally, and cannot effectively interact with other users in the scene.
Therefore, the current live broadcast mode has the problem that the off-site user cannot effectively interact with the on-site user.
Disclosure of Invention
Based on this, it is necessary to provide a live broadcast method, apparatus, system, computer-readable storage medium and computer device based on a virtual environment for the problem that an off-site user cannot effectively interact with an on-site user.
A virtual environment based live method, comprising:
the virtual reality equipment displays a live virtual environment of a live broadcast site to off-site audiences and captures first interactive behaviors of the off-site audiences in the live virtual environment;
the projection equipment projects the live virtual avatar of the off-site audience to the live broadcast site; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
A virtual environment based live method, comprising:
receiving a first interaction behavior sent by virtual reality equipment; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
generating a live virtual avatar of the off-site audience, and generating a second interactive behavior of the live virtual avatar according to the first interactive behavior of the off-site audience;
and controlling a projection device to project the live virtual avatar with the second interactive behavior to the live broadcast site.
A virtual environment based live method, comprising:
displaying a live virtual environment of a live broadcast site to an off-site audience;
capturing a first interactive behavior of the off-site audience in the live virtual environment, and enabling a projection device to project a live virtual avatar with the second interactive behavior to the live broadcast site; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
A virtual environment based live method, comprising:
receiving projection control information of a server; the projection control information is generated according to a first interaction behavior sent by the virtual reality device received by the server; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
projecting the on-site virtual avatar of the off-site audience to the live broadcast site according to the projection control information; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
A live system, comprising:
a virtual reality device and a projection device;
the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences and capturing first interactive behaviors of the off-site audiences in the live virtual environment;
the projection equipment is used for projecting the live virtual avatar of the off-site audience to the live broadcast live site; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
A live device, comprising:
the receiving module is used for receiving a first interaction behavior sent by the virtual reality equipment; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
the generation module is used for generating a live virtual avatar of the off-site audience and generating a second interactive behavior of the live virtual avatar according to the first interactive behavior of the off-site audience;
and the control module is used for controlling projection equipment to project the live virtual avatar with the second interactive behavior to the live broadcast site.
A live device, comprising:
the display module is used for displaying a live virtual environment of a live broadcast site to off-site audiences;
the behavior capturing module is used for capturing a first interactive behavior of the off-site audience in the live virtual environment, and a projection device projects a live virtual avatar with the second interactive behavior to the live broadcast site; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
A live device, comprising:
the information receiving module is used for receiving the projection control information of the server; the projection control information is generated according to a first interaction behavior sent by the virtual reality device received by the server; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
the substitute projection module is used for projecting the live virtual substitute of the off-site audience to the live broadcast site according to the projection control information; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
A computer-readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of:
displaying a live virtual environment of a live broadcast site to an off-site audience, and capturing a first interactive behavior of the off-site audience in the live virtual environment;
projecting a live virtual avatar of the offsite audience to the live broadcast site; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
A computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of:
displaying a live virtual environment of a live broadcast site to an off-site audience, and capturing a first interactive behavior of the off-site audience in the live virtual environment;
projecting a live virtual avatar of the offsite audience to the live broadcast site; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
According to the live broadcast method, the device, the system, the computer readable storage medium and the computer equipment based on the virtual environment, the virtual reality equipment is used for displaying the live virtual environment of the live broadcast site, capturing the first interactive behavior of the off-site audience in the live virtual environment, and the projection equipment is used for projecting the on-site virtual avatar with the second interactive behavior to the live broadcast site, so that the off-site audience can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site while watching the real live virtual environment of the live broadcast site, and the problem that the off-site audience can only watch the picture of the live broadcast site and cannot interact with the live broadcast site in the traditional live broadcast method is solved.
Moreover, the live broadcasting method can also enable live audiences in the live broadcasting site to interact with off-site audiences outside the live broadcasting site through the second interaction behavior of the live virtual avatar, is not limited to the interaction of characters and voice, and solves the problem that the live audiences cannot fully interact with the off-site audiences in the traditional live broadcasting method.
Moreover, the live broadcasting method can also enable off-site audiences outside the live broadcasting site to interact through the second interaction behavior of the live virtual avatar in the live broadcasting site, not only limited to the interaction of characters and voice, and solves the problem that the off-site audiences cannot fully interact in the live broadcasting site in the traditional live broadcasting method.
Drawings
FIG. 1 is a diagram of an application environment of a virtual environment based live method in one embodiment;
FIG. 2 is a flow diagram illustrating a virtual environment-based live broadcast method, according to an embodiment;
FIG. 3 is a diagram illustrating a scene of a live broadcast site in accordance with an embodiment;
FIG. 4 is a diagram of a live virtual environment, in one embodiment;
FIG. 5A is a diagram illustrating the interactive behavior of a live virtual avatar, in accordance with one embodiment;
FIG. 5B is a second schematic diagram illustrating the interactive behavior of a live virtual avatar in one embodiment;
FIG. 6A is a diagram illustrating interaction in a live broadcast site, in accordance with one embodiment;
FIG. 6B is a second schematic diagram illustrating interaction in a live broadcast site according to an embodiment;
FIG. 6C is a third diagram illustrating interaction in a live broadcast site in accordance with an embodiment;
FIG. 7A is a diagram illustrating an interaction of an off-site viewer in a live scene, in accordance with one embodiment;
FIG. 7B is a second illustration of an off-site viewer interacting in a live scene in one embodiment;
FIG. 8A is a diagram illustrating one embodiment of a virtual avatar position-based projection;
FIG. 8B is a second illustration of a virtual avatar position based projection in one embodiment;
FIG. 8C is a third diagram illustrating projection based on a virtual avatar position in one embodiment;
FIG. 9 is a schematic diagram of a mobile live virtual avatar in one embodiment;
FIG. 10 is a diagram of a display interface of a virtual reality device, under an embodiment;
FIG. 11 is a flow diagram of another virtual environment-based live method in one embodiment;
FIG. 12 is a flowchart illustrating a virtual environment-based live broadcasting method according to an embodiment;
FIG. 13 is a flowchart illustrating a virtual environment-based live broadcasting method according to an embodiment;
fig. 14A is a block diagram of a live system in one embodiment;
FIG. 14B is a block diagram of an alternate embodiment of a live system;
fig. 14C is a diagram illustrating an application scenario of a live broadcast system in an embodiment;
FIG. 15 is a timing diagram of a particular live process in one embodiment;
FIG. 16 is a block diagram of a live device in one embodiment;
FIG. 17 is a block diagram of an alternate embodiment of a live device;
FIG. 18 is a block diagram of an alternate embodiment of a live device;
FIG. 19 is a block diagram showing the construction of a computer device according to one embodiment;
FIG. 20A is a block diagram of a data sharing system, according to an embodiment;
FIG. 20B is a diagram illustrating a data structure of a blockchain according to one embodiment;
FIG. 20C is a diagram of a new block generation process, according to one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Fig. 1 is an application environment diagram of a live broadcast method based on a virtual environment in an embodiment. Referring to fig. 1, the live broadcasting method may be applied to a live broadcasting system. The live system may include a virtual reality device 110 and a projection device 120.
The virtual reality device 110 may have, among other things, display functionality of a virtual environment. The display function of the virtual environment is a function of displaying a virtual environment picture based on a virtual reality technology. The virtual reality technology may be a technology of acquiring three-dimensional data of a real environment and constructing a virtual environment based on the three-dimensional data. Through the virtual reality device 110, a user can view a picture in a real environment in a virtual environment. The user can also adjust the viewing angle in the virtual environment to see the pictures at different angles in the real environment. The virtual display device 110 may specifically be a device that is wearable VR glasses, wearable VR helmets, and the like.
Further, the virtual reality device 110 may also have functionality to capture user behavior in various ways.
For example, the virtual reality device 110 may be worn on the head of the user, and an expression sensor on the virtual reality device 110 may capture the face of the user through a built-in camera, recognize the captured facial picture, and capture the real-time facial expression of the user.
For another example, the virtual reality device 110 may have a sensing handle, and the user may hold the sensing handle to perform a body motion such as waving his hand, and the sensing handle may detect a moving direction and a moving speed by a built-in gyroscope, and capture the body motion of the user in real time by using the moving direction and the moving speed as the moving direction and the moving speed of the hand of the user.
For another example, the virtual reality device 110 may capture the body movement through an optical sensor, and more specifically, the optical sensor scans laser in a specific space where the user is located through a laser emitter, and senses the received laser according to a laser receiver in the optical sensor, and positions the body position of the user through the position where the laser reaches the laser receiver, so as to capture the real-time body movement of the user.
For another example, the virtual reality device 110 may capture the user's voice through a built-in voice capture device, thereby capturing the user's real-time voice.
Of course, the virtual reality device 110 may capture the behavior of the user in other manners, and the specific implementation manner of capturing the behavior of the user is not limited in the embodiment of the present application.
The projection device 120 may be a device having a projection function. The projection device 120 may project a planar image on a planar projection panel.
The projection device 120 may also project three-dimensional stereoscopic images on a designated space through a stereoscopic projection technique. The stereoscopic projection technique is a technique for recording and reproducing a real three-dimensional image of an object using the principles of interference and diffraction. The stereoscopic image is actually a planar image in space, and a user can see a three-dimensional stereoscopic effect with naked eyes, so that the stereoscopic image is also called a pseudo-holographic image.
The projection device 120 may also project a three-dimensional hologram on a designated space through a holographic projection technique. The holographic projection technique is a technique of forming a hologram image on a holographic medium in a space using laser emission. The hologram is a three-dimensional stereoscopic image in space.
As shown in fig. 2, in one embodiment, a virtual environment based live method is provided. The present embodiment is mainly illustrated by applying the method to the live broadcast system in fig. 1. Referring to fig. 2, the live broadcasting method specifically includes the following steps:
s202, displaying a live virtual environment of a live broadcast site to an off-site audience by virtual reality equipment, and capturing a first interactive behavior of the off-site audience in the live virtual environment.
The live virtual environment may be a live virtual environment.
The first interactive behavior may be a behavior of a user interacting with the live virtual environment.
In one embodiment, the first interactive behavior may comprise at least one of a limb movement, a facial expression, a sound. The first interactive behavior may be a limb action of the off-site audience, for example, a hand waving, a palm waving, etc. of the off-site audience, which may be the limb action. The first interactive behavior may be a facial expression of the off-site viewer, for example, an expression of smiling, surprise, etc. of the off-site viewer's face, which may be a facial expression. The first interactive action may also be a sound made by an off-board audience, e.g., a voice spoken by the off-board audience, a clapping sound.
It should be noted that the live broadcasting method of the present embodiment may be applied to application scenes that need live broadcasting, for example, application scenes of a concert, a meeting, a classroom, and the like, where the application scenes generally include live viewers in a live broadcasting site and off-site viewers outside the live broadcasting site who watch the live broadcasting site through a network. In application scenes of concerts, meetings, classes and the like, outdoor audiences are limited to interact with live audiences in a text and voice mode through a traditional live broadcast mode, and sufficient interaction is difficult to perform. The live broadcast method of the embodiment is applied to the application scene, so that off-site audiences can view a real live virtual environment of a live broadcast site and can fully and effectively interact with live audiences in the live broadcast site and the real environment of the live broadcast site.
Fig. 3 is a schematic view of a live broadcast site according to an embodiment, and referring to fig. 3, a plurality of live viewers 303 are included in a live broadcast site, and the live viewers 303 watch a program performance in the live broadcast site. In addition, off-site viewers 301 outside the live scene can view the live virtual environment of the live scene through the virtual reality device 110.
In a specific implementation, a plurality of panoramic camera devices may be arranged in a live broadcast site, the panoramic camera devices shoot the live broadcast site at different viewing angles to obtain live broadcast site pictures at a plurality of viewing angles, and the shot live broadcast site pictures are sent to a server for generating a live virtual environment, and for the purpose of distinguishing descriptions, the server is named as a virtual environment synthesis server.
The virtual environment synthesis server may receive live broadcast live pictures of multiple viewing angles shot by each panoramic device, and synthesize the live broadcast live pictures of multiple viewing angles into a live virtual environment composed of pictures of multiple different viewing angles through a three-dimensional modeling algorithm, and the virtual environment synthesis server may generate a picture viewed in the live virtual environment according to a viewing angle selected by an off-site viewer, and send the picture as the above-mentioned live virtual environment to the virtual reality device 110.
The virtual reality device 110 may receive the live virtual environment transmitted by the virtual environment composition server and display the live virtual environment to the off-site viewer 301, so that the off-site viewer 301 may view the live virtual environment of the live broadcast through the virtual reality device 110.
On the basis of fig. 3, fig. 4 shows a schematic diagram of a live virtual environment of an embodiment. Referring to fig. 4, an off-site audience 301 views the environment of a live scene and a plurality of live audiences 303 in the live scene in a live virtual environment displayed by the virtual reality device 110.
The virtual reality device 110 can detect operations of the off-site audience 301 for adjusting different visual angles, such as raising and twisting, the virtual reality device 110 can correspondingly display the on-site virtual environment with different visual angles, and the off-site audience 301 can freely change the visual angle for watching the on-site virtual environment, so that the watching effect of the off-site audience 301 is consistent with the watching effect in a live broadcast site.
The virtual reality device 110 may also capture a first interactive behavior of the offsite audience 301. When watching the live virtual environment of the live broadcast site, the off-site audience 301 may make various interactive behaviors such as body movements and facial expressions according to people or objects in the live virtual environment, and the virtual reality device 110 may capture the interactive behavior made by the off-site audience 301 as the first interactive behavior.
For example, the off-site audience 301 may smile when watching a show through the virtual reality device 110, and the virtual reality device 110 may capture the facial expression of the off-site audience 301 smiling. For another example, the off-site audience 301 may feel the hot ambience of the live broadcast and then perform the limb movement of the palm, and the virtual reality device 110 may capture the limb movement of the palm performed by the off-site audience 301. For another example, the off-site audience 301 may see a live audience 303 in a live scene, the off-site audience 301 may wave his hands to communicate with the live audience 303, and the virtual reality device 110 may capture the physical movements of the off-site audience 301 waving his hands. For another example, the off-board viewer 301 produces "wonderful" shouting, clapping, and laughing when seeing a wonderful program show.
S204, projecting the on-site virtual substitution of the off-site audience to a live broadcast site by the projection equipment; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated based on the first interactive behavior of the off-site audience.
The live virtual avatar may be an image of the avatar representing the off-site audience projected to the live broadcast site through the projection device 120. The image form of the live virtual avatar can be specifically a stereoscopic image or a holographic image.
In a specific implementation, after capturing a first interactive behavior of the off-site audience 301, the virtual reality device 110 may generate corresponding interactive behavior data according to the first interactive behavior, and send the interactive behavior data to a server for controlling the projection device 120 to perform projection, where the server is named as a virtual avatar server for distinguishing and explaining. The virtual avatar server generates a live virtual avatar of the off-site audience and a second interactive behavior of the live virtual avatar according to the interactive behavior data, and controls the projection device 120 to project the live virtual avatar performing the second interactive behavior in the live broadcast site.
In practical applications, the virtual reality device 110 may also directly send the interactive behavior data to the projection device 120, and the projection device 120 generates a live virtual avatar of the off-site audience and a second interactive behavior of the live virtual avatar according to the interactive behavior data.
For example, referring to fig. 3, the projection device 120 projects a live virtual avatar 302 of an off-site audience 301 in a live scene, which the live virtual avatar 302 is visible to a live audience 303 in the live scene.
In practice, the second interactive behavior of the live virtual avatar 302 projected by the projection device 120 may be consistent with the first interactive behavior of the off-site audience 301 captured by the virtual reality device 110.
For example, fig. 5A is a schematic diagram illustrating the interactive behavior of a live virtual avatar according to an embodiment, the virtual reality device 110 may capture facial expressions of laughing by the off-site audience 301, the live virtual avatar 302 projected by the projection device 120 may appear laughing facial expressions, and the live audience 303 may view the facial expressions of laughing by the live virtual avatar 302.
For example, fig. 5B shows a second schematic diagram of the interactive behavior of a live virtual avatar according to an embodiment, the virtual reality device 110 may capture facial expressions and limb movements of waving hands of the off-site audience 301, the live virtual avatar projected by the projection device 120 may appear with facial expressions and limb movements of waving hands, and the live audience 303 may view the facial expressions and limb movements of waving hands of the live virtual avatar 302.
In practical applications, the second interactive behavior of the live virtual avatar 302 may be synchronized with the first interactive behavior of the off-site viewer 301, so as to ensure real-time interaction between the off-site viewer 301 and other live viewers 303 in the live broadcast.
In one embodiment, the virtual avatar projected on the live broadcast site by the projection device 120 may be a stereoscopic image and/or a holographic image, for example, the projection device 120 is embodied as a stereoscopic projection device, which can generate a three-dimensional stereoscopic image according to the image of the off-site audience 301 by using a stereoscopic projection technology and project the three-dimensional stereoscopic image on the specified space of the live broadcast site, and the live audience 303 can see the virtual avatar 302 displayed as the three-dimensional stereoscopic image on the specified space.
In practice, the image of the live virtual avatar 302 may be the real image of the off-site audience 301. Specifically, the virtual reality device 110 may photograph the off-site audience 301, and generate an image of the appearance, height, body type, etc. of the off-site audience as an image of the live virtual avatar 302 from the photographed image. Of course, the image of the live avatar 302 may also be an animated image generated according to the real image of the off-site audience 301, and the present embodiment does not limit the image of the live avatar.
The off-site audience 301 can interact with the live audience 303 in the live scene through the live virtual avatar 302 by the live virtual avatar 302 through the live virtual environment of the live scene displayed by the virtual reality device 110, the capture of the first interactive behavior of the off-site audience 301 by the virtual reality device 110, and the projection of the live virtual avatar 302 with the second interactive behavior in the live scene by the projection device 120.
Fig. 6A is one of the schematic diagrams of interaction in a live scene of an embodiment. Referring to fig. 6A, after a live audience 303 sees a live virtual avatar 302 projected in a live scene by the projection device 120, the live audience 303 may smile to the live virtual avatar 302.
Fig. 6B is a second schematic diagram of interaction in a live broadcast site according to an embodiment. Referring to fig. 6B, an off-site audience 301 views a live virtual environment of a live scene through the virtual reality device 110, and can see a live audience 303 in the live scene smiling thereto.
Fig. 6C is a third schematic diagram of a live audience interacting with an off-site audience in a live scene according to an embodiment. Referring to fig. 6C, the off-site audience 301 may make a smiling facial expression, the virtual reality device 110 captures a first interactive behavior in which the off-site audience 301 may make a smile, and the projection device 120 projects a live virtual avatar 302 in the live scene to make a second interactive behavior in which the live virtual avatar 302 makes a smile. Thus, live viewers interact with off-site viewers in the live broadcast live in the manner described above.
The off-site audience 301 can also interact with other off-site audiences through the on-site virtual environment of the live scene displayed by the virtual reality device 110, the capture of the first interactive behavior of the off-site audience 301 by the virtual reality device 110, and the projection of the on-site virtual avatar 302 with the second interactive behavior in the live scene by the projection device 120.
Fig. 7A is one of the schematic diagrams of interaction between off-site viewers in a live scene according to an embodiment. Referring to fig. 7, an off-site audience 301 and an off-site audience 304 exist outside the live scene, viewing the live virtual environment of the live scene through the virtual reality device 110 and the virtual reality device 130, respectively. Projection device 120 may project live virtual avatar of each of offsite viewer 301 and offsite viewer 304 to the live scene. The off-site spectator 301 may view the live virtual avatar 305 of the off-site spectator 304 in the live virtual environment displayed by the virtual reality device 110, and the off-site spectator 304 may also view the live virtual avatar 302 of the off-site spectator 301 in the live virtual environment displayed by the virtual reality device 130.
Fig. 7B is a second illustration of interaction between off-site viewers in a live scene, in accordance with an embodiment. Referring to fig. 7, an off-site audience 301 may perform a first interactive activity of smiling to a live virtual avatar 305 of an off-site audience 304, which is captured by the virtual reality device 110, and a second interactive activity of smiling is performed by the live virtual avatar 302 of the off-site audience 301. Off-site audience 304 may also see live virtual avatar 302 smiling in the live virtual environment, off-site audience 304 may make a first interactive act of waving his hands, which virtual reality device 130 captures, and live virtual avatar 305 of off-site audience 304 may also make a second interactive act of waving his hands accordingly. The offsite audience 301 may see the live virtual avatar 305 waving his hand in the live virtual environment displayed by the virtual reality device 110. Therefore, the off-site audiences interact in the live broadcast field in the manner described above.
In the live broadcast method, the virtual reality equipment is used for displaying the live virtual environment of the live broadcast site, the first interactive behavior of the off-site audience in the live virtual environment is captured, the projection equipment is used for projecting the live virtual avatar with the second interactive behavior to the live broadcast site, so that the off-site audience can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site while watching the real live virtual environment of the live broadcast site, and the problem that the off-site audience can only watch the picture of the live broadcast site and cannot interact with each other in the traditional live broadcast method is solved.
Moreover, the live broadcasting method can also enable live audiences in the live broadcasting site to interact with off-site audiences outside the live broadcasting site through the second interaction behavior of the live virtual avatar, is not limited to the interaction of characters and voice, and solves the problem that the live audiences cannot fully interact with the off-site audiences in the traditional live broadcasting method.
Moreover, the live broadcasting method can also enable off-site audiences outside the live broadcasting site to interact through the second interaction behavior of the live virtual avatar in the live broadcasting site, not only limited to the interaction of characters and voice, and solves the problem that the off-site audiences cannot fully interact in the live broadcasting site in the traditional live broadcasting method.
In an embodiment, the off-site audience has a corresponding virtual avatar position in a live broadcast site, and the live virtual environment is generated according to a picture viewed by the live virtual avatar at the virtual avatar position, where the step S202 may specifically include: the projection device projects a live virtual avatar of an offsite audience on a virtual avatar location in a live broadcast live.
The virtual avatar position may be a position for projecting a live virtual avatar in a live broadcast site.
In a specific implementation, a position may be preset in a live broadcast site as a virtual avatar position, and the virtual reality device 110 may generate a live virtual environment displayed to the off-site audience 301 according to a picture viewed by the live virtual avatar at the virtual avatar position. More specifically, after determining the virtual avatar position, the virtual environment composition server may compose a live virtual environment according to a picture viewed by the virtual avatar position from the perspective of the virtual avatar position, and transmit the live virtual environment to the virtual reality device 110. Therefore, the virtual reality device 110 displays the live virtual environment, that is, the screen viewed by the live virtual avatar at the virtual avatar position. When the projection device 120 is projecting, the live virtual avatar may be projected based on the virtual avatar position, so that the live virtual avatar is projected on the virtual avatar position of the live broadcast site.
Fig. 8A is one of schematic diagrams of a projection apparatus for projecting based on a virtual avatar position according to an embodiment. Referring to fig. 8A, the virtual avatar position 306 of the off-site audience 301 can be set in the live broadcast site, so that it can be determined that the viewing angle of the live virtual avatar at the virtual avatar position 306 is as shown by the dotted arrow in the figure, and the picture viewed by the live virtual avatar at the virtual avatar position 306 contains the live audience 303 whose hands are waving. From this perspective, a corresponding live virtual environment can be generated.
Fig. 8B is a second schematic diagram of a projection apparatus for projecting based on a virtual avatar position according to an embodiment. Referring to fig. 8B, the view of the off-site audience 301 through the virtual reality device 110 is identical to the view of the live virtual avatar at the virtual avatar position 306, and includes the live audience 303 waving his/her hands.
Fig. 8C is a third schematic diagram of a projection apparatus for performing projection based on a virtual avatar position according to an embodiment. Projection device 120 may project live virtual avatar 302 of offsite audience 301 at virtual avatar position 306.
It should be noted that if the live virtual environment seen by the off-site viewer 301 is not the scene viewed by the live virtual avatar at the virtual avatar position, the interaction may be inefficient. For example, the picture seen by the off-site audience 301 is a picture viewed by the on-site virtual avatar at the virtual avatar position a, and actually the on-site virtual avatar is projected at the virtual avatar position B, when the on-site audience performs an interactive action with respect to the picture viewed at the virtual avatar position a, the on-site virtual avatar projection performs an interactive action at the virtual avatar position B, and thus the interaction cannot be performed effectively.
According to the live broadcast method, the live virtual avatar is projected on the virtual avatar position in the live broadcast site, so that the picture actually watched by the off-site audience is consistent with the picture watched by the live virtual avatar on the virtual avatar position, and the problem that the picture watched by the off-site audience is not the picture watched by the live virtual avatar on the virtual avatar position, so that invalid and effective interaction is caused is solved.
In an embodiment, the live broadcasting method may further include:
the virtual reality equipment captures the movement behavior of off-site audiences; the movement behavior is used to adjust the virtual avatar position of the off-site audience in the live scene.
The movement behavior may be the movement behavior of the off-site audience, for example, the movement behavior of the off-site audience forward.
In specific implementation, a reverse sensing device such as a gyroscope may be built in the virtual reality device 110, and when the off-site audience 301 wears the virtual reality device 110 to move, the virtual reality device 110 may sense the direction and speed of movement through the gyroscope as the above-mentioned moving behavior. The movement behavior captured by the virtual reality device 110 may generate corresponding movement behavior data.
According to the movement behavior data, the virtual avatar position can be adjusted, so that the live virtual avatar 302 is controlled to move in the live broadcast site. For example, the projection device 120 may be moved based on the movement behavior data, thereby adjusting the virtual avatar position. For another example, the projection device 120 may be controlled to adjust the position of the projection according to the movement behavior data, so as to adjust the virtual avatar position.
Fig. 9 is a schematic diagram of a mobile live virtual avatar of an embodiment. Referring to fig. 9, the off-site audience 301 moves forward, the virtual reality device 110 captures the movement behavior of the off-site audience 301 moving forward, and the projection device 120 adjusts the virtual avatar position accordingly according to the movement behavior, thereby moving the live virtual avatar 302 forward.
In practical applications, when the virtual reality device 110 captures the movement behavior of the off-site audience 301, the live virtual environment can be changed accordingly according to the movement behavior. For example, the off-site audience 301 moves forward and the live virtual environment may be transformed into what the live virtual avatar 302 would see after moving forward in the live scene.
In an embodiment, when the first interactive action is the sound, the live broadcasting method may further include:
the sound of the live audience is played on the live broadcast site.
In a specific implementation, the virtual reality device 110 may collect sounds emitted by the off-site audience 301 in the on-site virtual environment through a built-in sound collection device. A voice player may be provided in the live scene to play the sound of the off-site audience 301 captured by the virtual reality device 110 as the sound emitted by the live virtual avatar. In practical applications, the projection device 120 with the sound playing function may also play the sound of the off-site audience 301 as the sound of the live virtual avatar.
In the live broadcasting method, the sound of the off-site audience is captured through the virtual reality equipment and is played in the live broadcasting site, so that the off-site audience can interact in the live broadcasting site in a sound mode.
In an embodiment, the live broadcasting method may further include:
the virtual reality equipment displays a behavior virtual reality animation of a live virtual avatar to off-site audiences; and the behavior virtual reality animation is generated according to the second interactive behavior of the live virtual avatar.
The behavior virtual reality animation can be a virtual reality animation reflecting the second interactive behavior of the live virtual avatar.
In a specific implementation, the virtual avatar server may generate a behavior virtual reality animation for a second interactive behavior of the live virtual avatar.
The behavior virtual reality animation is used for displaying a second interactive behavior of the live virtual avatar. For example, the off-site audience makes a first interactive behavior of waving hands, the live virtual avatar correspondingly makes a second interactive behavior of waving hands, and the behavior virtual reality animation shows the live virtual avatar to make the second interactive behavior of waving hands.
For wearable virtual reality equipment such as VR glasses, VR helmet, etc., the on-site audience can not see the interactive behavior of oneself, also can not see the second interactive behavior of the on-site virtual avatar in the on-site virtual environment, there may be the problem that the second interactive behavior of the on-site virtual avatar does not conform to the requirements of the on-site audience.
According to the live broadcast method, the behavior virtual reality animation is displayed through the virtual reality equipment, so that an off-site audience can watch the second interactive behavior of the live virtual avatar, the first interactive behavior of the live virtual avatar is adjusted according to the behavior virtual reality animation, the second interactive behavior of the live virtual avatar in a live broadcast site is adjusted, and the problem that the second interactive behavior of the live virtual avatar does not meet the requirements of the live audience is solved.
In practical applications, the behavior virtual reality animation displayed on the virtual reality device may also be generated according to a second interactive behavior of a live virtual avatar of another offsite audience. Therefore, the off-site audiences can interact with each other through the behavior virtual reality animation displayed by the virtual reality equipment, and the off-site audiences can more effectively interact with each other.
It should be noted that the live virtual environment and the behavioral virtual reality animation can be simultaneously displayed on the display interface of the virtual reality device 110. Fig. 10 is a schematic diagram of a display interface of a virtual reality device according to an embodiment, and referring to fig. 10, the display interface of the virtual reality device 110 may display a live virtual environment on the right side, in which a live virtual avatar 307 of a live audience 303 and another live virtual avatar of an off-site audience may be included, and display a live virtual reality animation 308 of the live virtual avatar of the off-site audience itself and a live virtual reality animation 309 of the another live virtual avatar of the off-site audience on the left side.
As shown in fig. 11, in an embodiment, a live broadcast method based on a virtual environment is provided, and this embodiment is mainly illustrated by applying the method to the virtual reality device 110 in fig. 1. Referring to fig. 11, the live broadcasting method specifically includes the following steps:
s1102, displaying a live virtual environment of a live broadcast site to off-site audiences;
s1104, capturing a first interactive behavior of an off-site audience in the on-site virtual environment, and projecting the on-site virtual avatar with a second interactive behavior to a live broadcast site by projection equipment; the second interactive behavior of the live virtual avatar is generated based on the first interactive behavior of the off-site audience.
In specific implementation, the virtual reality device 110 may receive the live virtual environment sent by the virtual environment synthesis server, and display the live virtual environment to the off-site audience 301, and the virtual reality device 110 may further capture a first interaction behavior that the off-site audience 301 does when watching the live virtual environment, so that the projection device 120 projects the live virtual avatar of the off-site audience to a live broadcast site, where the live virtual avatar has a second interaction behavior, and the second interaction behavior of the live virtual avatar is generated according to the first interaction behavior of the off-site audience.
Since the processing procedure of the virtual reality device 110 has been described in detail in the foregoing embodiments, it is not described herein again.
In the live broadcast method, the virtual reality equipment is used for displaying the live virtual environment of the live broadcast site, the first interactive behavior of the off-site audience to the live virtual environment is captured, the projection equipment is used for projecting the live virtual avatar of the off-site audience with the second interactive behavior to the live broadcast site, so that the off-site audience can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site while watching the real live virtual environment of the live broadcast site, and the problem that the off-site audience can only watch the picture of the live broadcast site and cannot interact with the live broadcast site in the traditional live broadcast method is solved.
As shown in fig. 12, in an embodiment, a virtual environment-based live broadcast method is provided, and this embodiment is mainly illustrated by applying the method to the projection device 120 in fig. 1. Referring to fig. 12, the live broadcasting method specifically includes the following steps:
s1202, receiving projection control information of a server; the projection control information is generated according to a first interaction behavior sent by the virtual reality device received by the server; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
s1204, according to projecting the control information, project the scene fictitious substitution of the audience outside the scene to the live broadcast scene; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated based on the first interactive behavior of the off-site audience.
In a specific implementation, the virtual avatar server may receive a first interaction captured by the virtual reality device 110 to an off-site audience. And the virtual avatar server generates a live virtual avatar of the off-site audience, and generates a second interactive behavior of the live virtual avatar according to the first interactive behavior. The virtual avatar server may control the projection device 120 to project the live virtual avatar performing the second interactive behavior in the live broadcast site by sending the projection control information. When receiving the projection control information sent by the virtual avatar server, the projection device 120 may project the live virtual avatar performing the second interactive behavior in the live broadcast site according to the projection control information. Since the processing procedure of the projection device 120 has been described in detail in the foregoing embodiments, it is not described herein again.
In the live broadcast method, the virtual reality equipment is used for displaying the live virtual environment picture of the live broadcast site, the first interactive behavior of the off-site audience in the live virtual environment is captured, the projection equipment is used for projecting the live virtual avatar with the second interactive behavior to the live broadcast site, so that the off-site audience can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site while watching the real live virtual environment picture of the live broadcast site, and the problem that the off-site audience can only watch the picture of the live broadcast site and cannot interact with the live broadcast site in the traditional live broadcast method is solved.
As shown in fig. 13, in an embodiment, a live broadcast method based on a virtual environment is provided, and this embodiment is mainly illustrated by applying the method to the virtual avatar server described above. Referring to fig. 13, the live broadcasting method specifically includes the following steps:
s1302, receiving a first interaction behavior sent by the virtual reality equipment; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
s1304, generating a live virtual avatar of the off-site audience, and generating a second interactive behavior of the live virtual avatar according to the first interactive behavior of the off-site audience;
and S1306, controlling the projection equipment to project the live virtual avatar with the second interactive behavior to a live broadcast site.
In specific implementation, the virtual avatar server may receive a first interaction captured by the virtual reality device to the off-site audience. And the virtual avatar server generates a live virtual avatar of the off-site audience, and generates a second interactive behavior of the live virtual avatar according to the first interactive behavior. The virtual avatar server may generate projection control information according to the live virtual avatar and the second interactive behavior, and send the projection control information to the projection device 120, so as to control the projection device 120 to project the live virtual avatar performing the second interactive behavior in the live broadcast site. Since the processing procedures of the virtual reality device 110 and the projection device 120 have been described in detail in the foregoing embodiments, they are not described in detail here.
In the live broadcasting method, the live virtual avatar of the off-site audience is generated by receiving the first interactive behavior sent by the virtual reality equipment used for displaying the live virtual environment of the live broadcasting site to the off-site audience, the second interactive behavior of the live virtual avatar is generated according to the first interactive behavior of the off-site audience in the live virtual environment captured by the virtual reality equipment, and the projection equipment is controlled to project the live virtual avatar of the off-site audience with the second interactive behavior to the live broadcasting site, so that the off-site audience can fully and effectively interact with the live audience in the live broadcasting site and the environment of the live broadcasting site while watching the real live virtual environment of the live broadcasting site, and the problem that the off-site audience can only watch the picture of the live broadcasting site and cannot interact with the live broadcasting site in the traditional live broadcasting method is solved.
As shown in fig. 14A, in one embodiment, a live system 1400 is provided, the live system 1400 may include: a virtual reality device 1410 and a projection device 1420.
The virtual reality equipment 1410 is used for displaying a live virtual environment of a live broadcast site to an off-site audience and capturing a first interactive behavior of the off-site audience in the live virtual environment;
a projection device 1420 to project the live virtual avatar of the offsite audience to the live broadcast site; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
Since the processing procedures of the virtual reality device 1410 and the projection device 1420 have been described in detail in the foregoing embodiments, they are not described in detail herein.
In the live broadcast system, the virtual reality equipment is used for displaying the live virtual environment of the live broadcast site, the first interactive behavior of the off-site audience in the live virtual environment is captured, the projection equipment is used for projecting the live virtual avatar with the second interactive behavior to the live broadcast site, so that the off-site audience can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site while watching the real live virtual environment of the live broadcast site, and the problem that the off-site audience can only watch the picture of the live broadcast site and cannot interact with each other in the traditional live broadcast method is solved.
On the basis of fig. 14A, fig. 14B shows another live system 1400 in an embodiment, where there may be a plurality of virtual reality devices 1410 in the live system 1400, and the live system 1400 may further include:
a panoramic camera 1430, a virtual environment synthesis server 1440, a virtual avatar server 1450 and a live tv device 1460;
the panoramic shooting equipment 1430 is used for carrying out multi-view panoramic shooting on a live broadcast site to obtain live broadcast site pictures with multiple views;
the virtual environment synthesis server 1440 is configured to synthesize live broadcast live pictures from multiple views into a live virtual environment, and send the live virtual environment to the virtual reality device 1410;
the virtual avatar server 1450 is configured to receive the first interaction behavior sent by the virtual reality device, generate a live virtual avatar of the off-site audience, generate a second interaction behavior of the live virtual avatar according to the first interaction behavior of the off-site audience, and control the projection device to project the live virtual avatar having the second interaction behavior to the live broadcast site;
and a live television broadcast device 1460 for live television broadcast of the live television broadcast site.
In specific implementation, a plurality of panoramic camera devices 1430 may be deployed at different viewing angles of a live broadcast site, and the plurality of panoramic camera devices 1430 obtain live broadcast site pictures at different viewing angles respectively and send the live broadcast site pictures to the virtual environment composition server 1440. The virtual environment composition server 1440 receives live scenes from different perspectives, composes the live scenes from the different perspectives into a live virtual environment, and sends the live virtual environment to the virtual reality device 1410 for the virtual reality device 1310 to display the live virtual environment to off-site viewers.
The virtual avatar server 1450 may generate a live virtual avatar of an off-site audience, and receive the first interactive behavior captured by the virtual reality device 1410, and the virtual avatar server 1450 generates a second interactive behavior of the live virtual avatar according to the first interactive behavior, and controls the projection device 1420 to project the live virtual avatar having the second interactive behavior in the live broadcast site.
The live telecast equipment 1460 can carry out live telecast on live telecast sites including live audiences and live virtual replacements of off-site audiences, and the live audiences in the live telecast sites and the live virtual replacements of the off-site audiences can be watched by the TV audiences.
In one embodiment, the virtual environment composition server 1440 and the virtual reality device 1410 may be based on 5G technology (5G)thGeneration Mobile Network, fifth Generation Mobile NetworkA telecommunications technology) to communicate. Specifically, the virtual environment synthesis server 1340 may convert the synthesis into the live virtual environment into a 5G data format based on a 5G communication protocol, and transmit the live virtual environment in the 5G data format to the virtual reality device 1310 through the 5G network. Because the virtual reality device requires higher real-time performance when displaying the live virtual environment of the live broadcast site, the multi-view panoramic picture is synthesized into the live virtual environment through the three-dimensional modeling algorithm, and the real-time performance required when the virtual reality device displays the live virtual environment of the live broadcast site cannot be ensured if the communication delay of the network is serious. Therefore, the virtual environment composition server 1440 communicates with the virtual reality device 1410 based on 5G technology, and real-time performance required for displaying a live virtual environment of a live broadcast site can be ensured.
On the basis of fig. 14B, fig. 14C shows a schematic diagram of an application scene of a live system in an embodiment. Fig. 15 shows a timing diagram of a particular live flow of an embodiment. This will be explained below with reference to fig. 14C and 15. Referring to fig. 15, in one particular live flow, the following steps may be included:
s1502, the panoramic camera 1430 performs multi-view panoramic shooting on the live broadcast site to obtain live broadcast site pictures at multiple views;
s1504, the panoramic imaging apparatus 1430 sends the live scene to the virtual environment composition server 1440;
s1506, the virtual environment composition server 1440 composites the live views from multiple views into a live virtual environment;
s1508, the virtual environment composition server 1440 sends the live virtual environment to the virtual reality device 1410;
s1510, the virtual reality device 1410 displays a live virtual environment;
s1512, the virtual reality device 1410 captures a first interaction behavior of the off-site audience;
s1514, the virtual reality device 1410 sends the first interactive behavior to the virtual avatar server 1450;
s1516, the virtual avatar server 1450 generates a live virtual avatar, and generates a second interactive behavior of live virtual promotion according to the first interactive behavior;
s1518, the projection device 1420 projects the live virtual avatar having the second interactive behavior in the live broadcast site.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
As shown in fig. 16, in one embodiment, there is provided a live device 1600 comprising:
a display module 1602, configured to display a live virtual environment of a live broadcast site to an off-site viewer;
a behavior capture module 1604, configured to capture a first interactive behavior of the off-site audience in the live virtual environment, for a projection device to project a live virtual avatar with the second interactive behavior to the live broadcast site; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
The live broadcast device displays the live virtual environment of the live broadcast site through the virtual reality equipment, captures the first interactive behavior of the off-site audience in the live virtual environment, and projects the live virtual avatar with the second interactive behavior to the live broadcast site through the projection equipment, so that the off-site audience can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site while watching the real live virtual environment of the live broadcast site, and the problem that the off-site audience can only watch the picture of the live broadcast site and cannot interact with the live broadcast site in the traditional live broadcast method is solved.
In one embodiment, the first interactive behavior comprises at least one of a limb movement, a facial expression, a sound.
In one embodiment, the off-site audience has a corresponding virtual avatar position in the live broadcast site, and the live virtual environment is generated according to a picture viewed by the live virtual avatar at the virtual avatar position.
In one embodiment, the live device 1600 may further include:
the mobile behavior capturing module is used for capturing the mobile behavior of the off-site audience; the movement behavior is used to adjust the virtual avatar position of the offsite audience in the live broadcast site.
In one embodiment, the live device 1600 may further include:
the animation display module is used for displaying the behavior virtual reality animation of the live virtual avatar to the off-site audience; the behavior virtual reality animation is generated according to the second interactive behavior of the live virtual avatar.
As shown in fig. 17, in one embodiment, there is provided a live device 1700 including:
an information receiving module 1702, configured to receive projection control information of a server; the projection control information is generated according to a first interaction behavior sent by the virtual reality device received by the server; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
an avatar projection module 1704, configured to project a live virtual avatar of the offsite audience to the live broadcast site according to the projection control information; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
The live broadcast device displays the live virtual environment picture of the live broadcast site through the virtual reality equipment, captures the interactive behavior of the off-site audience on the live virtual environment picture, and projects the live virtual avatar with the interactive behavior of the off-site audience to the live broadcast site through the projection equipment, so that the off-site audience can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site while watching the real live virtual environment picture of the live broadcast site, and the problem that the off-site audience can only watch the picture of the live broadcast site and cannot interact with the live broadcast site in the traditional live broadcast method is solved.
In an embodiment, the live broadcasting apparatus 1700 is further specifically configured to: and projecting the live virtual avatar of the off-site audience on the virtual avatar position in the live broadcast live site.
In an embodiment, the live broadcasting apparatus 1700 is further specifically configured to: adjusting the virtual avatar position of the offsite audience in the live broadcast site.
In an embodiment, the live broadcasting apparatus 1700 is further specifically configured to: playing the sound of the live audience at the live scene.
As shown in fig. 18, in one embodiment, there is provided a live device 1800 comprising:
a receiving module 1802, configured to receive a first interaction behavior sent by a virtual reality device; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
a generating module 1804, configured to generate a live virtual avatar of the offsite audience, and generate a second interactive behavior of the live virtual avatar according to the first interactive behavior of the offsite audience;
a control module 1806, configured to control a projection device to project the live virtual avatar with the second interactive behavior to the live broadcast site.
The live broadcast device generates a live virtual avatar of the off-site audience by receiving the first interactive behavior sent by the virtual reality equipment for displaying the live virtual environment of the live broadcast site to the off-site audience, generates a second interactive behavior of the live virtual avatar according to the first interactive behavior of the off-site audience in the live virtual environment captured by the virtual reality equipment, and controls the projection equipment to project the live virtual avatar of the off-site audience with the second interactive behavior to the live broadcast site, so that the off-site audience can fully and effectively interact with the live audience in the live broadcast site and the environment of the live broadcast site while watching the real live virtual environment of the live broadcast site, and the problem that the off-site audience can only watch the picture of the live broadcast site and cannot interact in the traditional live broadcast method is solved.
FIG. 19 is a diagram showing an internal structure of a computer device in one embodiment. The computer device may specifically be live device 1600 in fig. 16, live device 1700 in fig. 17, or live device 1800 in fig. 18. As shown in fig. 17, the computer apparatus includes a processor, a memory, a network interface, an input device, and a display screen connected through a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the computer device stores an operating system and may also store a computer program that, when executed by the processor, causes the processor to implement a live method based on a virtual environment. The internal memory may also have stored thereon a computer program that, when executed by the processor, causes the processor to perform a live method based on a virtual environment. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 19 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, the live broadcast apparatus provided in the present application may be implemented in the form of a computer program, and the computer program may be run on a computer device as shown in fig. 19. The memory of the computer device may store various program modules that make up the live device, such as the display module 1602 and the behavior capture module 1604 shown in fig. 16. The computer program of each program module makes the processor execute the steps in the virtual environment-based live broadcast method of each embodiment of the present application described in the present specification.
For example, the computer device shown in fig. 19 may perform the step of displaying the live virtual environment of the live scene to the off-site audience through the display module 1602 in the live device shown in fig. 16. The computer device may perform the step of capturing a first interactive behavior of the offsite audience in the live virtual environment through a behavior capture module 1604.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the virtual environment based live method described above. Here, the steps of the virtual environment based live broadcasting method may be the steps in the virtual environment based live broadcasting method of the above embodiments.
In one embodiment, a computer readable storage medium is provided, storing a computer program that, when executed by a processor, causes the processor to perform the steps of the virtual environment based live method described above. Here, the steps of the virtual environment based live broadcasting method may be the steps in the virtual environment based live broadcasting method of the above embodiments.
Artificial Intelligence (AI) is a theory, method, technique and application system that uses a digital computer or a machine controlled by a digital computer to simulate, extend and expand human Intelligence, perceive the environment, acquire knowledge and use the knowledge to obtain the best results. In other words, artificial intelligence is a comprehensive technique of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making.
The artificial intelligence technology is a comprehensive subject and relates to the field of extensive technology, namely the technology of a hardware level and the technology of a software level. The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
Computer Vision technology (CV) Computer Vision is a science for researching how to make a machine "see", and further refers to that a camera and a Computer are used to replace human eyes to perform machine Vision such as identification, tracking and measurement on a target, and further image processing is performed, so that the Computer processing becomes an image more suitable for human eyes to observe or transmitted to an instrument to detect. As a scientific discipline, computer vision research-related theories and techniques attempt to build artificial intelligence systems that can capture information from images or multidimensional data. Computer vision technologies generally include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D technologies, virtual reality, augmented reality, synchronous positioning, map construction, and other technologies, and also include common biometric technologies such as face recognition and fingerprint recognition.
With the research and progress of artificial intelligence technology, the artificial intelligence technology is developed and applied in a plurality of fields, such as common smart homes, smart wearable devices, virtual assistants, smart speakers, smart marketing, unmanned driving, automatic driving, unmanned aerial vehicles, robots, smart medical care, smart customer service, and the like.
The live broadcast method based on the virtual environment provided by the embodiment of the application relates to the computer vision technology of artificial intelligence, and is specifically explained by the following embodiment:
in the process that the virtual reality equipment captures the first interactive behavior of the off-site audience in the on-site virtual environment, the on-site audience can be shot to obtain an audience behavior image, and the interactive behavior of the on-site audience is identified through a computer vision technology.
In the process that the virtual reality equipment captures a first interaction behavior of off-site audiences in a live virtual environment, expressions on the faces of the live audiences are recognized through a computer vision technology.
The virtual environment synthesis server can carry out three-dimensional object modeling on a direct broadcasting site through a computer vision technology to synthesize a site virtual environment.
The live broadcasting method based on the virtual environment provided by the embodiment of the application relates to a block chain, and is specifically explained by the following embodiment:
referring to the data sharing system shown in fig. 20A, the data sharing system 2000 refers to a system for performing data sharing between nodes, the data sharing system may include a plurality of nodes 2001, and the plurality of nodes 2001 may refer to respective clients in the data sharing system.
Each node 2001 may receive input information during normal operation and maintain shared data within the data sharing system based on the received input information. In order to ensure information intercommunication in the data sharing system, information connection can exist between each node in the data sharing system, and information transmission can be carried out between the nodes through the information connection. For example, when an arbitrary node in the data sharing system receives input information, other nodes in the data sharing system acquire the input information according to a consensus algorithm, and store the input information as data in shared data, so that the data stored on all the nodes in the data sharing system are consistent.
Each node in the data sharing system has a node identifier corresponding thereto, and each node in the data sharing system may store a node identifier of another node in the data sharing system, so that the generated block is broadcast to the other node in the data sharing system according to the node identifier of the other node in the following. Each node may maintain a node identifier list as shown in the following table, and store the node name and the node identifier in the node identifier list correspondingly. The node identifier may be an IP (Internet Protocol) address and any other information that can be used to identify the node, and table 1 only illustrates the IP address as an example.
Node name Node identification
Node 1 117.114.151.174
Node 2 117.116.189.145
Node N 119.123.789.258
Each node in the data sharing system stores one identical blockchain. The block chain is composed of a plurality of blocks, as shown in fig. 20B, the block chain is composed of a plurality of blocks, the starting block includes a block header and a block main body, the block header stores an input information characteristic value, a version number, a timestamp and a difficulty value, and the block main body stores input information; the next block of the starting block takes the starting block as a parent block, the next block also comprises a block head and a block main body, the block head stores the input information characteristic value of the current block, the block head characteristic value of the parent block, the version number, the timestamp and the difficulty value, and the like, so that the block data stored in each block in the block chain is associated with the block data stored in the parent block, and the safety of the input information in the block is ensured.
When each block in the block chain is generated, referring to fig. 20C, when the node where the block chain is located receives the input information, the input information is verified, after the verification is completed, the input information is stored in the memory pool, and the hash tree for recording the input information is updated; and then, updating the updating time stamp to the time when the input information is received, trying different random numbers, and calculating the characteristic value for multiple times, so that the calculated characteristic value can meet the following formula:
SHA256(SHA256(version+prev_hash+merkle_root+ntime+nbits+x))<TARGET
wherein, SHA256 is a characteristic value algorithm used for calculating a characteristic value; version is version information of the relevant block protocol in the block chain; prev _ hash is a block head characteristic value of a parent block of the current block; merkle _ root is a characteristic value of the input information; ntime is the update time of the update timestamp; nbits is the current difficulty, is a fixed value within a period of time, and is determined again after exceeding a fixed time period; x is a random number; TARGET is a feature threshold, which can be determined from nbits.
Therefore, when the random number meeting the formula is obtained through calculation, the information can be correspondingly stored, and the block head and the block main body are generated to obtain the current block. And then, the node where the block chain is located respectively sends the newly generated blocks to other nodes in the data sharing system where the newly generated blocks are located according to the node identifications of the other nodes in the data sharing system, the newly generated blocks are verified by the other nodes, and the newly generated blocks are added to the block chain stored in the newly generated blocks after the verification is completed.
In the live broadcast method based on the virtual environment according to the embodiment of the present application, the virtual avatar server may upload a live virtual avatar generated according to a live audience and a second interactive behavior generated according to the first interactive behavior to a certain node 2001 in the data sharing system 2000, and the node 2001 maintains shared data in the data sharing system 2000 using data uploaded by the virtual avatar server as input information.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (15)

1. A virtual environment based live method, comprising:
the virtual reality equipment displays a live virtual environment of a live broadcast site to off-site audiences and captures first interactive behaviors of the off-site audiences in the live virtual environment;
the projection equipment projects the live virtual avatar of the off-site audience to the live broadcast site; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
2. The method of claim 1, wherein the offsite viewer has a corresponding virtual avatar position in the live broadcast live, and the live virtual environment is generated from a view of the live virtual avatar viewed at the virtual avatar position;
the projection device projects the live virtual avatar of the off-site audience to the live broadcast site, including:
and the projection equipment projects the live virtual avatar of the off-site audience on the virtual avatar position in the live broadcast live site.
3. The method of claim 2, further comprising:
the virtual reality device captures the movement behavior of the off-site audience; the movement behavior is used to adjust the virtual avatar position of the offsite audience in the live broadcast site.
4. The method of claim 1, further comprising:
the virtual reality equipment displays the behavior virtual reality animation of the live virtual avatar to the off-site audience; the behavior virtual reality animation is generated according to the second interactive behavior of the live virtual avatar.
5. The method of claim 1, wherein the first interactive behavior comprises at least one of a limb movement, a facial expression, and a sound.
6. The method of claim 1, wherein the live virtual avatar is a stereoscopic image and/or a holographic image.
7. A virtual environment based live method, comprising:
receiving a first interaction behavior sent by virtual reality equipment; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
generating a live virtual avatar of the off-site audience, and generating a second interactive behavior of the live virtual avatar according to the first interactive behavior of the off-site audience;
and controlling a projection device to project the live virtual avatar with the second interactive behavior to the live broadcast site.
8. A virtual environment based live method, comprising:
displaying a live virtual environment of a live broadcast site to an off-site audience;
capturing a first interactive behavior of the off-site audience in the live virtual environment, and enabling a projection device to project a live virtual avatar with the second interactive behavior to the live broadcast site; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
9. A virtual environment based live method, comprising:
receiving projection control information of a server; the projection control information is generated according to a first interaction behavior sent by the virtual reality device received by the server; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
projecting the on-site virtual avatar of the off-site audience to the live broadcast site according to the projection control information; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
10. A live system, comprising:
a virtual reality device and a projection device;
the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences and capturing first interactive behaviors of the off-site audiences in the live virtual environment;
the projection equipment is used for projecting the live virtual avatar of the off-site audience to the live broadcast live site; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
11. A live device, comprising:
the receiving module is used for receiving a first interaction behavior sent by the virtual reality equipment; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
the generation module is used for generating a live virtual avatar of the off-site audience and generating a second interactive behavior of the live virtual avatar according to the first interactive behavior of the off-site audience;
and the control module is used for controlling projection equipment to project the live virtual avatar with the second interactive behavior to the live broadcast site.
12. A live device, comprising:
the display module is used for displaying a live virtual environment of a live broadcast site to off-site audiences;
the behavior capturing module is used for capturing a first interactive behavior of the off-site audience in the live virtual environment, and a projection device projects a live virtual avatar with the second interactive behavior to the live broadcast site; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
13. A live device, comprising:
the information receiving module is used for receiving the projection control information of the server; the projection control information is generated according to a first interaction behavior sent by the virtual reality device received by the server; the virtual reality equipment is used for displaying a live virtual environment of a live broadcast site to off-site audiences; the first interactive behavior is obtained by capturing the interactive behavior of the off-site audience in the on-site virtual environment by the virtual reality equipment;
the substitute projection module is used for projecting the live virtual substitute of the off-site audience to the live broadcast site according to the projection control information; the live virtual avatar has a second interactive behavior; the second interactive behavior of the live virtual avatar is generated in accordance with the first interactive behavior of the off-site audience.
14. A computer-readable storage medium, storing a computer program which, when executed by a processor, causes the processor to carry out the steps of the method according to any one of claims 1 to 9.
15. A computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method according to any one of claims 1 to 9.
CN201910877247.4A 2019-09-17 2019-09-17 Live broadcast method, device and system based on virtual environment Active CN110602517B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910877247.4A CN110602517B (en) 2019-09-17 2019-09-17 Live broadcast method, device and system based on virtual environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910877247.4A CN110602517B (en) 2019-09-17 2019-09-17 Live broadcast method, device and system based on virtual environment

Publications (2)

Publication Number Publication Date
CN110602517A true CN110602517A (en) 2019-12-20
CN110602517B CN110602517B (en) 2021-05-11

Family

ID=68860195

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910877247.4A Active CN110602517B (en) 2019-09-17 2019-09-17 Live broadcast method, device and system based on virtual environment

Country Status (1)

Country Link
CN (1) CN110602517B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111228791A (en) * 2019-12-30 2020-06-05 塔普翊海(上海)智能科技有限公司 Real person AR shooting game equipment, and shooting fighting system and method based on AR technology
CN111399777A (en) * 2020-03-16 2020-07-10 北京平凯星辰科技发展有限公司 Differentiated key value data storage method based on data value classification
CN111399653A (en) * 2020-03-24 2020-07-10 北京文香信息技术有限公司 Virtual interaction method, device, equipment and computer storage medium
CN111541932A (en) * 2020-04-30 2020-08-14 广州华多网络科技有限公司 User image display method, device, equipment and storage medium for live broadcast room
CN112492231A (en) * 2020-11-02 2021-03-12 重庆创通联智物联网有限公司 Remote interaction method, device, electronic equipment and computer readable storage medium
TWI726577B (en) * 2019-05-15 2021-05-01 華碩電腦股份有限公司 Electronic device
CN113286162A (en) * 2021-05-20 2021-08-20 成都威爱新经济技术研究院有限公司 Multi-camera live-broadcasting method and system based on mixed reality
CN115396688A (en) * 2022-10-31 2022-11-25 北京玩播互娱科技有限公司 Multi-person interactive network live broadcast method and system based on virtual scene

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110063287A1 (en) * 2009-09-15 2011-03-17 International Business Machines Corporation Information Presentation in Virtual 3D
CN102170361A (en) * 2011-03-16 2011-08-31 西安电子科技大学 Virtual-reality-based network conference method
CN103888714A (en) * 2014-03-21 2014-06-25 国家电网公司 3D scene network video conference system based on virtual reality
CN105227990A (en) * 2015-09-25 2016-01-06 天脉聚源(北京)科技有限公司 The method and apparatus of the virtual spectators of mark VIP
CN105975622A (en) * 2016-05-28 2016-09-28 蔡宏铭 Multi-role intelligent chatting method and system
CN106162369A (en) * 2016-06-29 2016-11-23 腾讯科技(深圳)有限公司 A kind of realize in virtual scene interactive method, Apparatus and system
CN106303555A (en) * 2016-08-05 2017-01-04 深圳市豆娱科技有限公司 A kind of live broadcasting method based on mixed reality, device and system
CN106339986A (en) * 2016-08-31 2017-01-18 天脉聚源(北京)科技有限公司 Method and device for distributing head portraits to virtual seats
KR101704442B1 (en) * 2016-11-04 2017-02-09 한국프라임제약주식회사 An Eyesight measurement system using a virtual reality device
CN106789991A (en) * 2016-12-09 2017-05-31 福建星网视易信息***有限公司 A kind of multi-person interactive method and system based on virtual scene
US20170264936A1 (en) * 2016-03-14 2017-09-14 The Directv Group, Inc. Method and system for viewing sports content within a virtual reality environment
CN109271553A (en) * 2018-08-31 2019-01-25 乐蜜有限公司 A kind of virtual image video broadcasting method, device, electronic equipment and storage medium
CN109426343A (en) * 2017-08-29 2019-03-05 深圳市掌网科技股份有限公司 Cooperation training method and system based on virtual reality
CN109874021A (en) * 2017-12-04 2019-06-11 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus and system

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110063287A1 (en) * 2009-09-15 2011-03-17 International Business Machines Corporation Information Presentation in Virtual 3D
CN102170361A (en) * 2011-03-16 2011-08-31 西安电子科技大学 Virtual-reality-based network conference method
CN103888714A (en) * 2014-03-21 2014-06-25 国家电网公司 3D scene network video conference system based on virtual reality
CN105227990A (en) * 2015-09-25 2016-01-06 天脉聚源(北京)科技有限公司 The method and apparatus of the virtual spectators of mark VIP
US20170264936A1 (en) * 2016-03-14 2017-09-14 The Directv Group, Inc. Method and system for viewing sports content within a virtual reality environment
CN105975622A (en) * 2016-05-28 2016-09-28 蔡宏铭 Multi-role intelligent chatting method and system
CN106162369A (en) * 2016-06-29 2016-11-23 腾讯科技(深圳)有限公司 A kind of realize in virtual scene interactive method, Apparatus and system
CN106303555A (en) * 2016-08-05 2017-01-04 深圳市豆娱科技有限公司 A kind of live broadcasting method based on mixed reality, device and system
CN106339986A (en) * 2016-08-31 2017-01-18 天脉聚源(北京)科技有限公司 Method and device for distributing head portraits to virtual seats
KR101704442B1 (en) * 2016-11-04 2017-02-09 한국프라임제약주식회사 An Eyesight measurement system using a virtual reality device
CN106789991A (en) * 2016-12-09 2017-05-31 福建星网视易信息***有限公司 A kind of multi-person interactive method and system based on virtual scene
CN109426343A (en) * 2017-08-29 2019-03-05 深圳市掌网科技股份有限公司 Cooperation training method and system based on virtual reality
CN109874021A (en) * 2017-12-04 2019-06-11 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus and system
CN109271553A (en) * 2018-08-31 2019-01-25 乐蜜有限公司 A kind of virtual image video broadcasting method, device, electronic equipment and storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI726577B (en) * 2019-05-15 2021-05-01 華碩電腦股份有限公司 Electronic device
CN111228791A (en) * 2019-12-30 2020-06-05 塔普翊海(上海)智能科技有限公司 Real person AR shooting game equipment, and shooting fighting system and method based on AR technology
CN111399777A (en) * 2020-03-16 2020-07-10 北京平凯星辰科技发展有限公司 Differentiated key value data storage method based on data value classification
CN111399777B (en) * 2020-03-16 2023-05-16 平凯星辰(北京)科技有限公司 Differential key value data storage method based on data value classification
CN111399653A (en) * 2020-03-24 2020-07-10 北京文香信息技术有限公司 Virtual interaction method, device, equipment and computer storage medium
CN111541932A (en) * 2020-04-30 2020-08-14 广州华多网络科技有限公司 User image display method, device, equipment and storage medium for live broadcast room
CN112492231A (en) * 2020-11-02 2021-03-12 重庆创通联智物联网有限公司 Remote interaction method, device, electronic equipment and computer readable storage medium
CN113286162A (en) * 2021-05-20 2021-08-20 成都威爱新经济技术研究院有限公司 Multi-camera live-broadcasting method and system based on mixed reality
CN115396688A (en) * 2022-10-31 2022-11-25 北京玩播互娱科技有限公司 Multi-person interactive network live broadcast method and system based on virtual scene
CN115396688B (en) * 2022-10-31 2022-12-27 北京玩播互娱科技有限公司 Multi-person interactive network live broadcast method and system based on virtual scene

Also Published As

Publication number Publication date
CN110602517B (en) 2021-05-11

Similar Documents

Publication Publication Date Title
CN110602517B (en) Live broadcast method, device and system based on virtual environment
JP7135141B2 (en) Information processing system, information processing method, and information processing program
US9030486B2 (en) System and method for low bandwidth image transmission
KR102581453B1 (en) Image processing for Head mounted display devices
Fuchs et al. Immersive 3D telepresence
CN113473159B (en) Digital person live broadcast method and device, live broadcast management equipment and readable storage medium
CN107529091B (en) Video editing method and device
CN111862348B (en) Video display method, video generation method, device, equipment and storage medium
JP2019054488A (en) Providing apparatus, providing method, and program
CN104243961A (en) Display system and method of multi-view image
CN105959666A (en) Method and device for sharing 3d image in virtual reality system
JP7200935B2 (en) Image processing device and method, file generation device and method, and program
CN114900678B (en) VR end-cloud combined virtual concert rendering method and system
CN113784160A (en) Video data generation method and device, electronic equipment and readable storage medium
CN111698543B (en) Interactive implementation method, medium and system based on singing scene
CN105324994A (en) Method and system for generating multi-projection images
CN111179392A (en) Virtual idol comprehensive live broadcast method and system based on 5G communication
CN113852838A (en) Video data generation method and device, electronic equipment and readable storage medium
CN110730340B (en) Virtual audience display method, system and storage medium based on lens transformation
KR20190031220A (en) System and method for providing virtual reality content
KR102084970B1 (en) Virtual reality viewing method and virtual reality viewing system
WO2024103805A1 (en) Processing method and system for motion capture data
WO2023236656A1 (en) Method and apparatus for rendering interactive picture, and device, storage medium and program product
JP6609078B1 (en) Content distribution system, content distribution method, and content distribution program
CN109872400B (en) Panoramic virtual reality scene generation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant