CN114786023A - AR live broadcast system based on virtual reality - Google Patents

AR live broadcast system based on virtual reality Download PDF

Info

Publication number
CN114786023A
CN114786023A CN202210317153.3A CN202210317153A CN114786023A CN 114786023 A CN114786023 A CN 114786023A CN 202210317153 A CN202210317153 A CN 202210317153A CN 114786023 A CN114786023 A CN 114786023A
Authority
CN
China
Prior art keywords
live broadcast
interaction
data
image
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210317153.3A
Other languages
Chinese (zh)
Inventor
梁雅洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Xiaocancan Network Technology Co ltd
Original Assignee
Nanjing Xiaocancan Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Xiaocancan Network Technology Co ltd filed Critical Nanjing Xiaocancan Network Technology Co ltd
Priority to CN202210317153.3A priority Critical patent/CN114786023A/en
Publication of CN114786023A publication Critical patent/CN114786023A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/4662Learning process for intelligent management, e.g. learning user preferences for recommending movies characterized by learning algorithms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention belongs to the field of new media live broadcast, and particularly discloses an AR live broadcast system based on virtual reality, which comprises a main broadcast live broadcast end, an image processing cloud end and a user end, wherein the main broadcast live broadcast end comprises an audio and video acquisition module, an AR/VR simulation module and a processing module, the audio and video acquisition module and the AR/VR simulation module are respectively used for acquiring audio and video data and AR/VR image data, transmitting the acquired data to the processing module, and performing noise reduction processing and image preprocessing by the processing module; the anchor live broadcast end is connected with the image processing cloud end based on wireless/wired communication, uploads the preprocessed data to the image processing cloud end, superposes the AR/VR image on the video data through the image processing cloud end, packages the AR/VR image into TS streaming media with the AR/VR image, and transmits the TS streaming media to the user end in real time to play. By means of the AR technology, the AR is integrated into the anchor broadcast, so that live broadcast is more real, live broadcast experience is better, ornamental performance is high, and live broadcast effect is good.

Description

AR live broadcast system based on virtual reality
Technical Field
The invention relates to the field of new media live broadcast, in particular to an AR live broadcast system based on virtual reality.
Background
Augmented Reality (AR), which is a relatively new technology content that promotes integration between real world information and virtual world information content, implements analog simulation processing on entity information that is relatively difficult to experience in the real world space range on the basis of scientific technologies such as computers, and superimposes the virtual information content to be effectively applied in the real world, and can be perceived by human senses in the process, thereby realizing sensory experience beyond Reality. After the real environment and the virtual object are overlapped, the real environment and the virtual object can exist in the same picture and space at the same time. With the development of internet technology and the application development of intelligent equipment, the live broadcast platform has diversified live broadcast contents.
Live broadcast mostly adopts the live form of 2D to demonstrate among the prior art, and the show mode is comparatively single, can not satisfy the amusement demand, and present live broadcast system can't be according to user's use habit, dynamic demonstration interactive interface (interactive interface) in addition.
Disclosure of Invention
The invention aims to provide an AR live broadcast system based on virtual reality to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: an AR live broadcast system based on virtual reality comprises a main broadcast live broadcast end, an image processing cloud end and a user end, wherein the main broadcast live broadcast end comprises an audio and video acquisition module, an AR/VR simulation module and a processing module, the audio and video acquisition module and the AR/VR simulation module are respectively used for acquiring audio and video data and AR/VR image data and transmitting the acquired data to the processing module, and the processing module is used for carrying out noise reduction processing and image preprocessing; the anchor live broadcast end is connected with the image processing cloud end based on wireless/wired communication, uploads the preprocessed data to the image processing cloud end, superposes the AR/VR image on the video data through the image processing cloud end, packages the AR/VR image into TS streaming media, and transmits the TS streaming media to the user end in real time to play.
Preferably, the image processing cloud comprises a cloud live broadcast management module and a live broadcast interaction management module, wherein the cloud live broadcast management module is used for receiving audio and video data and AR/VR image data and performing superposition processing, packaging processing and real-time transmission processing on the audio and video data and the AR/VR image data.
Preferably, the live broadcast software system associated with the image processing cloud is loaded on the user side, the user side provides AR display and interaction functions, and the user side is in communication interaction with the image processing cloud based on a wireless/wired communication mode.
Preferably, the live broadcast interaction management module has an interaction recognition engine, and provides an interaction interface execution logic, specifically: when a user uses a live broadcast software system based on a user side, a live broadcast interaction management module judges whether the user uses the live broadcast software system for the first time, if not, an interaction engine calls data in a user habit historical library for analysis, and a dynamic interface based on user habits is generated; if yes, the interactive engine calls whether the user has historical data in the associated application, and if yes, the interactive engine performs data analysis based on the historical data and generates a dynamic interactive interface corresponding to the data.
Preferably, the user habit history base refers to a database table for storing all usage records of all live software system users.
Preferably, the live broadcast interaction management module further comprises an artificial intelligence interaction engine, the artificial intelligence interaction engine obtains an interaction information processing model through artificial intelligence learning training, the artificial intelligence interaction engine is used for obtaining interaction information sent by the user side, inputting the interaction information into the interaction information processing model, outputting the processed interaction information, and feeding back corresponding audio and video data or AR/VR images to the user side according to the processed interaction information.
Compared with the prior art, the invention has the beneficial effects that:
according to the invention, by means of the AR technology, the AR is blended into the anchor broadcast, so that the live broadcast is more real, the live broadcast experience is better, the ornamental performance is high, and the live broadcast effect is good; and the interactive engine is integrated in the live broadcast interactive management, the machine learning technology and the big data normative analysis technology can be combined, learning analysis can be carried out according to the data of user habits, a dynamic interactive interface matched with the habits is generated, and the using effect is better.
Drawings
FIG. 1 is a block diagram of the modules of the present invention;
fig. 2 is a schematic block diagram of an interactive interface execution logic of the live broadcast system of the present invention.
In the figure: 1. a main broadcast end; 101. an audio and video acquisition module; 102. an AR/VR simulation module; 2. an image processing cloud; 201. a cloud live broadcast management module; 202. a live broadcast interaction management module; 3. and a user side.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Referring to fig. 1-2, the present invention provides a technical solution: an AR live broadcast system based on virtual reality comprises a main broadcast live broadcast end 1, an image processing cloud end 2 and a user end 3, wherein the main broadcast live broadcast end 1 comprises an audio and video acquisition module 101, an AR/VR simulation module 102 and a processing module 103, the audio and video acquisition module 101 and the AR/VR simulation module 102 are respectively used for acquiring audio and video data and AR/VR image data and transmitting the acquired data to the processing module 103, and the processing module 103 performs noise reduction and image preprocessing; the anchor live broadcast end 1 is connected with the image processing cloud end 2 based on wireless/wired communication, the anchor live broadcast end 1 uploads preprocessed data to the image processing cloud end 2, AR/VR images are overlapped to the video data through the image processing cloud end 2, the AR/VR images are packaged into TS streaming media with the AR/VR images, and the TS streaming media are transmitted to the user end 3 in real time to be played.
In this embodiment, the AR/VR image generation software may employ Unity3D, Unity3D being a multi-platform, integrated development tool developed by Unity Technologies that allows players to easily create types of interactive content such as AR and VR content, building visualizations, real-time three-dimensional animations, three-dimensional video games, and the like.
In this embodiment, the image processing cloud 2 includes a cloud live broadcast management module 201 and a live broadcast interaction management module 202, where the cloud live broadcast management module 201 is configured to receive audio and video data and AR/VR image data, and perform superposition processing, encapsulation processing and real-time transmission processing on the audio and video data and the AR/VR image data.
In this embodiment, the client 3 is loaded with a live broadcast software system associated with the image processing cloud 2, the client 3 provides AR display and interaction functions, and the client 3 communicates and interacts with the image processing cloud 2 based on wireless/wired communication.
In this embodiment, the live interaction management module 202 has an interaction recognition engine, and the live interaction management module 202 provides an interaction interface execution logic, which specifically includes: when the user uses the live broadcast software system based on the user side 3, the live broadcast interaction management module 202 judges whether the user uses the live broadcast software system for the first time, if not, the interaction engine calls data in the user habit history library for analysis, and a dynamic interface based on the user habits is generated; if yes, the interactive engine calls whether the user has historical data in the associated application, and if yes, the interactive engine performs data analysis based on the historical data and generates a dynamic interactive interface corresponding to the data.
In this embodiment, the user habit history library refers to a database table that stores all usage records of all live software system users.
In this embodiment, the live broadcast interaction management module 202 further has an artificial intelligence interaction engine, the artificial intelligence interaction engine obtains an interaction information processing model through artificial intelligence learning training, and the artificial intelligence interaction engine is configured to obtain interaction information sent by the user side 3, input the interaction information into the interaction information processing model, output the processed interaction information, and feed back corresponding audio and video data or AR/VR images to the user side 3 according to the processed interaction information.
The live broadcast system integrates AR into the anchor broadcast through AR technology, so that live broadcast is more real, live broadcast experience is better, ornamental performance is high, and live broadcast effect is good; and the interactive engine is merged into the live broadcast interactive management, a machine learning technology and a big data normative analysis technology can be combined, learning analysis can be carried out according to the data habituated to the user, a dynamic interactive interface matched with the habit of the user is generated, and the using effect is better.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (6)

1. An AR live broadcast system based on virtual reality is characterized by comprising a main broadcast live broadcast end (1), an image processing cloud end (2) and a user end (3), wherein the main broadcast live broadcast end (1) comprises an audio and video acquisition module (101), an AR/VR simulation module (102) and a processing module (103), the audio and video acquisition module (101) and the AR/VR simulation module (102) are respectively used for acquiring audio and video data and AR/VR image data and transmitting the acquired data to the processing module (103), and the processing module (103) performs noise reduction and image preprocessing; the anchor live broadcast terminal (1) is connected with the image processing cloud terminal (2) based on wireless/wired communication, the anchor live broadcast terminal (1) uploads the preprocessed data to the image processing cloud terminal (2), the image processing cloud terminal (2) superposes AR/VR images on the video data, the AR/VR images are packaged into TS streaming media with the AR/VR images, and the TS streaming media are transmitted to the user terminal (3) in real time to be played.
2. The virtual reality based AR live broadcast system according to claim 1, wherein the image processing cloud (2) comprises a cloud live broadcast management module (201) and a live broadcast interaction management module (202), wherein the cloud live broadcast management module (201) is configured to receive audio and video data and AR/VR image data, and perform superposition processing, encapsulation processing and real-time transmission processing on the audio and video data and the AR/VR image data.
3. The virtual reality based AR live broadcasting system of claim 1, wherein the live broadcasting software system associated with the image processing cloud (2) is loaded on the user side (3), the user side (3) provides AR display and interaction functions, and the user side (3) communicates and interacts with the image processing cloud (2) based on wireless/wired communication.
4. The virtual reality-based AR live broadcast system according to claim 1, wherein the live interaction management module (202) has an interaction recognition engine, and the live interaction management module (202) provides interaction interface execution logic, specifically: when a user uses a live broadcast software system based on a user side (3), a live broadcast interaction management module (202) judges whether the user uses the live broadcast software system for the first time, if not, an interaction engine calls data in a user habit history library for analysis, and a dynamic interface based on user habits is generated; if yes, the interactive engine calls whether the user has historical data in the associated application, and if yes, the interactive engine performs data analysis based on the historical data and generates a dynamic interactive interface corresponding to the data.
5. The virtual reality (AR) live broadcast system is characterized in that the user habit history library refers to a database table storing all usage records of all live broadcast software system users.
6. The AR live broadcast system based on virtual reality as claimed in claim 4, wherein the live broadcast interaction management module (202) further comprises an artificial intelligence interaction engine, the artificial intelligence interaction engine obtains an interaction information processing model through artificial intelligence learning training, the artificial intelligence interaction engine is used for obtaining interaction information sent by the user side (3), inputting the interaction information into the interaction information processing model, outputting the processed interaction information, and feeding back corresponding audio and video data or AR/VR images to the user side (3) according to the processed interaction information.
CN202210317153.3A 2022-03-28 2022-03-28 AR live broadcast system based on virtual reality Pending CN114786023A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210317153.3A CN114786023A (en) 2022-03-28 2022-03-28 AR live broadcast system based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210317153.3A CN114786023A (en) 2022-03-28 2022-03-28 AR live broadcast system based on virtual reality

Publications (1)

Publication Number Publication Date
CN114786023A true CN114786023A (en) 2022-07-22

Family

ID=82424284

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210317153.3A Pending CN114786023A (en) 2022-03-28 2022-03-28 AR live broadcast system based on virtual reality

Country Status (1)

Country Link
CN (1) CN114786023A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108200446A (en) * 2018-01-12 2018-06-22 北京蜜枝科技有限公司 Multimedia interactive system and method on the line of virtual image
CN108833892A (en) * 2018-05-28 2018-11-16 徐州昇科源信息技术有限公司 A kind of VR live broadcast system
CN109963163A (en) * 2017-12-26 2019-07-02 阿里巴巴集团控股有限公司 Internet video live broadcasting method, device and electronic equipment
CN111200737A (en) * 2019-12-29 2020-05-26 航天信息股份有限公司企业服务分公司 Intelligent robot-assisted question answering system and method for live video platform
CN112732149A (en) * 2020-12-31 2021-04-30 上海航翼网络科技有限公司 Novel method for displaying software interactive interface based on AI technology
CN113132741A (en) * 2021-03-03 2021-07-16 广州鑫泓设备设计有限公司 Virtual live broadcast system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109963163A (en) * 2017-12-26 2019-07-02 阿里巴巴集团控股有限公司 Internet video live broadcasting method, device and electronic equipment
CN108200446A (en) * 2018-01-12 2018-06-22 北京蜜枝科技有限公司 Multimedia interactive system and method on the line of virtual image
CN108833892A (en) * 2018-05-28 2018-11-16 徐州昇科源信息技术有限公司 A kind of VR live broadcast system
CN111200737A (en) * 2019-12-29 2020-05-26 航天信息股份有限公司企业服务分公司 Intelligent robot-assisted question answering system and method for live video platform
CN112732149A (en) * 2020-12-31 2021-04-30 上海航翼网络科技有限公司 Novel method for displaying software interactive interface based on AI technology
CN113132741A (en) * 2021-03-03 2021-07-16 广州鑫泓设备设计有限公司 Virtual live broadcast system and method

Similar Documents

Publication Publication Date Title
CN104618797B (en) Information processing method, device and client
US9746912B2 (en) Transformations for virtual guest representation
CN107438183A (en) A kind of virtual portrait live broadcasting method, apparatus and system
CN107170030A (en) A kind of virtual newscaster's live broadcasting method and system
CN107801083A (en) A kind of network real-time interactive live broadcasting method and device based on three dimensional virtual technique
CN107248185A (en) A kind of virtual emulation idol real-time live broadcast method and system
WO2024078243A1 (en) Training method and apparatus for video generation model, and storage medium and computer device
CN111464828A (en) Virtual special effect display method, device, terminal and storage medium
US20230319328A1 (en) Reference of neural network model for adaptation of 2d video for streaming to heterogeneous client end-points
CN113382275A (en) Live broadcast data generation method and device, storage medium and electronic equipment
CN109413152A (en) Image processing method, device, storage medium and electronic equipment
Zerman et al. User behaviour analysis of volumetric video in augmented reality
CN113220130A (en) VR experience system for party building and equipment thereof
CN114786023A (en) AR live broadcast system based on virtual reality
CN116744027A (en) Meta universe live broadcast system
CN114422862A (en) Service video generation method, device, equipment, storage medium and program product
WO2022119612A1 (en) Set up and distribution of immersive media to heterogenous client end-points
CN112423014A (en) Remote review method and device
JP3338382B2 (en) Apparatus and method for transmitting and receiving a data stream representing a three-dimensional virtual space
CN112866741A (en) Gift animation effect display method and system based on 3D face animation reconstruction
CN112295211A (en) Stage performance virtual entertainment practical training system and method
US20230007067A1 (en) Bidirectional presentation datastream
US11943271B2 (en) Reference of neural network model by immersive media for adaptation of media for streaming to heterogenous client end-points
US20230007361A1 (en) Bidirectional presentation datastream using control and data plane channels
CN117544808A (en) Device control method, storage medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220722

RJ01 Rejection of invention patent application after publication