CN117041628B - Live picture rendering method, system, device, equipment and medium - Google Patents

Live picture rendering method, system, device, equipment and medium Download PDF

Info

Publication number
CN117041628B
CN117041628B CN202311298018.XA CN202311298018A CN117041628B CN 117041628 B CN117041628 B CN 117041628B CN 202311298018 A CN202311298018 A CN 202311298018A CN 117041628 B CN117041628 B CN 117041628B
Authority
CN
China
Prior art keywords
special effect
live
resource
rendering
video stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311298018.XA
Other languages
Chinese (zh)
Other versions
CN117041628A (en
Inventor
李志成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202311298018.XA priority Critical patent/CN117041628B/en
Publication of CN117041628A publication Critical patent/CN117041628A/en
Application granted granted Critical
Publication of CN117041628B publication Critical patent/CN117041628B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23412Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs for generating or manipulating the scene composition of objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8455Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a method, a system, a device, equipment and a medium for rendering a live picture, and relates to the technical field of picture generation. The method comprises the following steps: acquiring a live video stream; acquiring a special effect rendering instruction; based on the special effect identification in the special effect rendering instruction, acquiring a first special effect resource characterized by the special effect identification; and applying the first special effect resource to the live video stream to generate a special effect video stream, wherein the special effect video stream is used for being sent to at least one terminal in the live broadcasting room so as to render and display a live broadcasting picture with special effects. Through the mode, the special effect rendering process can be carried out more efficiently by means of equipment outside the terminal, the problem that the configuration difficulty is high due to the fact that different terminals need to be adapted is avoided, the problem that the terminal can completely present the special effect only by updating the latest installation package in real time is avoided, and accordingly the live broadcast picture with the special effect can be displayed more smoothly. The method and the device can be applied to various scenes such as cloud technology, artificial intelligence, intelligent traffic and the like.

Description

Live picture rendering method, system, device, equipment and medium
Technical Field
The embodiment of the application relates to the technical field of picture generation, in particular to a method, a system, a device, equipment and a medium for rendering a live picture.
Background
With the rapid development of internet technology, video live broadcast has gradually become a very wide entertainment mode, such as a game live broadcast scene, due to the characteristics of intuitiveness, rapidness, strong interactivity and the like. The terminal can display a live broadcasting room through a live broadcasting Application program (Application) with a live broadcasting function, in the live broadcasting room, a host and audiences can conduct various live broadcasting interactions such as praying, commentary, gift gifts and the like, and a live broadcasting special effect corresponding to the live broadcasting interaction is rendered and displayed through the terminal.
In the related art, a terminal downloads an installation package or an update package packaged with various special effect resources (such as pictures and videos); when live interaction exists in the live broadcasting room, a resource index identifier (Identity, ID) of a corresponding special effect of the live broadcasting interaction is determined through the rear end of the APP, an instruction comprising the special effect ID is generated and sent to the terminal, and therefore the terminal locally loads special effect resources corresponding to the special effect ID and renders and presents a live broadcasting picture with the special effect based on the installation package or the update package.
In the rendering process of the special effect resource, aiming at the same special effect resource, a terminal with better hardware configuration can perform a faster and perfect rendering process on the special effect resource, but a terminal with poorer hardware configuration can only perform a slow and stuck rendering process on the special effect resource based on the reason of poorer rendering calculation force, even the special effect resource with larger demand calculation force can not complete the rendering process, so that different terminal objects using the same APP have different effects, the rendering effect and the man-machine interaction efficiency are influenced, and the problem that the APP development needs to be compatible with various hardware configurations is caused, so that the APP development difficulty is higher.
Disclosure of Invention
The embodiment of the application provides a rendering method, a system, a device, equipment and a medium of a live broadcast picture, which can carry out a more efficient special effect rendering process by means of equipment outside a terminal, so that the problem that configuration difficulty is high due to the fact that different terminals need to be adapted is avoided, and the problem that the terminal needs to update a latest installation package in real time to completely present a special effect is also avoided, and therefore the terminal can smoothly display the live broadcast picture with the special effect based on an acquired special effect video stream. The technical scheme is as follows.
In one aspect, there is provided a method for rendering a live view, the method being performed by a server, the method comprising:
acquiring a live video stream, wherein the live video stream is generated based on video data acquired by a main broadcasting terminal, and the live video stream is used for representing live pictures in a live broadcasting room;
obtaining a special effect rendering instruction, wherein the special effect rendering instruction comprises a special effect identifier, the special effect identifier is used for uniquely identifying a special effect resource, and the special effect rendering instruction is used for applying the special effect resource to the live video stream;
acquiring a first special effect resource characterized by the special effect identifier based on the special effect identifier in the special effect rendering instruction;
And applying the first special effect resource to the live video stream to generate a special effect video stream, wherein the special effect video stream is used for being sent to at least one terminal in the live broadcasting room, and the at least one terminal is used for rendering the special effect video stream and displaying live broadcasting pictures with special effects.
In another aspect, there is provided a rendering apparatus of a live view, the apparatus including:
the video acquisition module is used for acquiring a live video stream, wherein the live video stream is generated based on video data acquired by the anchor terminal, and the live video stream is used for representing live pictures in a live broadcasting room;
the instruction acquisition module is used for acquiring an effect rendering instruction, wherein the effect rendering instruction comprises an effect identifier, the effect identifier is used for uniquely identifying an effect resource, and the effect rendering instruction is used for applying the effect resource to the live video stream;
the resource acquisition module is used for acquiring a first special effect resource characterized by the special effect identifier based on the special effect identifier in the special effect rendering instruction;
the special effect application module is used for applying the first special effect resource to the live video stream to generate a special effect video stream, the special effect video stream is used for being sent to at least one terminal in the live broadcasting room, and the at least one terminal is used for rendering the special effect video stream and displaying live broadcasting pictures with special effects.
In another aspect, a computer device is provided, where the computer device includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, where the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement a method for rendering a live view as in any one of the embodiments of the present application.
In another aspect, a computer readable storage medium is provided, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by a processor to implement a method for rendering a live view as in any of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method for rendering a live view according to any one of the above embodiments.
The beneficial effects that technical scheme that this application embodiment provided include at least:
acquiring a live video stream generated based on video data acquired by a host broadcasting terminal, acquiring a special effect rendering instruction comprising a special effect identifier, and acquiring a first special effect resource represented by the special effect identifier by equipment outside the terminal based on the special effect identifier in the special effect rendering instruction, thereby applying the first special effect resource to the live video stream and generating the special effect video stream. By the method, the special effect resource acquisition process and the special effect rendering process are carried out by equipment outside the terminal, so that the problem of high configuration difficulty caused by the need of adapting to different terminals is avoided, and the problem that the terminal can completely present the special effect only by updating the latest installation package in real time can be avoided; at least one terminal in the live broadcasting room can efficiently and smoothly display live broadcasting pictures with special effects by directly receiving the special effect video stream with special effect resources applied and rendering the special effect video stream.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
fig. 2 is a flowchart of a method for rendering a live view according to an exemplary embodiment of the present application;
fig. 3 is a flowchart of a method for rendering a live view according to another exemplary embodiment of the present application;
fig. 4 is a flowchart of a method for rendering a live view according to still another exemplary embodiment of the present application;
FIG. 5 is an interactive schematic diagram of a live view rendering system provided in an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of special effects content provided by an exemplary embodiment of the present application;
FIG. 7 is a schematic diagram of a related art provided by an exemplary embodiment of the present application;
FIG. 8 is a basic interaction diagram of a cloud application provided in an exemplary embodiment of the present application;
FIG. 9 is a schematic diagram of live-combined real-time cloud rendering processing principles and architecture provided in accordance with one exemplary embodiment of the present application;
fig. 10 is a schematic view of a scene interface of a rendering method of an application live frame according to an exemplary embodiment of the present application;
fig. 11 is a block diagram of a live view rendering apparatus according to an exemplary embodiment of the present application;
fig. 12 is a block diagram of a server according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, a brief description will be given of terms involved in the embodiments of the present application.
Cloud computing (clouding) is a computing model that distributes computing tasks across a large pool of computers, enabling various application systems to acquire computing power, storage space, and information services as needed. The network that provides the resources is referred to as the "cloud". Resources in the cloud are infinitely expandable in the sense of users, and can be acquired at any time, used as needed, expanded at any time and paid for use as needed.
As a basic capability provider of cloud computing, a cloud computing resource pool (abbreviated as a cloud platform) is established, which is generally called an infrastructure as a service (Infrastructure as a Service, iaaS) platform, and multiple types of virtual resources are deployed in the resource pool for external clients to select for use. The cloud computing resource pool mainly comprises: computing devices (which are virtualized machines, including operating systems), storage devices, network devices.
According to the logic function division, a platform service (Platform as a Service, paaS) layer can be deployed on the IaaS layer, and a software service (Software as a Service, saaS) layer can be deployed on the PaaS layer, or the SaaS can be directly deployed on the IaaS layer. PaaS is a platform on which software runs, such as a database, web container, etc. SaaS is a wide variety of business software such as web portals, sms mass senders, etc. Generally, saaS and PaaS are upper layers relative to IaaS.
Cloud Social contact (Cloud Social) is a virtual Social contact application mode of Internet of things, cloud computing and mobile Internet interaction application, aims at establishing a famous 'resource sharing relation graph', and further develops network Social contact, and is characterized in that a large number of Social resources are integrated and evaluated uniformly to form a resource effective pool to provide services for users as required. The more users that participate in the sharing, the greater the utility value that can be created.
In the related art, a terminal downloads an installation package or an update package packaged with various special effect resources (such as pictures and videos); when live interaction exists in the live broadcasting room, a resource index ID of a special effect corresponding to the live interaction is determined through the rear end of the APP, an instruction comprising the special effect ID is generated and sent to the terminal, and therefore the terminal locally loads special effect resources corresponding to the special effect ID and renders and presents a live broadcasting picture with the special effect based on an installation package or an update package. In the rendering process of the special effect resource, aiming at the same special effect resource, a terminal with better hardware configuration can perform a faster and perfect rendering process on the special effect resource, but a terminal with poorer hardware configuration can only perform a slow and stuck rendering process on the special effect resource based on the reason of poorer rendering calculation force, even the special effect resource with larger demand calculation force can not complete the rendering process, so that different terminal objects using the same APP have different effects, the rendering effect and the man-machine interaction efficiency are influenced, and the problem that the APP development needs to be compatible with various hardware configurations is caused, so that the APP development difficulty is higher.
In the embodiment of the application, a method for rendering a live broadcast picture is provided, a live broadcast video stream generated based on video data acquired by a main broadcast terminal is obtained, and a special effect rendering instruction comprising a special effect identifier is obtained, wherein the special effect rendering instruction is used for applying special effect resources represented by the special effect identifier to the live broadcast video stream; and further, acquiring a first special effect resource characterized by the special effect identifier based on the special effect identifier in the special effect rendering instruction, thereby applying the first special effect resource to the live video stream and generating the special effect video stream. By the method, the special effect resource acquisition process and the special effect rendering process are carried out by equipment outside the terminal, so that the problem of high configuration difficulty caused by the need of adapting to different terminals is avoided, and the problem that the terminal can completely present the special effect only by updating the latest installation package in real time can be avoided; at least one terminal in the live broadcasting room can efficiently and smoothly display live broadcasting pictures with special effects by directly receiving the special effect video stream with special effect resources applied and rendering the special effect video stream. The method for rendering the live pictures can be applied to various live scenes, such as: among various scenes such as a live game scene, a live chat scene, a live decryption scene, and the like, the embodiment of the application is not limited to this.
It should be noted that, information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals referred to in this application are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of relevant data is required to comply with relevant laws and regulations and standards of the relevant region. For example, the special effect resource, the live video stream and the like are all acquired under the condition of full authorization.
The method and the device can display a prompt interface, a popup window or output voice prompt information before and during the process of collecting the relevant data of the user, wherein the prompt interface, the popup window or the voice prompt information is used for prompting the user to collect the relevant data currently, so that the method and the device can start to execute the relevant step of acquiring the relevant data of the user only after acquiring the confirmation operation of the user on the prompt interface or the popup window, otherwise (namely, when the confirmation operation of the user on the prompt interface or the popup window is not acquired), finish the relevant step of acquiring the relevant data of the user, namely, the relevant data of the user is not acquired. In other words, all user data collected in the present application is collected with the consent and authorization of the user, and the collection, use and processing of relevant user data requires compliance with relevant laws and regulations and standards of the relevant region.
Secondly, an implementation environment related to the embodiment of the present application is described, and the method for rendering the live broadcast picture provided in the embodiment of the present application is implemented by a terminal and a server through data interaction. Referring to fig. 1, the implementation environment relates to a hosting terminal 110 and a server 120, and the hosting terminal 110 and the server 120 are connected through a communication network 130.
In some embodiments, during the process that the anchor performs live broadcast through the live broadcast room, the anchor terminal 110 may start the camera based on the authority of the anchor, so that video data is acquired based on the camera capturing function of the camera, and the anchor terminal 110 generates a live broadcast video stream based on the video data. Namely: the live video stream is a video stream generated based on video data collected by the anchor terminal. The live video stream is used for representing live pictures in the live broadcasting room.
Optionally, the anchor terminal 110 transmits the live video stream to the server 120 through the communication network 130, so that the server 120 acquires the live video stream.
In some embodiments, server 120 may also obtain special effects rendering instructions.
Illustratively, the effect rendering instruction is an instruction generated by selecting any one or more effects. For example: the anchor terminal 110 triggers the control provided on the live broadcast interface to realize the corresponding special effect, and the triggering process generates a special effect rendering instruction through the anchor terminal 110, so that the special effect rendering instruction received by the server 120 comes from the anchor terminal 110; or, the audience terminal watching live broadcast triggers the control provided on the live broadcast interface to realize the corresponding special effect, and the triggering process generates a special effect rendering instruction through the audience terminal, so that the special effect rendering instruction received by the server 120 is from the audience terminal and the like.
The special effect rendering instruction comprises special effect identifiers, and the special effect identifiers are used for uniquely identifying special effect resources.
Illustratively, the live interface provides a plurality of special effect controls, each special effect control corresponds to a special effect identifier, and the special effect identifier is used for uniquely characterizing special effect resources indicated by the special effect control; and determining the special effect identifier based on the selection of the special effect control, and further generating a special effect rendering instruction comprising the special effect identifier. The special effect rendering instruction is used for applying special effect resources to the live video stream.
In some embodiments, server 120 obtains a first effect resource characterized by the effect identification based on the effect identification in the effect rendering instruction. Illustratively, a plurality of special effects resources are cached by the server 120, and the server 120 can determine the first special effects resource to be applied to the live video stream according to the special effects identifier.
In some embodiments, the server 120, after determining the first special effects resource, applies the first special effects resource to the live video stream and generates the special effects video stream.
Wherein, the process of acquiring the special effects resources and applying the special effects resources to the live video stream is performed by the server 120, thereby realizing a higher-speed special effects rendering process by means of the server 120 having higher computing power except for the terminals (the anchor terminal 110 and the audience terminal).
Based on the application of the first special effects resource to the special effects rendering process of the live video stream, the server 120 is caused to obtain a special effects video stream for transmission to at least one terminal within the live room.
Illustratively, the at least one terminal within the living room includes at least one of the anchor terminal 110 and the viewer terminal described above.
Optionally, at least one terminal is used for rendering the special effect video stream and displaying the live picture with special effect.
Illustratively, after at least one terminal receives the special effect video stream, the special effect video stream is rendered in the current terminal, so that a terminal interface displays a live broadcast picture with special effects, and the special effects are special effect contents corresponding to the first special effect resource.
It should be noted that the server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content delivery network (Content Delivery Network, CDN), and basic cloud computing services such as big data and an artificial intelligence platform. The terminal may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc. The terminal and the server may be directly or indirectly connected through wired or wireless communication, which is not limited herein.
Cloud technology (Cloud technology) refers to a hosting technology that unifies serial resources such as hardware, application programs, networks and the like in a wide area network or a local area network to realize calculation, storage, processing and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied by the cloud computing business mode, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data with different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized only through cloud computing.
In some embodiments, the servers described above may also be implemented as nodes in a blockchain system.
In combination with the noun introduction and the application scenario, the method for rendering the live broadcast picture provided in the present application will be described, and the method is applied to a server as an example, as shown in fig. 2, and the method includes the following steps 210 to 240.
Step 210, acquiring a live video stream.
The live video stream is a video stream generated based on video data collected by the anchor terminal.
Illustratively, the anchor terminal is a terminal used by an anchor object, the anchor object is an object that initiates live broadcast, or the anchor object is an object that appears in live broadcast, or the like.
Optionally, taking the main broadcasting object as an object for initiating the live broadcasting as an example, the main broadcasting terminal is correspondingly provided with a live broadcasting function, and the live broadcasting function is used for establishing a live broadcasting room through the live broadcasting terminal so as to conduct a live broadcasting process.
Illustratively, the live broadcast function is a system function of a host broadcast terminal, and a host broadcast object initiates live broadcast by using the host broadcast terminal and establishes a live broadcast room; or the live broadcasting function is a function of a host broadcasting terminal after downloading a live broadcasting application program, and a host broadcasting object enters the live broadcasting application program by using the host broadcasting terminal and initiates live broadcasting, so that a live broadcasting room is established; or the live broadcast function is an applet function built in a certain application program downloaded by the anchor terminal, and the anchor object uses the anchor terminal to enter the applet of the certain application program and initiate live broadcast, thereby establishing a live broadcast room and the like.
In the process of live broadcasting through a live broadcasting room, a host broadcasting terminal can acquire video data acquired by a camera based on the authorization of a host broadcasting object to start the camera; in addition, the anchor terminal can also acquire the audio data collected by the microphone as a component part of the video data and the like based on the authorization of the anchor object to start the microphone. Optionally, the camera and the microphone may be implemented as a component configured by the anchor terminal itself, or may be implemented as a component externally connected to the anchor terminal.
Illustratively, the anchor terminal collects a plurality of video frames with time sequences in a certain time period based on the starting of the camera, and takes the plurality of video frames with time sequence as video data; or, the anchor terminal may collect a plurality of video frames and a plurality of audio frames having a time sequence in a certain period of time based on the camera and the microphone being turned on, and use the plurality of video frames and the plurality of audio frames having a time sequence relationship as video data.
Optionally, after obtaining video data composed of a plurality of video frames, the anchor terminal combines the plurality of video frames based on a time sequence relationship to obtain a live video stream; or after obtaining video data composed of a plurality of video frames and a plurality of audio frames, the anchor terminal combines the video frames and the audio frames based on a time sequence alignment relationship, so as to obtain a live video stream.
The live video stream is used for representing live pictures in the live broadcasting room.
Schematically, the live broadcasting room is a virtual room built based on the initiation of live broadcasting of the main broadcasting object, and the live broadcasting room correspondingly displays live broadcasting pictures acquired by the main broadcasting terminal of the main broadcasting object, wherein the live broadcasting pictures are live broadcasting video streams; in addition, the live broadcast-initiated main broadcasting object can invite other main broadcasting objects to participate in live broadcast together on line, so that the live broadcast room comprises other main broadcasting objects besides the live broadcast-initiated main broadcasting object, main broadcasting terminals corresponding to the other main broadcasting objects can acquire live broadcast pictures, and two live broadcast pictures are synthesized to obtain live broadcast video streams and the like.
In some embodiments, after generating a live video stream based on video data, the anchor terminal sends the live video stream to the server, so that the server obtains the live video stream.
Optionally, the anchor terminal sends the live video stream to the live platform, so that the live platform obtains the live video stream.
In some embodiments, after the video data is acquired by the anchor terminal, the video data is directly sent to the server, so that the server encodes the video data, and a live video stream is obtained.
Step 220, obtaining special effect rendering instructions.
The effect rendering instruction is an instruction for realizing an effect rendering process, that is: the effect rendering instruction is an instruction for rendering an effect. The special effects comprise various special effect types such as a character special effect, an expression special effect, an image special effect and the like; the special effects can be realized as either static special effects or dynamic special effects.
In some embodiments, the special effects rendering instructions are instructions generated based on special effects triggering operations on special effects content in the live interface.
Optionally, the live interface may be implemented as an interface displayed by the anchor terminal or as an interface displayed by the viewer terminal. For example: when the special effect rendering instruction comes from the anchor terminal, the special effect rendering instruction is an instruction generated by carrying out special effect triggering operation on special effect content in a live broadcast interface displayed by the anchor terminal based on an anchor object; when the special effect rendering instruction comes from the audience terminal, the special effect rendering instruction is generated by carrying out special effect triggering operation on special effect content in a live interface displayed by the audience terminal based on the audience object.
The special effect content is used for rendering special effect resources characterized by special effect identification to obtain the special effect. Namely: the special effect triggering operation is used for displaying special effects corresponding to the special effect content by triggering the special effect content. For example: and triggering and displaying the star special effect aiming at the area A in the live interface, and taking the triggering operation as the special effect triggering operation.
The special effect triggering operation is an operation of displaying the special effect while realizing different live broadcast functions. For example: triggering is carried out on the region B in the live interface to realize the praise function, meanwhile, the praise function can display a praise special effect in an accompanying manner, and the triggering operation is used as a special effect triggering operation.
Optionally, the special effect triggering operation includes a control triggering operation, a gesture triggering operation, a voice triggering operation, and other triggering forms, which are not limited herein.
In some embodiments, the special effect rendering instructions are from at least one of a head office terminal and a viewer terminal.
Illustratively, the audience terminal is a terminal other than the anchor terminal among the plurality of terminals in the living room, the audience terminal corresponds to the audience object, and the audience object participates in the living room in a living broadcast watching manner. The audience object interacts with the anchor object or other audience objects within the living room; the anchor object may also interact with the viewer object or an anchor object co-live with the viewer object within the living room.
Optionally, the specific effect rendering instruction is from the anchor terminal.
Schematically, a live broadcast interface displayed on the anchor terminal is distributed with a plurality of functional areas, and different functional areas are used for realizing different live broadcast functions. For example: the plurality of functional areas comprise comment functional areas, and are used for triggering and realizing comment functions so as to realize the purpose of publishing comment contents; the function areas comprise account attention areas and are used for triggering the account attention function to be realized so as to realize the purpose of paying attention to the current anchor object; the function areas comprise a gift giving area and are used for triggering the function of giving the gift to realize the purpose of giving the gift to the host; the function areas comprise a praying area for triggering and realizing a praying function so as to realize the purposes of praying the main broadcasting object and the living broadcasting room, increasing the popularity of the main broadcasting object and the living broadcasting room and the like.
Optionally, based on default configuration of the living room or custom configuration process of the terminal usage object (such as the anchor object, the audience object, etc.), the special effect content needs to be displayed concomitantly when triggering for a part of the functional area to realize the corresponding function. For example: when triggering the praise area, the thumb special effect and/or the love special effect are required to be displayed in a concomitant way; when triggering is performed for the gift-gifting area, it is necessary to display "gift-gifting special effects" and/or "gifted gift special effects" and the like.
Schematically, after the anchor terminal displays the live interface, when the anchor object triggers against the area in which the special effect content needs to be displayed, the anchor terminal generates a special effect rendering instruction.
Similarly, taking the example that the special effect rendering instruction comes from the audience terminal, the audience terminal displays a live broadcast interface corresponding to the live broadcast room when watching the live broadcast room, and when the live broadcast room comprises a part of functional areas for triggering to realize corresponding functions, special effect contents need to be displayed concomitantly; when the viewer object triggers for an area in which the effect content needs to be concomitantly displayed, the viewer terminal generates an effect rendering instruction.
Alternatively, the live interface displayed by the audience terminal may be the same as the live interface displayed by the anchor terminal, or may be different from the live interface displayed by the anchor terminal, which is not limited herein.
The special effect rendering instruction comprises a special effect identifier; the special effects identification is used to uniquely identify the special effects resource.
Illustratively, when generating the special effect rendering instruction, special effect identifiers for distinguishing different special effect resources are included, and the different special effect resources are used for representing data resources for rendering to obtain different special effect contents.
For example: the triggering operation performed on the praise area is used as the special effect triggering operation, the special effect content corresponding to the praise area is 'loving special effect', the special effect is required to be rendered through the special effect resource 1 corresponding to the 'loving special effect', if the 'loving special effect' is required to be displayed based on the special effect rendering instruction, the special effect rendering instruction needs to comprise a special effect mark a for representing the 'loving special effect', and the special effect mark a is used for uniquely representing the special effect resource 1.
Similarly, the triggering operation performed on the gift-gifting area is used as the special effect triggering operation, if the special effect content corresponding to the gift-gifting area is different based on the gifted gift, if the special effect content displayed based on the special effect rendering instruction is the 'gifted gift', the special effect content needs to be rendered through the special effect resource 2 corresponding to the 'gifted gift', and the special effect rendering instruction needs to include the special effect identifier b for representing the 'gifted gift', wherein the special effect identifier b is used for uniquely representing the special effect resource 2 and the like.
Based on the relationship between the special effect rendering instruction and the special effect identifier included in the special effect rendering instruction, if a live picture with a special effect needs to be obtained, special effect content needs to be added on the basis of a live video stream, namely: the special effect rendering instruction is used for applying special effect resources to the live video stream.
In some embodiments, the anchor terminal may send effect rendering instructions, and the audience terminal may also send effect rendering instructions; when the two terminals send the special effect rendering instruction, the two terminals can send the special effect rendering instruction simultaneously or sequentially.
Step 230, based on the special effect identifier in the special effect rendering instruction, obtaining a first special effect resource characterized by the special effect identifier.
Illustratively, after the special effect rendering instruction is obtained, a special effect identifier is obtained from the special effect rendering instruction.
Optionally, the special effect identifier is stored at a preset position in the special effect rendering instruction, and when the special effect identifier is acquired from the special effect rendering instruction, the special effect identifier is read from the preset field position in the special effect rendering instruction.
In some embodiments, the server is configured to cache a plurality of special effect resources, where the plurality of special effect resources are labeled with special effect identifiers respectively and correspondingly; after the special effect identification in the special effect rendering instruction is obtained, comparing the special effect identification with special effect identifications respectively marked by a plurality of special effect resources to determine a first special effect resource represented by the special effect identification in the special effect rendering instruction from the plurality of special effect resources.
Step 240, applying the first special effect resource to the live video stream to generate a special effect video stream.
Illustratively, after determining the first special effect resource to be applied, the first special effect resource is applied to the live video stream, thereby generating the special effect video stream.
Optionally, generating a special effect animation based on the first special effect resource, and applying the special effect animation to the live video stream, thereby generating the special effect video stream. For example: after the special effect animation is generated, determining a current time stamp corresponding to the live video stream at the current moment, so that the special effect animation is applied to the live video stream from the current time stamp, and the special effect video stream is obtained.
Illustratively, the special effect video stream is used for representing a video stream obtained after the special effect resource is applied to the live video stream.
The special effect video stream is used for being sent to at least one terminal in the live broadcasting room.
Optionally, the at least one terminal is a terminal currently within the living room. For example: at least one of the terminals is implemented as a hosting terminal and a plurality of audience terminals that are watching a living room.
Optionally, the at least one terminal is a terminal currently within the living room and transmitting the special effect rendering instruction. For example: the plurality of terminals currently in the living broadcast room are a terminal 1, a terminal 2 and a terminal 3; and if the terminal which receives the special effect triggering operation and sends the special effect rendering instruction is the terminal 1, the terminal 1 is used as at least one terminal for receiving the special effect video stream.
The terminal is used for rendering the special effect video stream and displaying the live broadcast picture with the special effect.
Illustratively, after at least one terminal receives the special effect video stream, each terminal renders the special effect video stream based on a picture rendering function to display a live picture with special effects.
The live broadcast picture with the special effect is realized as the live broadcast picture with the first special effect resource, so that the special effect content represented by the first special effect resource is additionally displayed on the monotone live broadcast picture, the display elements of the picture are enriched, and the layering sense of the live broadcast picture is improved.
In summary, a live video stream generated based on video data collected by a host terminal is obtained, a special effect rendering instruction including a special effect identifier is obtained, and equipment outside the terminal obtains a first special effect resource represented by the special effect identifier based on the special effect identifier in the special effect rendering instruction, so that the first special effect resource is applied to the live video stream and the special effect video stream is generated. By the method, the special effect resource acquisition process and the special effect rendering process are executed by the server, so that the problem of high configuration difficulty caused by the need of adapting different terminals is avoided, and the problem that the terminals can completely present the special effect only by updating the latest installation package in real time is also avoided; at least one terminal in the live broadcasting room can efficiently and smoothly display live broadcasting pictures with special effects by directly receiving the special effect video stream with special effect resources applied and rendering the special effect video stream.
In an alternative embodiment, when the first special effect resource is applied to the live video stream, the special effect picture frame is obtained based on the first special effect resource first, and then the special effect picture frame is applied to the live video stream, so that the special effect video stream is obtained. Illustratively, as shown in FIG. 3, the step 240 shown in FIG. 2 described above may also be implemented as steps 310 through 330 described below.
Step 310, obtaining a special effect rendering frame rate.
The special effect rendering Frame rate is used for representing the number of picture frames of special effect picture frames displayed Per Second when the special effect resource is rendered, the unit of the Frame rate is "transmission Frame number Per Second", and English is called Frame Per Second for short as FPS.
Illustratively, the special effect picture frame is used for being combined with the live picture frame of the live video stream to obtain the special effect video stream. Namely: the special effect picture frame is a picture frame for composing a special effect.
Optionally, the special effect rendering frame rate is a preset value. Such as: the preset special effect rendering frame rate is 120FPS, and the number of frames representing the special effect frames displayed per second is 120.
Optionally, the effect rendering frame rate is a numerical value determined based on the effect resource. For example: the different special effect resources are respectively corresponding to one special effect rendering frame rate, and after the first special effect resource is obtained, the special effect rendering frame rate corresponding to the first special effect resource is determined.
Step 320, based on the first effect resource and the effect rendering frame rate, generates a plurality of effect frames for exhibiting the first effect resource.
The special effect picture frames are used for forming special effects and displaying the special effect pictures.
Illustratively, a frame-by-frame special effect picture frame is rendered in real time according to the frame rate speed represented by the special effect rendering frame rate, so that a plurality of special effect picture frames are obtained.
In step 330, a plurality of special effect frames are applied to the live video stream to generate a special effect video stream.
In an alternative embodiment, a plurality of special effect picture frames are combined based on a time sequence relationship to obtain a special effect picture corresponding to the first special effect resource.
The special effect picture corresponds to a special effect duration, and the special effect duration is the time period length for displaying the special effect.
Illustratively, a timing relationship exists between the plurality of special effect picture frames so that the plurality of special effect picture frames can be combined. Namely: and arranging and combining the plurality of special effect picture frames with the time sequence relation on the time sequence, so as to obtain the special effect picture corresponding to the first special effect resource.
In an alternative embodiment, the special effect picture is applied to the live video stream to obtain the special effect video stream.
Optionally, after obtaining the special effect picture, taking the special effect picture as a top layer and taking the live video stream as a bottom layer, thereby applying the special effect picture to the live video stream and obtaining the special effect video stream.
In some embodiments, determining a first timestamp for starting to apply the special effect picture frame from a time axis corresponding to the live video stream; and starting from the first timestamp, applying the special effect picture to the live video stream to obtain the special effect video stream.
The duration of the special effect video stream is special effect duration.
In an alternative embodiment, a plurality of live view frames to which a plurality of special effect view frames are applied are determined from a live video stream.
The live-broadcast picture frames correspond to the special-effect picture frames one by one, and the interval duration between two adjacent live-broadcast picture frames is determined based on the special-effect rendering frame rate.
Illustratively, a first timestamp for starting to apply the special effect picture frame is determined from a time axis corresponding to the live video stream.
In some embodiments, the first timestamp of the beginning application special effects picture frame may be implemented as the current time.
Illustratively, the moment of generating the special effect picture is taken as the current moment, so that the received live video stream is analyzed to determine a first timestamp corresponding to the live video stream at the current moment; or, taking the moment of generating the first special effect picture frame as the current moment, so as to analyze the received live video stream to determine a first timestamp corresponding to the live video stream at the current moment; or taking the moment of receiving the special effect rendering instruction as the current moment, so as to analyze the received live video stream to determine a first timestamp corresponding to the live video stream at the current moment.
In some embodiments, the first timestamp of the beginning application of the special effects picture frame may also be implemented as the selected time instant.
Illustratively, the special effect rendering instruction includes a selected time, where the selected time is determined based on a special effect triggering operation of the anchor object or the audience object, and is determined after the time for displaying the special effect is configured.
Optionally, the selected time is a future time. For example: the current time is 10 points 20 minutes zero 30 seconds, the selected time is 10 points 25 minutes and the like.
In some embodiments, starting from the first timestamp, a plurality of live view frames is determined to which the plurality of effect view frames are applied. Illustratively, the live picture frame is a picture frame constituting a live video stream, and the plurality of live picture frames is a plurality of picture frames determined from the first timestamp.
Wherein, a plurality of time stamps are determined from the first time stamp, and each time stamp corresponds to a live broadcast picture frame and a special effect picture frame respectively, namely: the plurality of time stamps are in one-to-one correspondence with the plurality of live-broadcast picture frames, the plurality of time stamps are in one-to-one correspondence with the plurality of special-effect picture frames, and the first time stamp is in correspondence with the first special-effect picture frame, namely the live-broadcast picture frame of the first special-effect picture frame is applied.
Illustratively, the interval duration between two adjacent special effect picture frames is determined based on the special effect rendering frame rate, so the interval duration between two adjacent live broadcast picture frames is also determined based on the special effect rendering frame rate, and similarly, the interval duration between two adjacent time stamps is also determined based on the special effect rendering frame rate.
For example: the special effect rendering frame rate is 60FPS, 60 special effect picture frames are displayed in one second, and the interval duration between two adjacent live broadcast picture frames is 1/60 second; similarly, the interval duration between two adjacent special effect picture frames is also 1/60 second, and the interval duration between two adjacent time stamps is also 1/60 second, etc.
In an alternative embodiment, a plurality of special effect picture frames are respectively applied to corresponding live broadcast picture frames to obtain a plurality of special effect video picture frames.
The special effect video picture frame is used for generating a special effect video stream.
Illustratively, the first special effect picture frame is applied to a first live picture frame of the plurality of live picture frames, and similarly, the nth special effect picture frame is applied to an nth live picture frame of the plurality of live picture frames, where n is a positive integer greater than 1. Optionally, the live-broadcast picture frame to which the special effect picture frame is applied is called a special effect video picture frame, and based on the process that the special effect picture frames are respectively applied to the live-broadcast picture frames, the special effect video picture frames respectively corresponding to the special effect picture frames are obtained, that is, the special effect video picture frames respectively corresponding to the live-broadcast picture frames are obtained.
In an alternative embodiment, a plurality of special effects video frames are combined based on a timing relationship to generate a special effects video stream.
Illustratively, the plurality of special effect picture frames are in one-to-one correspondence with the plurality of special effect video picture frames, so that the time sequence relationship of the plurality of special effect picture frames, namely the time sequence relationship of the plurality of special effect picture frames, can be used for combining the plurality of special effect video picture frames based on the time sequence relationship to generate the special effect video stream.
Optionally, the special effect video stream may be generated by combining a plurality of special effect video picture frames according to a time sequence based on a time axis corresponding to the live video stream.
The duration of the special effect video stream is special effect duration. Namely: the time length of the special effect video stream for displaying the live picture with the special effect is the same as the time length of the special effect time length corresponding to the special effect picture.
Illustratively, after determining the first timestamp, when the special effect picture frame is applied to a plurality of live picture frames in the live video stream, starting from the first timestamp, applying the special effect picture frame to the corresponding live picture frame, thereby obtaining the special effect video stream with the special effect duration. Illustratively, the special effect duration is a preset duration, such as 2 seconds, 3 seconds, etc.
In an alternative embodiment, in some embodiments, after the last special effect frame is applied to the last live frame in the plurality of live frames, a video segment to which the plurality of special effect frames are applied may be determined, and the video segment to which the special effect frame is applied may be referred to as a special effect video stream.
Optionally, the special effect video stream is sent to at least one terminal in the live broadcast room, so that the at least one terminal renders the special effect video stream and displays live broadcast pictures with special effects.
In some embodiments, after the special effect picture frames are applied to the live video stream, special effect video picture frames for forming the special effect video stream are obtained, the special effect picture frames are in one-to-one correspondence with the special effect video picture frames for forming the special effect video stream, so that the live video picture with special effect can be conveniently displayed on at least one terminal as soon as possible, the special effect video picture frames obtained after the special effect picture frames are applied are sequentially sent to at least one terminal, and therefore the at least one terminal can obtain the live video picture with special effect which is displayed more quickly through a sequential rendering process of the special effect video picture frames.
It should be noted that the above is only an illustrative example, and the embodiments of the present application are not limited thereto.
In summary, the special effect resource obtaining process and the special effect rendering process are carried out by equipment outside the terminal, so that the problem of high configuration difficulty caused by the need of adapting to different terminals is avoided, and the problem that the terminal can completely present the special effect only by updating the latest installation package in real time is also avoided; at least one terminal in the live broadcasting room can efficiently and smoothly display live broadcasting pictures with special effects by directly receiving the special effect video stream with special effect resources applied and rendering the special effect video stream.
In the embodiment of the application, when the first special effect resource is applied to the live video stream, firstly, obtaining the special effect picture based on the first special effect resource and the special effect rendering frame rate, and then applying the special effect picture to the live video stream to obtain the special effect video stream is introduced. The problem of blocking and inefficiency possibly existing when the terminal executes the special effect rendering process is avoided, the special effect video picture frame forming the special effect video stream can be obtained in real time by applying the special effect picture frame forming the special effect picture to the live picture frame in the live video stream in real time, and then the at least one terminal can display the live picture with the special effect in real time based on the special effect video picture frame in a mode that the special effect video picture frame is sent to the at least one terminal, so that the special effect rendering efficiency is fully improved, and the high man-machine interaction efficiency is achieved.
In an optional embodiment, the method for rendering the live broadcast picture is applied to a live broadcast picture rendering system, and the live broadcast picture rendering system is used for caching a plurality of special effect resources, and the special effect resources are respectively and correspondingly marked with special effect identifications. After receiving the special effect rendering instruction, comparing the special effect identification in the special effect rendering instruction with the special effect identification respectively corresponding to the labels of at least one special effect resource stored in the rendering system of the live broadcast picture, thereby obtaining the obtained first special effect resource. Illustratively, as shown in fig. 4, the step 230 shown in fig. 2 is performed by the live-view rendering system, and the step 230 may be further implemented as the following steps 410 to 432.
Step 410, searching for at least one stored effect resource based on the effect identification in the effect rendering instruction.
Illustratively, the rendering system of the live broadcast picture is used for caching a plurality of special effect resources, each special effect resource is respectively marked with a special effect identifier, and the special effect identifier is used for uniquely identifying the special effect resource.
In some embodiments, the rendering system of the live broadcast picture already stores at least one special effect resource, and after receiving the special effect rendering instruction and determining the special effect identifier in the special effect rendering instruction, the special effect identifier corresponding to the at least one special effect resource is searched based on the special effect identifier.
Optionally, searching the at least one special effect resource to obtain a searching result, where the searching result includes at least one of the following cases.
(1) The search result indicates that the special effect identifier marked by the special effect resource stored in the rendering system of the live image of the rendering system of the live image comprises the special effect identifier in the special effect rendering instruction.
Illustratively, the rendering system of the live picture stores at least one special effect resource, the at least one special effect resource is respectively and correspondingly marked with a special effect identifier, the at least one special effect identifier is searched, and when the at least one special effect identifier of the rendering system of the live picture is determined to comprise the special effect identifier in the special effect rendering instruction, a search result is determined.
(2) The search result indicates that the special effect identifier marked by the special effect resource stored in the rendering system of the live image does not comprise the special effect identifier in the special effect rendering instruction.
Illustratively, the rendering system of the live picture stores at least one special effect resource, the at least one special effect resource is respectively and correspondingly marked with a special effect identifier, the at least one special effect identifier is searched, and when the at least one special effect identifier of the rendering system of the live picture is determined to not include the special effect identifier in the special effect rendering instruction, another searching result is determined.
In an alternative embodiment, the rendering system of the live frame includes a rendering server, the rendering server is used for caching a plurality of special effects resources, and the rendering server has at least one special effects resource stored.
Illustratively, the rendering server is a component part in a rendering system of the live broadcast picture, the rendering system of the live broadcast picture is realized as a cloud server cluster, and the rendering server is realized as at least one cloud server; or, the rendering system of the live broadcast picture is implemented as a server cluster, and the rendering server is implemented as at least one server, which may be an entity server, a cloud server, or a set of entity servers and cloud servers, and is not limited herein.
In an alternative embodiment, a first effect resource characterized by the effect identification is obtained based on the search result.
Illustratively, based on the difference of the search results, the first special effect resource characterized by the special effect identifier is obtained in different modes.
Alternatively, the following steps 420 or 430 to 432 are methods for obtaining the first special effects resource based on different search results.
Step 420, in response to the search result indicating that the stored at least one special effect resource includes the first special effect resource, obtains the first special effect resource from the at least one special effect resource.
Optionally, the special effect identifier corresponding to the first special effect resource is called a first special effect identifier, and when the search result indicates that the special effect identifier marked by the special effect resource stored by the rendering system of the live-broadcast picture includes the special effect identifier in the special effect rendering instruction, at least one special effect identifier corresponding to the rendering system of the live-broadcast picture includes the first special effect identifier.
Illustratively, if at least one special effect resource marked with the first special effect identifier exists in at least one special effect resource stored in the rendering system based on the live broadcast picture, when the first special effect resource is obtained, the special effect resource marked with the first special effect identifier is obtained from the at least one special effect resource to serve as the first special effect resource.
The first special effect identifier is a special effect identifier which is included in the special effect rendering instruction and used for acquiring the first special effect resource.
In step 430, a special effects resource acquisition request is generated in response to the search result indicating that the first special effects resource is not included in the stored at least one special effects resource.
Optionally, the special effect identifier corresponding to the first special effect resource is called a first special effect identifier, and when the search result indicates that the special effect identifier marked by the special effect resource stored by the rendering system of the live-broadcast picture does not include the special effect identifier in the special effect rendering instruction, at least one special effect identifier corresponding to the rendering system of the live-broadcast picture does not include the first special effect identifier.
Because the rendering system of the live-broadcast picture does not store the first special effect resource, if the rendering system of the live-broadcast picture needs to perform the special effect rendering process at the local end through the first special effect resource, the first special effect resource needs to be acquired based on the special effect identifier in the special effect rendering instruction.
Optionally, the rendering system of the live image generates an effect resource acquisition request for acquiring the first effect resource, where the effect resource acquisition request is used to hold Fang Huoqu the first effect resource from the effect resource.
Illustratively, the special effects resource holder is an object holding special effects resources.
In some embodiments, the live process is a process based on a live application that is run by a live platform that is responsible for developing the live application and enriching and perfecting the live application, including providing special effects resources within the live application. Optionally, the special effects resource holder is implemented as a live platform for developing and maintaining special effects resources, such as: application background of live APP, etc.
In some embodiments, the special effects resources applied in the live broadcast process are resources designed by resource development objects and put on shelf to the live broadcast room. Alternatively, the special effects resource holder is implemented as a resource development object or the like for designing and putting on the shelf the special effects resource.
In an alternative embodiment, the special effects resource acquisition request includes a special effects identifier for acquiring the first special effects resource.
Illustratively, the special effect identifier for acquiring the first special effect resource is a first special effect identifier, and when the special effect resource acquisition request is generated, the first special effect identifier is written into the special effect resource acquisition request; or establishing a binding relation between the first special effect identifier and the special effect resource acquisition request, and the like.
Optionally, the special effect resource obtaining request is used for obtaining the first special effect resource and also used for obtaining the stored special effect resource at the current moment.
For example: the method comprises the steps that a special effect resource 1 and a special effect resource 2 are stored in a rendering system of a live picture, a special effect identifier included in a special effect rendering instruction is used for acquiring a special effect resource 3, and the rendering system of the live picture is used for acquiring the special effect resource 3 and pulling the special effect resource which is already generated but not stored at a local end to the local end when generating a special effect resource acquisition request. If the special effect resource holder generates the special effect resource 1, the special effect resource 2, the special effect resource 3 and the special effect resource 4, the special effect resource acquisition request is also used for pulling the special effect resource 4 to the local end for storage.
It should be noted that the above is only an illustrative example, and the embodiments of the present application are not limited thereto.
Step 431, the special effect resource obtaining request is sent to the special effect resource holder.
Illustratively, after the rendering system of the live broadcast picture generates the special effect resource acquisition request, the special effect resource acquisition request is sent to the special effect resource holder.
Optionally, when the special effects resource holder is implemented as a live platform for developing and maintaining special effects resources, a special effects resource acquisition request is sent to the live platform to obtain the first special effects resources not stored at the local end from the live platform.
Step 432, receiving a first special effect resource sent by a special effect resource holder.
In some embodiments, the first effect identification is included in the effect resource acquisition request.
Optionally, after receiving the special effect resource obtaining request, the special effect resource holder reads a first special effect identifier in the special effect resource obtaining request, further determines a first special effect resource from the generated plurality of special effect resources based on the first special effect identifier, and sends the first special effect resource to a rendering system of the live broadcast picture.
Illustratively, when the rendering system of the live broadcast picture generates a special effect resource acquisition request, a first special effect identifier is written into a specified position of the special effect resource acquisition request, and the special effect resource acquisition request is sent to a special effect resource holder. After receiving the special effect resource acquisition request, the special effect resource holder reads the appointed position in the special effect resource acquisition request, so that a first special effect identifier is obtained. Based on the special effect resource holder holding a plurality of special effect resources, the special effect resource holder can search the plurality of special effect resources based on the first special effect identifier so as to find the special effect resource marked with the first special effect identifier, and the special effect resource is the first special effect resource.
In some embodiments, the special effect resource obtaining request includes special effect identifiers corresponding to at least one special effect resource stored in the rendering system of the live image.
Optionally, after receiving the special effect resource obtaining request, the special effect resource holder reads at least one special effect identifier in the special effect resource obtaining request; and comparing the at least one special effect identifier with special effect identifiers respectively corresponding to the generated plurality of special effect resources, thereby determining other special effect identifiers except the at least one special effect identifier, and sending the special effect resources corresponding to the other special effect identifiers to a rendering system of the live broadcast picture, wherein the rendering system comprises the first special effect resource.
Illustratively, when the rendering system of the live broadcast picture generates a special effect resource acquisition request, writing special effect identifiers respectively corresponding to at least one special effect resource stored by the rendering system of the live broadcast picture into a designated position of the special effect resource acquisition request, and sending the special effect resource acquisition request to a special effect resource holder. For example: the rendering system of the live image stores three special effect resources, namely a special effect resource A, a special effect resource B and a special effect resource C, wherein special effect identifiers corresponding to the three special effect resources respectively comprise a special effect identifier a corresponding to the special effect resource A, a special effect identifier B corresponding to the special effect resource B and a special effect identifier C corresponding to the special effect resource C, the special effect identifier a, the special effect identifier B and the special effect identifier C are written into the appointed position of the special effect resource acquisition request, and the special effect resource acquisition request is sent to a special effect resource holder.
Illustratively, after receiving the special effect resource acquisition request, the special effect resource holder reads the appointed position in the special effect resource acquisition request, so that three special effect resources stored in the rendering system of the live broadcast picture are known according to the three special effect identifications; based on the special effect resource holder holding a plurality of special effect resources, the special effect resource holder compares the three special effect identifications with the special effect identifications respectively marked by the plurality of special effect resources, so that other special effect identifications except the three special effect identifications can be determined. For example: the special effect resource holder holds special effect resource A, special effect resource B, special effect resource C, special effect resource D and special effect resource E; and comparing the read special effect identifier a, the special effect identifier b and the special effect identifier c with special effect identifiers respectively corresponding to the five special effect resources, so that other special effect identifiers except the three special effect identifiers comprise a special effect identifier D corresponding to the special effect resource D and a special effect identifier E corresponding to the special effect resource E.
Illustratively, the special effect resource holder sends the special effect resources characterized by the other special effect identifiers except the three special effect identifiers to the rendering system of the live broadcast picture, so that the rendering system of the live broadcast picture updates the currently stored special effect resources to obtain the special effect resource D and the special effect resource E which are not stored at the local end.
Based on the special effect resource holder holding the special effect resource related to the live broadcast process, the rendering system of the live broadcast picture can meet the requirement of the current special effect rendering instruction after updating and adding the special effect resource which is not stored at the local end before being stored by the special effect resource holder, and obtains the first special effect resource represented by the special effect identifier in the special effect rendering instruction, namely: the first effect resource is included in other effect resources newly acquired and stored from the effect resource holder.
It should be noted that the above is only an illustrative example, and the embodiments of the present application are not limited thereto.
In an alternative embodiment, a special effects resource package sent by a special effects resource holder is obtained.
The special effect resource package is a data package which is automatically sent to the rendering system of the live broadcast picture by the special effect resource holder based on the special effect resource updating process, and the special effect resource package comprises a plurality of supplementary special effect resources.
Wherein the supplementing of the special effects resource includes at least one of adding the special effects resource and updating the special effects resource.
Illustratively, after the special effect resource holder develops or generates a new special effect resource, in order to facilitate the rendering system of the live broadcast picture to perform the special effect rendering process more quickly based on the special effect rendering instruction, the special effect resource holder spontaneously sends a special effect resource package to the rendering system of the live broadcast picture so as to send the newly added special effect resource to the rendering system of the live broadcast picture.
Illustratively, when the special effect resource holder updates the existing special effect resource (such as style update, color update, etc.), in order to facilitate the rendering system of the live broadcast picture to more accurately perform the special effect rendering process based on the special effect rendering instruction, the special effect resource holder spontaneously sends a special effect resource package to the rendering system of the live broadcast picture so as to send the updated special effect resource to the rendering system of the live broadcast picture, etc.
In an alternative embodiment, the special effects resource packages are stored in a rendering system of the live view.
Illustratively, after receiving the special effect resource package, the rendering system of the live broadcast picture stores the special effect resource package in a designated storage position for storing the special effect resource; or after receiving the special effect resource package, the rendering system of the live image acquires the sent multiple special effect resources from the special effect resource package, and further stores the multiple special effect resources in a designated storage position for storing the special effect resources and the like.
It should be noted that the above is only an illustrative example, and the embodiments of the present application are not limited thereto.
In summary, the special effect resource obtaining process and the special effect rendering process are carried out by equipment outside the terminal, so that the problem of high configuration difficulty caused by the need of adapting to different terminals is avoided, and the problem that the terminal can completely present the special effect only by updating the latest installation package in real time is also avoided; at least one terminal in the live broadcasting room can efficiently and smoothly display live broadcasting pictures with special effects by directly receiving the special effect video stream with special effect resources applied and rendering the special effect video stream.
In the embodiment of the application, the rendering system of the live broadcast picture is introduced to cache a plurality of special effect resources marked with special effect identifiers, so that after the rendering system of the live broadcast picture obtains a special effect rendering instruction, the special effect resources stored at the local end can be searched according to the special effect identifiers in the special effect rendering instruction, and the first special effect resources can be rapidly applied to the live broadcast video stream; even if the local end does not store the special effect resource corresponding to the special effect identifier in the special effect rendering instruction, the first special effect resource can be held Fang Huoqu to the special effect resource through the special effect resource acquisition request, and the process of applying the first special effect resource to the live video stream is completed in the rendering system of the live picture. By the aid of the differential processing mode, application efficiency of applying the first special effect resources to the live video stream is improved, processes of large storage space and large calculation power such as resource storage and resource processing are finished by means of a live picture rendering system, the problem that rendering may have careless mistakes due to terminal performance is avoided, and rendering accuracy and rendering speed are further improved.
In an optional embodiment, the live-broadcast picture rendering method is applied to a live-broadcast picture rendering system, the live-broadcast picture rendering system comprises a video mixed-stream server and a rendering server, special-effect animations are generated through the rendering server, and then the live-broadcast video stream and the special-effect animations are mixed-stream through the video mixed-stream server, so that the special-effect video stream is obtained. Illustratively, as shown in fig. 5, the embodiment shown in fig. 2 described above is performed by a live-view rendering system that includes a video mixed-stream server 510 and a rendering server 520.
Video mixed stream server 510
Illustratively, the video mixed stream server 510 may be implemented as either a physical server or a cloud server, which is not limited herein.
In an alternative embodiment, video mixed stream server 510 is used to obtain a live video stream.
The live video stream is a video stream generated based on video data collected by the anchor terminal.
Illustratively, the anchor terminal is a terminal used by an anchor object, and the anchor terminal collects video data through a camera and generates a live video stream based on authorization of the anchor object.
In some embodiments, a live video stream sent by a hosting terminal is obtained.
Schematically, after the video data is acquired by the anchor terminal, the video data is encoded, so as to obtain a live video stream.
In some embodiments, video data sent by a anchor terminal is obtained; a live video stream is generated based on the video data.
Illustratively, after the video data is acquired by the anchor terminal, the video data is directly sent to the video mixed stream server 510, so that the video mixed stream server 510 encodes the video data, thereby obtaining a live video stream.
The anchor terminal sends the live video stream to a video mixed stream server 510 deployed on the rendering system of the live view. The video mixed stream server 510 is configured to perform at least one of mixed stream processing and rebroadcast processing on a live video stream.
Illustratively, mixed stream processing is used to perform video processing on live video streams, so that the processed video streams can conform to the expectations of the anchor object and the audience object, such as: adding text annotations to the live video stream, rendering special effects content, compressing video memory, changing video styles (e.g., size, color scale), etc., based on the operation of the anchor object and/or viewer object.
Illustratively, the rebroadcasting process is used for rebroadcasting the live video stream and/or the video stream after video processing, so as to forward to other terminals except the anchor terminal, so that the audience objects except the anchor object can watch the live process conveniently. Optionally, in the process of the rebroadcasting process, the change of the rebroadcasting form can be performed by changing the rebroadcasting address (such as newly adding the audience terminal in the live broadcasting room, reducing the audience terminal, ending the live broadcasting by the anchor terminal, and the like), changing the rebroadcasting form (such as changing the video compression form, and the like), changing the rebroadcasting mode (such as real-time rebroadcasting, delayed rebroadcasting, timed rebroadcasting, and the like), and the like.
In some embodiments, the hosting terminal implements a live broadcast process through a live broadcast application program, after the hosting terminal generates a live video stream, the live video stream is sent to an application background of the live broadcast application program for creating a live broadcast room, and then the live video stream is rebroadcast to a video mixed stream server 510 deployed on a rendering system of a live broadcast picture by the application background.
Illustratively, a live broadcast application program for realizing a live broadcast function is installed in the anchor terminal, the live broadcast application program comprises an application front end and an application background, and the application front end is interface content displayed for the anchor object and the audience object and is used for representing an interface directly interacted with the object, such as a live broadcast interface; the application background is used for managing the content of the live application program through the background and is oriented to authorized program developers, program operators and the like.
Optionally, the live broadcast process is performed by the live broadcast application program based on the anchor terminal, so that the live broadcast application program is convenient to manage live broadcast video streams, and the live broadcast video streams acquired and generated by the live broadcast application program through the authorization camera are sent to an application background of the live broadcast application program. The application background can authenticate the live video stream to screen the live video stream, for example: screening whether the live video stream is compliant, whether the content which does not meet the platform requirements exists, and the like.
Illustratively, when the application background successfully authenticates the live video stream (e.g., determines the live video stream compliance, etc.), the live video stream is forwarded to the video mixed stream server 510 of the rendering system of the live frame.
It should be noted that the above is only an illustrative example, and the embodiments of the present application are not limited thereto.
Rendering server 520
Illustratively, rendering server 520 may be implemented as either an entity server or a cloud server, without limitation.
In an alternative embodiment, the rendering server is configured to obtain special effects rendering instructions.
The effect rendering instruction is an instruction for rendering an effect. The special effects comprise various special effect types such as a character special effect, an expression special effect, an image special effect and the like; the special effects can be realized as either static special effects or dynamic special effects.
Illustratively, the live process is illustrated as implemented by a live application. As shown in fig. 6, a schematic diagram of special effects content provided for a host and/or viewer object in a live application 610. The special effects are shown in the right area 620, which includes various special effects such as "blood bottle", "hot pot", "high energy warning", "magic book", etc.
Alternatively, different effects content may render different effects, such as: when clicking the special effect content ' chafing dish ', displaying a virtual chafing dish special effect ' beep and beep ' bubbling ' in a live interface; after clicking the high-energy early warning of the special effect content, virtual early warning special effects such as fireworks and firecrackers are displayed in the live interface.
Illustratively, the special effect rendering instruction is an instruction generated based on a special effect triggering operation, which is an operation for displaying a special effect. For example: at least one special effect content is displayed in the live interface as shown in a region 620 in fig. 6, and a selection operation for the special effect content in the live interface is used as a special effect triggering operation, so that a special effect corresponding to the special effect content is displayed in the live interface based on the special effect triggering operation.
The special effect rendering instruction comprises special effect identifiers, and the special effect identifiers are used for uniquely identifying special effect resources.
Illustratively, each special effect content corresponds to a special effect identifier, and in the process of rendering the special effect by selecting the special effect content, the rendering process is realized by acquiring special effect resources corresponding to the special effect.
Optionally, each special effect content displayed in the live interface corresponds to one special effect resource, and the special effect corresponding to the special effect content is rendered and displayed through the corresponding special effect resource. In order to distinguish different special effects/special effect contents/special effect resources, the special effect resources are uniquely identified by special effect identification, namely: the special effect content is uniquely identified by the special effect identification.
Illustratively, as shown in the area 620 in fig. 6, the "hot pot" corresponds to the special effect identifier a, the "high-energy early warning" corresponds to the special effect identifier b, and the like, and different special effect contents correspond to different special effect identifiers.
In some embodiments, the special effect rendering instructions are from at least one of a head office terminal and a viewer terminal.
Schematically, a live broadcast interface is displayed by a host broadcast terminal, and a host broadcast object generates a special effect rendering instruction based on selection of special effect content in the live broadcast interface; alternatively, the viewer terminal displays a live interface, the viewer object generates a special effect rendering instruction, etc., by the viewer terminal based on selection of special effect content in the live interface.
Optionally, taking the specific effect rendering instruction from the audience terminal as an example, determining the specific effect identifier corresponding to the specific effect content based on the selection of the specific effect content in the live interface by the audience object, thereby generating the specific effect rendering instruction including the specific effect identifier when generating the specific effect rendering instruction, so as to uniquely determine the specific effect resource required to perform the specific effect rendering process.
In an alternative embodiment, rendering server 520 obtains the first effect resource characterized by the effect identification based on the effect identification in the effect rendering instruction.
Illustratively, the special effect identifier is stored at a preset position in the special effect rendering instruction, and when the special effect identifier is obtained from the special effect rendering instruction, the rendering server reads out the special effect identifier from the preset field position in the special effect rendering instruction.
In some embodiments, a plurality of special effect resources are cached in the rendering server, and the special effect resources are respectively and correspondingly marked with special effect identifications.
Optionally, after the special effect identifier in the special effect rendering instruction is obtained, comparing the special effect identifier with special effect identifiers respectively marked by the plurality of special effect resources, and when the special effect identifier in the special effect rendering instruction is found out from the plurality of special effect identifiers, determining the special effect resource corresponding to the special effect identifier as the first special effect resource.
Optionally, after obtaining the special effect identifier in the special effect rendering instruction, comparing the special effect identifier with special effect identifiers respectively marked by the plurality of special effect resources, and when the special effect identifier in the special effect rendering instruction is not found out from the plurality of special effect identifiers, generating a special effect resource obtaining request by the rendering server, wherein the special effect resource obtaining request is used for holding Fang Huoqu first special effect resources from the special effect resources; and then, the rendering server sends the special effect resource acquisition request to the special effect resource holder to acquire a first special effect resource from the special effect resource holder, wherein the special effect identifier corresponding to the first special effect resource is the special effect identifier in the special effect rendering instruction.
In an alternative embodiment, the rendering server generates a plurality of special effect picture frames corresponding to the first special effect resource after obtaining the first special effect resource for rendering onto the live video stream.
In some embodiments, an effect rendering frame rate is obtained, the effect rendering frame rate being used to characterize a frame number of effect frame frames displayed per second when the effect resource is rendered.
Optionally, the special effect rendering frame rate is a preset value. Such as: the preset special effect rendering frame rate is 60FPS, and the number of frames representing the special effect frames displayed per second is 60.
Optionally, the effect rendering frame rate is a numerical value determined based on the effect resource. For example: the different special effect resources are respectively corresponding to one special effect rendering frame rate, and after the first special effect resource is obtained, the special effect rendering frame rate corresponding to the first special effect resource is determined.
In some embodiments, a special effects picture corresponding to the first special effects resource is generated based on the first special effects resource and the special effects rendering frame rate.
Optionally, a plurality of effect picture frames for exhibiting the effect of the first asset are generated based on the first effect resource and the effect rendering frame rate.
Illustratively, a frame-by-frame special effect picture frame is rendered in real time according to the frame rate speed represented by the special effect rendering frame rate, so that a plurality of special effect picture frames are obtained.
Optionally, the plurality of special effect picture frames are combined according to the time sequence to obtain the special effect picture corresponding to the first special effect resource.
Illustratively, a timing relationship exists between the plurality of special effect picture frames so that the plurality of special effect picture frames can be combined. Namely: and combining the plurality of special effect picture frames with the time sequence relation according to the time sequence, so as to obtain the special effect picture corresponding to the first special effect resource.
In an alternative embodiment, a plurality of effect picture frames are sent to video mixed stream server 510.
Illustratively, the video mixed-stream server 510 is further configured to perform video processing on the live video stream, for example, on the basis of storing the live video stream: and adding special effects to the live video stream by applying a plurality of special effect picture frames to the live video stream so as to display live pictures with special effects and the like.
Video mixed stream server 510
In an alternative embodiment, the video mixed-stream server 510 is further configured to apply a plurality of special effect picture frames to the live video stream to obtain a special effect video stream.
Illustratively, after receiving the special effect frame, the video mixed-stream server 510 determines a first time stamp for starting to apply the special effect frame from a time axis corresponding to the live video stream, and starts to apply the special effect frame to the live video stream from the first time stamp, thereby obtaining the special effect video stream.
Optionally, starting from the current timestamp, taking the special effect picture as a top layer and taking the live video stream as a bottom layer, thereby realizing the process of applying the special effect picture to the live video stream and obtaining the special effect video stream.
Illustratively, when a plurality of special effect picture frames are applied to a live video stream, the plurality of special effect picture frames are respectively applied to corresponding live picture frames, so that a plurality of special effect video picture frames are obtained.
Optionally, after each special effect picture frame is applied to the corresponding live broadcast picture frame, a special effect video picture frame corresponding to the special effect picture frame is obtained, and at least one adjacent special effect video picture frame is sequentially sent to at least one terminal according to the time sequence relation of the special effect picture frame, so that the terminal performs a more timely rendering process based on the special effect video picture frame. Namely: the at least one terminal can perform the special effect rendering process based on the received special effect picture frames, so that live pictures with special effects are displayed in real time.
Illustratively, at least one special effect video picture frame may be referred to as a special effect video stream, i.e.: the special effect video stream comprises at least one special effect video picture frame. Therefore, the picture frames are taken as analysis rendering granularity, and the special effect rendering efficiency is improved.
In an alternative embodiment, video mixed stream server 510, after obtaining the special effects video stream, transmits the special effects video stream to at least one terminal within the living room.
Optionally, video mixed stream server 510 sends a special effect video stream to each terminal within the live room, including the anchor terminal and the audience terminal; each terminal can render the special effect video stream to display a live picture with special effects.
Optionally, the video mixed stream server 510 transmits the special effect video stream to the terminal that transmits the special effect rendering instruction. For example: the special effect rendering instruction is an instruction generated after special effect triggering operation is performed on the basis of a viewer object through a live broadcast interface displayed by a viewer terminal A, and the viewer terminal A sends the special effect rendering instruction to a rendering server 520 in a rendering system of a live broadcast picture, so that a video mixed-stream server 510 generates a special effect video stream; when the special effect video stream is sent, the special effect video stream is sent to the audience terminal A, so that the audience terminal A can render the special effect video stream to display a live broadcast picture with special effects; other terminals still display live pictures and the like without special effects because the other terminals do not receive the special effect video stream.
Optionally, the video mixed stream server 510 transmits the special effect video stream to the anchor terminal and the terminal (e.g., the viewer terminal a described above) that transmits the special effect rendering instruction, so that the anchor terminal and the viewer terminal a may render the special effect video stream to display a live picture with special effects; other terminals still display live pictures and the like without special effects because the other terminals do not receive the special effect video stream.
In some embodiments, the special effects rendering instructions include a terminal identification. And acquiring a terminal identifier in the special effect rendering instruction.
The terminal identification is used for indicating at least one terminal for receiving the special effect video stream, and the terminal identification is determined based on at least one terminal selected during special effect triggering operation.
Illustratively, the rendering server 520 forwards the special effect rendering instruction to the video mixed-stream server 510 in addition to sending the special effect picture to the video mixed-stream server 510; alternatively, the rendering server 520 transmits the terminal identification read from the special effect rendering instruction to the video mixed stream server 510 in addition to the special effect screen to the video mixed stream server 510. So that the video mixed stream server 510 can learn the terminal identification.
Alternatively, at least one terminal characterized by a terminal identity is referred to as a special effect trigger terminal.
The special effect triggering terminal is schematically used for representing the terminal applying the special effect. Optionally, the terminal identifier is implemented as identification information marked by the terminal sending the special effect rendering instruction. Namely: the terminal identification indicates that the special effect video stream is sent to a terminal sending out a special effect rendering instruction, and the terminal is a special effect triggering terminal.
Optionally, the terminal identifier is implemented as identification information marked by the terminal for which the special effect rendering instruction is directed.
Illustratively, the special effect rendering instruction is an instruction sent by the audience terminal a, and the purpose of the special effect rendering instruction is to display the special effect on a live interface corresponding to the anchor terminal, such as giving a gift to an anchor object, sending an encouraging utterance to the anchor object, and the like; if the audience object operating the audience terminal A hopes that the special effect can be only displayed on a live broadcast interface corresponding to the anchor terminal (such as a privately gifting gift, etc.), selecting the anchor terminal as the special effect triggering terminal in the process of special effect triggering operation, and writing a terminal identifier corresponding to the anchor terminal into the special effect rendering instruction when generating the special effect rendering instruction; similarly, if the audience object operating the audience terminal a wishes to apply the gift or other special effects on the live broadcast interface of the audience terminal B, the audience terminal B is selected as the special effect triggering terminal in the process of special effect triggering operation, and the terminal identifier corresponding to the audience terminal B is written into the special effect rendering instruction when the special effect rendering instruction is generated.
In some embodiments, video mixed stream server 510 sends the special effects video stream to the special effects trigger terminal based on the terminal identification.
Illustratively, the special effect video stream is selectively received by the selected special effect triggering terminal based on the special effect video stream, so that the special effect video stream can be rendered on the special effect triggering terminal more pertinently and a live broadcast picture with special effects can be displayed.
It is noted that the special effect trigger terminal is a terminal that is authorized to be selected.
Schematically, in the process of selecting the special effect triggering terminal, request information is sent to the selected terminal, the request information is used for requesting whether to agree to display the special effect selected by other terminals, and if the selected terminal agrees, the terminal can be used as the special effect triggering terminal. For example: in the process that the audience terminal A selects the special effect trigger terminal, request information is sent to the selected audience terminal B, and if the selected audience terminal B agrees, the audience terminal B can be used as the special effect trigger terminal.
Alternatively, the selected terminal may be set to a permitted display state or the like by the object permission, which is merely an illustrative example, and the embodiment of the present application is not limited thereto.
Namely: the special effect video stream may be sent equally to each terminal in the living broadcast room or may be sent differentially to individual terminals (such as the terminal sending special effect rendering instructions described above), which are merely illustrative examples and the embodiments of the present application are not limited thereto.
In some embodiments, the special effect triggering operation may select the special effect content and select at least one terminal, and may configure special effect properties such as a position of special effect display (special effect position), a size of special effect display (special effect size), and the like.
Illustratively, special effect positions (such as being displayed below a live broadcast interface, being displayed in the center of the live broadcast interface and the like) are configured in special effect triggering operation, and/or special effect sizes (such as being larger in special effect size, smaller in special effect size and the like) are configured in special effect triggering operation.
Optionally, the special effect attribute is written into the special effect rendering instruction, so that when the special effect rendering instruction is based on the special effect rendering instruction to apply the special effect resource to the live video stream, the special effect resource is applied based on the information of the special effect attribute representation, the special effect video stream conforming to the special effect attribute is obtained, and the live video with the special effect adapted to the configured special effect attribute is displayed through at least one terminal.
It should be noted that the above is only an illustrative example, and the embodiments of the present application are not limited thereto.
In summary, the special effect resource obtaining process and the special effect rendering process are carried out by equipment outside the terminal, so that the problem of high configuration difficulty caused by the need of adapting to different terminals is avoided, and the problem that the terminal can completely present the special effect only by updating the latest installation package in real time is also avoided; at least one terminal in the live broadcasting room can efficiently and smoothly display live broadcasting pictures with special effects by directly receiving the special effect video stream with special effect resources applied and rendering the special effect video stream.
In the embodiment of the application, the content of the live-broadcast picture rendering method applied to the live-broadcast picture rendering system is introduced, live-broadcast video streams are stored through a video mixed-stream server in the live-broadcast picture rendering system, special-effect resources are stored through the rendering server and rendered to generate special-effect pictures, and then the live-broadcast video streams and the special-effect pictures are mixed through the video mixed-stream server to obtain special-effect video streams. The video mixed-flow server and the rendering server are used for carrying out finer resolution on the special effect rendering process, so that the framework of the live-broadcast picture rendering process is clearer, and different modules are respectively responsible for corresponding functions, thereby being convenient for overall management of a live-broadcast picture rendering system.
In an alternative embodiment, the effect rendering process in the related art is described as follows.
In the related technology, when a live broadcast process is carried out through a live broadcast APP, a live broadcast room is created, and gift special effects of the live broadcast room are all generated by rendering at a terminal of the live broadcast APP.
Illustratively, the material format of the gift special effect provided in the living broadcasting room generally supports an image interchange format (Graphics Interchange Format, GIF), a portable network animation (Animated Portable Network Graphics, aplg), a picture file format (webpy, webp) combining lossy compression and lossless compression, an animation rendering library Lottie, an advanced video graphics array (Super Video Graphics Array, SVGA) and the like.
Optionally, special effect materials (special effect resources) are designed by design software, then special effect materials are packaged into special effect material packages supported by each terminal by using a special effect resource manufacturing tool, the special effect material packages are installed and packaged along with the live APP and then put on the shelf to an application market, and for a user who has installed the live APP, the special effect material packages can be downloaded and updated in an incremental updating process.
Namely: in the process of installing or updating the live APP, packaging pictures, videos, two-dimensional special effect resources, three-dimensional special effect resources and the like related to rendering, and configuring the pictures, videos, two-dimensional special effect resources, three-dimensional special effect resources and the like in an installation package or an updating package so as to download configuration information of the live APP to a local terminal provided with the live APP.
In the live broadcast process, when the live broadcast interaction process between the audience object and the anchor object needs to render the special effect, the audience terminal and/or the anchor terminal generate rendering instructions. Illustratively, taking the special effect rendering process performed by the audience terminal as an example, as shown in fig. 7, a special effect rendering schematic diagram in the related art is shown.
The viewer terminal 701 generates a rendering instruction (an instruction for starting execution of a rendering process) based on a trigger operation of a viewer object, and transmits the rendering instruction to the live APP backend 720; the live APP backend 720 determines special effect identifications of special effect resources to be applied based on the rendering instruction and triggering operation of the audience objects; and then, the live APP back end 720 generates a special effect rendering instruction comprising a special effect identifier, sends the special effect rendering instruction to the audience terminal 701 through Circuit Switch (CS) signaling or SEI frame information, and after receiving the special effect rendering instruction comprising the special effect identifier, the audience terminal 701 searches special effect resources in a downloaded installation package or update package to find out special effect resources corresponding to the special effect identifier, and loads the special effect resources corresponding to the special effect identifier locally, so that a live broadcast picture with the special effect is rendered and displayed on an interface.
Namely: the resource index ID (namely, special effect identification) of special effect resources (pictures, video effects and the like) to be presented or displayed is sent to the terminal through the APP back end, and after the terminal receives the resource index ID, special effect resources corresponding to the relevant resource index ID are loaded locally and then presented in a rendering mode.
The above process has the following problems:
(1) Different terminal hardware configurations of different using objects (anchor objects, audience objects and the like) are different, and for the same special effect resource, a special effect picture, video, 2D special effect resource, 3D special effect resource and the like with lower relative rendering calculation ratio are required to be designed in a compatible manner at a terminal with poor hardware configuration; or the effects seen by the use objects of different terminals provided with the same live APP are different due to the fact that the special effect resources which cannot be rendered are not rendered, and the live APP development needs to be compatible with various low-end hardware configurations, so that the development difficulty is high, the development efficiency and the special effect display effect are relatively unsatisfactory (for example, the compatibility of various low-end machine type equipment is considered in the special effect manufacturing process, so that the manufactured and rendered special effect is not obvious), and the use experience effect of the live APP is poor;
(2) When a live APP is newly added with a plurality of special effect resources, all terminals for installing the live APP can acquire the special effect resources by updating the live APP, and download installation packages, update packages and the like; however, as the number of special effect resources increases, the installation package of the new live APP becomes larger and larger, not only occupies the storage capacity of the terminal, but also makes the special effect rendering effect depend on the downloading process to a certain extent, if the new special effect resource needs to update the special effect material package of the live APP, the special effect material package corresponding to the special effect resource cannot be successfully rendered even if the special effect material package corresponding to the special effect resource is not updated, and the user cannot see the rendering effect of the special effect rendering. Namely: if the latest special effect resource cannot be downloaded at the terminal time, the corresponding special effect cannot be displayed, and the man-machine interaction efficiency is affected;
(3) The special effects resources are brought by signaling or information in supplemental enhancement information (Supplemental Enhancement Information, SEI) frames of a live video stream, wherein SEI belongs to the category of the code stream, and a method for adding additional information to the video code stream is provided. Because the network conditions of different use objects are different, the delay of the live broadcast process is different, and the different delay also can cause different time of gift or special effect display seen by the use objects when watching live broadcast, so that the live broadcast experience is poor.
It should be noted that the above is only an illustrative example, and the embodiments of the present application are not limited thereto.
In an alternative embodiment, the method for rendering the live video is referred to as a "live video special effect implementation solution based on real-time cloud rendering", and the special effect rendering process is performed in a cloud environment, so as to avoid the possible problems caused by the terminal execution, such as: configuration difference problems, update difference problems, latency problems, etc.
Schematically, the principle of cloud technology will be briefly described.
The cloud application applying the cloud technology is as follows: the application is put on a cloud server to run, audio and video acquired through image acquisition and voice acquisition are encoded through the cloud server and then transmitted to the terminal in a streaming mode, and then the terminal decodes the received audio and video stream and then renders the decoded audio and video stream at the terminal, so that the terminal can realize an application function without installing the application. Such as televisions, cell phones, PCs, tablets, etc. The cloud technology is used for enabling the user to be free of the problem of how to adapt games and applications to different software and hardware platforms, and the problem that rendering performance of the terminal is not strong enough and the like.
As shown in fig. 8, a basic interaction diagram of a cloud application (e.g., a cloud game) is shown. In the uplink part, the player performs an operation by means of a device such as a keyboard, a mouse, a handle or a touch screen connected (wired or wireless) with the terminal 810, and the terminal transmits an operation instruction corresponding to the operation and a coordinate position to a cloud application server instance 820 (cloud server ) of the cloud game, namely: transmitting interactive operation in real time; the cloud application server instance 820 maps the received operation instruction to the corresponding game mouse button, and sends the corresponding game mouse button to the real game application server through the keyboard mouse driver to complete the application service experience of the whole game.
Alternatively, the cloud application server instance 820 obtains a compressed audio and video stream based on the operation instruction through a rendering calculation process, and feeds back the audio and video stream to the terminal 810 to render and display a corresponding application screen (game screen) or the like on the terminal 810.
In some embodiments, the rendering principles and architecture for live video special effects for real-time cloud rendering are described.
As shown in fig. 9, a schematic diagram of a live-combined real-time cloud rendering processing principle and architecture is shown. Including a live platform basic interaction architecture 910, a real-time cloud rendering platform 920, and a live APP application background 930. Optionally, the real-time cloud rendering platform 920 is composed of a cloud server, and the live platform basic interaction architecture 910 and the real-time cloud rendering platform 920 may be collectively referred to as a live frame rendering system.
Taking the example that the live broadcast terminal 941 initiates live broadcast, the live broadcast terminal generates a live broadcast video stream and pushes the live broadcast video stream to an access module in the basic interaction architecture 910 of the live broadcast platform, and optionally, the access module forwards the received live broadcast video stream to a transcoding mixed stream module for storage.
If no audience object exists in the live broadcasting room and watches live broadcasting through the audience terminal 942, an edge data Center (OC) is not triggered, and the OC is used for triggering a back source to an intermediate source through the CDN so as to trigger a pull-up and conversion mixed stream module to acquire live broadcasting video streams;
When a live broadcast room has a live broadcast object, the live broadcast object is watched through the audience terminal 942, the audience terminal 942 is connected with the CDN and triggers OC access, so that the live broadcast is triggered to an intermediate source back source, the intermediate source mainly plays a role in OC back source convergence, and the intermediate source then pulls up a transcoding mixed stream template, so that a live broadcast video stream pushed by the host terminal in the uplink direction is obtained.
In some embodiments, when a host object or a viewer object within a living room needs to render a special effect, the rendering flow is as follows.
(1) The anchor terminal 941 or the viewer terminal 942 sends an effect rendering instruction to the APP application background 930, where the effect rendering instruction brings up an effect identification of the effect resource (e.g., gift, text, etc.) to be rendered.
(2) After receiving the special effect rendering instruction sent by the anchor terminal 941 or the audience terminal 942, the APP application background 930 checks whether the special effect rendering instruction and the special effect identifier are legal; if the special effect is legal, a special effect rendering instruction is sent to a transcoding mixed stream module of the basic interaction architecture 910 of the live broadcast platform, wherein the special effect rendering instruction carries a special effect identifier.
(3) The transcoding mixed stream module of the live platform basic interaction architecture 910 authenticates whether the special effect rendering instruction sent by the APP application background 930 is legal or not; if the special effect rendering command is legal, the special effect rendering command is timely sent to the real-time cloud rendering platform 920.
Wherein this request of the special effect rendering instruction is asynchronous, i.e.: the real-time cloud rendering platform 920 does not wait for the real-time cloud rendering platform 920 to return the rendered real-time special effect picture frame, and the real-time cloud rendering platform 920 has the function of sending the special effect picture frame, namely performing the mixed stream transcoding process and outputting the special effect picture frame to the anchor terminal 941 or the audience terminal 942; and not transcode only the live video stream for output to the anchor terminal 941 or the viewer terminal 942.
(4) After receiving the real-time special effect rendering instruction of the transcoding mixed stream module, the real-time cloud rendering platform 920 checks whether the special effect resource represented by the special effect identifier is cached locally according to the special effect identifier in the special effect rendering instruction.
(5) If no special effect resource label has a special effect identifier in the local special effect resource package (used for storing a plurality of local special effect resources) of the real-time cloud rendering platform 920, the real-time cloud rendering platform 920 requests to download the relevant special effect resource package from the APP application background 930 in real time.
Optionally, the real-time cloud rendering platform 920 also supports pushing the new special effect resource package to the real-time cloud rendering platform 920 in time when the APP application background 930 has a new special effect resource package, so that the real-time cloud rendering platform 920 can cluster to the local special effect resource package cache.
Illustratively, the real-time cloud rendering platform 920 renders a frame-by-frame special effect picture frame in real time according to special effect resources corresponding to the special effect identification and a special effect rendering frame rate, and sends the frame-by-frame special effect picture frame to the transcoding mixed stream module of the live platform basic interaction architecture 910 in real time; or, after the real-time cloud rendering platform 920 combines the special effect frames into the special effect frames, the special effect frames are sent to the transcoding mixed stream module of the basic interaction architecture 910 of the live broadcast platform, and so on.
(6) The transcoding mixed flow module receives a frame-by-frame special effect picture frame rendered by the real-time cloud rendering platform 920, and the special effect picture frames are processed according to the requirements of the APP application background 930, for example: determining position information, frame rate information, size information and the like of a special effect frame in a live broadcast stream, applying the special effect picture frame to the live broadcast video stream, thereby realizing the process of mixing the special effect picture frame and the live broadcast video stream, and outputting the obtained special effect video stream to an intermediate source; the intermediate source feeds an OC access point, which distributes the special effect video stream to at least one terminal in the live broadcast room.
As shown in fig. 10, a schematic view of a scene interface of the method for rendering the live broadcast picture is shown, that is, the method for rendering the live broadcast picture can be applied to multiple scenes such as a virtual comment scene 1010, a video rebroadcast scene 1020, a blind box opening scene 1030, and the like, and in the process of displaying a special effect by performing special effect triggering operations under different scenes, a device outside the terminal executes the processes of applying the special effect resource, downloading and storing the special effect resource.
Namely: through the scene step operations of (1) - (6) above, through the real-time cloud rendering platform 920 and live broadcast platform basic interaction architecture 910 combined live broadcast picture rendering system solution, real-time rendering and resource package updating are carried out by arranging the APP rendered special effect in the cloud, and then the real-time rendered special effect picture is issued to the user for viewing after being mixed with the live broadcast video stream in real time, the problem that the APP special effect resource package is not updated timely is well solved, the development problem of compatible rendering calculation force adaptation of different terminal configurations is also avoided, and the problem that the live broadcast special effect watched by different user objects has inconsistent picture time can be avoided to a certain extent.
In summary, the special effect resource obtaining process and the special effect rendering process are carried out by equipment outside the terminal, so that the problem of high configuration difficulty caused by the need of adapting to different terminals is avoided, and the problem that the terminal can completely present the special effect only by updating the latest installation package in real time is also avoided; at least one terminal in the live broadcasting room can efficiently and smoothly display live broadcasting pictures with special effects by directly receiving the special effect video stream with special effect resources applied and rendering the special effect video stream.
In the embodiment of the application, the special effect to be rendered is rendered in real time and the resource package is updated in the cloud, so that the real-time rendered special effect picture is mixed with the live video stream in real time and then is issued to the terminal, and the real-time rendered special effect picture is rendered and displayed for the user to watch, the problems of updating the existing special effect resource package and adapting and compatibility of the rendering calculation force of different terminal configurations can be well solved, the problem that live effect pictures displayed by different terminals are inconsistent in time can be avoided to a certain extent, the method and the device can be applied to quite a lot of live scenes, and the display effect of a live interface is enriched.
Fig. 11 is a block diagram of a live view rendering apparatus according to an exemplary embodiment of the present application, and as shown in fig. 11, the apparatus includes:
The video acquisition module 1110 is configured to acquire a live video stream, where the live video stream is a video stream generated based on video data acquired by a hosting terminal, and the live video stream is used to characterize live pictures in a live broadcasting room;
the instruction obtaining module 1120 is configured to obtain an effect rendering instruction, where the effect rendering instruction includes an effect identifier, the effect identifier is used to uniquely identify an effect resource, and the effect rendering instruction is used to apply the effect resource to the live video stream;
the resource obtaining module 1130 is configured to obtain, based on the special effect identifier in the special effect rendering instruction, a first special effect resource represented by the special effect identifier;
the special effects application module 1140 is configured to apply the first special effects resource to the live video stream to generate a special effects video stream, where the special effects video stream is used to be sent to at least one terminal in the live broadcast room, and the at least one terminal is used to render the special effects video stream and display a live broadcast picture with special effects.
In an optional embodiment, the special effects application module 1140 is further configured to obtain a special effects rendering frame rate, where the special effects rendering frame rate is used to characterize a frame number of special effects frames displayed per second when the special effects resource is rendered, and the special effects frame is used to combine with a live frame of the live video stream to obtain the special effects video stream; generating a plurality of special effect picture frames for showing the first special effect resource based on the first special effect resource and the special effect rendering frame rate, wherein the plurality of special effect picture frames have a time sequence relation; and applying the plurality of special effect picture frames to the live video stream to generate the special effect video stream.
In an optional embodiment, the special effects application module 1140 is further configured to determine, from the live video stream, a plurality of live frames to which the plurality of special effects frames are applied, where the plurality of live frames correspond one-to-one to the plurality of special effects frames, and an interval duration between two adjacent live frames is determined based on the special effects rendering frame rate; respectively applying the plurality of special effect picture frames to corresponding live picture frames to obtain a plurality of special effect video picture frames, wherein the special effect video picture frames are used for generating the special effect video stream; and combining the plurality of special effect video picture frames according to a time sequence based on a time axis corresponding to the live video stream to generate the special effect video stream.
In an optional embodiment, the special effects application module 1140 is further configured to combine the plurality of special effects frames based on the time sequence relationship to obtain a special effects frame corresponding to the first special effects resource, where the special effects frame corresponds to a special effects duration; determining a first time stamp for starting to apply the special effect picture frame from a time axis corresponding to the live video stream; and starting from the first timestamp, applying the special effect picture to the live video stream to obtain the special effect video stream, wherein the duration of the special effect video stream is the special effect duration.
In an optional embodiment, the special effects application module 1140 is further configured to combine the plurality of special effects frames based on the time sequence relationship to obtain a special effects frame corresponding to the first special effects resource, where the special effects frame corresponds to a special effects duration; determining a first timestamp corresponding to the live video stream at the current time; and starting from the first timestamp, applying the special effect picture to the live video stream to obtain the special effect video stream, wherein the duration of the special effect video stream is the special effect duration.
In an optional embodiment, the device is implemented as a rendering system of a live broadcast picture, and at least one special effect resource is stored in the rendering system of the live broadcast picture, and the at least one special effect resource corresponds to a special effect identifier respectively;
the resource obtaining module 1130 is further configured to search the stored at least one special effect resource based on the special effect identifier in the special effect rendering instruction; and acquiring the first special effect resource characterized by the special effect identification based on the search result.
In an optional embodiment, the resource obtaining module 1130 is further configured to obtain the first special effects resource from the at least one special effects resource in response to the search result indicating that the stored at least one special effects resource includes the first special effects resource.
In an optional embodiment, the resource obtaining module 1130 is further configured to generate, in response to the search result indicating that the stored at least one special effect resource does not include the first special effect resource, a special effect resource obtaining request, where the special effect resource obtaining request is used to obtain the first special effect resource from a special effect resource holder; transmitting the special effect resource acquisition request to the special effect resource holder; and receiving the first special effect resource sent by the special effect resource holder.
In an optional embodiment, the resource obtaining module 1130 is further configured to obtain a special effect resource packet sent by a special effect resource holder, where the special effect resource packet is a data packet sent by the special effect resource holder to the rendering system of the live broadcast picture automatically based on a special effect resource update process, and the special effect resource packet includes a plurality of supplementary special effect resources; and storing the special effect resource package in a rendering system of the live broadcast picture.
In an optional embodiment, the video obtaining module 1110 is further configured to obtain the live video stream sent by the anchor terminal; or, acquiring the video data sent by the anchor terminal; generating the live video stream based on the video data.
In an optional embodiment, the instruction obtaining module 1120 is further configured to obtain the special effect rendering instruction sent by the anchor terminal; or, acquiring the special effect rendering instruction sent by the audience terminal, wherein the audience terminal is a terminal in the live broadcasting room except the anchor terminal; the special effect rendering instruction is an instruction generated based on special effect triggering operation on special effect content in a live interface, and the special effect content is used for rendering the special effect resource represented by the special effect identifier to obtain the special effect.
In an optional embodiment, the special effects application module 1140 is further configured to obtain a terminal identifier in the special effects rendering instruction, where the terminal identifier is used to indicate the at least one terminal that receives the special effects video stream, and the terminal identifier is determined based on the at least one terminal selected during a special effects triggering operation; and transmitting the special effect video stream to the at least one terminal based on the terminal identification.
In summary, the special effect resource obtaining process and the special effect rendering process are carried out by equipment outside the terminal, so that the problem of high configuration difficulty caused by the need of adapting to different terminals is avoided, and the problem that the terminal can completely present the special effect only by updating the latest installation package in real time is also avoided; at least one terminal in the live broadcasting room can efficiently and smoothly display live broadcasting pictures with special effects by directly receiving the special effect video stream with special effect resources applied and rendering the special effect video stream.
It should be noted that: the live-broadcast picture rendering device provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the apparatus for rendering a live image provided in the foregoing embodiment and the method embodiment for rendering a live image belong to the same concept, and detailed implementation processes of the apparatus and the method embodiment are detailed and will not be repeated here.
Fig. 12 shows a schematic structural diagram of a server according to an exemplary embodiment of the present application. The server 1200 includes a central processing unit (Central Processing Unit, CPU) 1201, a system Memory 1204 including a random access Memory (Random Access Memory, RAM) 1202 and a Read Only Memory (ROM) 1203, and a system bus 1205 connecting the system Memory 1204 and the central processing unit 1201. The server 1200 also includes a mass storage device 1206 for storing an operating system 1213, application programs 1214, and other program modules 1215.
The mass storage device 1206 is connected to the central processing unit 1201 through a mass storage controller (not shown) connected to the system bus 1205. The mass storage device 1206 and its associated computer-readable media provide non-volatile storage for the server 1200. That is, the mass storage device 1206 may include a computer readable medium (not shown) such as a hard disk or compact disk read only memory (Compact Disc Read Only Memory, CD-ROM) drive.
Computer readable media may include computer storage media and communication media, without generality. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. The system memory 1204 and mass storage device 1206 described above may be collectively referred to as memory.
According to various embodiments of the present application, the server 1200 may also operate by being connected to a remote computer on a network, such as the Internet. That is, the server 1200 may be connected to the network 1212 through a network interface unit 1211 coupled to the system bus 1205, or alternatively, the network interface unit 1211 may be used to connect to other types of networks or remote computer systems (not shown).
The memory also includes one or more programs, one or more programs stored in the memory and configured to be executed by the CPU.
The embodiment of the application also provides a computer device, which comprises a processor and a memory, wherein at least one instruction, at least one section of program, code set or instruction set is stored in the memory, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by the processor to realize the method for rendering the live pictures provided by the embodiments of the method.
The embodiment of the application also provides a computer readable storage medium, on which at least one instruction, at least one section of program, code set or instruction set is stored, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by a processor, so as to implement the method for rendering the live broadcast picture provided by the above method embodiments.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method for rendering a live view according to any one of the above embodiments.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.

Claims (13)

1. A method of rendering a live view, the method performed by a server, the method comprising:
acquiring a live video stream, wherein the live video stream is generated based on video data acquired by a main broadcasting terminal, and the live video stream is used for representing live pictures in a live broadcasting room;
the method comprises the steps that a special effect rendering instruction is obtained, wherein the special effect rendering instruction comprises a special effect identifier and a selected time, the special effect identifier is used for uniquely identifying a special effect resource, the selected time is determined after the time for displaying the special effect is configured, and the special effect rendering instruction is used for applying the special effect resource to the live video stream;
acquiring a first special effect resource characterized by the special effect identifier based on the special effect identifier in the special effect rendering instruction;
obtaining a special effect rendering frame rate, wherein the special effect rendering frame rate is used for representing the number of picture frames of special effect picture frames displayed every second when the special effect resource is rendered, and the special effect picture frames are used for being combined with live picture frames of the live video stream to obtain a special effect video stream;
Generating a plurality of special effect picture frames for showing the first special effect resource based on the first special effect resource and the special effect rendering frame rate, wherein the plurality of special effect picture frames have a time sequence relation;
determining a first time stamp for starting to apply the special effect picture frame from a time axis corresponding to the live video stream, wherein the first time stamp is realized as the selected time;
starting from the first timestamp, applying the plurality of special effect picture frames to the live video stream to generate the special effect video stream;
acquiring a terminal identifier in the special effect rendering instruction, wherein the terminal identifier is used for indicating at least one terminal for receiving the special effect video stream, the terminal identifier is determined based on the at least one terminal selected during special effect triggering operation, the special effect triggering operation is also used for configuring at least one special effect attribute in the position of special effect display and the size of special effect display, and the special effect attribute is used for writing in the special effect rendering instruction;
and based on the terminal identification, sending the special effect video stream to the at least one terminal indicated by the terminal identification, wherein the at least one terminal is used for rendering the special effect video stream and displaying a live broadcast picture with special effects.
2. The method of claim 1, wherein the applying the plurality of effect picture frames to the live video stream to generate the effect video stream comprises:
determining a plurality of live broadcast picture frames applying the plurality of special effect picture frames from the live broadcast video stream, wherein the plurality of live broadcast picture frames correspond to the plurality of special effect picture frames one by one, and the interval duration between two adjacent live broadcast picture frames is determined based on the special effect rendering frame rate;
respectively applying the plurality of special effect picture frames to corresponding live picture frames to obtain a plurality of special effect video picture frames, wherein the special effect video picture frames are used for generating the special effect video stream;
and combining the plurality of special effect video picture frames based on the time sequence relation to generate the special effect video stream.
3. The method of claim 1, wherein the applying the plurality of effect picture frames to the live video stream to generate the effect video stream comprises:
combining the plurality of special effect picture frames based on the time sequence relation to obtain a special effect picture corresponding to the first special effect resource, wherein the special effect picture corresponds to a special effect time length;
Determining a first time stamp for starting to apply the special effect picture frame from a time axis corresponding to the live video stream;
and starting from the first timestamp, applying the special effect picture to the live video stream to obtain the special effect video stream, wherein the duration of the special effect video stream is the special effect duration.
4. A method according to any one of claims 1 to 3, wherein the method is performed by a rendering system of a live view, wherein at least one special effect resource is stored in the rendering system of the live view, and wherein the at least one special effect resource corresponds to a special effect identifier;
the obtaining, based on the special effect identifier in the special effect rendering instruction, a first special effect resource characterized by the special effect identifier includes:
searching the stored at least one special effect resource based on the special effect identification in the special effect rendering instruction;
and acquiring the first special effect resource characterized by the special effect identification based on the search result.
5. The method of claim 4, wherein the obtaining the first effect resource characterized by the effect identification based on the search result comprises:
and acquiring the first special effect resource from the at least one special effect resource in response to the searching result indicating that the stored at least one special effect resource comprises the first special effect resource.
6. The method of claim 4, wherein the obtaining the first effect resource characterized by the effect identification based on the search result comprises:
generating a special effect resource acquisition request for acquiring the first special effect resource from a special effect resource holder in response to the search result indicating that the first special effect resource is not included in the stored at least one special effect resource;
transmitting the special effect resource acquisition request to the special effect resource holder;
and receiving the first special effect resource sent by the special effect resource holder.
7. The method according to claim 4, wherein the method further comprises:
acquiring a special effect resource package sent by a special effect resource holder, wherein the special effect resource package is a data package automatically sent to a rendering system of the live broadcast picture by the special effect resource holder based on a special effect resource updating process, and the special effect resource package comprises a plurality of supplementary special effect resources;
and storing the special effect resource package in a rendering system of the live broadcast picture.
8. A method according to any one of claims 1 to 3, wherein said obtaining a live video stream comprises:
Acquiring the live video stream sent by the anchor terminal; or,
acquiring the video data sent by the anchor terminal; generating the live video stream based on the video data.
9. A method according to any one of claims 1 to 3, wherein said obtaining special effect rendering instructions comprises:
acquiring the special effect rendering instruction sent by the anchor terminal; or,
acquiring the special effect rendering instruction sent by an audience terminal, wherein the audience terminal is a terminal in the live broadcasting room except the anchor terminal;
the special effect rendering instruction is an instruction generated based on special effect triggering operation on special effect content in a live interface, and the special effect content is used for rendering the special effect resource represented by the special effect identifier to obtain the special effect.
10. A rendering system of live pictures is characterized by comprising a video mixed-stream server and a rendering server;
the video mixed stream server is used for acquiring a live video stream, wherein the live video stream is generated based on video data acquired by a main broadcasting terminal, and the live video stream is used for representing live pictures in a live broadcasting room;
The rendering server is used for acquiring a special effect rendering instruction, wherein the special effect rendering instruction comprises a special effect identifier and a selected time, the special effect identifier is used for uniquely identifying a special effect resource, the selected time is a time determined after the time for displaying the special effect is configured, and the special effect rendering instruction is used for applying the special effect resource to the live video stream; acquiring a first special effect resource characterized by the special effect identifier based on the special effect identifier in the special effect rendering instruction; obtaining a special effect rendering frame rate, wherein the special effect rendering frame rate is used for representing the number of picture frames of special effect picture frames displayed every second when the special effect resource is rendered, and the special effect picture frames are used for being combined with live picture frames of the live video stream to obtain a special effect video stream; generating a plurality of special effect picture frames for showing the first special effect resource based on the first special effect resource and the special effect rendering frame rate, wherein the plurality of special effect picture frames have a time sequence relation; transmitting the special effect picture frames to the video mixed stream server;
the video mixed stream server is further configured to determine, from a time axis corresponding to the live video stream, a first timestamp for starting to apply the special effect picture frame, where the first timestamp is implemented as the selected time; starting from the first timestamp, applying the plurality of special effect picture frames to the live video to obtain the special effect video stream; acquiring a terminal identifier in the special effect rendering instruction, wherein the terminal identifier is used for indicating at least one terminal for receiving the special effect video stream, the terminal identifier is determined based on the at least one terminal selected during special effect triggering operation, the special effect triggering operation is also used for configuring at least one special effect attribute in the position of special effect display and the size of special effect display, and the special effect attribute is used for writing in the special effect rendering instruction; and based on the terminal identification, sending the special effect video stream to the at least one terminal indicated by the terminal identification, wherein the at least one terminal is used for rendering the special effect video stream and displaying a live broadcast picture with special effects.
11. A live-view rendering apparatus, the apparatus comprising:
the video acquisition module is used for acquiring a live video stream, wherein the live video stream is generated based on video data acquired by the anchor terminal, and the live video stream is used for representing live pictures in a live broadcasting room;
the instruction acquisition module is used for acquiring an effect rendering instruction, wherein the effect rendering instruction comprises an effect identifier and a selected time, the effect identifier is used for uniquely identifying an effect resource, the selected time is a time determined after the time for displaying the effect is configured, and the effect rendering instruction is used for applying the effect resource to the live video stream;
the resource acquisition module is used for acquiring a first special effect resource characterized by the special effect identifier based on the special effect identifier in the special effect rendering instruction;
the special effect application module is used for obtaining a special effect rendering frame rate, wherein the special effect rendering frame rate is used for representing the number of picture frames of special effect picture frames displayed every second when the special effect resource is rendered, and the special effect picture frames are used for being combined with live picture frames of the live video stream to obtain a special effect video stream; generating a plurality of special effect picture frames for showing the first special effect resource based on the first special effect resource and the special effect rendering frame rate, wherein the plurality of special effect picture frames have a time sequence relation; determining a first time stamp for starting to apply the special effect picture frame from a time axis corresponding to the live video stream, wherein the first time stamp is realized as the selected time; starting from the first timestamp, applying the plurality of special effect picture frames to the live video stream to generate the special effect video stream; acquiring a terminal identifier in the special effect rendering instruction, wherein the terminal identifier is used for indicating at least one terminal for receiving the special effect video stream, the terminal identifier is determined based on the at least one terminal selected during special effect triggering operation, the special effect triggering operation is also used for configuring at least one special effect attribute in the position of special effect display and the size of special effect display, and the special effect attribute is used for writing in the special effect rendering instruction; and based on the terminal identification, sending the special effect video stream to the at least one terminal indicated by the terminal identification, wherein the at least one terminal is used for rendering the special effect video stream and displaying a live broadcast picture with special effects.
12. A computer device comprising a processor and a memory, wherein the memory has stored therein at least one program that is loaded and executed by the processor to implement the method of rendering live pictures as claimed in any one of claims 1 to 9.
13. A computer-readable storage medium, wherein at least one program is stored in the storage medium, the at least one program being loaded and executed by a processor to implement the method of rendering a live view according to any one of claims 1 to 9.
CN202311298018.XA 2023-10-09 2023-10-09 Live picture rendering method, system, device, equipment and medium Active CN117041628B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311298018.XA CN117041628B (en) 2023-10-09 2023-10-09 Live picture rendering method, system, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311298018.XA CN117041628B (en) 2023-10-09 2023-10-09 Live picture rendering method, system, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN117041628A CN117041628A (en) 2023-11-10
CN117041628B true CN117041628B (en) 2024-02-02

Family

ID=88637618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311298018.XA Active CN117041628B (en) 2023-10-09 2023-10-09 Live picture rendering method, system, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117041628B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108289159A (en) * 2017-05-25 2018-07-17 广州华多网络科技有限公司 A kind of terminal live streaming special efficacy add-on system, method and terminal live broadcast system
CN109246445A (en) * 2018-11-29 2019-01-18 广州市百果园信息技术有限公司 Method, apparatus, system, equipment and the storage medium explained in a kind of direct broadcasting room
CN110418155A (en) * 2019-08-08 2019-11-05 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus, computer readable storage medium and computer equipment
CN110599396A (en) * 2019-09-19 2019-12-20 网易(杭州)网络有限公司 Information processing method and device
CN111385639A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Video special effect adding method, device, equipment and storage medium
CN112218108A (en) * 2020-09-18 2021-01-12 广州虎牙科技有限公司 Live broadcast rendering method and device, electronic equipment and storage medium
CN113438490A (en) * 2021-05-27 2021-09-24 广州方硅信息技术有限公司 Live broadcast interaction method, computer equipment and storage medium
CN115665437A (en) * 2022-12-21 2023-01-31 深圳市易云数字科技有限责任公司 Scene customizable on-site interactive AR slow live broadcast system
CN115761090A (en) * 2022-11-17 2023-03-07 北京字跳网络技术有限公司 Special effect rendering method, device, equipment, computer readable storage medium and product
CN115767181A (en) * 2022-11-17 2023-03-07 北京字跳网络技术有限公司 Live video stream rendering method, device, equipment, storage medium and product

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108289159A (en) * 2017-05-25 2018-07-17 广州华多网络科技有限公司 A kind of terminal live streaming special efficacy add-on system, method and terminal live broadcast system
CN109246445A (en) * 2018-11-29 2019-01-18 广州市百果园信息技术有限公司 Method, apparatus, system, equipment and the storage medium explained in a kind of direct broadcasting room
CN111385639A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Video special effect adding method, device, equipment and storage medium
CN110418155A (en) * 2019-08-08 2019-11-05 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus, computer readable storage medium and computer equipment
CN110599396A (en) * 2019-09-19 2019-12-20 网易(杭州)网络有限公司 Information processing method and device
CN112218108A (en) * 2020-09-18 2021-01-12 广州虎牙科技有限公司 Live broadcast rendering method and device, electronic equipment and storage medium
CN113438490A (en) * 2021-05-27 2021-09-24 广州方硅信息技术有限公司 Live broadcast interaction method, computer equipment and storage medium
CN115761090A (en) * 2022-11-17 2023-03-07 北京字跳网络技术有限公司 Special effect rendering method, device, equipment, computer readable storage medium and product
CN115767181A (en) * 2022-11-17 2023-03-07 北京字跳网络技术有限公司 Live video stream rendering method, device, equipment, storage medium and product
CN115665437A (en) * 2022-12-21 2023-01-31 深圳市易云数字科技有限责任公司 Scene customizable on-site interactive AR slow live broadcast system

Also Published As

Publication number Publication date
CN117041628A (en) 2023-11-10

Similar Documents

Publication Publication Date Title
WO2018010682A1 (en) Live broadcast method, live broadcast data stream display method and terminal
US9129448B2 (en) Visualization of a natural language text
CN113099258B (en) Cloud guide system, live broadcast processing method and device, and computer readable storage medium
EP2940940B1 (en) Methods for sending and receiving video short message, apparatus and handheld electronic device thereof
WO2017206398A1 (en) Method and device for video sharing
US20210044644A1 (en) Systems, devices, and methods for streaming haptic effects
CN110149518B (en) Method, system, device, equipment and storage medium for processing media data
CN103947221A (en) User interface display method and device using same
CN112073754B (en) Cloud game screen projection method and device, computer equipment, computer readable storage medium and cloud game screen projection interaction system
WO2016074326A1 (en) Channel switching method, apparatus and system
JP2011501501A (en) Apparatus and method for providing stereoscopic 3D video content for LASeR-based terminals
CN110505511B (en) Method, device and system for playing video in webpage and computing equipment
CN102231851A (en) Scalable video insertion control
CN114450966A (en) Data model for representation and streaming of heterogeneous immersive media
CN112055252A (en) Multi-screen interaction method and device, computer readable medium and electronic equipment
CN111261133A (en) Singing processing method and device, electronic equipment and storage medium
CN113630618B (en) Video processing method, device and system
WO2024104333A1 (en) Cast picture processing method and apparatus, electronic device, and storage medium
US20230217047A1 (en) Method, system, and computer-readable recording medium for implementing fast-switching mode between channels in multi-live transmission environment
CN117041628B (en) Live picture rendering method, system, device, equipment and medium
US20230362460A1 (en) Dynamically generated interactive video content
CN114071170B (en) Network live broadcast interaction method and device
CN109408757A (en) Question and answer content share method, device, terminal device and computer storage medium
KR102376348B1 (en) Method, system, and computer readable record medium to implement seamless switching mode between channels in multiple live transmission environment
CN112887786B (en) Video playing method and device and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant