CN110719493A - Barrage display method and device, electronic equipment and readable storage medium - Google Patents

Barrage display method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN110719493A
CN110719493A CN201911080076.9A CN201911080076A CN110719493A CN 110719493 A CN110719493 A CN 110719493A CN 201911080076 A CN201911080076 A CN 201911080076A CN 110719493 A CN110719493 A CN 110719493A
Authority
CN
China
Prior art keywords
bullet screen
node
queue
bullet
barrage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911080076.9A
Other languages
Chinese (zh)
Inventor
邱俊琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huya Technology Co Ltd
Original Assignee
Guangzhou Huya Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huya Technology Co Ltd filed Critical Guangzhou Huya Technology Co Ltd
Priority to CN201911080076.9A priority Critical patent/CN110719493A/en
Publication of CN110719493A publication Critical patent/CN110719493A/en
Priority to PCT/CN2020/127052 priority patent/WO2021088973A1/en
Priority to US17/630,187 priority patent/US20220279234A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a barrage display method and device, electronic equipment and a readable storage medium, wherein received live streaming is rendered on a target model object in an Augmented Reality (AR) recognition plane, so that after the live streaming is displayed on the target model object, barrage data corresponding to the live streaming can be rendered in the AR recognition plane, and the barrage data can move in the AR recognition plane. So, can realize the demonstration of barrage in the real scene of AR, spectator can see the barrage and remove from the real scene of AR after opening the camera, and the real experience that the reinforcing barrage shows improves live broadcast object for appreciation nature, and then effectively improves user's retention rate.

Description

Barrage display method and device, electronic equipment and readable storage medium
Technical Field
The application relates to the field of internet live broadcast, in particular to a barrage display method and device, electronic equipment and a readable storage medium.
Background
In the internet live broadcast process, the barrage highlights the sharing spirit among users, for example, when the audience watches the live broadcast stream, the audience can discuss with each other and share the opinion of the live broadcast stream through the barrage. However, in the conventional scheme, the barrage is usually attached to the live stream for display, and this way is difficult for the audience to form real experience, resulting in low live play and difficulty in effectively improving the retention rate of the user.
Disclosure of Invention
In view of this, an object of the present application is to provide a bullet screen display method, device, electronic device and readable storage medium, which can realize display of a bullet screen in an AR real scene, and when a viewer opens a camera, the viewer can see that the bullet screen moves from the AR real scene, so as to enhance real experience of bullet screen display, improve playability of live broadcast, and further effectively improve retention rate of a user.
According to an aspect of the present application, there is provided a bullet screen display method applied to a live viewing terminal, the method including:
rendering the received live stream to a target model object in an Augmented Reality (AR) recognition plane so that the live stream is displayed on the target model object;
and rendering the barrage data corresponding to the live stream into the AR identification plane so that the barrage data moves in the AR identification plane.
In a possible embodiment, the step of rendering the received live stream onto a target model object in an augmented reality AR recognition plane to display the live stream on the target model object includes:
creating a tracing point on a preset point of the AR identification plane, wherein the tracing point is used for fixing the target model object on the preset point;
creating a corresponding display node at the position of the tracing point, and creating a first child node, wherein the first child node is used for adjusting and displaying the target model object, and the display node is a father node of the first child node;
creating a second child node inherited to the first child node to replace a bone adjustment node with the second child node when an addition request of the bone adjustment node is detected, wherein the bone adjustment node is used for adjusting the bone point of the target model object, in one possible implementation, the step of rendering the bullet screen data corresponding to the live stream into the AR identification plane to move the bullet screen data in the AR identification plane includes:
acquiring bullet screen data corresponding to the live stream from a live server, and adding the bullet screen data to a bullet screen queue;
initializing node information of a preset number of bullet screen nodes, wherein a father node of each bullet screen node is the second child node, and each bullet screen node is used for displaying a bullet screen;
and extracting the bullet screen data from the bullet screen queue to render the bullet screen data into the AR identification plane through at least part of bullet screen nodes in the preset number of bullet screen nodes, so that the bullet screen data moves in the AR identification plane.
In a possible implementation manner, the step of adding the bullet screen data to a bullet screen queue includes:
judging whether the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data;
if the queue length of the bullet screen queue is not greater than the bullet screen number of the bullet screen data, adding the bullet screen data into the bullet screen queue;
if the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data, when the queue length of the bullet screen queue is greater than the number of the bullet screens of the bullet screen data each time, the length of the bullet screen queue is extended by a preset length, and then the bullet screen data are continuously added into the bullet screen queue;
and if the length of the queue after the extension of the bullet screen queue is greater than a preset threshold value, discarding the number of bullet screens with a set number from the bullet screen queue according to the sequence of the bullet screen time from morning to evening.
In a possible implementation manner, the step of initially configuring a preset number of bullet screen nodes includes:
configuring a preset number of bullet screen nodes with the second child node as a father node;
and respectively configuring display information of each bullet screen node in the AR identification plane.
In a possible embodiment, the AR recognition plane includes an X axis, a Y axis, and a Z axis with the second node as a coordinate center axis, and the step of configuring the display information of each bullet screen node in the AR recognition plane includes:
respectively configuring world coordinates of each bullet screen node in the AR identification plane along different offset displacement points on the Y axis and the Z axis so as to enable each bullet screen node to be arranged at intervals along the Y axis and the Z axis;
setting a position on the X axis, which is deviated from the father node in the first direction by a preset unit displacement, as a world coordinate at which each bullet screen node starts to display, and setting a position, which is deviated from the father node in the second direction by a preset unit displacement, as a world coordinate at which each bullet screen node finishes to display.
In a possible implementation, before the step of extracting the barrage data from the barrage queue to render the barrage data into the AR identification plane through at least some of the preset number of barrage nodes, so that the barrage data moves in the AR identification plane, the method further includes:
configuring the preset number of bullet screen nodes into an inoperable state;
the step of extracting the barrage data from the barrage queue to render the barrage data into the AR identification plane through at least part of the barrage nodes in the preset number of barrage nodes, so that the barrage data moves in the AR identification plane includes:
extracting bullet screen data from a bullet screen data queue, and extracting at least part of bullet screen nodes from the preset number of bullet screen nodes according to the number of bullet screens of the bullet screen data;
after the extracted at least part of bullet screen nodes are adjusted from the inoperable state to the operable state, loading a character string display component corresponding to each target bullet screen node in the at least part of bullet screen nodes;
rendering the bullet screen data to the AR identification plane through a character string display component corresponding to each target bullet screen node;
adjusting the world coordinate change of the bullet screen corresponding to each target bullet screen node in the AR identification plane according to the node information of each target bullet screen node so as to enable the bullet screen data to move in the AR identification plane;
and when the display of any bullet screen is finished, reconfiguring a target bullet screen node corresponding to the bullet screen into an inoperable state.
According to another aspect of the present application, there is provided a bullet screen display device applied to a live viewing terminal, the device including:
the model rendering module is used for rendering the received live stream to a target model object in an Augmented Reality (AR) recognition plane so as to display the live stream on the target model object;
and the barrage rendering module is used for rendering the barrage data corresponding to the live stream into the AR identification plane so as to enable the barrage data to move in the AR identification plane.
According to another aspect of the present application, an electronic device is provided, which includes a machine-readable storage medium and a processor, where the machine-readable storage medium stores machine-executable instructions, and the processor, when executing the machine-executable instructions, implements the bullet screen display method described above.
According to another aspect of the present application, there is provided a readable storage medium having stored therein machine executable instructions which, when executed, implement the foregoing bullet screen display method.
Based on any of the above aspects, this application is on the target model object of rendering received live stream to augmented reality AR discernment plane to make live stream show on the target model object after, can render the barrage data that live stream corresponds to AR discernment plane in, so that barrage data removes in AR discernment plane. So, can realize the demonstration of barrage in the real scene of AR, spectator can see the barrage and remove from the real scene of AR after opening the camera, and the real experience that the reinforcing barrage shows improves live broadcast object for appreciation nature, and then effectively improves user's retention rate.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic view illustrating an interaction scene of a live broadcast system provided in an embodiment of the present application;
fig. 2 is a schematic flowchart illustrating a bullet screen display method provided in an embodiment of the present application;
FIG. 3 shows a flow diagram of the sub-steps of step S110 shown in FIG. 2;
FIG. 4 shows a flow diagram of the substeps of step S120 shown in FIG. 2;
FIG. 5 shows a sub-step flow diagram of sub-step S123 shown in FIG. 4;
fig. 6 is a schematic diagram illustrating a bullet screen displayed in a live stream according to an embodiment of the present application;
fig. 7 is a schematic diagram illustrating a bullet screen displayed on an AR recognition plane according to an embodiment of the present application;
fig. 8 is a schematic diagram illustrating functional modules of a bullet screen display device provided in an embodiment of the present application;
fig. 9 is a block diagram illustrating a schematic structure of an electronic device for implementing the bullet screen display method according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some of the embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 shows an interaction scene schematic diagram of a live broadcast system 10 provided in an embodiment of the present application. For example, the live system 10 may be for a service platform such as an internet live. The live broadcast system 10 may include a live broadcast server 100, a live broadcast viewing terminal 200, and a live broadcast providing terminal 300, where the live broadcast server 100 is in communication connection with the live broadcast viewing terminal 200 and the live broadcast providing terminal 300, respectively, and is configured to provide live broadcast services for the live broadcast viewing terminal 200 and the live broadcast providing terminal 300. For example, the anchor may provide a live stream online in real time to the viewer through the live providing terminal 300 and transmit the live stream to the live server 100, and the live viewing terminal 200 may pull the live stream from the live server 100 for online viewing or playback. For another example, the live broadcast server 100 may acquire the barrage data transmitted by the live broadcast viewing terminal 200 and the live broadcast providing terminal 300, and synchronize the barrage to each live broadcast viewing terminal 200.
In some implementation scenarios, the live viewing terminal 200 and the live providing terminal 300 may be used interchangeably. For example, a main broadcast of the live broadcast providing terminal 300 may provide a live video service to viewers using the live broadcast providing terminal 300, or view live video provided by other main broadcasts as viewers. For another example, a viewer of the live viewing terminal 200 may also use the live viewing terminal 200 to view live video provided by a main broadcast of interest, or to serve live video as a main broadcast to other viewers.
In this embodiment, the live viewing terminal 200 and the live providing terminal 300 may include, but are not limited to, a mobile device, a tablet computer, a laptop computer, or any combination of two or more thereof. In particular implementations, there may be zero, one, or more live viewing terminals 200 and live providing terminals 300, only one of which is shown in fig. 1, accessing the live server 100. The live viewing terminal 200 and the live providing terminal 300 may be installed with internet products for providing live internet services, for example, the internet products may be applications APP, Web pages, applets, and the like used in a computer or a smart phone and related to live internet services.
In this embodiment, the live server 100 may be a single physical server, or may be a server group including a plurality of physical servers for executing different data processing functions. The server groups may be centralized or distributed (e.g., the live server 100 may be a distributed system). In some possible embodiments, such as where the live server 100 employs a single physical server, different logical server components may be assigned to the physical server based on different live service functions.
It is understood that the live system 10 shown in fig. 1 is only one possible example, and in other possible embodiments, the live system 10 may include only a portion of the components shown in fig. 1 or may include other components.
In order to enable the barrage to be displayed in the AR real scene, improve live playability, and further effectively improve the retention rate of the user, fig. 2 shows a flowchart of the barrage display method provided in the embodiment of the present application, in this embodiment, the barrage display method may be executed by the live viewing terminal 200 shown in fig. 1, or when the anchor of the live providing terminal 300 is used as a viewer, the barrage display method may also be executed by the live providing terminal 300 shown in fig. 1.
It should be understood that, in other embodiments, the order of some steps in the bullet screen display method of this embodiment may be interchanged according to actual needs, or some steps may be omitted or deleted. The detailed steps of the bullet screen display method are described as follows.
Step S110, rendering the received live stream to a target model object in an augmented reality AR recognition plane, so that the live stream is displayed on the target model object.
And step S120, rendering the barrage data corresponding to the live stream into an AR identification plane so as to move the barrage data in the AR identification plane.
In this embodiment, when the audience of the live viewing terminal 200 logs in the live room to be viewed, the live room can be selected to be displayed in an AR mode, or the live viewing terminal 200 can also be automatically displayed in an AR mode when entering the live room, so that an AR display instruction can be triggered. When the live viewing terminal 200 detects an augmented reality AR display instruction, the camera may be turned on to enter the AR recognition plane, and then a corresponding target model object is generated in the AR recognition plane.
When the target model object is displayed in the AR recognition plane, the live viewing terminal 200 may render the received live stream onto the target model object, so that the live stream is displayed on the target model object.
In the process that audiences watch live streaming through the target model object that shows in the AR discernment plane, can obtain each bullet screen data of waiting to broadcast from live broadcast server to in the AR discernment plane is makeed up bullet screen data, so that bullet screen data moves in the AR discernment plane, move in comparing and moving in traditional scheme with bullet screen data play up live streaming picture, real effect when can improving bullet screen broadcast, the true experience of reinforcing bullet screen demonstration. So, realize the demonstration of barrage in the real scene of AR, spectator can see the barrage and remove from the real scene of AR after opening the camera, improves live broadcast object for appreciation nature, and then effectively improves user's retention rate.
In a possible implementation manner, for step S110, in order to ensure that the target model object does not change with the movement of the camera in the AR recognition plane subsequently and facilitate the target model object to be adjusted with the user operation during the process of generating the corresponding target model object in the AR recognition plane, a possible example is given below in conjunction with fig. 3 to exemplify the generation process of the target model object. Referring to fig. 3, step S110 may be implemented by the following sub-steps:
sub-step S111, creating a dot on a preset point of the AR recognition plane, which can be used to fix the target model object on the preset point.
And a substep S112, creating a corresponding display node at the point drawing position, and creating a first child node, wherein the first child node is used for adjusting and displaying the target model object, and the display node is a parent node of the first child node.
And a substep S113 of creating a second child node inherited from the first child node to replace the bone adjustment node with the second child node when the adding request of the bone adjustment node is detected, wherein the bone adjustment node can be used for adjusting the bone point of the target model object.
In one possible embodiment, for sub-step S112, the adjusting the target model object by the first child node includes one or more of the following adjustment modes:
1) the target model object may be scaled, for example, by adjusting the entire target model object to be enlarged or reduced, or by adjusting a part of the target model object to be enlarged or reduced.
2) The target model object is translated, for example, the target model object may be moved by a preset distance in each direction (upper left and right oblique directions).
3) The target model object is rotated. For example, the target model object may be rotated in a clockwise or counterclockwise direction.
In one possible implementation, for sub-step S112, a binding setting method of the first child node may be invoked to bind the target model object to the first child node to complete the display of the target model object in the AR recognition plane.
Therefore, in the process of generating the corresponding target model object in the AR identification plane, the target model object is fixed on the preset point through the tracing point, the target model object is ensured not to change along with the movement of the camera in the AR identification plane subsequently, the target model object is adjusted and displayed through the first sub-node, the target model object can be adjusted and displayed in real time along with the operation of a user, and in consideration of the possibility of adding the bone adjustment node to adjust the bone of the target model object, a second sub-node inherited to the first sub-node can be reserved, so that the bone adjustment node can be used for replacing the second sub-node when the bone adjustment node is added subsequently.
On the basis of the above, in a possible implementation manner, for step S120, since the number of barrage may be frequently and densely published, the memory usage is too much, the AR display process is unstable, and in order to improve the stability of the barrage AR display process, please refer to fig. 4, step S120 may be implemented by the following sub-steps:
and a substep S121, obtaining bullet screen data corresponding to the live stream from the live server, and adding the bullet screen data to a bullet screen queue.
And a substep S122 of initializing and configuring node information of a preset number of bullet screen nodes.
And a substep S123 of extracting the bullet screen data from the bullet screen queue to render the bullet screen data into the AR identification plane through at least part of bullet screen nodes in the preset number of bullet screen nodes, so that the bullet screen data moves in the AR identification plane.
In this embodiment, after the barrage data corresponding to the live stream is obtained from the live server, the barrage data is not directly rendered into the AR recognition plane, but is first added to the barrage queue. On this basis, a certain number (for example, 60) of barrage nodes barrage node may be configured for the AR identification plane, and a parent node of each barrage node may be the second child node of the foregoing sub-step S113, and each barrage node may be configured to display one barrage.
Then, in the process of rendering the barrage data to the AR identification plane, at least part of barrage nodes in the barrage nodes with the preset number can be rendered to the AR identification plane, so that the barrage data can move in the AR identification plane, the number of the barrage nodes can be determined according to the specific number of the barrages, the situations that the memory is occupied too much and the AR display process is unstable due to the intensive release of the number of the barrages are avoided, and the stability of the barrage AR display process is improved.
For example, in one possible implementation, for the sub-step S121, it may be determined whether the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data, and if the queue length of the bullet screen queue is not greater than the number of bullet screens of the bullet screen data, the bullet screen data is added to the bullet screen queue. If the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data, when the queue length of the bullet screen queue is greater than the number of the bullet screens of the bullet screen data at each time, the length of the bullet screen queue is extended by a preset length and then the bullet screen data is continuously added into the bullet screen queue. And if the length of the queue after the extension of the bullet screen queue is greater than a preset threshold value, discarding the number of the bullet screens with the set number from the bullet screen queue according to the sequence of the bullet screen time from morning to evening.
For example, if the preset threshold is 200 and the preset length of each expansion is 20, then when the queue length of the bullet screen queue is not greater than the number of bullet screens of the bullet screen data, the bullet screen data is added to the bullet screen queue. And when the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data, the length of the bullet screen queue is extended by 20, and then the bullet screen data are continuously added into the bullet screen queue. And if the extended queue length of the bullet screen queue is more than 200, discarding the earliest 20 bullet screens from the bullet screen queue according to the sequence of the bullet screen time from morning to evening.
In a possible implementation manner, for the sub-step S122, after configuring a preset number of bullet screen nodes with the second child node as a parent node, display information of each bullet screen node in the AR identification plane may be configured, and the display information may be used to subsequently configure how to display and move bullet screens corresponding to the bullet screen nodes.
For example, in one possible example, the AR recognition plane may include an X axis, a Y axis, and a Z axis with the second node as a coordinate center axis, and the world coordinates of each bullet screen node in the AR recognition plane may be configured along different offset displacement points on the Y axis and the Z axis, respectively, so that each bullet screen node is arranged at intervals along the Y axis and the Z axis, which may enable a subsequent bullet screen to exhibit different layering and distance senses when performing AR display. Meanwhile, it is also possible to set a position shifted from the parent node in the first direction by a preset unit displacement (e.g., 1.5 units displacement) on the X-axis as a world coordinate at which each bullet screen node starts to display, and a position shifted from the parent node in the second direction by a preset unit displacement (e.g., 1.5 units displacement) as a world coordinate at which each bullet screen node ends to display. By the arrangement, the bullet screen can be conveniently adjusted to be at the initial position and the final position.
Alternatively, the first direction may be a left direction of the screen and the second direction may be a right direction of the screen, or the first direction may be a right direction of the screen and the second direction may be a left direction of the screen. Alternatively, the first direction and the second direction may be any other directions.
In a possible implementation manner, when the number of the bullet screens is insufficient, if all the bullet screen nodes are in the use state, redundant performance consumption may be increased, based on which, before the bullet screen data is extracted from the bullet screen queue to be rendered into the AR identification plane through at least some of the preset number of bullet screen nodes, so that the bullet screen data moves in the AR identification plane, the preset number of bullet screen nodes may be configured to be in the non-operable state, and in the non-operable state, the bullet screen nodes do not participate in the bullet screen display process.
Then, with reference to sub-step S123 and with reference to fig. 5, the following sub-steps can be further implemented:
and a substep S1231, extracting bullet screen data from the bullet screen data queue, and extracting at least part of bullet screen nodes from a preset number of bullet screen nodes according to the number of bullet screens of the bullet screen data.
And a substep S1232 of adjusting the extracted at least part of bullet screen nodes from the inoperable state to the operable state, and then loading the character string display component corresponding to each target bullet screen node in the at least part of bullet screen nodes.
And a substep S1233 of rendering the bullet screen data to the AR identification plane through the character string display component corresponding to each target bullet screen node.
And a substep S1234, adjusting the world coordinate change of the bullet screen corresponding to each target bullet screen node in the AR identification plane according to the node information of each target bullet screen node, so that the bullet screen data moves in the AR identification plane.
And a substep S1235, after the display of any bullet screen is finished, reconfiguring the target bullet screen node corresponding to the bullet screen into an inoperable state.
In this embodiment, for the substep S1231, the number of extracted bullet screen nodes may be determined according to the number of bullet screens of the extracted bullet screen data. For example, if the number of bullet screens is 10, 10 target bullet screen nodes may be extracted as the display nodes of the 10 bullet screens.
Next, for sub-step S1232, after the extracted 10 target bullet screen nodes are adjusted from the inoperable state to the operable state, the character string display components corresponding to the 10 target bullet screen nodes are loaded. The character string display component can be understood as an image component used for displaying character strings on a live-broadcast watching terminal, and the character string display component can be TextView by taking the live-broadcast watching terminal running an android system as an example. Optionally, a corresponding relationship between each bullet screen node and the character string display component may be configured in advance, so that after a target bullet screen node is determined, the corresponding character string display component for displaying a bullet screen may be obtained. In this way, the bullet screen data can be rendered into the AR recognition plane through the character string display component corresponding to each target bullet screen node.
In the above process, a coordinate updating method may be rewritten in the bullet screen node, the coordinate updating method being executed once every preset time period (e.g., 16ms), so that the world coordinates of each bullet screen may be updated according to the display information set as described above, for example, the display is started at a position shifted from the parent node by a preset unit displacement in the first direction on the X-axis, and then the world coordinates of the preset displacement are updated every preset time period until the bullet screen display is ended when the updated world coordinates are the world coordinates at the position shifted from the parent node by the preset unit displacement in the second direction. And then, reconfiguring the target bullet screen node corresponding to the bullet screen into an inoperable state.
In order to facilitate the detailed display of the scenes in the embodiments of the present application, in conjunction with fig. 6 and fig. 7, schematic diagrams of the barrage displayed in the live stream and the barrage displayed in the AR recognition plane are provided for brief description below.
Referring to fig. 6, an interface schematic diagram of an exemplary AR recognition plane into which the live viewing terminal 200 opens the camera is shown, where the target model object shown in fig. 6 may be adaptively set at a certain position, for example, an intermediate position, in a real scene, at this time, the live stream may be rendered onto the target model object shown in fig. 6 for display in the foregoing embodiment, that is, the live stream is rendered into the target model object shown in fig. 6, and in this scheme, it can be seen that the bullet screen is actually displayed in the live stream on the target model object.
Referring to fig. 7, an interface schematic diagram of an exemplary AR recognition plane into which the live viewing terminal 200 opens the camera is also shown, and a barrage can be rendered into the AR recognition plane according to the foregoing embodiment, and at this time, the barrage can be seen to be displayed in an AR real scene instead of a live stream.
From this, to spectator, can realize the demonstration of barrage in the real scene of AR, spectator can see that the barrage moves from the real scene of AR after opening the camera, and the real experience that the reinforcing barrage shows improves live broadcast object for appreciation nature, and then effectively improves user's retention rate.
Based on the same inventive concept, please refer to fig. 8, which shows a functional module diagram of the bullet screen display device 410 provided in the embodiment of the present application, and the embodiment can divide the functional module of the bullet screen display device 410 according to the above method embodiment. For example, the functional blocks may be divided for the respective functions, or two or more functions may be integrated into one processing block. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation. For example, in the case of dividing each function module according to each function, the bullet screen display device 410 shown in fig. 8 is only a schematic device diagram. The bullet screen display device 410 may include a model rendering module 411 and a bullet screen rendering module 412, and the functions of the functional modules of the bullet screen display device 410 are described in detail below.
And the model rendering module 411 is configured to render the received live stream to a target model object in the augmented reality AR recognition plane, so that the live stream is displayed on the target model object. It is understood that the model rendering module 411 can be used to execute the above step S110, and the detailed implementation of the model rendering module 411 can refer to the above contents related to step S110.
And a barrage rendering module 412, configured to render the barrage data corresponding to the live stream into the AR recognition plane, so that the barrage data moves in the AR recognition plane. It is understood that the bullet screen rendering module 412 can be used to perform the step S120, and for the detailed implementation of the bullet screen rendering module 412, reference can be made to the contents related to the step S120.
In one possible implementation, the model rendering module 411 may render the received live stream onto the target model object in the augmented reality AR recognition plane to cause the live stream to be displayed on the target model object by:
creating a tracing point on a preset point of the AR identification plane, wherein the tracing point is used for fixing the target model object on the preset point;
creating a corresponding display node at the position of the tracing point, and creating a first child node, wherein the first child node is used for adjusting the target model object, and the display node is a father node of the first child node;
and creating a second child node, calling a binding setting method of the second child node to bind the target model object to the second child node so as to complete the display of the target model object in the AR identification plane, wherein the first child node is a father node of the second child node.
In one possible implementation, the barrage rendering module 412 may render the barrage data corresponding to the live stream into the AR recognition plane by:
acquiring bullet screen data corresponding to the live streaming from a live streaming server, and adding the bullet screen data to a bullet screen queue;
initializing node information of a preset number of bullet screen nodes, wherein a father node of each bullet screen node is a second child node, and each bullet screen node is used for displaying a bullet screen;
and extracting the bullet screen data from the bullet screen queue to render the bullet screen data into the AR identification plane through at least part of bullet screen nodes in the preset number of bullet screen nodes, so that the bullet screen data move in the AR identification plane.
In one possible implementation, the bullet screen rendering module 412 can add bullet screen data to the bullet screen queue by:
judging whether the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data;
if the queue length of the bullet screen queue is not more than the number of bullet screens of the bullet screen data, adding the bullet screen data into the bullet screen queue;
if the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data, when the queue length of the bullet screen queue is greater than the number of the bullet screens of the bullet screen data each time, the length of the bullet screen queue is extended by a preset length, and then the bullet screen data are continuously added into the bullet screen queue;
and if the length of the queue after the extension of the bullet screen queue is greater than a preset threshold value, discarding the number of the bullet screens with the set number from the bullet screen queue according to the sequence of the bullet screen time from morning to evening.
In one possible implementation, the bullet screen rendering module 412 may initially configure a preset number of bullet screen nodes by:
configuring a preset number of bullet screen nodes with the second child node as a father node;
and respectively configuring display information of each bullet screen node in the AR identification plane.
In a possible implementation manner, the AR recognition plane includes an X axis, a Y axis, and a Z axis with the second node as a coordinate center axis, and the bullet screen rendering module 412 may configure the display information of each bullet screen node in the AR recognition plane by:
respectively configuring world coordinates of each bullet screen node in an AR identification plane along different offset displacement points on the Y axis and the Z axis so as to enable each bullet screen node to be arranged at intervals along the Y axis and the Z axis;
setting a position on the X-axis, which is deviated from the parent node in the first direction by a preset unit displacement, as a world coordinate at which each bullet screen node starts to display, and setting a position, which is deviated from the parent node in the second direction by a preset unit displacement, as a world coordinate at which each bullet screen node finishes to display.
In a possible implementation, the bullet screen rendering module 412 is further configured to configure a preset number of bullet screen nodes to be in an inoperable state;
in one possible implementation, the step of the bullet screen rendering module 412 extracting bullet screen data from the bullet screen queue to render at least part of bullet screen nodes in the preset number of bullet screen nodes into the AR identification plane, so that the bullet screen data moves in the AR identification plane, includes:
extracting bullet screen data from the bullet screen data queue, and extracting at least part of bullet screen nodes from a preset number of bullet screen nodes according to the number of bullet screens of the bullet screen data;
after adjusting at least part of extracted bullet screen nodes from an inoperable state to an operable state, loading a character string display component corresponding to each target bullet screen node in at least part of bullet screen nodes;
rendering the bullet screen data to an AR identification plane through a character string display component corresponding to each target bullet screen node;
adjusting the world coordinate change of the bullet screen corresponding to each target bullet screen node in the AR identification plane according to the node information of each target bullet screen node so as to enable bullet screen data to move in the AR identification plane;
and when the display of any bullet screen is finished, reconfiguring a target bullet screen node corresponding to the bullet screen into an inoperable state.
Based on the same inventive concept, please refer to fig. 9, which shows a schematic block diagram of a structure of an electronic device 400 for executing the bullet screen display method according to an embodiment of the present application, where the electronic device 400 may be the live viewing terminal 200 shown in fig. 1, or when a main broadcast of the live providing terminal 300 is taken as a viewer, the electronic device 400 may also be the live providing terminal 300 shown in fig. 1. As shown in fig. 8, the electronic device 400 may include a bullet screen display device 410, a machine-readable storage medium 420, and a processor 430.
In this embodiment, the machine-readable storage medium 420 and the processor 430 are both located in the electronic device 400 and are separately located. However, it should be understood that the machine-readable storage medium 420 may also be separate from the electronic device 400 and accessible by the processor 430 through a bus interface. Alternatively, the machine-readable storage medium 420 may be integrated into the processor 430, e.g., may be a cache and/or general registers.
The processor 430 is a control center of the electronic device 400, connects various parts of the entire electronic device 400 using various interfaces and lines, performs various functions of the electronic device 400 and processes data by operating or executing software programs and/or modules stored in the machine-readable storage medium 420 and calling data stored in the machine-readable storage medium 420, thereby performing overall monitoring of the electronic device 400. Alternatively, processor 430 may include one or more processing cores; for example, processor 430 may integrate an application processor that handles primarily the operating system, user interface, applications, etc., and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
Among other things, processor 430 may include one or more processing cores (e.g., a single-core processor (S) or a multi-core processor (S)). Merely by way of example, a Processor may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Set Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller Unit, a Reduced Instruction Set computer (Reduced Instruction Set computer), a microprocessor, or the like, or any combination thereof.
The machine-readable storage medium 420 may be mass storage, removable storage, volatile Read-and-write Memory, or Read-Only Memory (ROM), among others, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; removable memory may include flash drives, floppy disks, optical disks, memory cards, zip disks, tapes, and the like; volatile read-write Memory may include Random Access Memory (RAM); the RAM may include Dynamic RAM (DRAM), Double data Rate Synchronous Dynamic RAM (DDR SDRAM); static RAM (SRAM), Thyristor-Based Random Access Memory (T-RAM), Zero-capacitor RAM (Zero-RAM), and the like. By way of example, ROMs may include Mask Read-Only memories (MROMs), Programmable ROMs (PROMs), erasable Programmable ROMs (PERROMs), Electrically Erasable Programmable ROMs (EEPROMs), compact disk ROMs (CD-ROMs), digital versatile disks (ROMs), and the like. The machine-readable storage medium 420 may be self-contained and coupled to the processor 430 via a communication bus. The machine-readable storage medium 420 may also be integrated with the processor. The machine-readable storage medium 420 is used for storing, among other things, machine-executable instructions for performing aspects of the present application. The processor 430 is configured to execute machine executable instructions stored in the machine readable storage medium 420 to implement the bullet screen display method provided by the foregoing method embodiments.
The bullet screen display device 410 may include various functional modules (e.g., the model rendering module 411 and the bullet screen rendering module 412) described in fig. 8, and may be stored in the machine-readable storage medium 420 in the form of software program codes, and the processor 430 may execute the various functional modules of the bullet screen display device 410 to implement the bullet screen display method provided by the foregoing method embodiment.
Since the electronic device 400 provided in the embodiment of the present application is another implementation form of the method embodiment executed by the electronic device 400, and the electronic device 400 can be used to execute the bullet screen display method provided in the method embodiment, the technical effect obtained by the electronic device 400 can refer to the method embodiment, and is not described herein again.
Further, an embodiment of the present application also provides a readable storage medium containing computer-executable instructions, where the computer-executable instructions can be used to implement the bullet screen display method provided by the foregoing method embodiment when executed.
Of course, the storage medium provided in the embodiments of the present application and containing the computer-executable instructions is not limited to the above method operations, and may also perform related operations in the bullet screen display method provided in any embodiments of the present application.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
The above description is only for various embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and all such changes or substitutions are included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A barrage display method is applied to a live viewing terminal, and comprises the following steps:
rendering the received live stream to a target model object in an Augmented Reality (AR) recognition plane so that the live stream is displayed on the target model object;
and rendering the barrage data corresponding to the live stream into the AR identification plane so that the barrage data moves in the AR identification plane.
2. The bullet screen display method according to claim 1, wherein said step of rendering the received live stream onto a target model object in an Augmented Reality (AR) recognition plane to display the live stream on the target model object comprises:
creating a tracing point on a preset point of the AR identification plane, wherein the tracing point is used for fixing the target model object on the preset point;
creating a corresponding display node at the position of the tracing point, and creating a first child node, wherein the first child node is used for adjusting and displaying the target model object, and the display node is a father node of the first child node;
creating a second child node inherited to the first child node to replace a bone adjustment node with the second child node upon detection of an addition request of the bone adjustment node, wherein the bone adjustment node is used to adjust a bone point of the target model object.
3. The bullet screen display method according to claim 2, wherein the step of rendering bullet screen data corresponding to the live stream into the AR recognition plane so that the bullet screen data moves in the AR recognition plane includes:
acquiring bullet screen data corresponding to the live stream from a live server, and adding the bullet screen data to a bullet screen queue;
initializing node information of a preset number of bullet screen nodes, wherein a father node of each bullet screen node is the second child node, and each bullet screen node is used for displaying a bullet screen;
and extracting the bullet screen data from the bullet screen queue to render the bullet screen data into the AR identification plane through at least part of bullet screen nodes in the preset number of bullet screen nodes, so that the bullet screen data moves in the AR identification plane.
4. The bullet screen display method according to claim 3, wherein said step of adding said bullet screen data to a bullet screen queue comprises:
judging whether the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data;
if the queue length of the bullet screen queue is not greater than the bullet screen number of the bullet screen data, adding the bullet screen data into the bullet screen queue;
if the queue length of the bullet screen queue is greater than the number of bullet screens of the bullet screen data, when the queue length of the bullet screen queue is greater than the number of the bullet screens of the bullet screen data each time, the length of the bullet screen queue is extended by a preset length, and then the bullet screen data are continuously added into the bullet screen queue;
and if the length of the queue after the extension of the bullet screen queue is greater than a preset threshold value, discarding the number of bullet screens with a set number from the bullet screen queue according to the sequence of the bullet screen time from morning to evening.
5. The bullet screen display method according to claim 3, wherein said step of initializing and configuring a preset number of bullet screen nodes comprises:
configuring a preset number of bullet screen nodes with the second child node as a father node;
and respectively configuring display information of each bullet screen node in the AR identification plane.
6. The bullet screen display method according to claim 5, wherein the AR identification plane includes an X-axis, a Y-axis and a Z-axis with the second node as a coordinate center axis, and the step of configuring the display information of each bullet screen node in the AR identification plane respectively includes:
respectively configuring world coordinates of each bullet screen node in the AR identification plane along different offset displacement points on the Y axis and the Z axis so as to enable each bullet screen node to be arranged at intervals along the Y axis and the Z axis;
setting a position on the X axis, which is deviated from the father node in the first direction by a preset unit displacement, as a world coordinate at which each bullet screen node starts to display, and setting a position, which is deviated from the father node in the second direction by a preset unit displacement, as a world coordinate at which each bullet screen node finishes to display.
7. The bullet screen display method according to claim 3, wherein before the step of extracting the bullet screen data from the bullet screen queue to render the bullet screen data into the AR identification plane through at least some of the bullet screen nodes in the preset number of bullet screen nodes, the method further comprises:
configuring the preset number of bullet screen nodes into an inoperable state;
the step of extracting the barrage data from the barrage queue to render the barrage data into the AR identification plane through at least part of the barrage nodes in the preset number of barrage nodes, so that the barrage data moves in the AR identification plane includes:
extracting bullet screen data from a bullet screen data queue, and extracting at least part of bullet screen nodes from the preset number of bullet screen nodes according to the number of bullet screens of the bullet screen data;
after the extracted at least part of bullet screen nodes are adjusted from the inoperable state to the operable state, loading a character string display component corresponding to each target bullet screen node in the at least part of bullet screen nodes;
rendering the bullet screen data to the AR identification plane through a character string display component corresponding to each target bullet screen node;
adjusting the world coordinate change of the bullet screen corresponding to each target bullet screen node in the AR identification plane according to the node information of each target bullet screen node so as to enable the bullet screen data to move in the AR identification plane;
and when the display of any bullet screen is finished, reconfiguring a target bullet screen node corresponding to the bullet screen into an inoperable state.
8. The utility model provides a bullet screen display device which characterized in that is applied to live viewing terminal, the device includes:
the model rendering module is used for rendering the received live stream to a target model object in an Augmented Reality (AR) recognition plane so as to display the live stream on the target model object;
and the barrage rendering module is used for rendering the barrage data corresponding to the live stream into the AR identification plane so as to enable the barrage data to move in the AR identification plane.
9. An electronic device, comprising a machine-readable storage medium having stored thereon machine-executable instructions and a processor, wherein the processor, when executing the machine-executable instructions, implements the bullet screen display method of any one of claims 1-7.
10. A readable storage medium having stored therein machine executable instructions which, when executed, implement the bullet screen display method of any one of claims 1 to 7.
CN201911080076.9A 2019-11-07 2019-11-07 Barrage display method and device, electronic equipment and readable storage medium Pending CN110719493A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201911080076.9A CN110719493A (en) 2019-11-07 2019-11-07 Barrage display method and device, electronic equipment and readable storage medium
PCT/CN2020/127052 WO2021088973A1 (en) 2019-11-07 2020-11-06 Live stream display method and apparatus, electronic device, and readable storage medium
US17/630,187 US20220279234A1 (en) 2019-11-07 2020-11-06 Live stream display method and apparatus, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911080076.9A CN110719493A (en) 2019-11-07 2019-11-07 Barrage display method and device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN110719493A true CN110719493A (en) 2020-01-21

Family

ID=69214768

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911080076.9A Pending CN110719493A (en) 2019-11-07 2019-11-07 Barrage display method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN110719493A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021088973A1 (en) * 2019-11-07 2021-05-14 广州虎牙科技有限公司 Live stream display method and apparatus, electronic device, and readable storage medium
CN113542846A (en) * 2020-04-21 2021-10-22 上海哔哩哔哩科技有限公司 AR barrage display method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011112368A2 (en) * 2010-03-10 2011-09-15 Empire Technology Development Llc Robust object recognition by dynamic modeling in augmented reality
CN106792072A (en) * 2016-12-28 2017-05-31 苏州商信宝信息科技有限公司 The method and its system of barrage and video are shown under a kind of separating medium
CN106792096A (en) * 2016-12-28 2017-05-31 苏州商信宝信息科技有限公司 A kind of augmented reality method and its system based on barrage
CN108347657A (en) * 2018-03-07 2018-07-31 北京奇艺世纪科技有限公司 A kind of method and apparatus of display barrage information
CN108421240A (en) * 2018-03-31 2018-08-21 成都云门金兰科技有限公司 Court barrage system based on AR

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011112368A2 (en) * 2010-03-10 2011-09-15 Empire Technology Development Llc Robust object recognition by dynamic modeling in augmented reality
CN106792072A (en) * 2016-12-28 2017-05-31 苏州商信宝信息科技有限公司 The method and its system of barrage and video are shown under a kind of separating medium
CN106792096A (en) * 2016-12-28 2017-05-31 苏州商信宝信息科技有限公司 A kind of augmented reality method and its system based on barrage
CN108347657A (en) * 2018-03-07 2018-07-31 北京奇艺世纪科技有限公司 A kind of method and apparatus of display barrage information
CN108421240A (en) * 2018-03-31 2018-08-21 成都云门金兰科技有限公司 Court barrage system based on AR

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021088973A1 (en) * 2019-11-07 2021-05-14 广州虎牙科技有限公司 Live stream display method and apparatus, electronic device, and readable storage medium
CN113542846A (en) * 2020-04-21 2021-10-22 上海哔哩哔哩科技有限公司 AR barrage display method and device

Similar Documents

Publication Publication Date Title
US10499035B2 (en) Method and system of displaying a popping-screen
CN108010112B (en) Animation processing method, device and storage medium
CN110784733B (en) Live broadcast data processing method and device, electronic equipment and readable storage medium
CN113457160B (en) Data processing method, device, electronic equipment and computer readable storage medium
CN113286159B (en) Page display method, device and equipment of application program
CN109045694B (en) Virtual scene display method, device, terminal and storage medium
CN111880877B (en) Animation switching method, device, equipment and storage medium
CN109788212A (en) A kind of processing method of segmenting video, device, terminal and storage medium
WO2018166470A1 (en) Animation display method based on frame rate and terminal device
WO2023226814A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN110719493A (en) Barrage display method and device, electronic equipment and readable storage medium
CN113705520A (en) Motion capture method and device and server
CN111131910B (en) Bullet screen implementation method and device, electronic equipment and readable storage medium
CN110856005A (en) Live stream display method and device, electronic equipment and readable storage medium
CN110102057B (en) Connecting method, device, equipment and medium for cut-scene animations
CN111756952A (en) Preview method, device, equipment and storage medium of effect application
WO2024131577A1 (en) Method and apparatus for creating special effect, and device and medium
US10328336B1 (en) Concurrent game functionality and video content
US20220279234A1 (en) Live stream display method and apparatus, electronic device, and readable storage medium
CN111127607A (en) Animation generation method, device, equipment and medium
CN115311397A (en) Method, apparatus, device and storage medium for image rendering
CN113559503A (en) Video generation method, device and computer readable medium
CN111475240B (en) Data processing method and system
CN113975802A (en) Game control method, device, storage medium and electronic equipment
CN113318441A (en) Game scene display control method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200121

RJ01 Rejection of invention patent application after publication