CN111541928A - Live broadcast display method, device, equipment and storage medium - Google Patents

Live broadcast display method, device, equipment and storage medium Download PDF

Info

Publication number
CN111541928A
CN111541928A CN202010311523.3A CN202010311523A CN111541928A CN 111541928 A CN111541928 A CN 111541928A CN 202010311523 A CN202010311523 A CN 202010311523A CN 111541928 A CN111541928 A CN 111541928A
Authority
CN
China
Prior art keywords
live
virtual
live broadcast
user
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010311523.3A
Other languages
Chinese (zh)
Other versions
CN111541928B (en
Inventor
杨宇晓
吕婧
徐伟裕
江全海
邓小明
谢唐瑜
朱武辉
张京燕
骆绍豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Kugou Computer Technology Co Ltd
Original Assignee
Guangzhou Kugou Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Kugou Computer Technology Co Ltd filed Critical Guangzhou Kugou Computer Technology Co Ltd
Priority to CN202010311523.3A priority Critical patent/CN111541928B/en
Publication of CN111541928A publication Critical patent/CN111541928A/en
Application granted granted Critical
Publication of CN111541928B publication Critical patent/CN111541928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a live broadcast display method, a live broadcast display device, live broadcast display equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: and displaying a user interface of the live client, wherein the user interface comprises a live picture of the anchor user. And displaying a virtual mouth on the target position of the live broadcast picture in an overlapping manner. In response to receiving the feeding operation, displaying the flying virtual food on the live screen in an overlapping manner, wherein the flying direction of the virtual food points to the virtual mouth. The application provides a novel live broadcast display human-computer interaction mode.

Description

Live broadcast display method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a live broadcast display method, apparatus, device, and storage medium.
Background
In a network live broadcast platform, a main broadcast client side forwards a collected live broadcast stream to a plurality of audience client sides through a server. And the audience client receives the live stream and displays the live frame of the anchor user according to the live stream.
When the audience user watches the live broadcast of the anchor user, the user interface of the audience client not only displays the live broadcast picture of the anchor, but also displays at least one control of a bullet screen control, a praise control and a gift sending control. The audience user sends the barrage on the live broadcast picture by using the barrage control, or the audience user sends the like to the anchor user by using the like control, or the audience user gives a virtual gift to the anchor user by using the gift control.
Because there is a geographical interval between the audience user and the anchor user, the common human-computer interaction forms are limited to the three types, so that the human-computer interaction form between the audience user and the anchor user is limited in the network live broadcast process.
Disclosure of Invention
The application provides a live broadcast display method, a live broadcast display device, live broadcast display equipment and a live broadcast display storage medium, which can enrich the human-computer interaction form between audience users and anchor users in the network live broadcast process. The technical scheme is as follows:
in one aspect, a live broadcast display method is provided, and the method includes:
displaying a user interface of a live client, wherein the user interface comprises a live frame of a main broadcast user;
displaying a virtual mouth on the target position of the live broadcast picture in an overlapping manner;
in response to receiving a feeding operation, displaying a flying virtual food on the live screen in an overlapping manner, wherein the flying direction of the virtual food points to the virtual mouth.
In another aspect, a live display apparatus is provided, the apparatus including:
the display module is used for displaying a user interface of a live client, and the user interface comprises a live frame of a main broadcast user;
the display module is further used for displaying a virtual mouth on the target position of the live broadcast picture in an overlapping mode;
the display module is further used for responding to the fact that the interaction module receives feeding operation, displaying flying virtual food on the live broadcast picture in an overlapping mode, and the flying direction of the virtual food points to the virtual mouth.
Optionally, a feeding button is further displayed on the user interface, and the feeding operation is a touch operation on the feeding button;
the display module is used for responding to the received touch operation and overlaying and displaying flying virtual food on the live broadcast picture, wherein the flying starting point of the virtual food is the feeding button, and the flying terminal point of the virtual food is the virtual mouth.
Optionally, the display module is configured to obtain flight information of the virtual food from a server, where the flight information includes a first coordinate of the feeding button in the live view and a second coordinate of the virtual mouth in the live view; the first coordinate is used as a flight starting point of the virtual food, the flying virtual food is superposed and displayed on the live broadcast picture, and a flight ending point of the virtual food is the second coordinate; setting transparency of the virtual food to a target value in response to the virtual food flying to the flight destination.
Optionally, the feeding operation is a continuous click operation on the feeding button, and the number of the virtual food and the number of clicks of the continuous click operation are positively correlated.
Optionally, the display module is configured to identify a picture position of a human face and a mouth in the live picture;
and displaying the virtual mouth in an overlapping way on the picture position.
Optionally, the apparatus further comprises:
the receiving module is used for receiving a gift sending operation of sending out a fed gift to the anchor user;
the display module is further used for responding to the gift delivering operation, displaying the feeding button on the live broadcast picture in an overlapping mode, and executing the step of displaying the virtual mouth on the target position of the live broadcast picture in an overlapping mode.
Optionally, the display module is further configured to display a countdown animation in an overlay manner on the live broadcast picture in response to the gift sending operation;
and responding to the countdown ending of the countdown animation, and switching and displaying the countdown animation as the feeding button on the live broadcast picture.
Optionally, the apparatus further comprises:
the first determining module is used for determining the user score according to the clicking times of the continuous clicking operation on the feeding button;
and the second determining module is used for determining ranking information according to the user scores, wherein the ranking information comprises a live broadcast room ranking and/or a platform ranking.
In yet another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the live display method of the above aspect.
In yet another aspect, a computer storage medium is provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the live display method of the above aspect.
The beneficial effect that technical scheme that this application provided brought includes at least:
by displaying a user interface of a live client, the user interface includes a live view of an anchor user. And then, displaying the virtual mouth on the target position of the live broadcast picture in an overlapping manner. When the feeding operation is received, the flying virtual food is displayed on the live broadcast picture in an overlapping mode, the flying direction of the virtual food points to the virtual mouth, the intimacy of the close-range feeding of the audience user and the anchor user can be simulated, the region limitation between the audience user and the anchor user is broken through to create a new human-computer interaction atmosphere, and a new human-computer interaction mode of live broadcast display is provided.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic interface diagram of a "eating challenge" game provided by an embodiment of the present application;
fig. 2 is a schematic structural diagram of a live broadcast system provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a live broadcast display method according to an embodiment of the present application;
FIG. 4 is a schematic view of a live interface provided in an embodiment of the present application;
fig. 5 is a schematic view of an interface for delivering a gift in a live interface according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface for displaying a flight animation of a virtual food provided by an embodiment of the present application;
fig. 7 is a schematic flowchart of another live display method provided in an embodiment of the present application;
fig. 8 is a schematic interface diagram illustrating displaying gift-sending operation information in a live interface according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an interface for displaying a countdown animation according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a live display device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of another live display apparatus provided in an embodiment of the present application;
fig. 12 is a schematic structural diagram of another live display apparatus provided in the embodiment of the present application;
fig. 13 is a schematic structural diagram of a terminal according to an embodiment of the present application.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
FIG. 1 is a schematic interface diagram of a "challenge to eat" game provided by an embodiment of the present application. As shown in fig. 1, a live interface 101 of the live client includes a live screen of the anchor user. When an audience user at a live client presents a "draught challenge" gift 102 to a host user, the live client displays a feed button 103 in the live interface and a virtual mouth 104 at the host user's mouth.
Depending on the number of clicks of the viewer user on the feeding button 103, the live client displays an animation of the virtual food 105 flying from the feeding button to the virtual mouth 104. For example, each time the feeding button is clicked, an animation of the virtual food flying from the feeding button to the virtual mouth is displayed. Completely disappeared when the virtual food flown to the virtual mouth.
According to the number of clicks of the audience user on the feeding button 103, the server generates the user score of the audience user, and the live client displays the user score information 106.
And ranking the user scores of all the audience users by the server according to the user scores of all the audience users in the live client, and displaying a ranking result by the live client. The ranking results include ranking results 107 of the user scores of all audience users in the live client, and ranking results 108 of the user scores of all audience users in the respective live rooms in the live client. The server generates reward information according to ranking results 107 of the user scores of all the audience users in the live broadcast client and ranking results 108 of the user scores of all the audience users in each live broadcast room in the live broadcast client, and the live broadcast client issues virtual currency rewards to the audience users and anchor users receiving gifts of the audience users according to the reward information.
When watching the live broadcast, the audience users send out the gift of the 'eating challenge' to the main broadcast user, and start the game of the 'eating challenge' in the live broadcast client of the audience users. Audience users in the game can display the 'simulated feeding' to the anchor user by clicking the corresponding button. Creates an interesting atmosphere similar to the short-distance feeding. The user scores of the audience users are then displayed and ranked and awarded to the audience users as well as the anchor user. A new man-machine interaction mode of live broadcast display is provided.
And the interestingness of the interaction between the audience user and the anchor user is improved.
Fig. 2 is a schematic structural diagram of a live broadcast system provided in an embodiment of the present application, and as shown in fig. 2, the system may include: a live server 210, a main cast terminal 220 of the live room and at least one viewer terminal 230 of the live room.
The live broadcast server 210 may be a server, a server cluster composed of a plurality of servers, a cloud computing service center, or the like, which is not limited herein. The anchor terminal 220 may be a terminal device including a camera, such as a smart phone, a tablet computer, a desktop computer, a notebook computer, and the like. The viewer terminal 230 may be a smart phone, a computer, a television, a multimedia player, an e-reader, etc. The connection between the live server 210 and the anchor terminal 220 may be established through a wired network or a wireless network, and the connection between the live server 210 and the viewer terminal 230 may be established through a wired network or a wireless network. As shown in fig. 2, in the embodiment of the present application, the anchor terminal 220 is a desktop computer, and the viewer terminal 230 is a smart phone.
It should be noted that a live client is installed on the audience terminal 230, the audience terminal 230 is connected to the live server 210 through the live client, and the live server 210 is a server corresponding to the live client. The anchor terminal 220 is installed with a live client, and the anchor terminal 220 is connected with the live server 210 through the live client. The live client on the audience terminal and the live client on the anchor terminal may be the same or different.
Fig. 3 is a schematic flow chart of a live broadcast display method according to an embodiment of the present application. The method may be used for a live client on a viewer terminal 230 as shown in fig. 2. As shown in fig. 3, the method includes:
step 301, displaying a user interface of the live client, where the user interface includes a live frame of the anchor user.
The user interface displayed by the live client may refer to a live interface including a live frame of the anchor user. Optionally, the live broadcast frame of the anchor user refers to that the anchor terminal collects a live broadcast stream, and a live broadcast client on the anchor terminal sends the live broadcast stream to the server. And then the server sends the live stream to a live client of the audience user, and the live client of the audience user processes a live frame displayed after the live stream.
Exemplarily, fig. 4 is a schematic view of a live interface provided in an embodiment of the present application. As shown in fig. 4, in the live interface 401, a live screen 402 of the anchor user, a send bullet screen button 403, and bullet screen information 404 may be included. A send gift button 405 may also be included in the live interface.
And step 302, displaying a virtual mouth on the target position of the live broadcast picture in an overlapping manner.
The target position can be the face and mouth position of a main broadcasting user in a live broadcasting picture, the center position of the live broadcasting picture or any specified position in the live broadcasting picture. For example, when the human face mouth of the anchor user exists in the live screen, the live client may display the virtual mouth in an overlapping manner at the position of the human face mouth of the anchor user. When the live broadcast picture does not have the human face mouth of the anchor user, the live broadcast client can display the virtual mouth at the center of the live broadcast picture. The virtual mouth may change position in response to movement of the anchor user.
The virtual mouth of the live client, which is displayed in an overlapping manner on the target position of the live picture, can be static or dynamic. For example, the virtual mouth may be an animation that simulates a human face's mouth opening and closing. Optionally, the server determines, from the facial features of the anchor user, a virtual mouth that best matches the facial features of the anchor user among the plurality of virtual mouths, and the virtual mouth is displayed by the live client. For example, when the anchor user is female, the live client displays a virtual mouth that is lovely. When the anchor user is male, the live client displays a common virtual mouth.
Optionally, the live client determines coordinates of the face mouth of the anchor user in real time in a live picture through a face recognition technology, and then displays the virtual mouth through the coordinates. Or the server determines the coordinates of the face mouth of the anchor user in real time in a live broadcast picture through a face recognition technology and sends the coordinates to the live broadcast client, and the live broadcast client displays the virtual mouth through the coordinates. When the position of the face mouth is determined by the face recognition technology, the position of the face mouth can be determined by a feature comparison method, namely, at least one attribute of the contour size, the position and the distance attribute of the face mouth is determined firstly, then the geometric feature information of the face mouth is determined, and the position of the face mouth is determined by comparing the information with the at least one attribute. The face mouth position can also be determined by a face pattern template method, namely, a plurality of face templates are stored firstly, and the live broadcast picture of the anchor user is compared with the face templates, so that the face mouth position of the anchor user is determined. And is not limited herein.
The live broadcast client can display the virtual mouth in an overlapping mode on the target position of the live broadcast picture when the audience user presents the feeding gift to the anchor user. The live broadcast client may identify the feeding present through a feeding present identification for indicating that the live broadcast client starts to perform the step of displaying the virtual mouth when the feeding present of the feeding present identification is presented. The live broadcast client can display a feeding starting button in a live broadcast interface. The live client may display a virtual mouth superimposed on the target location of the live view when the audience user clicks the start feeding button. Clicking the start feeding button is used to instruct the start of the step of displaying the virtual mouth.
By way of example, fig. 5 is a schematic interface diagram of delivering a gift in a live interface provided in an embodiment of the present application. As shown in fig. 5, in the interface, the upper half area is a live video of the anchor user, and the lower half area is a gift area, which may include one or more gifts therein. For example, may include feeding gifts 501 and other gifts 503. The feeding gift label 502 can be displayed beside the feeding gift.
Step 303, in response to receiving the feeding operation, superimposing and displaying the flying virtual food on the live broadcast picture, wherein the flying direction of the virtual food points to the virtual mouth.
The feeding operation is used for instructing the live client to start to superpose and display flying virtual food on the live picture. Optionally, when the terminal detects a click operation on a feeding button corresponding to the feeding operation, when the terminal recognizes a voice instruction corresponding to the feeding operation, and/or when the terminal detects a sliding operation in a specified area in a live broadcast interface, it is determined that the feeding operation is received. For example, the voice command corresponding to the feeding operation may be "feed" or "feed his/her food", etc.
The live broadcast client can determine the number of the displayed flying virtual foods according to the number of times of clicking the feeding button corresponding to the feeding operation. Considering the flight time of the virtual food, when the number of clicks is large, a maximum of 3 virtual foods may be displayed every 0.5 seconds.
For example, referring to fig. 1, when the terminal detects a click operation on the feeding button 103, it is determined that the feeding operation is received.
Optionally, the virtual food comprises any random virtual item to be fed into the virtual mouth, such as cake, apple, ice cream, lipstick, and the like. The flight direction of the virtual food is directed to the virtual mouth. The flight starting point of the virtual food may be a position of a feeding button corresponding to a feeding operation, and the flight ending point of the virtual food may be a position of a virtual mouth. The virtual food can fly from the flight starting point to the flight terminal point in a straight line, and can also fly from the flight starting point to the flight terminal point in an arc throwing way. The flying speed of the virtual food can be constant or variable during the flying process. The live client may limit the flight duration of the virtual food when the virtual food is flying at varying speeds. The flight duration of the virtual food can be determined by the live broadcast client according to the distance between the flight starting point and the flight ending point, and divided by the set flight speed. Illustratively, the distance between the flight starting point and the flight ending point is 3cm, the set flight speed is 1.5cm/s, and the flight time length is 2 seconds. The display size may be gradually reduced during flight of the virtual food. For example, the display size of the virtual food is 100% at the flight start point, and the display size of the virtual food is 50% at the flight end point. When the virtual food flies to the flight end, the transparency may change from 1 to 0, i.e. the virtual food disappears completely.
Illustratively, fig. 6 is an interface schematic diagram of a flying animation displaying virtual food provided by an embodiment of the present application. The interface includes a live view of the anchor user. The display size of the virtual food 601 displayed at the first flight time is 90%, and the display size of the virtual food 602 displayed at the second flight time is 60%. Wherein the second time of flight is subsequent to the first time of flight.
To sum up, the live broadcast display method provided by the embodiment of the present application displays a user interface of a live broadcast client, where the user interface includes a live broadcast picture of a host user. And then, displaying the virtual mouth on the target position of the live broadcast picture in an overlapping manner. When receiving feeding operation, the flying virtual food is superposed and displayed on a live broadcast picture, the flying direction of the virtual food points to the virtual mouth, the intimacy of the close-range feeding of audience users and the anchor users can be simulated, and the region limitation between the audience users and the anchor users is broken through to create a new human-computer interaction atmosphere.
Fig. 7 is a flowchart illustrating another live display method according to an embodiment of the present application. The method may be used for a live client on a viewer terminal 230 as shown in fig. 2. As shown in fig. 7, the method includes:
and 701, displaying a user interface of the live client, wherein the user interface comprises a live broadcast picture of a main broadcast user.
For example, a user interface displaying a live client may refer to fig. 4. This application is not described in detail herein.
Step 702, receiving a gift feeding operation for sending out a feeding gift to a host user.
The live broadcast client can display the gift sending operation information of the fed gift in the live broadcast interface when the audience user sends the fed gift to the anchor user.
Fig. 8 is a schematic interface diagram for displaying gift-sending operation information in a live interface according to an embodiment of the present application. As shown in fig. 8, a live view is displayed in the interface, and gift-offering operation information 801 of the viewer user is displayed in the live view in an overlapping manner to prompt the viewer user to complete the gift-offering operation.
The live broadcast client may receive a gift sending operation to send out the fed gift to the anchor user after the audience user selects the fed gift and completes the virtual currency payment for the gift. Or when the anchor user determines to receive the fed gifts sent by the audience users, and the service sends the determination information to the live broadcast client, the live broadcast client receives the gift sending operation of sending the fed gifts to the anchor user.
And step 703, responding to the gift delivering operation, overlaying and displaying a feeding button on the live broadcast picture and overlaying and displaying a virtual mouth on the target position of the live broadcast picture.
The target position can be the face and mouth position of a main broadcast user in a live broadcast picture identified by a live broadcast client, the center position of the live broadcast picture, or any specified position in the live broadcast picture. The live client can display a feeding button according to the position of the virtual mouth. For example, the feeding button displayed at the live client is directly below the virtual mouth, at a distance of 2 cm. The position of the feeding button displayed by the live client can be changed in real time according to the position of the virtual mouth.
The live broadcast client can also display countdown animation in an overlaying mode on a live broadcast picture. And when the countdown of the countdown animation is finished, the countdown animation is switched and displayed as a feeding button on the live broadcast picture. The countdown animation may be a three second countdown animation.
Illustratively, fig. 9 is a schematic interface diagram for displaying a countdown animation provided by an embodiment of the present application. As shown in fig. 9(a), a live view of the anchor user is displayed in the live view interface. When the present delivery operation is received, a countdown animation 901, which is an animation when 3 seconds are counted down, is displayed. As shown in fig. 9(b), the countdown animation 902 is an animation that counts down for 1 second. As shown in fig. 9(c), when the countdown of the countdown animation is completed, the countdown animation is switched and displayed as the feeding button 903 on the live view screen.
And 704, responding to the received feeding operation, and overlaying and displaying the flying virtual food on the live broadcast picture.
Optionally, a feeding button is further displayed on the live broadcast interface. The live broadcast client can display the feeding button on the live broadcast interface within a set time length. When the set time length is exceeded, the feeding button disappears, and the feeding button can not be clicked. The set time period may be 6 seconds. The feeding operation may be a touch operation on a feeding button. Responding to a feeding instruction triggered by the touch operation in response to receiving the feeding operation. For example, the feeding operation may be a continuous click operation on a feeding button. Optionally, the number of flying virtual food and the number of clicks of the continuous clicking operation are positively correlated. For example, each time the feed button is clicked, the live client overlays a virtual food item that is in flight. When the feeding button is clicked more than 3 times in 0.5 seconds, the live client may display three flying virtual foods in superposition for at most 0.5 seconds.
The flight direction of the virtual food is directed to the virtual mouth. The flight starting point of the virtual food may be the position of the feeding button, and the flight ending point of the virtual food may be the position of the virtual mouth.
The live client may obtain flight information of the virtual food from the server. The flight information may include a first coordinate of the feeding button in the live view, a second coordinate of the virtual mouth in the live view, and a time of flight of the virtual food. The live broadcast client superposes and displays flying virtual food on a live broadcast picture, the flight starting point of the virtual food is the first coordinate, the flight ending point of the virtual food is the second coordinate, and the flight duration of the virtual food is the flight time.
Optionally, the transparency of the virtual food is set to a target value in response to the virtual food flying to the flight end. The target value may be 0, i.e., when the virtual food flies to the flight end, the virtual food disappears completely.
Step 705, determining the user score according to the click times of the continuous click operation on the feeding button.
The live broadcast client can determine the user score according to the different frequency intervals of the click frequency of the continuous click operation. Or the server determines the user score according to different frequency intervals of the click frequency of the continuous click operation and sends the user score to the live client.
1. When the number of clicks is 1 or more and 10 or less, the final score is determined to be 99.
2. When the number of clicks is greater than 10 and equal to or less than 30, a final score is randomly generated between 100 and 1000.
3. When the number of clicks is greater than 30 and equal to or less than 50, a final score is randomly generated between 1000 and 3000.
4. When the number of clicks is greater than 50 and equal to or less than 70, a final score is randomly generated between 3000 and 4500.
5. When the number of clicks is greater than 70, a final score is randomly generated between 1500 and 2500.
Illustratively, when the number of clicks is 30, the user score is determined to be 987. When the number of clicks is 60, the user score is determined to be 4120.
Step 706, display user score.
For example, an interface displaying the user score may refer to fig. 1. This application is not described in detail herein.
And step 707, obtaining the ranking information determined according to the user scores.
The ranking information includes a live space ranking and/or a platform ranking. The live broadcast room ranking refers to the user score ranking of each audience user in any live broadcast room of the live broadcast client. The platform rank refers to the user score ranking of all audience users in the live client. The ranking information may be calculated according to the historical highest user score of the viewer user when determining the ranking information. Illustratively, the audience user has 2 user scores, 987 and 4120, respectively, within a certain direct broadcasting time. Then a user score of 4120 is calculated for the viewer user when determining the live room ranking.
The user score may be sent by the live client to a server, which determines ranking information. And sending the ranking information to the live client. The period of live room ranking may be one day. Namely, at 0 point every day, the live broadcast room ranking is cleared, and when the server acquires a user score, the live broadcast room ranking is updated and sent to the live broadcast client. The period of the platform ranking may be one week. Namely, at 0 o' clock of every Monday, the platform ranking is emptied, and when the server acquires a user score, the platform ranking is updated and sent to the live broadcast client.
Step 708, obtaining the reward information determined according to the ranking information.
The reward information may include reward information for the audience users as well as reward information for the anchor user. The viewer user's reward information and the anchor user's reward information may be used to deliver rewards to the viewer user as well as the anchor user. The reward information may be determined by the server from the ranking information and sent to the live client. The anchor user refers to the anchor user who received the feeding gift given by the audience user.
The server can determine first user reward information and first main broadcast reward information according to the live broadcast room ranking, and the live broadcast client can issue rewards to audience users ranked in the top 3 of the live broadcast room ranking through the first user reward information. The server can determine second user reward information and anchor reward information according to the platform ranking, and the live broadcast client can issue rewards to audience users at the top 3 of the platform ranking and to the anchor users corresponding to the user scores of the audience users at the top 3 through the second user reward information and the anchor reward information. The reward may be a virtual coin in the live client.
For example, a first one of the audience users ranked 3 top in the live space may be awarded 1 ten thousand virtual coin awards, a second one of the audience users ranked 3 top in the live space may be awarded 5000 virtual coin awards, and a third one of the audience users ranked 3 top in the live space may be awarded 2000 virtual coin awards. 10 ten thousand virtual coin awards may be issued to the anchor user corresponding to the first one of the audience users with top 3 of the plat rank and the user score of that audience user, 5 ten thousand virtual coin awards may be issued to the anchor user corresponding to the second one of the audience users with top 3 of the plat rank and the user score of that audience user, and 3 ten thousand virtual coin awards may be issued to the anchor user corresponding to the third one of the audience users with top 3 of the plat rank and the user score of that audience user.
It should be noted that, the sequence of the steps of the live display method provided in the embodiment of the present application may be appropriately adjusted, and the steps may also be increased or decreased according to the circumstances, and any method that can be easily conceived by a person skilled in the art within the technical scope disclosed in the present application shall be included in the protection scope of the present application, and therefore, the details are not described again.
To sum up, in the live broadcast display method provided by the embodiment of the present application, when a gift sending operation of sending out a feed gift to a host user is received, a feed button is displayed in an overlapping manner on a live broadcast picture, and a virtual mouth is displayed in an overlapping manner on a target position of the live broadcast picture. When the feeding operation is received, virtual food flying is displayed on a live broadcast picture in an overlapping mode, the close behavior of close-range feeding of audience users and anchor users can be simulated, and the region limitation between the audience users and the anchor users is broken through to create a new man-machine interaction atmosphere.
And determining the user score according to the clicking times of the continuous clicking operation on the feeding button and displaying the user score. And determining ranking information according to the user scores. And awards are issued to the audience users and the anchor users based on the ranking information. The embodiment of the application increases the interestingness of the man-machine interaction mode.
Fig. 10 is a schematic structural diagram of a live display device according to an embodiment of the present application. The apparatus may be used for a live client on a viewer terminal 230 as shown in fig. 2. As shown in fig. 10, the apparatus 100 includes:
a display module 1001, configured to display a user interface of a live client, where the user interface includes a live frame of a host user.
The display module 1001 is further configured to display a virtual mouth in an overlay manner at the target position of the live view.
The display module 1001 is further configured to, in response to the receiving of the feeding operation by the interaction module 1002, superimpose and display the flying virtual food on the live screen, where a flying direction of the virtual food points to the virtual mouth.
To sum up, the live broadcast display device provided by the embodiment of the application displays the user interface of the live broadcast client through the display module, wherein the user interface includes the live broadcast picture of the anchor user. And then, displaying the virtual mouth on the target position of the live broadcast picture in an overlapping manner through a display module. When receiving the operation of throwing something and feeding through interactive module, overlap the virtual food that shows the flight through the display module on the live broadcast picture, the flight direction of virtual food points to virtual mouth, can simulate out audience user and the anchor user's close-range act of feeding, breaks through the region restriction between audience user and the anchor user and builds out new human-computer interaction atmosphere, and this application embodiment provides a new human-computer interaction mode that live broadcast shows.
Optionally, a feeding button is further displayed on the user interface, and the feeding operation is a touch operation on the feeding button. A display module 1001 for:
and responding to the received touch operation, superimposing and displaying flying virtual food on the live broadcast picture, wherein the flying starting point of the virtual food is a feeding button, and the flying end point of the virtual food is a virtual mouth.
Optionally, a display module 1001 configured to:
the method comprises the steps of obtaining flight information of virtual food from a server, wherein the flight information comprises a first coordinate of a feeding button in a live broadcast picture and a second coordinate of a virtual mouth in the live broadcast picture.
And taking the first coordinate as a flight starting point of the virtual food, superposing and displaying the flying virtual food on a live broadcast picture, and taking a flight finishing point of the virtual food as a second coordinate.
In response to the virtual food flying to the flight end point, the transparency of the virtual food is set to a target value.
Optionally, the feeding operation is a continuous click operation on the feeding button, and the number of virtual foods and the number of clicks of the continuous click operation are positively correlated.
Optionally, a display module 1001 configured to:
and recognizing the picture position of the human face and the mouth in the live broadcast picture.
And displaying the virtual mouth on the picture position in an overlapping way.
Optionally, as shown in fig. 11, the apparatus 100 further includes:
the receiving module 1003 is configured to receive a gift sending operation of sending out a feeding gift to the anchor user.
The display module 1001 is further configured to superimpose and display a feeding button on the live view in response to a gift giving operation, and perform a step of superimposing and displaying a virtual mouth on a target position of the live view.
Optionally, a display module 1001 configured to:
and responding to the gift giving operation, and overlaying and displaying the countdown animation on the live broadcast picture.
And in response to the countdown of the countdown animation ending, switching and displaying the countdown animation as a feeding button on the live broadcast picture.
Optionally, as shown in fig. 12, the apparatus 100 further includes:
the first determining module 1004 is used for determining the user score according to the clicking times of the continuous clicking operation on the feeding button;
a second determining module 1005, configured to determine ranking information according to the user score, where the ranking information includes a live room ranking and/or a platform ranking.
To sum up, the live display device that this application embodiment provided is receiving the present thing operation of sending out the present thing of throwing something and feeding to the anchor user through receiving module, superpose the demonstration through display module and throw something and feed the button and superpose on the target position on the live picture and show virtual mouth. When receiving the operation of throwing something and feeding through interactive module, overlap the virtual food that shows the flight through the display module on the live broadcast picture, can simulate out audience user and the anchor user's close-range act of feeding at close range, break through the region restriction between audience user and the anchor user and build new human-computer interaction atmosphere, this application embodiment provides a new human-computer interaction mode of live broadcast display.
And determining the user score through the first determining module according to the clicking times of the continuous clicking operations on the feeding button and displaying the user score. Ranking information is determined by a second determination module based on the user score. And awards are issued to the audience users and the anchor users based on the ranking information. The embodiment of the application increases the interestingness of the man-machine interaction mode.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Embodiments of the present application further provide a computer device, including: the device comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the memory of the device, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the live broadcast display method provided by the method embodiments.
The computer device may be a terminal. Exemplarily, fig. 13 is a schematic structural diagram of a terminal provided in an embodiment of the present application.
In general, terminal 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction for execution by processor 1301 to implement the live display method provided by method embodiments herein.
In some embodiments, terminal 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Processor 1301, memory 1302, and peripheral interface 1303 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1303 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, display screen 1305, camera assembly 1306, audio circuitry 1307, positioning assembly 1308, and power supply 1309.
Peripheral interface 1303 may be used to connect at least one peripheral associated with I/O (Input/Output) to processor 1301 and memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1301, the memory 1302, and the peripheral device interface 1303 may be implemented on a separate chip or circuit board, which is not limited in this embodiment of the present application.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1304 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1305 is a touch display screen, the display screen 1305 also has the ability to capture touch signals on or over the surface of the display screen 1305. The touch signal may be input to the processor 1301 as a control signal for processing. At this point, the display 1305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1305 may be one, providing the front panel of terminal 1300; in other embodiments, display 1305 may be at least two, either on different surfaces of terminal 1300 or in a folded design; in still other embodiments, display 1305 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1300. Even further, the display 1305 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal 1300 and the rear camera is disposed on the back of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for realizing voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1300. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1301 or the radio frequency circuitry 1304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1307 may also include a headphone jack.
The positioning component 1308 is used for positioning the current geographic position of the terminal 1300 to implement navigation or LBS (location based Service). The positioning component 1308 can be a positioning component based on the GPS (global positioning System) of the united states, the beidou System of china, or the galileo System of russia.
Power supply 1309 is used to provide power to various components in terminal 1300. The power source 1309 may be alternating current, direct current, disposable or rechargeable. When the power source 1309 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyro sensor 1312, pressure sensor 1313, fingerprint sensor 1314, optical sensor 1315, and proximity sensor 1316.
The acceleration sensor 1311 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1301 may control the touch display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1311. The acceleration sensor 1311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1312 may detect the body direction and the rotation angle of the terminal 1300, and the gyro sensor 1312 may cooperate with the acceleration sensor 1311 to acquire a 3D motion of the user with respect to the terminal 1300. Processor 1301, based on the data collected by gyroscope sensor 1312, may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1313 may be disposed on a side bezel of terminal 1300 and/or underlying touch display 1305. When the pressure sensor 1313 is disposed on the side frame of the terminal 1300, a user's holding signal to the terminal 1300 may be detected, and the processor 1301 performs left-right hand recognition or shortcut operation according to the holding signal acquired by the pressure sensor 1313. When the pressure sensor 1313 is disposed at a lower layer of the touch display screen 1305, the processor 1301 controls an operability control on the UI interface according to a pressure operation of the user on the touch display screen 1305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1314 is used for collecting the fingerprint of the user, and the processor 1301 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 1314, or the fingerprint sensor 1314 identifies the identity of the user according to the collected fingerprint. When the identity of the user is identified as a trusted identity, the processor 1301 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 1314 may be disposed on the front, back, or side of the terminal 1300. When a physical button or vendor Logo is provided on the terminal 1300, the fingerprint sensor 1314 may be integrated with the physical button or vendor Logo.
The optical sensor 1315 is used to collect the ambient light intensity. In one embodiment, the processor 1301 can control the display brightness of the touch display screen 1305 according to the intensity of the ambient light collected by the optical sensor 1315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the touch display 1305 is turned down. In another embodiment, the processor 1301 can also dynamically adjust the shooting parameters of the camera assembly 1306 according to the ambient light intensity collected by the optical sensor 1315.
Proximity sensor 1316, also known as a distance sensor, is typically disposed on a front panel of terminal 1300. Proximity sensor 1316 is used to gather the distance between the user and the front face of terminal 1300. In one embodiment, the processor 1301 controls the touch display 1305 to switch from the bright screen state to the dark screen state when the proximity sensor 1316 detects that the distance between the user and the front face of the terminal 1300 gradually decreases; the touch display 1305 is controlled by the processor 1301 to switch from the rest state to the bright state when the proximity sensor 1316 detects that the distance between the user and the front face of the terminal 1300 gradually becomes larger.
Those skilled in the art will appreciate that the configuration shown in fig. 13 is not intended to be limiting with respect to terminal 1300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
The embodiment of the present application further provides a computer storage medium, where at least one instruction, at least one program, a code set, or an instruction set may be stored in the storage medium, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by a processor to implement the live broadcast display method provided by the foregoing method embodiments.
It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only an example of the present application and should not be taken as limiting, and any modifications, equivalent switches, improvements, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (11)

1. A live display method, characterized in that the method comprises:
displaying a user interface of a live client, wherein the user interface comprises a live frame of a main broadcast user;
displaying a virtual mouth on the target position of the live broadcast picture in an overlapping manner;
in response to receiving a feeding operation, displaying a flying virtual food on the live screen in an overlapping manner, wherein the flying direction of the virtual food points to the virtual mouth.
2. The method according to claim 1, wherein a feeding button is further displayed on the user interface, and the feeding operation is a touch operation on the feeding button;
the responding to the receiving of the feeding operation, superposing and displaying flying virtual food on the live broadcast picture, wherein the flying direction of the virtual food points to the virtual mouth, and the method comprises the following steps:
and responding to the received touch operation, and superposing and displaying flying virtual food on the live broadcast picture, wherein the flying starting point of the virtual food is the feeding button, and the flying terminal point of the virtual food is the virtual mouth.
3. The method of claim 2, wherein the displaying the flying virtual food superimposed on the live view comprises:
acquiring flight information of the virtual food from a server, wherein the flight information comprises a first coordinate of the feeding button in the live broadcast picture and a second coordinate of the virtual mouth in the live broadcast picture;
the first coordinate is used as a flight starting point of the virtual food, the flying virtual food is superposed and displayed on the live broadcast picture, and a flight ending point of the virtual food is the second coordinate;
setting transparency of the virtual food to a target value in response to the virtual food flying to the flight destination.
4. The method according to claim 2, wherein the feeding operation is a continuous click operation on the feeding button, and the number of virtual foods and the number of clicks of the continuous click operation are positively correlated.
5. The method according to any one of claims 1 to 4, wherein the displaying of the virtual mouth in the target position of the live view in an overlaid manner comprises:
recognizing the picture position of the human face and the mouth in the live broadcast picture;
and displaying the virtual mouth in an overlapping way on the picture position.
6. The method of any of claims 2 to 4, further comprising:
receiving a gift sending operation of sending out a feeding gift to the anchor user;
and responding to the gift feeding operation, displaying the feeding button on the live broadcast picture in an overlapping mode, and executing the step of displaying the virtual mouth on the target position of the live broadcast picture in the overlapping mode.
7. The method of any one of claims 2 to 4, wherein the displaying the feeding button on the live screen in an overlaid manner in response to the gift feeding operation comprises:
responding to the gift sending operation, and overlaying and displaying a countdown animation on the live broadcast picture;
and responding to the countdown ending of the countdown animation, and switching and displaying the countdown animation as the feeding button on the live broadcast picture.
8. The method of any of claims 2 to 4, further comprising:
determining a user score according to the clicking times of the continuous clicking operation on the feeding button;
and determining ranking information according to the user scores, wherein the ranking information comprises a live room ranking and/or a platform ranking.
9. A live display apparatus, the apparatus comprising:
the display module is used for displaying a user interface of a live client, and the user interface comprises a live frame of a main broadcast user;
the display module is further used for displaying a virtual mouth on the target position of the live broadcast picture in an overlapping mode;
the display module is further used for responding to the fact that the interaction module receives feeding operation, displaying flying virtual food on the live broadcast picture in an overlapping mode, and the flying direction of the virtual food points to the virtual mouth.
10. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the live display method of any of claims 1 to 8.
11. A computer storage medium having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by a processor to implement a live display method as claimed in any one of claims 1 to 8.
CN202010311523.3A 2020-04-20 2020-04-20 Live broadcast display method, device, equipment and storage medium Active CN111541928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010311523.3A CN111541928B (en) 2020-04-20 2020-04-20 Live broadcast display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010311523.3A CN111541928B (en) 2020-04-20 2020-04-20 Live broadcast display method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111541928A true CN111541928A (en) 2020-08-14
CN111541928B CN111541928B (en) 2022-11-18

Family

ID=71976854

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010311523.3A Active CN111541928B (en) 2020-04-20 2020-04-20 Live broadcast display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111541928B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112672175A (en) * 2020-12-11 2021-04-16 北京字跳网络技术有限公司 Live broadcast interaction method and device, electronic equipment and storage medium
WO2023045708A1 (en) * 2021-09-24 2023-03-30 北京字跳网络技术有限公司 Interaction method and apparatus, electronic device, readable storage medium, and program product
CN116156268A (en) * 2023-02-20 2023-05-23 北京乐我无限科技有限责任公司 Virtual resource control method and device for live broadcasting room, electronic equipment and storage medium
JP7284909B1 (en) 2022-11-18 2023-06-01 17Live株式会社 game chip gift

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107566911A (en) * 2017-09-08 2018-01-09 广州华多网络科技有限公司 A kind of live broadcasting method, device, system and electronic equipment
CN108134964A (en) * 2017-11-22 2018-06-08 上海掌门科技有限公司 Net cast stage property stacking method, computer equipment and storage medium
CN109194973A (en) * 2018-09-26 2019-01-11 广州华多网络科技有限公司 A kind of more main broadcaster's direct broadcasting rooms give the methods of exhibiting, device and equipment of virtual present

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107566911A (en) * 2017-09-08 2018-01-09 广州华多网络科技有限公司 A kind of live broadcasting method, device, system and electronic equipment
CN108134964A (en) * 2017-11-22 2018-06-08 上海掌门科技有限公司 Net cast stage property stacking method, computer equipment and storage medium
CN109194973A (en) * 2018-09-26 2019-01-11 广州华多网络科技有限公司 A kind of more main broadcaster's direct broadcasting rooms give the methods of exhibiting, device and equipment of virtual present

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112672175A (en) * 2020-12-11 2021-04-16 北京字跳网络技术有限公司 Live broadcast interaction method and device, electronic equipment and storage medium
WO2023045708A1 (en) * 2021-09-24 2023-03-30 北京字跳网络技术有限公司 Interaction method and apparatus, electronic device, readable storage medium, and program product
JP7284909B1 (en) 2022-11-18 2023-06-01 17Live株式会社 game chip gift
JP2024074213A (en) * 2022-11-18 2024-05-30 17Live株式会社 Game Chip Gift
CN116156268A (en) * 2023-02-20 2023-05-23 北京乐我无限科技有限责任公司 Virtual resource control method and device for live broadcasting room, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111541928B (en) 2022-11-18

Similar Documents

Publication Publication Date Title
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
CN111541928B (en) Live broadcast display method, device, equipment and storage medium
CN109729411B (en) Live broadcast interaction method and device
CN111050189B (en) Live broadcast method, device, equipment and storage medium
CN112118477B (en) Virtual gift display method, device, equipment and storage medium
CN111083516B (en) Live broadcast processing method and device
WO2023000677A1 (en) Content item display method and apparatus
CN112181572A (en) Interactive special effect display method and device, terminal and storage medium
CN111355974A (en) Method, apparatus, system, device and storage medium for virtual gift giving processing
CN110533585B (en) Image face changing method, device, system, equipment and storage medium
CN110740340B (en) Video live broadcast method and device and storage medium
CN111246095B (en) Method, device and equipment for controlling lens movement and storage medium
CN112328091B (en) Barrage display method and device, terminal and storage medium
CN111368114B (en) Information display method, device, equipment and storage medium
CN114116053B (en) Resource display method, device, computer equipment and medium
CN113613028B (en) Live broadcast data processing method, device, terminal, server and storage medium
CN111028566A (en) Live broadcast teaching method, device, terminal and storage medium
CN112827166A (en) Card object-based interaction method and device, computer equipment and storage medium
CN112130945A (en) Gift presenting method, device, equipment and storage medium
CN113134232B (en) Virtual object control method, device, equipment and computer readable storage medium
CN112367533B (en) Interactive service processing method, device, equipment and computer readable storage medium
CN111061369B (en) Interaction method, device, equipment and storage medium
CN110312144B (en) Live broadcast method, device, terminal and storage medium
CN110152309B (en) Voice communication method, device, electronic equipment and storage medium
CN112023403A (en) Battle process display method and device based on image-text information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant