CN111083505B - Live broadcast room virtual gift interaction method, electronic equipment and device - Google Patents

Live broadcast room virtual gift interaction method, electronic equipment and device Download PDF

Info

Publication number
CN111083505B
CN111083505B CN201911136198.5A CN201911136198A CN111083505B CN 111083505 B CN111083505 B CN 111083505B CN 201911136198 A CN201911136198 A CN 201911136198A CN 111083505 B CN111083505 B CN 111083505B
Authority
CN
China
Prior art keywords
gift
live
striker
live broadcast
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911136198.5A
Other languages
Chinese (zh)
Other versions
CN111083505A (en
Inventor
郭俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Cubesili Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Cubesili Information Technology Co Ltd filed Critical Guangzhou Cubesili Information Technology Co Ltd
Priority to CN201911136198.5A priority Critical patent/CN111083505B/en
Publication of CN111083505A publication Critical patent/CN111083505A/en
Application granted granted Critical
Publication of CN111083505B publication Critical patent/CN111083505B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4784Supplemental services, e.g. displaying phone caller identification, shopping application receiving rewards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Marketing (AREA)
  • Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a live broadcast room virtual gift interaction method, electronic equipment and a device, wherein the method comprises the following steps: determining a virtual gift to be presented and a gift-receiving party; displaying a first collision piece and a target area corresponding to the virtual gift on a live interface; triggering the first striker to move; and if the first collision piece is finally moved to the target area, displaying the preset special effect that the gift receiver receives the virtual gift on the live broadcast interface. Through the mode, the application can improve the intuitiveness of the interaction of the virtual gift in the live broadcast room and the gift presentation.

Description

Live broadcast room virtual gift interaction method, electronic equipment and device
Technical Field
The application relates to the technical field of live broadcast, in particular to a live broadcast room virtual gift interaction method, electronic equipment and device.
Background
For live broadcasting, live broadcasting interaction is a very important link. The anchor interacts with the viewer through the anchor process. Moreover, the existing live broadcast technology can support multi-anchor, anchor and audience to carry out microphone connection interaction, and the interactive environment provided by the live broadcast room is very important.
The existing process of presenting the virtual gift is too dull and unintuitive, which often causes that the gift giver cannot visually see the presented gift and the presented object, and the situation of presenting the gift incorrectly is easy to occur.
Disclosure of Invention
The technical problem mainly solved by the application is to provide the live broadcast room virtual gift interaction method, the electronic equipment and the device, and the virtual gift interaction process can be seen more intuitively.
In order to solve the technical problem, the application adopts a technical scheme that: a live broadcast room virtual gift interaction method is provided, and comprises the following steps: determining a virtual gift to be presented and a gift recipient corresponding to at least one live broadcast area; displaying a first collision piece corresponding to the virtual gift and a target area corresponding to a gift recipient on a live broadcast interface; triggering the first collision piece to move on a live interface; and if the first collision piece is finally moved to the target area, displaying a preset special effect that the gift receiver receives the virtual gift on a live broadcast interface.
In order to solve the above technical problem, another technical solution adopted by the present application is: an electronic device is provided that includes a determination module, a display module, and a processing module. The determining module is used for determining a virtual gift to be presented and a gift receiver corresponding to at least one live broadcast area; the display module is used for displaying a first collision piece corresponding to the virtual gift and a target area corresponding to the gift receiving party on a live broadcast interface; the processing module is used for triggering the first collision piece to move on the live broadcast interface; the display module is further used for displaying a preset special effect of the virtual gift received by the gift receiver on the live broadcast interface when the first collision piece is finally moved to the target area.
In order to solve the above technical problem, the present application adopts another technical solution that: the processor is used for executing the program data to realize the live broadcast virtual gift interaction method.
In order to solve the above technical problem, the present application adopts another technical solution that: the device with the storage function is provided with program data, and the program data can be executed to realize the live broadcast room virtual gift interaction method.
Compared with the prior art, the beneficial effects of this application are: through show first collision piece and the target area that corresponds with at least one live broadcast area on the live broadcast interface, when sending the present process, remove through triggering first collision piece for virtual gift gives the process can the visualize, move to the target area in first collision piece is final, can let the user further the whole process that virtual gift was given directly perceivedly, and receive the object of giving, the present process of visualing can reduce the probability that the gift gave the mistake, increase the interactive function of live broadcast room and improve the interactive effect of live broadcast room.
Drawings
Fig. 1 is a schematic block diagram of a live broadcast system of an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 2 is a schematic view of a first interface of an electronic device according to an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 3 is a first flowchart of an embodiment of a live broadcast room virtual gift interaction method according to the present application;
fig. 4 is a second flowchart of an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 5 is a second schematic view of a live broadcast interface of an electronic device according to an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 6 is a third schematic view of a live broadcast interface of an electronic device according to an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 7 is a fourth schematic view of a live broadcast interface of an electronic device according to an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 8 is a fifth schematic view of a live interface of an electronic device according to an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 9 is a sixth schematic view of a live broadcast interface of an electronic device according to an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 10 is a third flowchart of an embodiment of a live broadcast room virtual gift interaction method according to the present application;
fig. 11 is a fourth flowchart illustrating a live room virtual gift interaction method according to an embodiment of the present application;
fig. 12 is a seventh schematic view illustrating a live interface of an electronic device according to an embodiment of a live broadcast room virtual gift interaction method of the present application;
fig. 13 is a fourth flowchart illustrating a live room virtual gift interaction method according to an embodiment of the present application;
FIG. 14 is a block diagram schematically illustrating the structure of a first embodiment of the electronic device of the present application;
FIG. 15 is a block diagram schematically illustrating the structure of a second embodiment of the electronic device of the present application;
fig. 16 is a block diagram schematically illustrating the structure of the device having a storage function according to the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The inventor of the application discovers through long-term research that if a user is watching live broadcast activities as audiences and wants to interact with a live broadcast and guests and need to give a virtual gift, the user directly clicks the virtual gift, and the virtual gift directly enters an account of the live broadcast or guests, so that the user cannot directly and clearly watch the gift which guests or the live broadcast are sent to, and the live broadcast interaction effect is poor. In order to improve or solve the above technical problem, the present application proposes at least the following embodiments.
Referring to fig. 1, the following embodiment may be applied to a live system 1. The following is an exemplary introduction of a live system 1, the live system 1 for example comprising at least a plurality of viewers 20, a host 30 and a server 10. The audience member 20 and the anchor member 30 may be electronic devices, and specifically may be a mobile terminal, a computer, a server, or other terminals, the mobile terminal may be a mobile phone, a notebook computer, a tablet computer, an intelligent wearable device, or the like, and the computer may be a desktop computer, or the like. The server 10 may pull the live data stream from the anchor 30 and push the retrieved live data stream to the viewer 20. After acquiring the live data stream, the viewer end 20 can view the live process of the anchor or guest. Video or voice connections may be made between the anchor side 30 and the anchor side 30, and between the anchor side 30 and the viewer side 20.
Referring to fig. 2, for video live broadcasting, a live broadcasting area 101 of a plurality of video live broadcasting pictures can be displayed on a live broadcasting interface 100 of a corresponding live broadcasting room. For voice live broadcasting, a live broadcasting area 101 corresponding to each live broadcasting party can be displayed on a live broadcasting interface 100 of a corresponding live broadcasting room, and icons such as head images or nicknames, characters or other images of the live broadcasting parties can be displayed in the live broadcasting area 101. Each live zone may correspond to a live user, which may also be referred to as a live account. Of course, when a host is active in hosting a barley, it may be referred to as a host, and other hosts connected to the barley may be referred to as guests. The user enters the live broadcast room of the anchor at the audience 20, can watch the live broadcast process of the anchor or the live broadcast interaction process of the anchor and the guests, and the user can give gifts to the anchor or the guests for interaction. Embodiments of the present application may also be applied to other types of live systems 1, and are not limited to the above exemplary description.
Referring to fig. 2 and 3, an embodiment of a live broadcast room virtual gift interaction method of the present application includes:
s100: a virtual gift to be gifted and a gift recipient corresponding to the at least one live broadcast area are determined.
For example, the user may enter the live broadcast room through the electronic device used by the user, and the virtual gift 104 that needs to be presented and the gift recipient that needs to be presented may be selected on the live broadcast interface 101, so that the electronic device may determine the virtual gift 104 and the gift recipient specified by the user according to the selection of the user. The gift-receiving party may be at least one of live users corresponding to the multiple live broadcast areas 101, that is, a gift-receiving party corresponding to at least one live broadcast area 101. The user may be a viewer, or may be a anchor or guest in the live process. The viewer selects the virtual gift 104 to be presented and the gift-receiving party to be presented through the viewer's terminal 20. The anchor may select the virtual gift 104 and the gift-receiving party to be presented to other anchors or guests through the live broadcast end, and the anchor 30 may determine the virtual gift 104 and the gift-receiving party specified by the anchor according to the selection of the anchor.
As shown in fig. 2, when a user enters a live room, a live interface 100 for the live room may be displayed on a display screen of their electronic device. The live interface 100 can provide one or more live zones 101. If the live room is in the process of connecting to the wheat, the live interface 100 can display a plurality of live areas 101. The live zones 101 may be arranged in an array, located in a top region or a region adjacent to the top of the live interface 100. The live interface 100 may also provide a gift selection interface 102 spaced from the live area 101, and the gift selection interface 102 may be located in a bottom area of the live interface 100.
Referring to fig. 4, prior to step S100, a virtual gift 104 may be called up through the gift selection interface 102 to be selectable, for example, by:
s011: the interface selection operation of selecting the gift selection interface is detected, a gift selection area corresponding to the gift selection interface is displayed on the live broadcast interface in response to the interface selection operation, and a plurality of virtual gifts arranged in an arc shape are displayed in the gift selection area.
As shown in fig. 2, a user may select a gift selection interface 102 on a live interface 100 of their electronic device. For example, for the electronic device being a smart mobile terminal, the user may select the gift selection interface 102 by clicking on the gift selection interface 102 via the touch screen display. Referring to fig. 5, the electronic device detects an interface selection operation for selecting the gift selection interface 102, and may form an interface selection instruction to be sent to a processor of the electronic device, and the processor, in response to the interface selection instruction, displays a gift selection area 103 corresponding to the gift selection interface 102 on the live interface 100. A plurality of virtual gifts 104 in an arc arrangement are displayed on the gift selection area 103 for selection by the user. By providing the gift selection area 103 in which the arc-shaped arrangement of virtual gifts 104 can be displayed, more virtual gifts 104 can be provided and the virtual gifts 104 can be presented more intuitively.
As shown in fig. 5, each virtual gift 104 is, for example, displayed on a wheel model, which is a circle, and the wheel model may be symmetrically distributed with a perpendicular bisector of an edge of a bottom area of the live interface 100, for example, a center point of the wheel model may be located on the edge of the live interface 100. The gift selection area 103 may present a portion of the wheel model and another portion hidden. The gift selection area 103 may display a plurality of virtual gifts 104 and other virtual gifts 104 may be hidden.
As shown in fig. 5, the electronic device may dynamically calculate the spacing between the virtual gifts 104 based on the number of virtual gifts 104. For example, assuming that the number of virtual gifts 104 is n, the virtual gifts 104 may be evenly distributed over the roulette wheel model. The angle θ between two adjacent virtual gifts 104 can be calculated by the following expression:
θ=360°/n;
the virtual gift 104 may appear circular in its appearance when displayed and the center of the virtual gift 104 may be located on the circumference of the wheel model.
S012: and detecting the gift switching operation, and rolling and switching the virtual gift displayed in the gift selection area in a wheel disc mode according to the switching direction and amplitude corresponding to the gift switching operation.
Referring to fig. 6, for example, a user may slide the virtual gift 104 by a touch gesture, specifically, by touching a certain position on the live interface 100 and sliding the position substantially along the arc direction of the wheel model, the wheel model may be rotated, so that the virtual gift 104 may rotate around the central point of the wheel model, the virtual gift 104 may be displayed in a scrolling manner, and the hidden virtual gift 104 may be displayed in the gift selection area 103 in a switching manner. For example, the virtual gift 104 corresponding to the gift 6 in fig. 6 is switched from the hidden state in fig. 5 to the gift selection area 103, so that a different virtual gift 104 can be displayed by switching. Switch virtual gift 104 through the rim plate mode, compare in turning over the page mode, can make virtual gift 104's switching more coherent, can also improve virtual gift 104 and switch the convenience of operation, the user operation of being convenient for improves user's sense of participation.
Specifically, a coordinate system is established by taking a center point of the wheel disc model as a coordinate origin, taking an edge where the center point of the wheel disc model is located as an abscissa axis, and taking a perpendicular bisector of the edge as an ordinate axis. Under the coordinate system, the electronic device can record the coordinate (x) of the touch point1,y1) Then, an angle α between a line connecting the touch point and the origin of coordinates and the abscissa axis may be calculated:
Figure BDA0002279663070000061
next, the coordinates (x) of the touch location after the gesture movement may be recorded2,y2) Then, an included angle β between a line connecting the moved touch position and the origin of coordinates and the abscissa axis may be calculated:
Figure BDA0002279663070000062
then, calculating a rotation angle v corresponding to the touch gesture by using the alpha and the beta:
ν=β-α;
can judge the switching direction of gesture through the positive and negative of judgement v value, and judge through the numerical value of v value itself and switch the range, so the rim plate model can be according to switching direction pivoted range for rotation angle v, and then switch and show different virtual gifts 104. Of course, the touch direction detected by the touch screen of the electronic device itself may be used as the switching direction.
As shown in fig. 6, for selecting a virtual gift 104 in the roulette model, a preset position 105 may be provided in the live interface 100. The preset positions 105 may also be arranged in a circle. The area of the preset position 105 may be greater than or equal to the area of the virtual gift 104.
The predetermined location 105 may be fixedly displayed on the live interface 100 and may be located on the arc-shaped path of the virtual gift 104, so as to surround the single virtual gift 104 when the virtual gift 104 enters the predetermined location 105. The center of the preset location 105 may be located on the midperpendicular of the edge of the live interface 100 passing through the center of the wheel model. Of course, the center of the preset position 105 may be further located on the circumference of the wheel model, so that the virtual gift 104 enters the preset position 105, and the centers of the two centers may coincide.
As shown in fig. 4, the preset position 105 may be used for selecting the virtual gift 104, which may be realized by the following steps included in step S100:
s101: and judging whether the duration time of the virtual gift currently positioned in the preset position is greater than the preset time.
The user scrolls and displays the virtual gifts 104 by means of wheel rotation, and if a certain virtual gift 104 is to be given as a gift, the user can rotate the virtual gift 104 into the preset position 105 by means of touch gestures or mouse operation.
S102: if so, determining the virtual gift currently located at the preset position for a duration time longer than the preset time as the virtual gift to be presented.
The virtual gift 104 may stay within the predetermined position 105 after entering the position. When the virtual gift 104 stays for a duration greater than a predetermined time, the virtual gift 104 is selected as the virtual gift 104 to be gifted.
If not, the step S101 is continuously executed to continuously perform the detection and judgment.
As shown in fig. 4, before step S100, a live user corresponding to the live broadcast area 101 may be selected as a gift recipient by operating the live broadcast area 101 by a user, for example, by:
s013: a selection operation of selecting at least one live zone among a plurality of live zones is detected.
As shown in fig. 5, for example, a user clicks at least one live broadcast area 101 on the live broadcast interface 100 to select an object to be presented first, that is, clicks the live broadcast area 101 corresponding to the object to be presented. When the user performs a selected operation on the live interface 100, for example, the electronic device may form a selected operation instruction to be sent to a processor or the like.
S014: and responding to the selection operation, and determining the live broadcast user corresponding to the live broadcast area specified by the selection operation as a gift-receiving party.
The electronic equipment responds to the selection operation, namely a processor and the like responds to the selection operation instruction, and the live broadcast user corresponding to the live broadcast area 101 specified by the selection operation is determined as a gift-receiving party. The virtual gift 104 to be given by the user will be given to the gift recipient.
S200: and displaying a first collision piece corresponding to the virtual gift and a target area corresponding to the gift recipient on the live broadcast interface.
Referring to fig. 7, after determining the virtual gift 104 and the gift recipient to be gifted, the live interface 100 may display a first collision piece 120 and a target area 130 corresponding to the virtual gift 104. The first collision member 120 may be displayed in a partial area between the gift selection area 103 and the plurality of live broadcast areas 101.
The first striker 120 can be moved on the live interface 100, and the progress of the movement can be displayed on the live interface 100. The first impactor 120 may be triggered to move toward the target area 130. The first impactor 120 may not move directly toward the target area 130, but may move toward the target area 130 in multiple impacts with edges of the live interface 100.
In some embodiments, the live interface 100 can display the first impactor 120 and the target area 130 while directly triggering the first impactor 120 so that the first impactor 120 can move directly, performing step S300 described below. For example, if the electronic device is a smart device with a touch screen, the first striker 120 may be triggered to move by touching or clicking on it.
In other embodiments, as shown in FIG. 8, the live interface 100 may display the second impactor 110 in addition to the first impactor 120 and the target area 130. For example, step S200 may include:
s210: the first impactor, the second impactor, and the target area are displayed on the live interface.
The second striker 110 and the first striker 120 may be misaligned. When displayed, the second striker 110 and the first striker 120 may be displayed in a partial area of the live interface 100 between the plurality of live areas 101 and the gift selection area 103. Of course, the display of the second striker 110 and the first striker 120 may be provided as appropriate. The first striker 120 and the second striker 110 may be shown spaced apart, or may be in edge contact.
The second striker 110 and the first striker 120 may each be provided in a circular or spherical shape. The second striker 110 and the first striker 120 may be the same size. As shown in fig. 8, the second striker 110 may be shown adjacent to the preset position 105. For example, the second striker 110 may be tangent to the preset position 105. The center of the second striker 110 and the center of the preset position 105 and the center of the wheel model are located on the same central vertical line. Alternatively, the first striker 120 may be randomly displayed on the live interface 100 with the second striker 110 and the first striker 120 not being coincident.
Unlike the above-described embodiment showing the first striker 120 and the target region 130, this embodiment can facilitate the execution of the later-described step S300 by triggering the second striker 110 to collide against the first striker 120 so that the first striker 120 receives the triggering movement, by showing the first striker 120, the second striker 110, and the target region 130.
As shown in fig. 4 and 8, in order to facilitate the user to operate the second striker 110 and the first striker 120, the user may be instructed to operate by providing a guide line, which may be implemented by the following steps further included in step S200:
s220: and displaying a guide line on the live broadcasting interface, wherein the guide line sequentially passes through the second collision member and the first collision member and extends to the edge of the live broadcasting interface.
The guide line may be a straight line. The guide lines are used to indicate the path of the second striker 110 striking the first striker 120, so that the user can know the direction and angle of the second striker 110 striking the first striker 120. The guide lines may be shown as dashed lines, but may be of other linear types. The guide line extends from the second striker 110 to the first striker 120 and then to the edge of the live interface 100, where the edge of the live interface 100 is the edge that is adjacent to the first striker 120 and can intersect the guide line.
Specifically, the guide line may pass through the center of the second striker 110 and the center of the first striker 120, so it is possible to improve the collision efficiency of the second striker 110 and the first striker 120 and to improve the accuracy of the moving path after the collision.
The present embodiment can calculate the incident angle and the emission angle of the second collision member 110 colliding with the first collision member 120 using the guideline, as shown in the guideline. After step S200 may include:
s230: an angle of incidence and an angle of reflection of the second impactor to impact the first impactor are determined based on the angle between the director line and the edge of the direct broadcast interface.
As shown in fig. 8, the angle between the guide line and the perpendicular to the edge of the live interface 100, that is, the incident angle, can be calculated according to the angle between the guide line and the edge of the live interface 100 intersecting the guide line. Similar to the law of reflection of light, the angle of incidence is equal to the angle of reflection, and an angle of incidence and an angle of reflection can be derived that determine the second striker 110 striking the first striker 120.
That is, the second striker 110 strikes the first striker 120 at the incident angle, the first striker 120 also strikes the edge of the direct broadcast interface 100 at the incident angle, and the first striker 120 bounces at the reflection angle to continue moving, and may collide with other edges.
By showing the guiding lines, it is convenient for the user to observe the positional relationship between the second striker 110 and the first striker 120 and the incident angle and the reflection angle of the subsequent collision, so that it is convenient for the user to adjust the positional relationship between the second striker 110 and the first striker 120, and further adjust the incident angle and the reflection angle, thereby increasing the probability of the subsequent first striker 120 entering the target zone 130. Also, the subsequent calculation of the movement path of the first striker 120 can be facilitated by the incident angle and the reflection angle.
As shown in fig. 4, if the user wants to adjust the incident angle and the reflection angle, the position of the first collision member 120 may be adjusted, and the following steps may be implemented after S210:
s240: it is detected whether there is a moving operation for moving the second collision member.
As shown in fig. 9, the user may perform a moving operation on the first striker 120. For an electronic device having a touch screen, a user may touch the first striker 120 to drag, i.e., may move the position of the first striker 120. For an electronic device such as a computer, a user may select and drag the first striker 120 through a mouse, a touch panel, or the like to move the first striker 120. The electronic device, for example, detecting the movement operation of the user may form a movement operation instruction, which is sent to a processor of the electronic device, so as to perform corresponding processing on the movement operation.
S250: in response to the moving operation, the second striker is moved to a final position designated by the moving operation.
As shown in fig. 9, the electronic device may move the first striker 120 to a final position specified or corresponding to the movement operation, i.e., move the first striker 120 from a previous position to a final position, in response to the movement operation, i.e., the processor of the electronic device responds to the movement operation instruction. As shown in fig. 9, in response to the moving operation, the first striker 120 may be moved from the position a to the position B, and thus the corresponding incident angle and reflection angle may be adjusted.
The position of the second striker 110 may be fixed. Of course, the position of the second striker 110 can also be set adjustable, and this embodiment can also be implemented by moving the second striker 110 to adjust the angle of incidence and the angle of reflection at which the second striker 110 strikes the first striker 120, in a manner similar to the adjustment process of the first striker 120 described above.
After moving the second striker 110 or the first striker 120, the process may return to steps S211 and S222 to display the guide line and calculate the incident angle and the reflection angle.
If no adjustment is made, the next process can be continued. The next process flow of the process flow shown in fig. 4 is interfaced with the previous process flow shown in fig. 10.
As shown in fig. 10, after the incident angle and the reflection angle are calculated, the movement path of the first striker 120 may be calculated specifically by the following steps after S210.
S260: the movement path of the first impactor is calculated based on the angle of incidence and the angle of reflection.
As shown in fig. 8, according to the incident angle and the reflection angle of the second striker 110 striking the first striker 120 determined in S220, the edge of the first striker 120 striking the live interface 100 bounces at the reflection angle and moves straight until striking the next edge of the live interface 100 or directly entering the target area 130. If the direct broadcast interface 100 continues to be hit, the incident angle of the first striker 120 to each edge is also determined and can be calculated, so that the reflection angle is also determined and can be calculated, i.e., the incident angle and the reflection angle of the first striker 120 hitting the first striker 120 before the first striker 120 enters the target area 130 are correlated to the incident angle and the reflection angle of the second striker 110 hitting the first striker 120 no matter how many times the first striker 120 hits each edge of the direct broadcast interface 100, and the moving path of the first striker 120 after hitting can be calculated based on the incident angle and the reflection angle of the second striker 110 hitting the first striker 120.
In some implementations, the movement path may also be displayed in the live interface 100 in the form of an auxiliary line. The auxiliary lines may show the trajectory that the first striker 120 has traveled, and the non-traveled path may not be shown. Of course, after the moving path is calculated, the entire moving path may be directly displayed on the live interface 100 in the form of an auxiliary line, so that the user may perform corresponding adjustment. For example, the step S230 is performed to adjust the position of the first collision member 120, and further adjust the incident angle and the reflection angle, so as to achieve the purpose of adjusting the moving path, thereby increasing the success rate of the first collision member 120 entering the target area 130.
S300: the first collision member is triggered to move on the live interface.
For embodiments in which the first impactor 120 and the target area 130 are displayed on the live interface 100, a guideline extending through the first impactor 120 to the edge of the live interface 100 may also be displayed. For example, the electronic device may determine the point of impact of the first impactor 120 with the edge of the live interface 100 and, thus, determine the line between the first impactor 120 and the point of impact as a guideline. In particular, the impact point may be selected by the user, such as by touching the display screen, and the electronic device determines the impact point based on the user's selection. The angle of incidence and the angle of reflection of the first striker 120, and thus the path of movement of the first striker 120, can be determined from the guide lines.
The first striker 120 may be directly triggered to move along a movement path on the live interface 100. For example, the user can trigger the first striker 120 to move by clicking or touching a certain key/icon, or by directly clicking or touching the first striker 120. Of course, the electronic device may also automatically trigger the first striker 120 to move.
For embodiments in which the first striker 120, the second striker 110, and the target area 130 are displayed on the live interface 100, this may be achieved by the following further steps included in step S300:
s310: triggering the second collision member to collide with the first collision member so that the first collision member is moved in collision on the live interface.
Specifically, the second striker 110 may be triggered to impact the first striker 120 according to a gift-offering trigger instruction. The gift-offering triggering instruction can be automatically formed by the electronic equipment or formed based on the triggering operation of the user. For example, if the electronic device does not detect that the second striker 110 or the first striker 120 is moved or has not been adjusted in position within a preset time period, a gift-offering trigger instruction is automatically generated to trigger the second striker 110 to strike the first striker 120, so that the first striker 120 is moved by the strike. For example, for an electronic device with a touch screen, a user may touch and click on the second striker 110 or the first striker 120 for triggering the second striker 110 to impact the first striker 120, and the electronic device may form a gift-giving triggering instruction. For an electronic device such as a computer, a user may click on the second striker 110 or the first striker 120 through a peripheral device such as a mouse. Of course, a key on the electronic device may be provided for triggering the second striker 110 to strike the first striker 120, and the electronic device may form a gift giving triggering instruction when the user presses the key.
Referring to fig. 10, the first striker 120 may move along a moving path for the collision movement, which may be specifically shown as the following steps included in step S310:
s311: triggering the second collision member to strike the first collision member at an incident angle such that the second collision member moves on the live interface according to a movement path.
After the moving path of the first collision member 120 is calculated, the second collision member 110 is triggered to impact the first collision member 120 at an incident angle according to the gift sending trigger instruction, and the first collision member 120 moves after being impacted and moves according to the moving path, so that a user can visually see the presenting process of the virtual gift 104 and clearly know which anchor or guest the gift sent by the user is sent to.
After the first collision member 120 is triggered to move, it needs to be determined whether the first collision member can move to the target area 130 corresponding to the gift recipient on the live interface 100, which may be specifically implemented by the following steps after step S300:
s320: and judging whether the moving path has track points falling into the target area.
As shown in fig. 8, determining whether or not there is a locus point falling within the target area 130 in the moving path corresponds to determining whether or not the first collision member 120 can enter the target area 130. For example, the second striker 110 and the first striker 120 are spherical, and the target zone 130 serves as a hole. If there is a locus point falling into the target area 130 in the moving path of the first collision member 120, the specification states that the first collision member 120 finally enters the hole.
Referring to fig. 11, for a specific judgment process, the following steps may be implemented, which are included in step S320:
s321: and determining a near track point of the moving path closest to the center of the target area.
Referring to fig. 12, a perpendicular line of the moving path may be drawn through the center of the target area 130, and an intersection point of the perpendicular line having the shortest perpendicular distance and the moving path may be selected as a near track point. The near-track points are determined, i.e. the points with the highest probability of being able to enter the target area 130 are determined.
S322: the distance between the near trajectory point and the center of the target region is calculated based on the coordinates of the near trajectory point and the coordinates of the center of the target region.
As shown in FIG. 12, let the coordinates of the near locus point be (x)0,y0) The target region 130 has a center coordinate of (x)3,y3) If the radius of the target region 130 is r, the distance H between the near trajectory point and the center of the target region 130 is:
Figure BDA0002279663070000141
s323: and comparing the distance between the near track point and the center of the target area with the radius of the target area to judge whether the track point falling into the target area exists in the moving path.
If H-r >0, the near-track point does not enter the target region 130.
If H-r ≦ 0, the near trajectory point can enter the target area 130, i.e., the first striker 120 can fall into the target area 130 ("hole").
S330: if so, it is determined that the first impactor can eventually enter the target area, and the first impactor is eventually moved to the target area along the path of movement.
That is, if H-r ≦ 0, indicating that there is a trajectory point for the movement path that falls within the target area 130, it may be determined that the first impactor 120 is eventually able to enter the target area 130 and move the first impactor 120 along the movement path to the target area 130. . Of course, a time limit may also be set for the entire collision process, which time limit is exceeded before the target area 130 is entered, and the second striker 120 and the first striker 120 may be subjected to a disappearing process.
S340: if not, the first collision member may continue to follow the moving path, and then S350-S360 may be performed to determine the time length and perform corresponding processing.
If it may take a long time for the first striker 120 to enter the target area 130 after triggering the second striker 110 to strike the first striker 120, or if the first striker 120 cannot enter the target area 130 due to the moving path, the moving time of the first striker 120 in the live interface 100 is too long, which may affect the user's experience in watching the live room. As shown in fig. 10, no matter whether the first striker 120 can enter the target area or not when moving along the moving path, a time limit may be set for the collided movement of the first striker 120, which may be specifically implemented as the following steps after S300:
s350: the length of time between the current point in time and the point in time at which the second striker is triggered to strike the first striker is calculated.
The point in time at which the second striker 110 is triggered is taken as the start point of the time length, and the current point in time is taken as the end point of the time length. Or the point in time at which the first striker 120 is collided with by the second striker 110 may be the starting point of the length of time. For example, after triggering the second striker 110 to impact the first striker 120, the first striker 120 is moved by the impact movement in the live interface 100 during the moving movement of the first striker 120 at the current point in time. In this way, the duration of time during which the second striker 110 strikes the first striker 120 can be effectively controlled.
For the embodiment in which the first striker 120 is displayed without the second striker 110, the point in time at which the first striker 120 is triggered is taken as the starting point of the length of time by directly triggering the movement of the first striker 120.
S360: the time length is compared with a preset time length.
The preset time length can be set according to actual conditions, and can be adjusted in the background through the server 10. Of course, the adjustment can be carried out by the user. For example, the preset time period is 1min or 30 s.
S370: if the time length is longer than the preset time length, the first collision piece is subjected to disappearance processing at the current time point so as not to be displayed on the live broadcast interface.
If the duration of time from the triggering of the second striker 110 to impact the first striker 120 is longer than the predetermined duration of time, which means that the process of the second striker 110 impacting the first striker 120 is relatively long and may affect the viewing experience of the user or the gift giving experience, the first striker 120 may be extinguished at the current time point and may be extinguished in the live interface 100 or not displayed in the live interface 100. Of course, the second collision member 110 may be simultaneously subjected to the disappearance processing.
Even if the first collision member 120 is disappeared at the current point in time, the virtual gift 104 can still enter the gift recipient, and only the display of the gift giving process is disappeared. Of course, gift giving failure processing may be performed.
If the length of time is less than the predetermined length of time, the first striker 120 continues to move along the movement path. If the first impactor 120 is able to move to the target area 130 corresponding to the gift recipient on the live interface 100 for a predetermined length of time, i.e., the first impactor 120 successfully reaches the destination, the user views the entire gift-giving process.
S400: and if the first collision piece is finally moved to a target area corresponding to the gift receiver on the live broadcast interface, displaying a preset special effect that the virtual gift receiver receives on the live broadcast interface.
The target area 130 may be displayed in the live area 101 corresponding to the gift recipient. For example, the target area 130 is displayed floating on the live zone 101. The center of the target area 130 may coincide with the geometric center of the live zone 101.
When the distance between the near track point and the center of the target area 130 is less than or equal to the radius of the target area 130, a preset special effect corresponding to the virtual gift 104 may be obtained first. After the first impactor 120 finally moves into the target area 130, the acquired preset special effect is displayed on the live interface 100. The preset special effect can be an expression special effect, an explosion special effect, a voice special effect, a character special effect and the like. By acquiring the preset special effect in advance, the response efficiency of the preset special effect can be improved, the display time of the preset special effect is shortened, and the processing pressure of the system is reduced.
Through show first collision piece 120 and with the target area 130 that at least one live broadcast area 101 corresponds on live broadcast interface 100, when carrying out the gift sending process, through triggering first collision piece 120 and moving for virtual gift 104 presents the process can be visualized, in first collision piece 110 finally moves to target area 130, can let the user further directly perceivedly virtual gift 104 presents the whole process to and receive the object of giving, the probability of presenting the mistake can be reduced to the gift of intuitionistic gift process, increase the interactive function of live broadcast room and improve the interactive effect of live broadcast room.
Further, through setting up second collision piece 110 and first collision piece 120, trigger second collision piece 110 and strike first collision piece 120, can enrich the interactive function and the effect of gift present, further strengthen the intuitiveness of virtual gift present process, enrich the function of live broadcast system 11, improve live broadcast interactive effect and user experience.
As shown in fig. 13, for displaying the preset special effect on the live interface 100, the following steps included in step S400 may be implemented:
s410: and carrying out face recognition in a live broadcast area corresponding to the gift recipient.
The live broadcast area 101 is, for example, a video live broadcast area 101 or a display avatar, and before the preset special effect is displayed, the live broadcast area 101 is subjected to face recognition, so as to improve the display effect of the preset special effect.
S420: and when the preset face is identified, displaying the preset special effect on the preset part on the preset face.
The predetermined location is, for example, the nose, mouth, eyes or other locations. Specifically, the preset special effect may be set as a dynamic goal special effect. And attaching the preset special effect to the nose part or the mouth part of the preset human face according to the human face model.
Through carrying out face identification to the live broadcast area 101 that receives the gift side and corresponding to will predetermine the special effect and show in the predetermined position of predetermineeing the people's face, can show virtual gift 104 present process and the gift side that receives more directly perceivedly, further improve the bandwagon's bandwagon show effect.
The interaction process described in the above embodiment of the live broadcast virtual gift interaction method in the present application may be presented on an electronic device of a user, and may also be synchronized to an electronic device of another viewer or an electronic device of an anchor through the server 10 for live broadcast viewing.
As shown in fig. 14, the electronic device according to the first embodiment of the present application includes: the device comprises a determination module 211, a display module 212, a detection module 214, a processing module 213, a judgment module 215, a comparison module 216, an acquisition module 217 and an identification module 218.
Wherein the determining module 211 is configured to determine a virtual gift to be gifted and a gift recipient corresponding to at least one live broadcast area. The display module 212 is configured to display a first collision piece corresponding to the virtual gift and a target area corresponding to the gift recipient on the live interface. The processing module 213 is configured to trigger the first impactor to move on the live interface. The display module 212 is further configured to display a preset special effect that the gift recipient receives the virtual gift on the live broadcast interface if the first collision member finally moves to the target area.
Optionally, the display module 21 is further configured to display the first impactor, the second impactor, and the target area on the live interface.
Optionally, the processing module 213 is further configured to trigger the second striker to impact the first striker on the live interface, so that the first striker is moved by impact on the live interface.
Optionally, the display module 212 may be configured to display a guideline on the live interface that passes through the second collision member and the first collision member in sequence and extends to an edge of the live interface. The processing module 213 may be configured to determine an angle of incidence and an angle of reflection at which the second impactor hits the first impactor based on an angle between the guide line and an edge of the live interface. The processing module 213 may also be used to calculate the movement path of the first impactor based on the angle of incidence and the angle of reflection.
Optionally, the processing module 213 may also be used to trigger the second striker to strike the first striker at an incident angle to cause the second striker to move in accordance with the movement path.
Optionally, the determining module 215 may be configured to determine whether the moving path has a track point falling into the target area. If so, the processing module 213 is configured to determine that the first impactor is able to eventually enter the target area and to eventually move the first impactor to the target area along the path of travel.
Optionally, the processing module 213 may also be configured to determine a close trajectory point in the movement path that is closest to the center of the target area. The processing module 213 may also be configured to calculate a distance between the near trajectory point and the center of the target region based on the coordinates of the near trajectory point and the coordinates of the center of the target region.
Optionally, the comparing module 216 may be configured to compare the distance between the near track point and the center of the target area with the radius of the target area to determine whether there is a track point falling into the target area on the moving path.
Optionally, the obtaining module 217 may be configured to obtain the preset special effect when a distance between the near trajectory point and the center of the target area is smaller than or equal to a radius of the target area.
Optionally, the recognition module 218 may be configured to perform face recognition in a live video window corresponding to the gift recipient. The display module 212 may be further configured to display the preset special effect at a preset position on the preset face when the preset face is recognized.
Alternatively, the detection module 214 may be used to detect a moving operation for moving the first striker. The processing module 213 may be configured to move the first impactor to a final position specified by the moving operation in response to the moving operation.
Optionally, the processing module 213 may also be configured to calculate a length of time between a current point in time and a point in time at which the second striker is triggered to impact the first striker. The processing module 213 may also be configured to compare the time duration with a preset time duration. The processing module 213 may be further configured to, if the time duration is greater than the preset time duration, perform disappearing processing on the first collision member at the current time point by the processing module 213, so as not to be displayed on the live interface.
Optionally, the detection module 214 may also be configured to detect an interface selection operation that selects a gift selection interface. The processing module 213 displays a gift selection area corresponding to the gift selection interface on the live interface in response to the interface selection operation, and displays a plurality of virtual gifts arranged in an arc in the gift selection area of the display module 212.
Optionally, the detection module 214 may also be used to detect a gift switching operation. The processing module 213 may also scroll through the virtual gifts displayed in the gift selection area on the display module 212 in a roulette manner according to the switching direction and magnitude corresponding to the gift switching operation.
Alternatively, the determining module 215 may be configured to determine whether the duration of the virtual gift currently located at the preset position is greater than a preset time. The determining module 211 may be configured to determine the virtual gift located at the preset position for a duration greater than a preset time as the virtual gift to be gifted.
Optionally, the detecting module 214 is configured to detect a selection operation of selecting at least one live video window. The determining module 211 is configured to respond to the selection operation, and use a live user corresponding to the live video window specified by the selection operation as a gift-receiving party. Wherein the target area is located within the live video window specified by the selected operation.
Referring to fig. 15, the electronic device according to the second embodiment of the present application includes a processor 221, a memory 222, a display 223, and a communication circuit 224. The memory 222, the communication circuit 224 and the display 223 are respectively coupled to the processor 221. The memory 222 stores program data, and the processor 221 is configured to execute the program data to implement the live-air virtual gift interaction method embodiment as described above.
The communication circuit 224 is used for the electronic device of the present embodiment to communicate with an external device, and the electronic device can transmit data to the external device or receive data from the external device through the communication circuit 224. The display screen 223 is used to implement the process of live room virtual gift interaction. The memory 222 is used for storing program data and may be a RAM, a ROM, or other type of storage device.
The processor 221 is used for controlling the operation of the electronic device, and the processor 221 may also be referred to as a CPU (Central Processing Unit). The processor 221 may be an integrated circuit chip having signal processing capabilities. The processor 221 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 221 may be any conventional processor or the like.
For detailed explanation of each functional module or component in the first embodiment and the second embodiment of the electronic device of the present application, reference may be made to the explanation in the above embodiment of the live broadcast room virtual gift interaction method of the present application, and details are not described here again.
In the several embodiments provided in the present application, it should be understood that the disclosed electronic device and live broadcast room virtual gift interaction method may be implemented in other ways. For example, the above-described embodiments of the electronic device are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
Referring to fig. 16, the integrated unit may be stored in the device with storage function 230 if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage device and includes instructions (program data) for causing a computer (which may be a personal computer, a server 10, or a network device) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present invention. The aforementioned storage device includes: various media such as a usb disk, a portable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and electronic devices such as a computer, a mobile phone, a notebook computer, a tablet computer, and a camera having the storage medium.
The description of the execution process of the program data in the device 230 with a storage function may refer to the above embodiments of the live broadcast room virtual gift interaction method of the present application, and will not be described herein again.
To sum up, the above embodiments visually present the virtual gift 104 through the visualized moving process of the first collision piece 120, and when the first collision piece 120 moves into the target area 130, the preset special effect can be visually presented, so that the presented object is more visualized, the user can clearly know the process and the object of presenting the virtual gift 104, and the probability of error in presenting the gift is reduced.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (14)

1. A live broadcast room virtual gift interaction method is characterized in that a live broadcast room is provided with a live broadcast interface, the live broadcast interface is provided with a plurality of live broadcast areas which are arranged at intervals, and the method comprises the following steps:
determining a virtual gift to be given and a gift receiver corresponding to at least one live broadcast area;
displaying a first collision piece and a second collision piece corresponding to the virtual gift and a target area corresponding to the gift recipient on the live broadcast interface;
triggering the second striker to impact the first striker so that the first striker is moved in a collision on the live interface;
and if the first collision piece is finally moved to the target area, displaying a preset special effect that the gift receiving party receives the virtual gift on the live broadcast interface.
2. The live broadcast room virtual gift interaction method of claim 1 wherein said displaying a first collision piece corresponding to the virtual gift, a second collision piece, and a target area corresponding to the gift recipient on the live broadcast interface comprises:
displaying a guide line on the live interface, the guide line sequentially passing through the second collision member and the first collision member and extending to an edge of the live interface.
3. The live-air virtual gift interaction method of claim 2, wherein said displaying a first collision piece corresponding to the virtual gift, a second collision piece, and a target area corresponding to the gift recipient on the live interface comprises:
determining an incident angle and a reflection angle of the second striker striking the first striker according to an angle between the guide line and an edge of the live interface;
calculating a movement path of the first impactor based on the incident angle and the reflection angle;
the triggering the second striker to impact the first striker such that the first striker is moved in impact on the live interface, comprising:
triggering the second striker to strike the first striker at the incident angle to cause the first striker to move along the movement path.
4. The live room virtual gift interaction method of claim 3 wherein said first impactor eventually moving to said target area comprises:
judging whether the moving path has track points falling into the target area or not;
if so, determining that the first impactor can finally enter the target area when moving to the trajectory point falling into the target area.
5. The live room virtual gift interaction method of claim 4 wherein said target area is arranged in a circular shape; the judging whether the track points falling into the target area exist in the track points of the moving path includes:
determining a near track point which is closest to the center of the target area in the moving path;
calculating the distance between the near track point and the center of the target area based on the coordinates of the near track point and the coordinates of the center of the target area;
and comparing the distance between the near track point and the center of the target area with the radius of the target area, and further judging whether the track point falling into the target area exists in the moving path.
6. The live broadcast room virtual gift interaction method of claim 5, wherein before displaying on the live broadcast interface that the virtual gift recipient receives the preset special effect of the virtual gift, the method comprises:
and when the distance between the near track point and the center of the target area is smaller than or equal to the radius of the target area, acquiring the preset special effect corresponding to the virtual gift.
7. The live room virtual gift interaction method of claim 6 wherein said live zone is adapted to present a live video frame of a corresponding live user; the displaying of the preset special effect that the virtual gift is received by the gift receiving party on the live broadcast interface comprises:
performing face recognition in the live broadcast area corresponding to the gift recipient;
and when a preset face is identified, displaying the preset special effect on a preset part on the preset face.
8. The live room virtual gift interaction method of claim 2 wherein after displaying the first impactor, the second impactor, and the target area on the live interface, comprising:
detecting a moving operation for moving the first striker;
moving the first striker to a final position specified by the moving operation in response to the moving operation.
9. The live room virtual gift interaction method of claim 1 wherein said triggering said second impactor to impact said first impactor is followed by:
calculating a length of time between a current point in time and a point in time at which the second impactor is triggered;
comparing the time length with a preset time length;
if the time length is larger than the preset time length, the first collision piece at the current time point is subjected to disappearance processing so as not to be displayed on the live broadcast interface.
10. The live broadcast room virtual gift interaction method of claim 1, wherein a gift selection interface arranged at a distance from the live broadcast area is further displayed on the live broadcast interface; before the determining a virtual gift to be gifted and a gift recipient corresponding to at least one of the live broadcast areas, the method includes:
detecting an interface selection operation for selecting the gift selection interface, displaying a gift selection area corresponding to the gift selection interface on the live broadcast interface in response to the interface selection operation, and displaying a plurality of virtual gifts in arc arrangement in the gift selection area;
detecting gift switching operation, and switching the virtual gift displayed in the gift selection area in a wheel disc mode in a rolling mode according to the switching direction and amplitude corresponding to the gift switching operation;
the determining a virtual gift to be presented and a gift recipient corresponding to at least one of the live broadcast areas includes:
judging whether the duration time of the virtual gift currently positioned in a preset position in the preset position is longer than preset time or not, wherein the preset position is displayed on the arc-shaped arrangement paths of the virtual gifts in the gift selection area;
if so, determining the virtual gift currently located at the preset position as the virtual gift to be presented;
detecting a selection operation of selecting at least one live broadcast area in the plurality of live broadcast areas;
and responding to the selected operation, and taking a live broadcast user corresponding to the live broadcast area specified by the selected operation as the gift receiver, wherein the target area is displayed in the live broadcast area specified by the selected operation.
11. A live room virtual gift interaction method of claim 10 wherein said first striker member and said second striker member are each circularly disposed; the second striker member is displayed adjacent to the preset position, the first striker member is randomly displayed on the live interface, and the first striker member and the second striker member are not coincident.
12. An electronic device capable of displaying a live interface having a plurality of live zones arranged at intervals, the electronic device comprising:
the determining module is used for determining a virtual gift to be presented and a gift receiving party corresponding to at least one live broadcast area;
the display module is used for displaying a first collision piece and a second collision piece corresponding to the virtual gift and a target area corresponding to the gift receiving party on the live broadcast interface;
the processing module is used for triggering the second collision piece to collide the first collision piece to move on the live broadcast interface;
the display module is further used for displaying a preset special effect of the virtual gift received by the gift receiver on the live broadcast interface if the first collision piece is finally moved to the target area.
13. An electronic device comprising a processor, a memory, a communication circuit, and a display screen, the memory, the communication circuit, and the display screen being respectively coupled to the processor, the memory storing program data, the processor being configured to execute the program data to implement the live room virtual gift interaction method of any of claims 1-11.
14. An apparatus having a storage function, characterized in that program data are stored, the program data being executable to implement a live room virtual gift interaction method as claimed in any one of claims 1-11.
CN201911136198.5A 2019-11-19 2019-11-19 Live broadcast room virtual gift interaction method, electronic equipment and device Active CN111083505B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911136198.5A CN111083505B (en) 2019-11-19 2019-11-19 Live broadcast room virtual gift interaction method, electronic equipment and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911136198.5A CN111083505B (en) 2019-11-19 2019-11-19 Live broadcast room virtual gift interaction method, electronic equipment and device

Publications (2)

Publication Number Publication Date
CN111083505A CN111083505A (en) 2020-04-28
CN111083505B true CN111083505B (en) 2021-12-28

Family

ID=70311192

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911136198.5A Active CN111083505B (en) 2019-11-19 2019-11-19 Live broadcast room virtual gift interaction method, electronic equipment and device

Country Status (1)

Country Link
CN (1) CN111083505B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111541909A (en) * 2020-04-30 2020-08-14 广州华多网络科技有限公司 Panoramic live broadcast gift delivery method, device, equipment and storage medium
CN111526412A (en) * 2020-04-30 2020-08-11 广州华多网络科技有限公司 Panoramic live broadcast method, device, equipment and storage medium
CN111901624A (en) * 2020-08-06 2020-11-06 广州虎牙科技有限公司 Live broadcast display method and device, electronic equipment and storage medium
CN112891944B (en) * 2021-03-26 2022-10-25 腾讯科技(深圳)有限公司 Interaction method and device based on virtual scene, computer equipment and storage medium
CN113126875B (en) * 2021-04-21 2022-08-16 广州博冠信息科技有限公司 Virtual gift interaction method and device, computer equipment and storage medium
CN113438490A (en) * 2021-05-27 2021-09-24 广州方硅信息技术有限公司 Live broadcast interaction method, computer equipment and storage medium
CN116521038A (en) * 2022-01-24 2023-08-01 腾讯科技(深圳)有限公司 Data processing method, device, computer equipment and readable storage medium
CN115767117A (en) * 2022-10-26 2023-03-07 腾讯音乐娱乐科技(深圳)有限公司 Method, equipment and storage medium for live broadcast interactive operation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105373306A (en) * 2015-10-13 2016-03-02 广州酷狗计算机科技有限公司 Virtual goods presenting method and device
CN108260021A (en) * 2018-03-08 2018-07-06 乐蜜有限公司 Living broadcast interactive method and apparatus
CN109194973A (en) * 2018-09-26 2019-01-11 广州华多网络科技有限公司 A kind of more main broadcaster's direct broadcasting rooms give the methods of exhibiting, device and equipment of virtual present

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104104703B (en) * 2013-04-09 2018-02-13 广州华多网络科技有限公司 More people's audio-video-interactive method, client, server and systems
CN108024134B (en) * 2017-11-08 2020-01-21 北京密境和风科技有限公司 Live broadcast-based data analysis method and device and terminal equipment
CN107784524A (en) * 2017-11-08 2018-03-09 上海壹账通金融科技有限公司 Exchange method, electric terminal and the computer-readable recording medium of live platform
CN116761007A (en) * 2017-12-29 2023-09-15 广州方硅信息技术有限公司 Method for giving virtual gift to multicast live broadcasting room and electronic equipment
CN109104641B (en) * 2018-09-29 2021-02-12 广州方硅信息技术有限公司 Method and device for presenting virtual gift in multi-main broadcast live broadcast room

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105373306A (en) * 2015-10-13 2016-03-02 广州酷狗计算机科技有限公司 Virtual goods presenting method and device
CN108260021A (en) * 2018-03-08 2018-07-06 乐蜜有限公司 Living broadcast interactive method and apparatus
CN109194973A (en) * 2018-09-26 2019-01-11 广州华多网络科技有限公司 A kind of more main broadcaster's direct broadcasting rooms give the methods of exhibiting, device and equipment of virtual present

Also Published As

Publication number Publication date
CN111083505A (en) 2020-04-28

Similar Documents

Publication Publication Date Title
CN111083505B (en) Live broadcast room virtual gift interaction method, electronic equipment and device
CN111225226B (en) Interactive method, electronic equipment and device for presenting virtual gift
JP7326328B2 (en) Power Management for Optical Position Tracking Devices
JP4387242B2 (en) PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE
US10257423B2 (en) Method and system for determining proper positioning of an object
US10015402B2 (en) Electronic apparatus
US8737693B2 (en) Enhanced detection of gesture
US8549418B2 (en) Projected display to enhance computer device use
EP2594895B1 (en) Object position and orientation detection system
US11364439B2 (en) Game processing system, game processing program, and game processing method
US9690475B2 (en) Information processing apparatus, information processing method, and program
KR101156324B1 (en) Game device, computer-readable recording medium having game control program stored thereon and game control method
US20130055143A1 (en) Method for manipulating a graphical user interface and interactive input system employing the same
US20120268369A1 (en) Depth Camera-Based Relative Gesture Detection
US9317171B2 (en) Systems and methods for implementing and using gesture based user interface widgets with camera input
JP5598703B2 (en) Program, game device, and control method thereof
CN102436327B (en) Screen input system and implementation method thereof
Schwaller et al. Pointing in the air: measuring the effect of hand selection strategies on performance and effort
CN111580652A (en) Control method and device for video playing, augmented reality equipment and storage medium
CN110413187B (en) Method and device for processing annotations of interactive intelligent equipment
JP2024517367A (en) Apparatus and method for implementing user interface for live auction
Hürst et al. Drawing outside the lines: Tracking-based gesture interaction in mobile augmented entertainment
US11188206B2 (en) Information processing apparatus and information processing method
WO2016085498A1 (en) Virtual representation of a user portion
WO2020132783A1 (en) Control method for self-service device and self-service device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210114

Address after: 511442 3108, 79 Wanbo 2nd Road, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU CUBESILI INFORMATION TECHNOLOGY Co.,Ltd.

Address before: 511449 28th floor, block B1, Wanda Plaza, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Applicant before: GUANGZHOU HUADUO NETWORK TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant