WO2023279745A1 - 基于播放对象的交互方法及装置 - Google Patents

基于播放对象的交互方法及装置 Download PDF

Info

Publication number
WO2023279745A1
WO2023279745A1 PCT/CN2022/079584 CN2022079584W WO2023279745A1 WO 2023279745 A1 WO2023279745 A1 WO 2023279745A1 CN 2022079584 W CN2022079584 W CN 2022079584W WO 2023279745 A1 WO2023279745 A1 WO 2023279745A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
interaction data
live
virtual space
anchor
Prior art date
Application number
PCT/CN2022/079584
Other languages
English (en)
French (fr)
Inventor
陆勇
Original Assignee
北京达佳互联信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京达佳互联信息技术有限公司 filed Critical 北京达佳互联信息技术有限公司
Publication of WO2023279745A1 publication Critical patent/WO2023279745A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L21/00Speech or voice signal processing techniques to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
    • G10L21/02Speech enhancement, e.g. noise reduction or echo cancellation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations

Definitions

  • the present disclosure relates to the technical field of the Internet, and in particular to an interactive method, device, electronic equipment and storage medium based on playing objects.
  • Audio and video live streaming has been widely used on mobile devices, including one-way live content viewing and two-way real-time audio and video communication. These applications bring users convenient access to live broadcast content and real-time communication capabilities in mobile scenarios.
  • the present disclosure provides an interactive method, device, electronic equipment and storage medium based on playing objects.
  • an interaction method based on a playback object including:
  • the user interaction data is played in the virtual space; the user interaction data is generated based on the interaction behavior of the anchor and the guests;
  • the live broadcast data and the user interaction data are played synchronously.
  • the interactive method based on the playback object also includes:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction, so that the live data and the user interaction data are played synchronously.
  • the playback page includes a playback speed adjustment control, and in response to detecting the playback speed adjustment instruction, adjusting the playback speed of the live data according to the speed adjustment information carried in the playback speed adjustment instruction includes:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction.
  • adjusting the playback speed of the live data according to the speed adjustment information carried in the playback speed adjustment instruction includes:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction.
  • the user interaction data includes the user's voice interaction data, and adjusting the playback speed of the live data according to the speed adjustment information carried in the playback speed adjustment instruction, so that the live data and the user interaction data are played synchronously includes:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction, so that the live data and the user's voice interaction data are played synchronously.
  • an interaction method based on a playback object including:
  • the virtual space created based on the first user account is displayed on the playback page; the virtual space includes live data playback controls;
  • the guest interaction data is played in the virtual space.
  • the interactive method based on the playback object also includes:
  • User interaction data is provided to the server.
  • the host interaction data after eliminating the first echo data includes the host's voice interaction data, and the guest interaction data includes the guest's voice interaction data;
  • the voice interaction data of the anchor and the voice interaction data of the guests are merged to obtain the voice interaction data of the user.
  • the interactive method based on the playback object also includes:
  • the anchor interaction data and the guest interaction data after the first echo data are eliminated are aligned to obtain user interaction data
  • User interaction data is provided to the server.
  • the interactive method based on the playback object also includes:
  • Echo cancellation processing is performed on the anchor interaction data in the user interaction data based on the live data to obtain the user interaction data after the first echo data is eliminated;
  • the user interaction data after the first echo data is eliminated is provided to the server.
  • the first echo data is generated by collecting live data by the sound collection device.
  • an interaction method based on a playback object including:
  • the interactive method based on the playback object also includes:
  • the second echo data is generated by collecting live data by the sound collection device.
  • an interactive device based on a playback object including:
  • a display module configured to display the virtual space created based on the first user account on the play page
  • the first playback module is configured to play the user interaction data in the virtual space in response to detecting that the playback data corresponding to the virtual space contains live data and user interaction data; the user interaction data is generated based on the interaction behavior of the anchor and the guest data;
  • the second playback module is configured to call the live broadcast player to play the live data in the virtual space
  • the live broadcast data and the user interaction data are played synchronously.
  • the playback object-based interactive device also includes:
  • the speed adjustment module is configured to adjust the playback speed of the live data according to the speed adjustment information carried in the playback speed adjustment instruction in response to the detection of the playback speed adjustment instruction, so that the live data and the user interaction data are played synchronously.
  • the playback page includes a playback speed adjustment control
  • the speed adjustment module is configured to:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction.
  • the speed regulation module is configured to:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction.
  • the user interaction data includes voice interaction data of the user
  • the speed adjustment module is configured to:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction, so that the live data and the user's voice interaction data are played synchronously.
  • an interactive device based on a playback object including:
  • the display module is configured to display a virtual space created based on the first user account on the playback page; the virtual space includes a live data playback control;
  • the first playing module is configured to call the live player to play the live data in the virtual space in response to detecting the live data play instruction triggered by the live data play control;
  • the second playing module is configured to play the guest interaction data in the virtual space in response to detecting the guest interaction data sent by the guest terminal.
  • the playback object-based interactive device also includes:
  • the echo processing module is configured to, in response to detecting the anchor interaction data, perform echo cancellation processing on the anchor interaction data based on the live broadcast data, and obtain the anchor interaction data after the first echo data is eliminated;
  • the merging module is configured to perform merging processing on the host interaction data and the guest interaction data after the first echo data is eliminated, to obtain user interaction data;
  • the sending module is configured to provide user interaction data to the server.
  • the host interaction data after eliminating the first echo data includes the host's voice interaction data, and the guest interaction data includes the guest's voice interaction data;
  • Confluence module configured as:
  • the voice interaction data of the anchor and the voice interaction data of the guests are merged to obtain the voice interaction data of the user.
  • the playback object-based interactive device also includes:
  • the echo processing module is configured to, in response to detecting the anchor interaction data, perform echo cancellation processing on the anchor interaction data based on the live broadcast data, and obtain the anchor interaction data after the first echo data is eliminated;
  • the timestamp determination module is configured to determine the first timestamp of the anchor interaction data after the first echo data is eliminated; determine the second timestamp of the guest interaction data;
  • the data acquisition module is configured to perform alignment processing on the anchor interaction data and the guest interaction data after the first echo data is eliminated based on the first timestamp and the second timestamp, to obtain user interaction data;
  • the sending module is configured to provide user interaction data to the server.
  • the playback object-based interactive device also includes:
  • the recording module is configured to record the played guest interaction data and anchor interaction data to obtain user interaction data
  • the echo processing module is configured to perform echo cancellation processing on the anchor interaction data in the user interaction data based on the live broadcast data, and obtain the user interaction data after the first echo data is eliminated;
  • the sending module is configured to provide the user interaction data after the first echo data is eliminated to the server.
  • the first echo data is generated by collecting live data by the sound collection device.
  • an interactive device based on a playback object including:
  • a display module configured to display the virtual space created based on the first user account on the play page
  • the first playback module is configured to call a live player to play the live data in the virtual space in response to detecting the live data corresponding to the virtual space;
  • the second playing module is configured to play the anchor interaction data in the virtual space in response to detecting the anchor interaction data sent by the anchor terminal.
  • the playback object-based interactive device also includes:
  • the echo processing module is configured to, in response to detecting the guest interaction data, perform echo cancellation processing on the guest interaction data based on the live broadcast data, and obtain the guest interaction data after the second echo data is eliminated;
  • the sending module is configured to send the guest interaction data after the second echo data is eliminated to the server.
  • the second echo data is generated by collecting live data by the sound collection device.
  • an electronic device including: a processor; a memory for storing processor-executable instructions; wherein, the processor is configured to execute instructions, so as to implement the above-mentioned first aspect, The method of any one of the second aspect or the third aspect.
  • a non-volatile computer-readable storage medium When the instructions in the computer-readable storage medium are executed by the processor of the electronic device, the electronic device can execute the embodiments of the present disclosure.
  • a computer program product includes a computer program, the computer program is stored in a readable storage medium, and at least one processor of a computer device reads and executes the program from the readable storage medium
  • a computer program that enables a computer device to execute the method of any one of the first aspect, the second aspect, or the third aspect of the embodiments of the present disclosure.
  • the virtual space created based on the first user account is displayed on the playing page, and in response to detecting that the playing data corresponding to the virtual space contains live data and user interaction data, the user interaction data is played in the virtual space; the user interaction data is based on the host and The data generated by the guest's interactive behavior calls the live broadcast player to play the live data in the virtual space, wherein the live data and user interaction data are played synchronously.
  • the embodiment of the present application can improve the interaction efficiency between users through the user interaction data, and at the same time, avoid the delay between the two data by synchronously playing the live broadcast data and the user interaction data.
  • Fig. 1 is a schematic diagram showing an application environment according to an exemplary embodiment
  • Fig. 2 is a schematic diagram showing an application environment according to an exemplary embodiment
  • Fig. 3 is a flowchart of an interaction method based on playing objects according to an exemplary embodiment
  • Fig. 4 is a flowchart of an interaction method based on playing objects according to an exemplary embodiment
  • Fig. 5 is a flowchart of an interaction method based on playing objects according to an exemplary embodiment
  • Fig. 6 is a flowchart of an interaction method based on playing objects according to an exemplary embodiment
  • Fig. 7 is a block diagram of an interactive device based on playing objects according to an exemplary embodiment
  • Fig. 8 is a block diagram of an interactive device based on playing objects according to an exemplary embodiment
  • Fig. 9 is a block diagram of an interactive device based on playing objects according to an exemplary embodiment
  • Fig. 10 is a block diagram showing an electronic device for recommendation according to an exemplary embodiment.
  • FIG. 1 is a schematic diagram of an application environment of an interaction method based on a playback object according to an exemplary embodiment.
  • the application environment may include an audience client 011, an anchor client 021 , guest client 031 and server 041.
  • the above-mentioned audience client 011 displays the virtual space created based on the first user account on the playing page; in response to detecting that the playing data corresponding to the virtual space contains live data and user interaction data, in the virtual space Play user interaction data; user interaction data is generated based on the interaction between the host and guests; call the live broadcast player to play the live data in the virtual space; where the live data and user interaction data are played synchronously.
  • the above-mentioned viewer client 011 may include but not limited to smartphones, desktop computers, tablet computers, notebook computers, smart speakers, digital assistants, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, Electronic devices such as smart wearable devices.
  • the operating system running on the electronic device may include but not limited to Android system, IOS system, linux, windows, Unix and so on.
  • the above-mentioned anchor client 021 displays a virtual space created based on the first user account on the play page; the virtual space includes a live data play control; in response to detecting the live data play instruction triggered by the live data play control, Invoking the live broadcast player to play the live data in the virtual space; in response to detecting the guest interaction data sent by the guest terminal, playing the guest interaction data in the virtual space.
  • the above-mentioned anchor client 021 may include but not limited to smartphones, desktop computers, tablet computers, notebook computers, smart speakers, digital assistants, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, Electronic devices such as smart wearable devices.
  • the operating system running on the electronic device may include but not limited to Android system, IOS system, linux, windows, Unix and so on.
  • the above-mentioned guest client 031 displays the virtual space created based on the first user account on the play page; in response to detecting the live data corresponding to the virtual space, call the live player to play the live data in the virtual space; respond After detecting the anchor interaction data sent by the anchor terminal, the anchor interaction data is played in the virtual space.
  • the above-mentioned guest client 031 may include but not limited to smartphones, desktop computers, tablet computers, notebook computers, smart speakers, digital assistants, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, Electronic devices such as smart wearable devices. It may also be software running on the aforementioned electronic devices, such as application programs, applets, and the like.
  • the operating system running on the electronic device may include but not limited to Android system, IOS system, linux, windows, Unix and so on.
  • the server 041 can be a content distribution device built between the above-mentioned clients, the server 041 can include an independent physical server, or a server cluster or a distributed system composed of multiple physical servers, or It provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, CDN (Content Delivery Network, content distribution network), and big data and artificial intelligence Cloud servers for basic cloud computing services such as platforms.
  • CDN Content Delivery Network, content distribution network
  • Cloud servers for basic cloud computing services such as platforms.
  • FIG. 1 is only an application environment of the playback object-based interaction method provided by the present disclosure. In actual applications, other application environments may also be included.
  • FIG. 2 is based on an example A schematic diagram of an application environment of an interactive method based on a playback object shown in a specific embodiment, as shown in 2, the application environment may include an audience client 012, an anchor client 022, and a guest client 032.
  • Fig. 3 is a flow chart of an interaction method based on a playback object according to an exemplary embodiment.
  • the interaction method based on a playback object can be applied to the audience client, including the following step S301, on the playback page Displaying the virtual space created based on the first user account; step S303, in response to detecting that the playing data corresponding to the virtual space contains live data and user interaction data, playing the user interaction data in the virtual space; the user interaction data is based on the host and The data generated by the guest's interactive behavior; step S305, calling the live broadcast player to play the live data in the virtual space; wherein, the live data and the user interaction data are played synchronously.
  • step S301 the virtual space created based on the first user account is displayed on the play page.
  • steps S301 to S305 are described with the viewer client as the execution subject.
  • the virtual space in the text may be a live broadcast room on a certain live broadcast platform.
  • the anchor can use the first user account to log in to the application corresponding to the live broadcast platform on all the anchor clients (such as mobile phones) of the anchor, and create a virtual space corresponding to the first user account (that is, the live broadcast room) in the application
  • the virtual space (live room) can be entered by querying the first user account.
  • the viewer client can also display pages in the virtual space, and enter the virtual space (live room) based on the detected entry instruction for the virtual space.
  • the audience client can also enter the virtual space (live room) based on the received virtual space sharing information. In this way, the viewer client can display the virtual space created based on the first user account on the playing page.
  • the application program corresponding to the live broadcast platform may be a music application program, a short video application program, a social application program and the like.
  • the above-mentioned first user account may be the account information of the anchor on the live broadcast platform, such as a unique identification code assigned by the platform, a mobile phone number, an email address, and a nickname of the anchor.
  • the above-mentioned second user account may be account information of the viewer on the live broadcast platform, such as a unique identification code assigned by the platform, a mobile phone number, an email address, and a nickname of the viewer.
  • the guest's user account can be called the third user account, which is the guest's account information on the live broadcast platform, including the unique identification code assigned by the platform, mobile phone number, mailbox number, and the guest's nickname. Since the virtual space is created by the anchor through his own anchor client, there is only one anchor or moderator in the virtual space. In some embodiments, the preset identity information of the anchor user may be included on the playing surface.
  • the above-mentioned preset identification information may be used to distinguish the first user account, the third user account of the invited guests, and the second user account of ordinary viewers entering the virtual space.
  • the preset identity information may be used to represent the identity information of the host identity, and may also be used to represent the identity information of the host identity.
  • step S303 in response to detecting that the broadcast data corresponding to the virtual space contains live data and user interaction data, the user interaction data is played in the virtual space; the user interaction data is generated based on the interaction behaviors of the host and guests.
  • the above-mentioned live data may be real-time live data in the current outside world.
  • it may be live data of currently ongoing events, that is, real-time live data of events (such as live data of football games, live data of basketball games, live data of marathon games, etc.), which may be currently ongoing
  • the live broadcast data of theatrical arts live broadcast data of concerts, live data of dance teaching, live data of a large-scale theatrical performance, etc.
  • the currently ongoing live broadcast data of sports events and the currently ongoing live broadcast data of literature and art are some examples listed in this application, and of course other live data currently being live broadcast may also be included.
  • the above-mentioned live broadcast data may be non-real-time play data in the current outside world.
  • it can be recorded and broadcast event play data (play data of football games, play data of basketball games, play data of marathon games, etc.), can be recorded and broadcast art class play data (play data of concerts) , the playback data of dance teaching, the playback data of a large-scale theatrical performance, etc.).
  • the embodiment of this application will take the real-time live event data as an example for illustration, and other live data implementation methods can refer to the real-time event live data implementation mode, which will not be repeated here.
  • the user interaction data may be data generated based on the interaction behavior between the host and the guests.
  • the interaction data between the anchor and the guests can be generated through interactive behaviors in the form of text, through interactive behaviors in the form of emoticons, or through interactive behaviors in the form of voice.
  • step S305 the live broadcast player is invoked to play the live data in the virtual space; wherein, the live data and the user interaction data are played synchronously.
  • the broadcast page of the anchor client may include a live data playback control, and in response to the anchor client detecting the live data playback instruction triggered by the live data playback control, the live broadcast player may be called to play in the virtual space of the anchor client.
  • the live broadcast data is played, and at the same time, the broadcast data can be pushed to the audience client.
  • the live broadcast player in response to the viewer client detecting the live data corresponding to the virtual space, the live broadcast player is invoked to play the live data in the virtual space of the viewer client.
  • the live broadcast data and the user interaction data are played synchronously.
  • Fig. 4 is a flowchart of an interaction method based on a playback object according to an exemplary embodiment. As shown in Fig. 4, the interaction method based on a playback object further includes:
  • Step S401 in response to detecting the playback speed adjustment instruction, adjust the playback speed of the live data according to the speed adjustment information carried in the playback speed adjustment instruction, so that the live data and user interaction data are played synchronously.
  • the playback page includes a playback speed adjustment control.
  • the playback speed of the live data can be adjusted according to the speed adjustment information carried in the playback speed adjustment instruction. .
  • the viewer client receives the playback speed adjustment instruction sent by the server, and adjusts the playback speed of the live data according to the speed adjustment information carried in the playback speed adjustment instruction.
  • live data and user interaction data are played synchronously, and the play speed of live data can be adjusted through play speed adjustment instructions from different sources, which increases the flexibility of the solution.
  • the user interaction data includes the user's voice interaction data
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction, so that the live data and the user interaction data are played synchronously.
  • the terminal can adjust the playback speed of the live data according to the speed adjustment information carried in the playback speed adjustment command, so that the live data and the user's voice interaction data can be played synchronously.
  • the guest client and anchor client can communicate based on low-latency audio services.
  • This premise will cause the user interaction data received by the audience client to be delayed after the live data is received by the audience client, resulting in asynchronous user interaction data and live data. Therefore, the speed adjustment information carried in the playback speed adjustment command It can be delayed by n microseconds, and n can be obtained based on empirical values or debugging values.
  • the above playback speed adjustment instruction may be based on an operable delayed live stream (Manipulable Delayed Live Stream, MDLS) technology to ensure that the live content and user interaction data are played synchronously.
  • MDLS operable Delayed Live Stream
  • Fig. 5 is a flowchart of an interaction method based on a playback object according to an exemplary embodiment.
  • the interaction method based on a playback object can be applied to the anchor client, including the following step S501, on the playback page Displaying the virtual space created based on the first user account; the virtual space includes a live data playback control; step S503, in response to detecting the live data playback instruction triggered by the live data playback control, calling the live player to play the live data in the virtual space; Step S505, in response to detecting the guest interaction data sent by the guest terminal, play the guest interaction data in the virtual space.
  • step S501 the virtual space created based on the first user account is displayed on the playing page; the virtual space includes live data playing controls.
  • steps S501 to S505 are described with the host client as the execution subject.
  • the virtual space in the text may be a live broadcast room on a certain live broadcast platform.
  • the anchor can use the first user account to log in to the application corresponding to the live broadcast platform on all the anchor clients (such as mobile phones) of the anchor, and create a virtual space corresponding to the first user account (that is, the live broadcast room) in the application In this way, the anchor client can display the virtual space based on the first user account on the playback page.
  • the virtual center may contain live data playback controls.
  • the live data playback control can be directly displayed on the playback page.
  • the live data playback control is hidden on the playback page, in response to the anchor client detecting the display instruction of the live data playback control (for example, detecting that the screen of the anchor client is touched ), you can display the live data playback controls on the playback page.
  • step S503 in response to detecting the live data play instruction triggered by the live data play control, the live player is called to play the live data in the virtual space.
  • the live broadcast player in response to the host client detecting the live data play instruction triggered by the live data play control, the live broadcast player can be invoked to play the live data in the virtual space, so that the live data can be displayed on the play page.
  • the above-mentioned live broadcast player may be a player built in the application program and can be called. It can also be other players in the anchor client.
  • the live data play instruction triggered by the live data play control can be embodied as, when the live data play control on the anchor client is touched, it can jump to the live data display page, the live data
  • the data display page may include a plurality of live data in some embodiments.
  • the live player may be invoked and jump back to the playback page to play the live broadcast in the virtual space. data.
  • the live data may be real-time live data in the current outside world.
  • it may be live data of currently ongoing events, that is, real-time live data of events (such as live data of football games, live data of basketball games, live data of marathon games, etc.), which may be currently ongoing
  • the live broadcast data of theatrical arts live broadcast data of concerts, live data of dance teaching, live data of a large-scale theatrical performance, etc.
  • the currently ongoing live broadcast data of sports events and the currently ongoing live broadcast data of literature and art are some examples listed in this application, and of course other live data currently being live broadcast may also be included.
  • the above-mentioned live broadcast data may be non-real-time play data in the current outside world.
  • it can be recorded and broadcast event play data (play data of football games, play data of basketball games, play data of marathon games, etc.), can be recorded and broadcast art class play data (play data of concerts) , the playback data of dance teaching, the playback data of a large-scale theatrical performance, etc.).
  • step S505 in response to detecting the guest interaction data sent by the guest terminal, play the guest interaction data in the virtual space.
  • the embodiment of the present application is mainly based on the live broadcast data played in the virtual space, the application scenario in which the guest interacts with the host. Therefore, in response to the anchor client detecting the guest interaction data sent by the guest client, the guest interaction data can be played in the virtual space.
  • the anchor client may receive the anchor interaction data. Since the anchor interaction data is obtained by the anchor based on the interactive behavior of the live data played in the virtual space, it carries the live data. In other words, in response to detecting the anchor interaction data, the anchor client may perform echo cancellation processing on the anchor interaction data based on the live broadcast data, to obtain anchor interaction data after the first echo data is eliminated. This is because the first echo data can be generated by the live data collected by the sound collection device. In some embodiments, the sound collection device may be a microphone. Subsequently, the anchor client may merge the anchor interaction data and the guest interaction data after the first echo data has been eliminated to obtain user interaction data. And provide the above user interaction data to the server, and send it to the viewer (viewer client) by the server. Wherein, the guest interaction data may be the guest interaction data after the second echo data is eliminated by the guest client.
  • the anchor client may also send the anchor interaction data after the first echo data is eliminated to the guest client, or send it to the guest client through the server.
  • the figures of the host and guests can also be displayed on the play page.
  • the live broadcast data can be displayed on the playback page.
  • the anchor interaction data after the first echo data is eliminated includes the anchor’s voice interaction data
  • the guest interaction data includes the guest’s voice interaction data. Therefore, the anchor customer The terminal can perform voice merge processing on the voice interaction data of the anchor and the voice interaction data of the guests to obtain the voice interaction data of the user.
  • the live broadcast data can be selected to be played on the play pages of the anchor client, guest client, and audience client. It makes everyone pay more attention to the live broadcast data, and user interaction data can be transmitted through voice, so as to meet the purpose of user interaction needs.
  • the host client plays the guest interaction data in the virtual space, it can also Including: in response to the anchor client detecting the anchor interaction data, the echo cancellation process may be performed on the anchor interaction data based on the live broadcast data to obtain the anchor interaction data after the first echo data is eliminated.
  • the anchor client can determine the first timestamp of the anchor interaction data after the first echo data is eliminated, and determine the second timestamp of the guest interaction data, and eliminate the first echo data based on the first timestamp and the second timestamp
  • the final host interaction data and guest interaction data are aligned to obtain user interaction data, and the user interaction data is provided to the server, and the server can send the user interaction data to the audience client.
  • the host server does not need to perform merge processing, but only needs to simply determine the order of the host interaction data and guest interaction data after the first echo data is eliminated according to the first timestamp and the second timestamp, and obtain user interaction data.
  • the aforementioned anchor data and guest interaction data after the first echo data has been eliminated may be recorded separately by the anchor client and the guest client.
  • the anchor data and the guest interaction data after the first echo data is eliminated may be recorded together.
  • the anchor client can record the played guest interaction data and anchor interaction data to obtain the user interaction data, and perform echo cancellation processing on the anchor interaction data in the user interaction data based on the live broadcast data to obtain the eliminated first User interaction data after echo data.
  • the user interaction data after the first echo data is eliminated is provided to the server.
  • the anchor client after the anchor client selects the live broadcast data, the anchor client, the guest client and the viewer client can respectively call the live broadcast player to play on the play page.
  • the anchor client can push the voice sent by the anchor to the live stream, and the anchor client can also receive the voice of the guest sent by the guest client.
  • the anchor client can start the audio processing module (Audio Process Module, APM) to take over the live broadcast data and anchor interaction data, perform echo cancellation processing on the anchor interaction data based on the live broadcast data, and obtain the anchor interaction data after the first echo data is eliminated.
  • APM Audio Process Module
  • the anchor client can combine the voice of the anchor and the voice of the guest after the first echo data has been eliminated, and push it to the server for the audience client to pull.
  • Fig. 6 is a flowchart of an interaction method based on a playback object according to an exemplary embodiment.
  • the interaction method based on a playback object can be applied to a guest client, including the following step S601, on the playback page Displaying the virtual space created based on the first user account; step S603, in response to detecting the live data corresponding to the virtual space, calling the live player to play the live data in the virtual space; step S605, responding to detecting the anchor interaction sent by the anchor Data, play the anchor interaction data in the virtual space.
  • step S601 the virtual space created based on the first user account is displayed on the play page.
  • steps S601 to S605 are described with the guest client as the execution subject.
  • the virtual space in the text may be a live broadcast room on a certain live broadcast platform.
  • the anchor can use the first user account to log in to the application corresponding to the live broadcast platform on all the anchor clients (such as mobile phones) of the anchor, and create a virtual space corresponding to the first user account (that is, the live broadcast room) in the application In this way, the guest client can display the virtual space based on the first user account on the playing page.
  • step S603 in response to detecting the live data corresponding to the virtual space, a live player is called to play the live data in the virtual space.
  • the broadcast page of the anchor client may include a live data playback control, and in response to the anchor client detecting the live data playback instruction triggered by the live data playback control, the live broadcast player may be called to play in the virtual space of the anchor client.
  • the live broadcast data is played, and at the same time, the broadcast data can be pushed to the guest client.
  • the live player in response to the guest client detecting the live data corresponding to the virtual space, the live player is invoked to play the live data in the virtual space of the guest client.
  • step S605 in response to detecting the anchor interaction data sent by the anchor terminal, play the anchor interaction data in the virtual space.
  • the embodiment of the present application is mainly based on the live broadcast data played in the virtual space, the application scenario in which the guest interacts with the host. Therefore, in response to the guest client detecting the anchor interaction data sent by the anchor client, the anchor interaction data can be played in the virtual space.
  • the guest client in addition to the above-mentioned anchor interaction data, can receive the guest interaction data sent by the guest. Since the guest interaction data is obtained by the guest based on the interactive behavior of the live data played in the virtual space, it carries the live data. In other words, in response to detecting the guest interaction data, the guest client may perform echo cancellation processing on the guest interaction data based on the live broadcast data, to obtain the guest interaction data after the second echo data is eliminated. This is because the second echo data can be generated by the live data collected by the sound collection device. In some embodiments, the sound collection device may be a microphone. Subsequently, the guest client can send the above-mentioned guest interaction data after the second echo data is eliminated to the server, and send the host client through the server.
  • the guest interaction data includes the voice interaction data of the guest
  • the host interaction data may include the voice interaction data of the host.
  • figures of the anchor and guests may also be displayed on the playing page of the guest client.
  • only the live broadcast data can be displayed on the playback page, and others can only be played in the form of voice.
  • the live broadcast data can be selected to be played on the play pages of the anchor client, guest client, and audience client. It makes everyone pay more attention to the live broadcast data, and user interaction data can be transmitted through voice, so as to meet the purpose of user interaction needs.
  • the guest client after receiving the live data, can call the live player to play the live data on the play page, and receive the anchor interaction data sent by the anchor client.
  • the guest client can start the audio processing module (Audio Process Module, APM) to take over the live data and guest interaction data, perform echo cancellation processing on the guest interaction data based on the live data, and obtain the guest interaction data after the second echo data is eliminated. In this way, the live broadcast data collected by the microphone is prevented from being played back to the guest client.
  • APM Audio Process Module
  • the guest client and the host client may communicate based on low-latency audio services.
  • the host client, guest client and audience client can jointly watch the live broadcast data in the virtual live broadcast room, and based on the live broadcast data, the guests and the host can communicate in real time.
  • the number of guests can be multiple and variable, and the audience can be adjusted to the status of guests by the anchor.
  • the user interaction data can be well synchronized between the audience client and the live broadcast data, thereby ensuring that the communication can be carried out well, and at the same time, it is more efficient than text communication in some technologies.
  • Fig. 7 is a block diagram of an interaction device based on playing objects according to an exemplary embodiment.
  • the device includes a display module 701 , a first playback module 702 and a second playback module 703 .
  • the display module 701 is configured to display the virtual space created based on the first user account on the play page;
  • the first playback module 702 is configured to play user interaction data in the virtual space in response to detecting that the playback data corresponding to the virtual space contains live data and user interaction data; the user interaction data is generated based on the interaction behavior of the host and guests The data;
  • the second playing module 703 is configured to call the live player to play the live data in the virtual space
  • the live broadcast data and the user interaction data are played synchronously.
  • the playback object-based interactive device also includes:
  • the speed adjustment module is configured to adjust the playback speed of the live data according to the speed adjustment information carried in the playback speed adjustment instruction in response to the detection of the playback speed adjustment instruction, so that the live data and the user interaction data are played synchronously.
  • the playback page includes a playback speed adjustment control
  • the speed adjustment module is configured to:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction.
  • the speed regulation module is configured to:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction.
  • the user interaction data includes voice interaction data of the user
  • the speed adjustment module is configured to:
  • the playback speed of the live data is adjusted according to the speed adjustment information carried in the playback speed adjustment instruction, so that the live data and the user's voice interaction data are played synchronously.
  • Fig. 8 is a block diagram of an interaction device based on playing objects according to an exemplary embodiment.
  • the device includes a display module 801 , a first playback module 802 and a second playback module 803 .
  • the display module 801 is configured to display a virtual space created based on the first user account on the play page; the virtual space includes a live data play control;
  • the first playing module 802 is configured to call the live player to play the live data in the virtual space in response to detecting the live data play instruction triggered by the live data play control;
  • the second playing module 803 is configured to play the guest interaction data in the virtual space in response to detecting the guest interaction data sent by the guest terminal.
  • the playback object-based interactive device also includes:
  • the echo processing module is configured to, in response to detecting the anchor interaction data, perform echo cancellation processing on the anchor interaction data based on the live broadcast data, and obtain the anchor interaction data after the first echo data is eliminated;
  • the merging module is configured to perform merging processing on the host interaction data and the guest interaction data after the first echo data is eliminated, to obtain user interaction data;
  • the sending module is configured to provide the user interaction data to the server.
  • the anchor interaction data after eliminating the first echo data includes the anchor's voice interaction data, and the guest interaction data includes the guest's voice interaction data;
  • Confluence module configured as:
  • the voice interaction data of the anchor and the voice interaction data of the guests are merged to obtain the voice interaction data of the user.
  • the playback object-based interactive device also includes:
  • the echo processing module is configured to, in response to detecting the anchor interaction data, perform echo cancellation processing on the anchor interaction data based on the live broadcast data, and obtain the anchor interaction data after the first echo data is eliminated;
  • the timestamp determination module is configured to determine the first timestamp of the anchor interaction data after the first echo data is eliminated; determine the second timestamp of the guest interaction data;
  • the data acquisition module is configured to perform alignment processing on the anchor interaction data and the guest interaction data after the first echo data is eliminated based on the first timestamp and the second timestamp, to obtain user interaction data;
  • the sending module is configured to provide user interaction data to the server.
  • the playback object-based interactive device also includes:
  • the recording module is configured to record the played guest interaction data and anchor interaction data to obtain user interaction data
  • the echo processing module is configured to perform echo cancellation processing on the anchor interaction data in the user interaction data based on the live broadcast data, and obtain the user interaction data after the first echo data is eliminated;
  • the sending module is configured to provide the user interaction data after the first echo data is eliminated to the server.
  • the first echo data is generated by collecting live data by the sound collection device.
  • Fig. 9 is a block diagram of an interaction device based on playing objects according to an exemplary embodiment.
  • the device includes a display module 901 , a first playback module 902 and a second playback module 903 .
  • the display module 901 is configured to display the virtual space created based on the first user account on the play page;
  • the first playing module 902 is configured to call a live player to play the live data in the virtual space in response to detecting the live data corresponding to the virtual space;
  • the second playing module 903 is configured to play the anchor interaction data in the virtual space in response to detecting the anchor interaction data sent by the anchor terminal.
  • the playback object-based interactive device also includes:
  • the echo processing module is configured to, in response to detecting the guest interaction data, perform echo cancellation processing on the guest interaction data based on the live broadcast data, and obtain the guest interaction data after the second echo data is eliminated;
  • the sending module is configured to send the guest interaction data after the second echo data is eliminated to the server.
  • the second echo data is generated by collecting live data by the sound collection device.
  • Fig. 10 is a block diagram of an electronic device 1000 for recommendation according to an exemplary embodiment.
  • the electronic device may be an anchor client, a guest client or an audience client, and its internal structure may be as shown in FIG. 10 .
  • the electronic device includes a processor, memory and network interface connected by a system bus. Wherein, the processor of the electronic device is used to provide calculation and control capabilities.
  • the memory of the electronic device includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium stores an operating system and computer programs.
  • the internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage medium.
  • the network interface of the electronic device is used to communicate with an external terminal through a network connection. When the computer program is executed by the processor, an interactive method based on playing objects is realized.
  • FIG. 10 is only a block diagram of a partial structure related to the disclosed solution, and does not constitute a limitation on the electronic device to which the disclosed solution is applied.
  • the specific electronic device can be More or fewer components than shown in the figures may be included, or some components may be combined, or have a different arrangement of components.
  • an electronic device including: a processor; a memory for storing instructions executable by the processor; wherein, the processor is configured to execute the instructions, so as to implement The interaction method based on the playback object in the example.
  • a non-volatile computer-readable storage medium is also provided, and when instructions in the computer-readable storage medium are executed by a processor of the electronic device, the electronic device can execute the embodiments of the present disclosure. Playback object-based interaction methods in .
  • a computer program product includes a computer program, the computer program is stored in a readable storage medium, and at least one processor of a computer device reads and executes the computer program from the readable storage medium.
  • the program enables the computer device to execute the playback object-based interaction method of the embodiment of the present disclosure.
  • Nonvolatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM) or external cache memory.
  • RAM random access memory
  • RAM is available in many forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Chain Synchlink DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Quality & Reliability (AREA)
  • Databases & Information Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开关于一种基于播放对象的交互方法、装置、电子设备及存储介质。方法包括:在播放页面显示基于第一用户账号创建的虚拟空间,响应于检测到虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在虚拟空间中播放用户交互数据;用户交互数据是基于主播和嘉宾的交互行为生成的数据,调用直播播放器在虚拟空间中播放直播数据,其中,直播数据和用户交互数据保持同步播放。

Description

基于播放对象的交互方法及装置
相关申请的交叉引用
本申请基于申请号为202110757594.0、申请日为2021年07月05日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及互联网技术领域,尤其涉及一种基于播放对象的交互方法、装置、电子设备及存储介质。
背景技术
音视频直播在移动设备的应用已经非常广泛,包括单向的直播内容观看,以及双向的音视频实时交流。这些应用在移动场景下,为用户带来了便利的直播内容获取及实时沟通交流能力。
目前,围绕直播内容的实时交流形态,主要是通过弹幕、评论的方式呈现,比如,在用户在观看直播内容的情况下,通过输入文字或者表情包来表达观点,然而此种方式对于需要实时交流的用户来说,沟通效率比较低。
发明内容
本公开提供一种基于播放对象的交互方法、装置、电子设备及存储介质。
根据本公开实施例的第一方面,提供一种基于播放对象的交互方法,包括:
在播放页面显示基于第一用户账号创建的虚拟空间;
响应于检测到虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在虚拟空间中播放用户交互数据;用户交互数据是基于主播和嘉宾的交互行为生成的数据;
调用直播播放器在虚拟空间中播放直播数据;
其中,直播数据和用户交互数据保持同步播放。
在一些实施例中,该基于播放对象的交互方法还包括:
响应于检测到播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户交互数据保持同步播放。
在一些实施例中,播放页面包括播放速度调节控件,响应于检测到播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度包括:
响应于检测到播放速度调节控件触发的播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度。
在一些实施例中,响应于检测到播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度包括:
接收服务器发送的播放速度调节指令;
根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度。
在一些实施例中,用户交互数据包括用户的语音交互数据,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户交互数据保持同步播放包括:
根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户的语音交互数据保持同步播放。
根据本公开实施例的第二方面,提供一种基于播放对象的交互方法,包括:
在播放页面显示基于第一用户账号创建的虚拟空间;虚拟空间包含有直播数据播放控件;
响应于检测到直播数据播放控件触发的直播数据播放指令,调用直播播放器在虚拟空间中播放直播数据;
响应于检测到嘉宾端发送的嘉宾交互数据,在虚拟空间中播放嘉宾交互数据。
在一些实施例中,该基于播放对象的交互方法还包括:
响应于检测到主播交互数据,基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
对消除第一回声数据后的主播交互数据和嘉宾交互数据进行合流处理,得到用户交互数据;
将用户交互数据提供至服务器。
在一些实施例中,消除第一回声数据后的主播交互数据包括主播的语音交互数据,嘉宾交互数据包括嘉宾的语音交互数据;
对消除第一回声数据后的主播交互数据和嘉宾交互数据进行合流处理,得到用户交互数据包括:
对主播的语音交互数据和嘉宾的语音交互数据进行语音合流处理,得到用户的语音交互数据。
在一些实施例中,该基于播放对象的交互方法还包括:
响应于检测到主播交互数据,基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
确定消除第一回声数据后的主播交互数据的第一时间戳;
确定嘉宾交互数据的第二时间戳;
基于第一时间戳和第二时间戳对消除第一回声数据后的主播交互数据和嘉宾交互数据进行对齐处理,得到用户交互数据;
将用户交互数据提供至服务器。
在一些实施例中,该基于播放对象的交互方法还包括:
对播放的嘉宾交互数据和主播交互数据进行录制,得到用户交互数据;
基于直播数据对用户交互数据中的主播交互数据进行回声消除处理,得到消除第一回声数据后的用户交互数据;
将消除第一回声数据后的用户交互数据提供至服务器。
在一些实施例中,第一回声数据是由声音采集装置采集直播数据生成的。根据本公开实施例的第三方面,提供一种基于播放对象的交互方法,包括:
在播放页面显示基于第一用户账号创建的虚拟空间;
响应于检测到虚拟空间对应的直播数据,调用直播播放器在虚拟空间中播放直播数据;
响应于检测到主播端发送的主播交互数据,在虚拟空间中播放主播交互数据。
在一些实施例中,该基于播放对象的交互方法还包括:
响应于检测到嘉宾交互数据,基于直播数据对嘉宾交互数据进行回声消除处理,得到消除第二回声数据后的嘉宾交互数据;
将消除第二回声数据后的嘉宾交互数据发送至服务器。
在一些实施例中,第二回声数据是由声音采集装置采集直播数据生成的。
根据本公开实施例的第四方面,提供一种基于播放对象的交互装置,包括:
显示模块,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;
第一播放模块,被配置为响应于检测到虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在虚拟空间中播放用户交互数据;用户交互数据是基于主播和嘉宾的交互行为生成的数据;
第二播放模块,被配置为调用直播播放器在虚拟空间中播放直播数据;
其中,直播数据和用户交互数据保持同步播放。
在一些实施例中,该基于播放对象的交互装置还包括:
速度调节模块,被配置为响应于检测到播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户交互数据保持同步播放。
在一些实施例中,播放页面包括播放速度调节控件,速度调节模块,被配置为:
响应于检测到播放速度调节控件触发的播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度。
在一些实施例中,速度调节模块,被配置为:
接收服务器发送的播放速度调节指令;
根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度。
在一些实施例中,用户交互数据包括用户的语音交互数据,速度调节模块,被配置为:
根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户的语音交互数据保持同步播放。
根据本公开实施例的第五方面,提供一种基于播放对象的交互装置,包括:
显示模块,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;虚拟空间包含有直播数据播放控件;
第一播放模块,被配置为响应于检测到直播数据播放控件触发的直播数据播放指令,调用直播播放器在虚拟空间中播放直播数据;
第二播放模块,被配置为响应于检测到嘉宾端发送的嘉宾交互数据,在虚拟空间中播放嘉宾交互数据。
在一些实施例中,该基于播放对象的交互装置还包括:
回声处理模块,被配置为响应于检测到主播交互数据,基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
合流模块,被配置为对消除第一回声数据后的主播交互数据和嘉宾交互数据进行合流处理,得到用户交互数据;
发送模块,被配置为将用户交互数据提供至服务器。
在一些实施例中,消除第一回声数据后的主播交互数据包括主播的语音交互数据,嘉宾交互数据包括嘉宾的语音交互数据;
合流模块,被配置为:
对主播的语音交互数据和嘉宾的语音交互数据进行语音合流处理,得到用户的语音交互数据。
在一些实施例中,该基于播放对象的交互装置还包括:
回声处理模块,被配置为响应于检测到主播交互数据,基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
时间戳确定模块,被配置为确定消除第一回声数据后的主播交互数据的第一时间戳;确定嘉宾交互数据的第二时间戳;
数据获取模块,被配置为基于第一时间戳和第二时间戳对消除第一回声数据后的主播交互数据和嘉宾交互数据进行对齐处理,得到用户交互数据;
发送模块,被配置为将用户交互数据提供至服务器。
在一些实施例中,该基于播放对象的交互装置还包括:
录制模块,被配置为对播放的嘉宾交互数据和主播交互数据进行录制,得到用户交互数据;
回声处理模块,被配置为基于直播数据对用户交互数据中的主播交互数据进行回声消除处理,得到消除第一回声数据后的用户交互数据;
发送模块,被配置为将消除第一回声数据后的用户交互数据提供至服务器。
在一些实施例中,第一回声数据是由声音采集装置采集直播数据生成的。
根据本公开实施例的第六方面,提供一种基于播放对象的交互装置,包括:
显示模块,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;
第一播放模块,被配置为响应于检测到虚拟空间对应的直播数据,调用直播播放器在虚拟空间中播放直播数据;
第二播放模块,被配置为响应于检测到主播端发送的主播交互数据,在虚拟空间中播放主播交互数据。
在一些实施例中,该基于播放对象的交互装置还包括:
回声处理模块,被配置为响应于检测到嘉宾交互数据,基于直播数据对嘉宾交互数据进行回声消除处理,得到消除第二回声数据后的嘉宾交互数据;
发送模块,被配置为执行将消除第二回声数据后的嘉宾交互数据发送至服务器。
在一些实施例中,第二回声数据是由声音采集装置采集直播数据生成的。
根据本公开实施例的第七方面,提供一种电子设备,包括:处理器;用于存储处理器可执行指令的存储器;其中,处理器被配置为执行指令,以实现如上述第一方面、第二方面或者第三方面中任一项的方法。
根据本公开实施例的第八方面,提供一种非易失性计算机可读存储介质,当计算机可读存储介质中的指令由电子设备的处理器执行时,使得电子设备能够执行本公开实施例的第一方面、第二方面或者第三方面中任一项的方法。
根据本公开实施例的第九方面,提供一种计算机程序产品,计算机程序产品包括计算机程序,计算机程序存储在可读存储介质中,计算机设备的至少一个处理器从可读存储介质读取并执行计算机程序,使得计算机设备执行本公开实施例的第一方面、第二方面或者第三方面中任一项的方法。
在播放页面显示基于第一用户账号创建的虚拟空间,响应于检测到虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在虚拟空间中播放用户交互数据;用户交互数据是基于主播和嘉宾的交互行为生成的数据,调用直播播放器在虚拟空间中播放直播数据,其中,直播数据和用户交互数据保持同步播放。如此,本申请实施例可以通过用户交互数据可以使得用户之间交互效率得到提升,同时,通过同步播放的直播数据和用户交互数据避免了两个数据之间的延时。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理,并不构成对本公开的不当限定。
图1是根据一示例性实施例示出的一种应用环境的示意图;
图2是根据一示例性实施例示出的一种应用环境的示意图
图3是根据一示例性实施例示出的一种基于播放对象的交互方法的流程图;
图4是根据一示例性实施例示出的一种基于播放对象的交互方法的流程图;
图5是根据一示例性实施例示出的一种基于播放对象的交互方法的流程图;
图6是根据一示例性实施例示出的一种基于播放对象的交互方法的流程图;
图7是根据一示例性实施例示出的一种基于播放对象的交互装置的框图;
图8是根据一示例性实施例示出的一种基于播放对象的交互装置的框图;
图9是根据一示例性实施例示出的一种基于播放对象的交互装置的框图;
图10是根据一示例性实施例示出的一种用于推荐的电子设备的框图。
具体实施方式
为了使本领域普通人员更好地理解本公开的技术方案,下面将结合附图,对本公开实施例中的技术方案进行清楚、完整地描述。
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的第一对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本公开的实施例能够以除了在这里图示或描述的那些以外的顺序实施。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
本申请中有关用户的所有数据均是用户授权后的数据。
请参阅图1,图1是根据一示例性实施例示出的一种基于播放对象的交互方法的应用环境的示意图,如图1所示,该应用环境可以包括观众客户端011、主播客户端021,嘉宾客户端031和服务器041。
在一些实施例中,上述的观众客户端011在播放页面显示基于第一用户账号创建的虚拟空间;响应于检测到虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在虚拟空间中播放用户交互数据;用户交互数据是基于主播和嘉宾的交互行为生成的数据;调用直播播放器在虚拟空间中播放直播数据;其中,直播数据和用户交互数据保持同步播放。其中,上述的观众客户端011可以包括但不限于智能手机、台式计算机、平板电脑、笔记本电脑、智能音箱、数字助理、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、智能可穿戴设备等类型的电子设备。也可以为运行于上述电子设备的软体,例如应用程序、小程序等。在一些实施例中,电子设备上运行的操作***可以包括但不限于安卓***、IOS***、linux、windows、Unix等。
在一些实施例中,上述的主播客户端021在播放页面显示基于第一用户账号创建的虚拟空间;虚拟空间包含有直播数据播放控件;响应于检测到直播数据播放控件触发的直播数据播放指令,调用直播播放器在虚拟空间中播放直播数据;响应于检测到嘉宾端发送的嘉宾交互数据,在虚拟空间中播放嘉宾交互数据。其中,上述的主播客户端021可以包括但不限于智能手机、台式计算机、平板电脑、笔记本电脑、智能音箱、数字助理、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、智能可穿戴设备等类型的电子设备。也可以为运行于上述电子设备的软体,例如应用程序、小程序等。在一些实施例中,电子设备上运行的操作***可以包括但不限于安卓***、IOS***、linux、windows、Unix等。
在一些实施例中,上述的嘉宾客户端031在播放页面显示基于第一用户账号创建的虚拟空间;响应于检测到虚拟空间对应的直播数据,调用直播播放器在虚拟空间中播放直播数据;响应于检测到主播端发送的主播交互数据,在虚拟空间中播放主播交互数据。其中,上述的嘉宾客户端031可以包括但不限于智能手机、台式计算机、平板电脑、笔记本电脑、智能音箱、数字助理、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、智能可穿戴设备等类型的电子设备。也可以为运行于上述电子设备的软体,例如应用程序、小程序等。在一些实施例中,电子设备上运行的操作***可以包括但不限于安卓***、IOS***、linux、windows、Unix等。
在一些实施例中,服务器041可以是构建在上述客户端之间的内容分发设备,服务器041可以包括是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式***,还可 以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、CDN(Content Delivery Network,内容分发网络)、以及大数据和人工智能平台等基础云计算服务的云服务器。
此外,需要说明的是,图1所示的仅仅是本公开提供的基于播放对象的交互方法的一种应用环境,在实际应用中,还可以包括其他应用环境,例如,图2是根据一示例性实施例示出的一种基于播放对象的交互方法的应用环境的示意图,如2所示,该应用环境可以包括观众客户端012、主播客户端022,嘉宾客户端032。
图3是根据一示例性实施例示出的一种基于播放对象的交互方法的流程图,如图3所示,基于播放对象的交互方法可以应用于观众客户端,包括以下步骤S301,在播放页面显示基于第一用户账号创建的虚拟空间;步骤S303,响应于检测到虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在虚拟空间中播放用户交互数据;用户交互数据是基于主播和嘉宾的交互行为生成的数据;步骤S305,调用直播播放器在虚拟空间中播放直播数据;其中,直播数据和用户交互数据保持同步播放。
在步骤S301中,在播放页面显示基于第一用户账号创建的虚拟空间。
本申请实施例中,步骤S301至S305是以观众客户端为执行主语阐述的。
本申请实施例中,文中的虚拟空间可以是某个直播平台上的直播间。主播可以在主播所有的主播客户端(比如手机)上,使用第一用户账号登录该直播平台对应的应用程序,并在该应用程序中创建第一用户账号对应的虚拟空间(也就是直播间),如此,响应于观众客户端同样通过第二用户账号启动观众所有的观众客户端上的应用程序,可以通过查询第一用户账号进入该虚拟空间(直播间)。观众客户端也可以在虚拟空间集合展示页面,基于检测到的针对该虚拟空间的进入指示进入该虚拟空间(直播间)。观众客户端还可以基于接收到的虚拟空间分享信息进入该虚拟空间(直播间)。如此,观众客户端就可以在播放页面上显示基于第一用户账号创建的虚拟空间。
在一些实施例中,该直播平台对应的应用程序可以是音乐应用程序,短视频应用程序,社交应用程序等等。
在一些实施例中,上述的第一用户账号可以是主播在该直播平台上的账号信息,比如平台分配的唯一标识码,手机号,邮箱号,主播的昵称。上述的第二用户账号可以是观众在该直播平台上的账号信息,比如平台分配的唯一标识码,手机号,邮箱号,观众的昵称。嘉宾的用户账号可以被称为第三用户账号,是嘉宾在该直播平台上的账号信息,包括平台分配的唯一标识码,手机号,邮箱号,嘉宾的昵称。由于该虚拟空间是主播通过自己的主播客户端创建的,因此,该虚拟空间只有一位主播或者说主持。在一些实施例中,可以在播放面上包含有主播用户的预设身份标识信息。
在一些实施例中,上述的预设身份标识信息可以用于区分第一用户账号、邀请的嘉宾的第三用户账号和进入该虚拟空间中的普通观众的第二用户账号。在一些实施例中,该预设身份标识信息可以用来表示主播身份的标识信息,还可以用来表示主持身份的标识信息。
在步骤S303中,响应于检测到虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在虚拟空间中播放用户交互数据;用户交互数据是基于主播和嘉宾的交互行为生成的数据。
在一些实施例中,上述的直播数据可以是当前外界中实时的直播数据。在一些实施例中,可以是当前正在进行的赛事直播数据,也就是实时的赛事直播数据(比如足球比赛的直播数据,篮球比赛的直播数据,马拉松比赛的直播数据等),可以是当前正在进行的文艺类直播数据(演唱会的直播数据,舞蹈教学的直播数据,某个大型文艺演出的直播数据等)。其中,当前正在进行的赛事直播数据和前正在进行的文艺类直播数据是本申请列出的一些实施例,当然还可以包括其他当前正在进行直播的直播数据。
在一些实施例中,上述的直播数据可以是当前外界中非实时的播放数据。在一些实施例中,可以是录播的赛事播放数据(足球比赛的播放数据,篮球比赛的播放数据,马拉松比赛的播放数据等),可以是录播的文艺类播放数据(演唱会的播放数据,舞蹈教学的播放数据,某个大型文艺演出的播放数据等)。
为了对全文内容进行清晰阐述,本申请实施例将以实时的赛事直播数据为例进行说明,其他直播数据的实施方式可以参考该实时的赛事直播数据的实施方式,这里不再赘述。
本申请实施例中,用户交互数据可以是基于主播和嘉宾的交互行为生成的数据。在一些实施 例中,主播和嘉宾的交互数据可以通过文字形式的交互行为生成,可以通过表情形式的交互行为生成,可以通过语音形式的交互行为生成。
在步骤S305中,调用直播播放器在虚拟空间中播放直播数据;其中,直播数据和用户交互数据保持同步播放。
本申请实施例中,主播客户端的播放页面上可以包括直播数据播放控件,响应于主播客户端检测到该直播数据播放控件触发的直播数据播放指令,可以调用直播播放器在主播客户端的虚拟空间中播放该直播数据,同时,可以将播放数据推送到观众客户端。对应的,响应于观众客户端检测到虚拟空间对应的直播数据,调用直播播放器在观众客户端的虚拟空间中播放直播数据。其中,直播数据和用户交互数据保持同步播放。
在一些实施例中,观众客户端调用直播播放器在虚拟空间中播放的直播数据和用户交互数据可以是不同步的。图4是根据一示例性实施例示出的一种基于播放对象的交互方法的流程图,如图4所示,该基于播放对象的交互方法还包括:
步骤S401,响应于检测到播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户交互数据保持同步播放。
在一些实施例中,播放页面包括播放速度调节控件,响应于观众客户端检测到播放速度调节控件触发的播放速度调节指令,可以根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度。
在一些实施例中,观众客户端接收服务器发送的播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度。
如此,本申请实施例中,直播数据和用户交互数据保持同步播放,可以通过不同来源的播放速度调节指令调节直播数据的播放速度,增加了方案的灵活多变性。
本申请实施例中,用户交互数据包括用户的语音交互数据,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户交互数据保持同步播放可以体现为:观众客户端可以根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户的语音交互数据保持同步播放。
一般来说,由于用户交互数据是基于嘉宾交互数据和主播交互数据合流得到的,为了保证嘉宾交互数据和主播交互数据之间的交互不会有太长时延,因此,嘉宾客户端和主播客户端可以是基于低延时音频服务进行交流。此种前提会导致观众客户端接收到用户交互数据会延后于直播数据被观众客户端接收到,从而造成用户交互数据和直播数据的不同步,因此,播放速度调节指令中携带的速度调节信息可以是延后n微妙,n可以是基于经验值得到,也可以基于调试值得到。在一些实施例中,上述播放速度调节指令可以是基于可操作的延迟直播流(Manipulable Delayed Live Stream,MDLS)技术来保证直播内容和用户交互数据同步播放的。
图5是根据一示例性实施例示出的一种基于播放对象的交互方法的流程图,如图5所示,基于播放对象的交互方法可以应用于主播客户端,包括以下步骤S501,在播放页面显示基于第一用户账号创建的虚拟空间;虚拟空间包含有直播数据播放控件;步骤S503,响应于检测到直播数据播放控件触发的直播数据播放指令,调用直播播放器在虚拟空间中播放直播数据;步骤S505,响应于检测到嘉宾端发送的嘉宾交互数据,在虚拟空间中播放嘉宾交互数据。
在步骤S501中,在播放页面显示基于第一用户账号创建的虚拟空间;虚拟空间包含有直播数据播放控件。
本申请实施例中,步骤S501至S505是以主播客户端为执行主语阐述的。
本申请实施例中,文中的虚拟空间可以是某个直播平台上的直播间。主播可以在主播所有的主播客户端(比如手机)上,使用第一用户账号登录该直播平台对应的应用程序,并在该应用程序中创建第一用户账号对应的虚拟空间(也就是直播间),如此,主播客户端就可以在播放页面上显示基于第一用户账号的虚拟空间。
在一些实施例中,虚拟中间中可以包含有直播数据播放控件。在一些实施例中,该直播数据播放控件可以直接显示在播放页面上。在另一个在一些实施例中实施例中该直播数据播放控件是隐藏于播放页面的,响应于主播客户端检测到该直播数据播放控件的显示指令(比如,检测到主播客户端的屏幕被触碰),可以在播放页面上显示直播数据播放控件。
在步骤S503中,响应于检测到直播数据播放控件触发的直播数据播放指令,调用直播播放器在虚拟空间中播放直播数据。
本申请实施例中,响应于主播客户端检测到该直播数据播放控件触发的直播数据播放指令,可以调用直播播放器在虚拟空间中播放该直播数据,如此,直播数据可以展示在播放页面上。
在一些实施例中,上述的直播播放器可以是内置在应用程序中的,可以被调用的播放器。还可以是主播客户端中其他播放器。
在一些实施例中,检测到直播数据播放控件触发的直播数据播放指令可以体现为,在主播客户端上的直播数据播放控件被触碰的情况下,可以跳转至直播数据展示页面,该直播数据展示页面中可以包括多个在一些实施例中直播数据,响应于主播客户端检测到某个直播数据被选择,可以调用直播播放器,并跳转回播放页面,在虚拟空间中播放该直播数据。
本申请实施例中,该直播数据可以是当前外界中实时的直播数据。在一些实施例中,可以是当前正在进行的赛事直播数据,也就是实时的赛事直播数据(比如足球比赛的直播数据,篮球比赛的直播数据,马拉松比赛的直播数据等),可以是当前正在进行的文艺类直播数据(演唱会的直播数据,舞蹈教学的直播数据,某个大型文艺演出的直播数据等)。其中,当前正在进行的赛事直播数据和前正在进行的文艺类直播数据是本申请列出的一些实施例,当然还可以包括其他当前正在进行直播的直播数据。
在一些实施例中,上述的直播数据可以是当前外界中非实时的播放数据。在一些实施例中,可以是录播的赛事播放数据(足球比赛的播放数据,篮球比赛的播放数据,马拉松比赛的播放数据等),可以是录播的文艺类播放数据(演唱会的播放数据,舞蹈教学的播放数据,某个大型文艺演出的播放数据等)。
在步骤S505中,响应于检测到嘉宾端发送的嘉宾交互数据,在虚拟空间中播放嘉宾交互数据。
由于本申请实施例主要是基于虚拟空间中播放的直播数据,嘉宾和主播进行交互的应用场景。因此,响应于主播客户端检测到嘉宾客户端发送的嘉宾交互数据,可以在虚拟空间中播放嘉宾交互数据。
本申请实施例中,除了上述的嘉宾交互数据之外,主播客户端可以接收到主播交互数据。由于主播交互数据是主播基于虚拟空间中播放的直播数据的交互行为得到的,因此,携带有直播数据。换句话说,响应于检测到主播交互数据,主播客户端可以基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据。这是因为第一回声数据可以是由声音采集装置采集直播数据生成的。在一些实施例中,声音采集装置可以是麦克风。随后,主播客户端可以对消除第一回声数据后的主播交互数据和嘉宾交互数据进行合流处理,得到用户交互数据。并且将上述用户交互数据提供至服务器,并由服务器发送至观众端(观众客户端)。其中,嘉宾交互数据可以是嘉宾客户端消除第二回声数据后的嘉宾交互数据。
在一些实施例中,主播客户端还可以将消除第一回声数据后的主播交互数据发送至嘉宾客户端,或者通过服务器发送至嘉宾客户端。
如此,就可以避免推送到嘉宾客户端中的主播交互数据,以及发送至观众客户端的用户交互数据(包括主播交互数据和嘉宾交互数据)存在直播数据,从而造成在嘉宾客户端和观众客户端播放的直播数据产生重复的后果。
在一些实施例中,除了直播数据可以展示在播放页面上,还可以将主播和嘉宾的身影展示在播放页面上。在一些实施例中,可以将直播数据展示在播放页面上,其他的,消除第一回声数据后的主播交互数据包括主播的语音交互数据,嘉宾交互数据包括嘉宾的语音交互数据,因此,主播客户端可以对主播的语音交互数据和嘉宾的语音交互数据进行语音合流处理,得到用户的语音交互数据。
如此,本申请实施例可以选择直播数据在主播客户端,嘉宾客户端和观众客户端的播放页面上进行播放。使得大家的注意点更关注于直播数据,用户交互数据可以通过语音的方式传输,从而达到满足用户的交互需求的目的。
除了上述的对消除第一回声数据后的主播交互数据和嘉宾交互数据进行合流处理,得到用户交互数据之外,在一些实施例中,主播客户端在虚拟空间中播放嘉宾交互数据之后,还可以包括:响应于主播客户端检测到主播交互数据,可以基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据。随后,主播客户端可以确定消除第一回声数据后的主播交互数据的第一时间戳,并确定嘉宾交互数据的第二时间戳,基于第一时间戳和第二时间戳对消除第一回声数据后的主播交互数据和嘉宾交互数据进行对齐处理,得到用户交互数据,将用户 交互数据提供至服务器,服务器可以将用户交互数据发送至观众客户端。
如此,主播服务器不需要做合流处理,只需要简单的根据第一时间戳和第二时间戳确定消除第一回声数据后的主播交互数据和嘉宾交互数据的前后顺序,得到用户交互数据。
上述的消除第一回声数据后的主播数据和嘉宾交互数据可以是分别由主播客户端和嘉宾客户端分开录制的。在一些实施例中,消除第一回声数据后的主播数据和嘉宾交互数据可以是一起录制的。在一些实施例中,主播客户端可以对播放的嘉宾交互数据和主播交互数据进行录制,得到用户交互数据,并基于直播数据对用户交互数据中的主播交互数据进行回声消除处理,得到消除第一回声数据后的用户交互数据。将消除第一回声数据后的用户交互数据提供至服务器。
在一些实施例中,主播客户端选择直播数据后,主播客户端,嘉宾客户端和观众客户端分别可以调用直播播放器在播放页面上进行播放。其次,主播客户端可以将主播发出的语音推到直播流中,主播客户端还可以接收到嘉宾客户端发送的嘉宾的语音。此外,主播客户端可以启动音频处理模块(Audio Process Module,APM)接管直播数据和主播交互数据,基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据。从而避免麦克风采集到直播数据回播给嘉宾客户端和观众客户端。随后,主播客户端可以将消除第一回声数据后的主播的语音和嘉宾的语音进行话音合流,推动给服务器,供观众客户端拉取。
图6是根据一示例性实施例示出的一种基于播放对象的交互方法的流程图,如图6所示,基于播放对象的交互方法可以应用于嘉宾客户端,包括以下步骤S601,在播放页面显示基于第一用户账号创建的虚拟空间;步骤S603,响应于检测到虚拟空间对应的直播数据,调用直播播放器在虚拟空间中播放直播数据;步骤S605,响应于检测到主播端发送的主播交互数据,在虚拟空间中播放主播交互数据。
在步骤S601中,在播放页面显示基于第一用户账号创建的虚拟空间。
本申请实施例中,步骤S601至S605是以嘉宾客户端为执行主语阐述的。
本申请实施例中,文中的虚拟空间可以是某个直播平台上的直播间。主播可以在主播所有的主播客户端(比如手机)上,使用第一用户账号登录该直播平台对应的应用程序,并在该应用程序中创建第一用户账号对应的虚拟空间(也就是直播间),如此,嘉宾客户端就可以在播放页面上显示基于第一用户账号的虚拟空间。
在步骤S603中,响应于检测到虚拟空间对应的直播数据,调用直播播放器在虚拟空间中播放直播数据。
本申请实施例中,主播客户端的播放页面上可以包括直播数据播放控件,响应于主播客户端检测到该直播数据播放控件触发的直播数据播放指令,可以调用直播播放器在主播客户端的虚拟空间中播放该直播数据,同时,可以将播放数据推送到嘉宾客户端。对应的,响应于嘉宾客户端检测到虚拟空间对应的直播数据,调用直播播放器在嘉宾客户端的虚拟空间中播放直播数据。
在步骤S605中,响应于检测到主播端发送的主播交互数据,在虚拟空间中播放主播交互数据。
由于本申请实施例主要是基于虚拟空间中播放的直播数据,嘉宾和主播进行交互的应用场景。因此,响应于嘉宾客户端检测到主播客户端发送的主播交互数据,可以在虚拟空间中播放主播交互数据。
本申请实施例中,除了上述的主播交互数据之外,嘉宾客户端可以接收到嘉宾发出的嘉宾交互数据。由于嘉宾交互数据是嘉宾基于虚拟空间中播放的直播数据的交互行为得到的,因此,携带有直播数据。换句话说,响应于检测到嘉宾交互数据,嘉宾客户端可以基于直播数据对嘉宾交互数据进行回声消除处理,得到消除第二回声数据后的嘉宾交互数据。这是因为第二回声数据可以是由声音采集装置采集直播数据生成的。在一些实施例中,声音采集装置可以是麦克风。随后,嘉宾客户端可以将上述消除第二回声数据后的嘉宾交互数据发送至服务器,并通过服务器发送主播客户端。
如此,就可以避免推送到主播客户端中的嘉宾交互数据存在直播数据,从而造成在主播客户端播放的直播数据产生重复的后果。
在一些实施例中,嘉宾交互数据包括嘉宾的语音交互数据,主播交互数据可以包括主播的语音交互数据。
在一些实施例中,除了直播数据可以展示在播放页面上,还可以将主播和嘉宾的身影展示在嘉宾客户端的播放页面上。在一些实施例中,也可以只将直播数据展示在播放页面上,其他的, 只通过语音的形式播放出来。
如此,本申请实施例可以选择直播数据在主播客户端,嘉宾客户端和观众客户端的播放页面上进行播放。使得大家的注意点更关注于直播数据,用户交互数据可以通过语音的方式传输,从而达到满足用户的交互需求的目的。
在一些实施例中,嘉宾客户端在接收到直播数据后,可以调用直播播放器在播放页面上播放直播数据,并接收主播客户端发送的主播交互数据。此外,嘉宾客户端可以启动音频处理模块(Audio Process Module,APM)接管直播数据和嘉宾交互数据,基于直播数据对嘉宾交互数据进行回声消除处理,得到消除第二回声数据后的嘉宾交互数据。从而避免麦克风采集到直播数据回播给嘉宾客户端。
在一些实施例中,嘉宾客户端和主播客户端可以是基于低延时音频服务进行交流。
综上,主播客户端,嘉宾客户端和观众客户端可以共同在虚拟直播间观看直播数据,并基于该直播数据,使得嘉宾和主播进行实时交流。此外,嘉宾的个数可以是多个且可变的,观众可以被主播调整至嘉宾的身份。同时,用户交互数据能够在观众客户端和直播数据保持良好的同步,进而保证交流能够良好的进行,同时比一些技术中的文字交流更能有效率。
图7是根据一示例性实施例示出的一种基于播放对象的交互装置框图。参照图7,该装置包括显示模块701、第一播放模块702和第二播放模块703。
显示模块701,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;
第一播放模块702,被配置为响应于检测到虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在虚拟空间中播放用户交互数据;用户交互数据是基于主播和嘉宾的交互行为生成的数据;
第二播放模块703,被配置为调用直播播放器在虚拟空间中播放直播数据;
其中,直播数据和用户交互数据保持同步播放。
在一些实施例中,该基于播放对象的交互装置还包括:
速度调节模块,被配置为响应于检测到播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户交互数据保持同步播放。
在一些实施例中,播放页面包括播放速度调节控件,速度调节模块,被配置为:
响应于检测到播放速度调节控件触发的播放速度调节指令,根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度。
在一些实施例中,速度调节模块,被配置为:
接收服务器发送的播放速度调节指令;
根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度。
在一些实施例中,用户交互数据包括用户的语音交互数据,速度调节模块,被配置为:
根据播放速度调节指令中携带的速度调节信息调节直播数据的播放速度,使得直播数据和用户的语音交互数据保持同步播放。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。图8是根据一示例性实施例示出的一种基于播放对象的交互装置框图。参照图8,该装置包括显示模块801、第一播放模块802和第二播放模块803。
显示模块801,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;虚拟空间包含有直播数据播放控件;
第一播放模块802,被配置为响应于检测到直播数据播放控件触发的直播数据播放指令,调用直播播放器在虚拟空间中播放直播数据;
第二播放模块803,被配置为响应于检测到嘉宾端发送的嘉宾交互数据,在虚拟空间中播放嘉宾交互数据。
在一些实施例中,该基于播放对象的交互装置还包括:
回声处理模块,被配置为响应于检测到主播交互数据,基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
合流模块,被配置为对消除第一回声数据后的主播交互数据和嘉宾交互数据进行合流处理,得到用户交互数据;
发送模块,被配置为将用户交互数据提供服务器。
在一些实施例中,消除第一回声数据后的主播交互数据包括主播的语音交互数据,嘉宾交互 数据包括嘉宾的语音交互数据;
合流模块,被配置为:
对主播的语音交互数据和嘉宾的语音交互数据进行语音合流处理,得到用户的语音交互数据。
在一些实施例中,该基于播放对象的交互装置还包括:
回声处理模块,被配置为响应于检测到主播交互数据,基于直播数据对主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
时间戳确定模块,被配置为确定消除第一回声数据后的主播交互数据的第一时间戳;确定嘉宾交互数据的第二时间戳;
数据获取模块,被配置为基于第一时间戳和第二时间戳对消除第一回声数据后的主播交互数据和嘉宾交互数据进行对齐处理,得到用户交互数据;
发送模块,被配置为将用户交互数据提供至服务器。
在一些实施例中,该基于播放对象的交互装置还包括:
录制模块,被配置为对播放的嘉宾交互数据和主播交互数据进行录制,得到用户交互数据;
回声处理模块,被配置为基于直播数据对用户交互数据中的主播交互数据进行回声消除处理,得到消除第一回声数据后的用户交互数据;
发送模块,被配置为将消除第一回声数据后的用户交互数据提供至服务器。
在一些实施例中,第一回声数据是由声音采集装置采集直播数据生成的。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
图9是根据一示例性实施例示出的一种基于播放对象的交互装置框图。参照图9,该装置包括显示模块901、第一播放模块902和第二播放模块903。
显示模块901,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;
第一播放模块902,被配置为响应于检测到虚拟空间对应的直播数据,调用直播播放器在虚拟空间中播放直播数据;
第二播放模块903,被配置为响应于检测到主播端发送的主播交互数据,在虚拟空间中播放主播交互数据。
在一些实施例中,该基于播放对象的交互装置还包括:
回声处理模块,被配置为响应于检测到嘉宾交互数据,基于直播数据对嘉宾交互数据进行回声消除处理,得到消除第二回声数据后的嘉宾交互数据;
发送模块,被配置为将消除第二回声数据后的嘉宾交互数据发送至服务器。
在一些实施例中,第二回声数据是由声音采集装置采集直播数据生成的。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
图10是根据一示例性实施例示出的一种用于推荐的电子设备1000的框图。
该电子设备可以是主播客户端,嘉宾客户端或者观众客户端,其内部结构图可以如图10所示。该电子设备包括通过***总线连接的处理器、存储器和网络接口。其中,该电子设备的处理器用于提供计算和控制能力。该电子设备的存储器包括非易失性存储介质、内存储器。该非易失性存储介质存储有操作***和计算机程序。该内存储器为非易失性存储介质中的操作***和计算机程序的运行提供环境。该电子设备的网络接口用于与外部的终端通过网络连接通信。该计算机程序被处理器执行时以实现一种基于播放对象的交互方法。
本领域技术人员可以理解,图10中示出的结构,仅仅是与本公开方案相关的部分结构的框图,并不构成对本公开方案所应用于其上的电子设备的限定,具体的电子设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在示例性实施例中,还提供了一种电子设备,包括:处理器;用于存储该处理器可执行指令的存储器;其中,该处理器被配置为执行该指令,以实现如本公开实施例中的基于播放对象的交互方法。
在示例性实施例中,还提供了一种非易失性计算机可读存储介质,当该计算机可读存储介质中的指令由电子设备的处理器执行时,使得电子设备能够执行本公开实施例中的基于播放对象的交互方法。
在示例性实施例中,还提供了一种计算机程序产品,计算机程序产品包括计算机程序,计算机程序存储在可读存储介质中,计算机设备的至少一个处理器从可读存储介质读取并执行计算机程序,使得计算机设备执行本公开实施例的基于播放对象的交互方法。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,该计算机程序可存储于一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
本公开所有实施例均可以单独被执行,也可以与其他实施例相结合被执行,均视为本公开要求的保护范围。

Claims (31)

  1. 一种基于播放对象的交互方法,包括:
    在播放页面显示基于第一用户账号创建的虚拟空间;
    响应于检测到所述虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在所述虚拟空间中播放所述用户交互数据;所述用户交互数据是基于主播和嘉宾的交互行为生成的数据;
    调用直播播放器在所述虚拟空间中播放所述直播数据;
    其中,所述直播数据和所述用户交互数据保持同步播放。
  2. 根据权利要求1所述的基于播放对象的交互方法,还包括:
    响应于检测到播放速度调节指令,根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度,使得所述直播数据和所述用户交互数据保持同步播放。
  3. 根据权利要求2所述的基于播放对象的交互方法,其中,所述播放页面包括播放速度调节控件,所述响应于检测到播放速度调节指令,根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度包括:
    响应于检测到所述播放速度调节控件触发的所述播放速度调节指令,根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度。
  4. 根据权利要求2所述的基于播放对象的交互方法,其中,所述响应于检测到播放速度调节指令,根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度包括:
    接收服务器发送的所述播放速度调节指令;
    根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度。
  5. 根据权利要求2至4任一所述的基于播放对象的交互方法,其中,所述用户交互数据包括用户的语音交互数据,所述根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度,使得所述直播数据和所述用户交互数据保持同步播放包括:
    根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度,使得所述直播数据和所述用户的语音交互数据保持同步播放。
  6. 一种基于播放对象的交互方法,包括:
    在播放页面显示基于第一用户账号创建的虚拟空间;所述虚拟空间包含有直播数据播放控件;
    响应于检测到所述直播数据播放控件触发的直播数据播放指令,调用直播播放器在所述虚拟空间中播放所述直播数据;
    响应于检测到嘉宾端发送的嘉宾交互数据,在所述虚拟空间中播放所述嘉宾交互数据。
  7. 根据权利要求6所述的基于播放对象的交互方法,还包括:
    响应于检测到主播交互数据,基于所述直播数据对所述主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
    对所述消除第一回声数据后的主播交互数据和所述嘉宾交互数据进行合流处理,得到用户交互数据;
    将所述用户交互数据提供至服务器。
  8. 根据权利要求7所述的基于播放对象的交互方法,其中,所述消除第一回声数据后的主播交互数据包括所述主播的语音交互数据,所述嘉宾交互数据包括嘉宾的语音交互数据;
    所述对所述消除第一回声数据后的主播交互数据和所述嘉宾交互数据进行合流处理,得到用户交互数据包括:
    对所述主播的语音交互数据和所述嘉宾的语音交互数据进行语音合流处理,得到用户的语音交互数据。
  9. 根据权利要求6所述的基于播放对象的交互方法,还包括:
    响应于检测到主播交互数据,基于所述直播数据对所述主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
    确定所述消除第一回声数据后的主播交互数据的第一时间戳;
    确定所述嘉宾交互数据的第二时间戳;
    基于所述第一时间戳和所述第二时间戳对所述消除第一回声数据后的主播交互数据和所述嘉宾交互数据进行对齐处理,得到用户交互数据;
    将所述用户交互数据提供至服务器。
  10. 根据权利要求6所述的基于播放对象的交互方法,还包括:
    对播放的所述嘉宾交互数据和主播交互数据进行录制,得到用户交互数据;
    基于所述直播数据对所述用户交互数据中的所述主播交互数据进行回声消除处理,得到消除 第一回声数据后的用户交互数据;
    将所述消除第一回声数据后的用户交互数据提供至服务器。
  11. 根据权利要求7-10任一所述的基于播放对象的交互方法,其中,所述第一回声数据是由声音采集装置采集所述直播数据生成的。
  12. 一种基于播放对象的交互方法,包括:
    在播放页面显示基于第一用户账号创建的虚拟空间;
    响应于检测到所述虚拟空间对应的直播数据,调用直播播放器在所述虚拟空间中播放所述直播数据;
    响应于检测到主播端发送的主播交互数据,在所述虚拟空间中播放所述主播交互数据。
  13. 根据权利要求12所述的基于播放对象的交互方法,还包括:
    响应于检测到嘉宾交互数据,基于所述直播数据对所述嘉宾交互数据进行回声消除处理,得到消除第二回声数据后的嘉宾交互数据;
    将所述消除第二回声数据后的嘉宾交互数据发送至服务器。
  14. 根据权利要求13所述的基于播放对象的交互方法,其中,所述第二回声数据是由声音采集装置采集所述直播数据生成的。
  15. 一种基于播放对象的交互装置,包括:
    显示模块,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;
    第一播放模块,被配置为响应于检测到所述虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在所述虚拟空间中播放所述用户交互数据;所述用户交互数据是基于主播和嘉宾的交互行为生成的数据;
    第二播放模块,被配置为调用直播播放器在所述虚拟空间中播放所述直播数据;
    其中,所述直播数据和所述用户交互数据保持同步播放。
  16. 根据权利要求15所述的基于播放对象的交互装置,还包括:
    速度调节模块,被配置为响应于检测到播放速度调节指令,根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度,使得所述直播数据和所述用户交互数据保持同步播放。
  17. 根据权利要求16所述的基于播放对象的交互装置,其中,所述播放页面包括播放速度调节控件,所述速度调节模块,被配置为:
    响应于检测到所述播放速度调节控件触发的所述播放速度调节指令,根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度。
  18. 根据权利要求16所述的基于播放对象的交互装置,其中,所述速度调节模块,被配置为:
    接收服务器发送的所述播放速度调节指令;
    根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度。
  19. 根据权利要求16至18任一所述的基于播放对象的交互装置,其中,所述用户交互数据包括用户的语音交互数据,所述速度调节模块,被配置为:
    根据所述播放速度调节指令中携带的速度调节信息调节所述直播数据的播放速度,使得所述直播数据和所述用户的语音交互数据保持同步播放。
  20. 一种基于播放对象的交互装置,包括:
    显示模块,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;所述虚拟空间包含有直播数据播放控件;
    第一播放模块,被配置为响应于检测到所述直播数据播放控件触发的直播数据播放指令,调用直播播放器在所述虚拟空间中播放所述直播数据;
    第二播放模块,被配置为响应于检测到嘉宾端发送的嘉宾交互数据,在所述虚拟空间中播放所述嘉宾交互数据。
  21. 根据权利要求20所述的基于播放对象的交互装置,还包括:
    回声处理模块,被配置为响应于检测到主播交互数据,基于所述直播数据对所述主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
    合流模块,被配置为对所述消除第一回声数据后的主播交互数据和所述嘉宾交互数据进行合流处理,得到用户交互数据;
    发送模块,被配置为将所述用户交互数据提供至服务器。
  22. 根据权利要求21所述的基于播放对象的交互装置,其中,所述消除第一回声数据后的主播交互数据包括所述主播的语音交互数据,所述嘉宾交互数据包括嘉宾的语音交互数据;
    所述合流模块,被配置为:
    对所述主播的语音交互数据和所述嘉宾的语音交互数据进行语音合流处理,得到用户的语音交互数据。
  23. 根据权利要求20所述的基于播放对象的交互装置,还包括:
    回声处理模块,被配置为响应于检测到主播交互数据,基于所述直播数据对所述主播交互数据进行回声消除处理,得到消除第一回声数据后的主播交互数据;
    时间戳确定模块,被配置为确定所述消除第一回声数据后的主播交互数据的第一时间戳;确定所述嘉宾交互数据的第二时间戳;
    数据获取模块,被配置为基于所述第一时间戳和所述第二时间戳对所述消除第一回声数据后的主播交互数据和所述嘉宾交互数据进行对齐处理,得到用户交互数据;
    发送模块,被配置为将所述用户交互数据提供至服务器。
  24. 根据权利要求20所述的基于播放对象的交互装置,还包括:
    录制模块,被配置为对播放的所述嘉宾交互数据和主播交互数据进行录制,得到用户交互数据;
    回声处理模块,被配置为基于所述直播数据对所述用户交互数据中的所述主播交互数据进行回声消除处理,得到消除第一回声数据后的用户交互数据;
    发送模块,被配置为将所述消除第一回声数据后的用户交互数据提供至服务器。
  25. 根据权利要求21至24任一所述的基于播放对象的交互装置,其中,所述第一回声数据是由声音采集装置采集所述直播数据生成的。
  26. 一种基于播放对象的交互装置,包括:
    显示模块,被配置为在播放页面显示基于第一用户账号创建的虚拟空间;
    第一播放模块,被配置为响应于检测到所述虚拟空间对应的直播数据,调用直播播放器在所述虚拟空间中播放所述直播数据;
    第二播放模块,被配置为响应于检测到主播端发送的主播交互数据,在所述虚拟空间中播放所述主播交互数据。
  27. 根据权利要求26所述的基于播放对象的交互装置,还包括:
    回声处理模块,被配置为响应于检测到嘉宾交互数据,基于所述直播数据对所述嘉宾交互数据进行回声消除处理,得到消除第二回声数据后的嘉宾交互数据;
    发送模块,被配置为将所述消除第二回声数据后的嘉宾交互数据发送至服务器。
  28. 根据权利要求27所述的基于播放对象的交互方法,其中,所述第二回声数据是由声音采集装置采集所述直播数据生成的。
  29. 一种电子设备,包括:
    处理器;
    用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令,以实现以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;
    响应于检测到所述虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在所述虚拟空间中播放所述用户交互数据;所述用户交互数据是基于主播和嘉宾的交互行为生成的数据;
    调用直播播放器在所述虚拟空间中播放所述直播数据;
    其中,所述直播数据和所述用户交互数据保持同步播放;
    或者实现以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;所述虚拟空间包含有直播数据播放控件;
    响应于检测到所述直播数据播放控件触发的直播数据播放指令,调用直播播放器在所述虚拟空间中播放所述直播数据;
    响应于检测到嘉宾端发送的嘉宾交互数据,在所述虚拟空间中播放所述嘉宾交互数据;
    或者实现以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;
    响应于检测到所述虚拟空间对应的直播数据,调用直播播放器在所述虚拟空间中播放所述直播数据;
    响应于检测到主播端发送的主播交互数据,在所述虚拟空间中播放所述主播交互数据。
  30. 一种非易失性计算机可读存储介质,其中,当所述计算机可读存储介质中的指令由电子设备的处理器执行时,使得所述电子设备能够执行以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;
    响应于检测到所述虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在所述虚拟空间中播放所述用户交互数据;所述用户交互数据是基于主播和嘉宾的交互行为生成的数据;
    调用直播播放器在所述虚拟空间中播放所述直播数据;
    其中,所述直播数据和所述用户交互数据保持同步播放;
    或者执行以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;所述虚拟空间包含有直播数据播放控件;
    响应于检测到所述直播数据播放控件触发的直播数据播放指令,调用直播播放器在所述虚拟空间中播放所述直播数据;
    响应于检测到嘉宾端发送的嘉宾交互数据,在所述虚拟空间中播放所述嘉宾交互数据;
    或者执行以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;
    响应于检测到所述虚拟空间对应的直播数据,调用直播播放器在所述虚拟空间中播放所述直播数据;
    响应于检测到主播端发送的主播交互数据,在所述虚拟空间中播放所述主播交互数据。
  31. 一种计算机程序产品,其中,所述计算机程序产品包括计算机程序,所述计算机程序存储在可读存储介质中,计算机设备的至少一个处理器从所述可读存储介质读取并执行所述计算机程序,使得所述计算机设备执行以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;
    响应于检测到所述虚拟空间对应的播放数据中包含有直播数据和用户交互数据,在所述虚拟空间中播放所述用户交互数据;所述用户交互数据是基于主播和嘉宾的交互行为生成的数据;
    调用直播播放器在所述虚拟空间中播放所述直播数据;
    其中,所述直播数据和所述用户交互数据保持同步播放;
    或者执行以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;所述虚拟空间包含有直播数据播放控件;
    响应于检测到所述直播数据播放控件触发的直播数据播放指令,调用直播播放器在所述虚拟空间中播放所述直播数据;
    响应于检测到嘉宾端发送的嘉宾交互数据,在所述虚拟空间中播放所述嘉宾交互数据;
    或者执行以下步骤:
    在播放页面显示基于第一用户账号创建的虚拟空间;
    响应于检测到所述虚拟空间对应的直播数据,调用直播播放器在所述虚拟空间中播放所述直播数据;
    响应于检测到主播端发送的主播交互数据,在所述虚拟空间中播放所述主播交互数据。
PCT/CN2022/079584 2021-07-05 2022-03-07 基于播放对象的交互方法及装置 WO2023279745A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110757594.0 2021-07-05
CN202110757594.0A CN113645472B (zh) 2021-07-05 2021-07-05 一种基于播放对象的交互方法、装置、电子设备及存储介质

Publications (1)

Publication Number Publication Date
WO2023279745A1 true WO2023279745A1 (zh) 2023-01-12

Family

ID=78416680

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/079584 WO2023279745A1 (zh) 2021-07-05 2022-03-07 基于播放对象的交互方法及装置

Country Status (2)

Country Link
CN (1) CN113645472B (zh)
WO (1) WO2023279745A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113645472B (zh) * 2021-07-05 2023-04-28 北京达佳互联信息技术有限公司 一种基于播放对象的交互方法、装置、电子设备及存储介质
CN114125492B (zh) * 2022-01-24 2022-07-15 阿里巴巴(中国)有限公司 直播内容生成方法以及装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130017891A1 (en) * 2011-07-15 2013-01-17 FunGoPlay LLC Systems and methods for providing virtual incentives for real-world activities
CN107547947A (zh) * 2017-08-24 2018-01-05 北京小米移动软件有限公司 直播间中虚拟礼物的赠送方法及装置
CN107750014A (zh) * 2017-09-25 2018-03-02 迈吉客科技(北京)有限公司 一种连麦直播方法和***
CN109874021A (zh) * 2017-12-04 2019-06-11 腾讯科技(深圳)有限公司 直播互动方法、装置及***
CN110910860A (zh) * 2019-11-29 2020-03-24 北京达佳互联信息技术有限公司 线上ktv实现方法、装置、电子设备及存储介质
CN112135159A (zh) * 2020-09-18 2020-12-25 湖南联盛网络科技股份有限公司 公屏演播方法、装置、智能终端及储存介质
CN113645472A (zh) * 2021-07-05 2021-11-12 北京达佳互联信息技术有限公司 一种基于播放对象的交互方法、装置、电子设备及存储介质

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140195675A1 (en) * 2013-01-09 2014-07-10 Giga Entertainment Media Inc. Simultaneous Content Data Streaming And Interaction System
CN106131583A (zh) * 2016-06-30 2016-11-16 北京小米移动软件有限公司 一种直播处理方法、装置、终端设备及***
CN109040851B (zh) * 2018-08-06 2021-02-12 广州方硅信息技术有限公司 基于直播进行游戏的延迟处理方法、***、服务器及计算机可读存储介质
CN110267064B (zh) * 2019-06-12 2021-11-12 百度在线网络技术(北京)有限公司 音频播放状态处理方法、装置、设备及存储介质
CN110650353B (zh) * 2019-09-25 2020-12-04 广州华多网络科技有限公司 多人连麦混画方法及装置、存储介质及电子设备
US11265607B2 (en) * 2019-10-01 2022-03-01 Synchronicity Finance Llc Systems, methods, and apparatuses for implementing a broadcast integration platform with real-time interactive content synchronization
CN110958464A (zh) * 2019-12-11 2020-04-03 北京达佳互联信息技术有限公司 直播数据处理方法、装置、服务器、终端及存储介质
CN111343476A (zh) * 2020-03-06 2020-06-26 北京达佳互联信息技术有限公司 视频共享方法、装置、电子设备及存储介质
CN111355973B (zh) * 2020-03-09 2021-10-15 北京达佳互联信息技术有限公司 数据播放方法、装置、电子设备及存储介质
CN111787353A (zh) * 2020-05-13 2020-10-16 北京达佳互联信息技术有限公司 多方音频的处理方法、装置、电子设备及存储介质
CN111918090B (zh) * 2020-08-10 2023-03-28 广州繁星互娱信息科技有限公司 直播画面显示方法、装置、终端及存储介质
CN111970524B (zh) * 2020-08-14 2022-03-04 北京字节跳动网络技术有限公司 交互类直播连麦的控制方法、装置、***、设备及介质
CN112337104A (zh) * 2020-11-05 2021-02-09 北京字节跳动网络技术有限公司 直播数据处理方法、装置、电子设备及可读介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130017891A1 (en) * 2011-07-15 2013-01-17 FunGoPlay LLC Systems and methods for providing virtual incentives for real-world activities
CN107547947A (zh) * 2017-08-24 2018-01-05 北京小米移动软件有限公司 直播间中虚拟礼物的赠送方法及装置
CN107750014A (zh) * 2017-09-25 2018-03-02 迈吉客科技(北京)有限公司 一种连麦直播方法和***
CN109874021A (zh) * 2017-12-04 2019-06-11 腾讯科技(深圳)有限公司 直播互动方法、装置及***
CN110910860A (zh) * 2019-11-29 2020-03-24 北京达佳互联信息技术有限公司 线上ktv实现方法、装置、电子设备及存储介质
CN112135159A (zh) * 2020-09-18 2020-12-25 湖南联盛网络科技股份有限公司 公屏演播方法、装置、智能终端及储存介质
CN113645472A (zh) * 2021-07-05 2021-11-12 北京达佳互联信息技术有限公司 一种基于播放对象的交互方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN113645472B (zh) 2023-04-28
CN113645472A (zh) 2021-11-12

Similar Documents

Publication Publication Date Title
US11616818B2 (en) Distributed control of media content item during webcast
US10897637B1 (en) Synchronize and present multiple live content streams
WO2023279745A1 (zh) 基于播放对象的交互方法及装置
US9497416B2 (en) Virtual circular conferencing experience using unified communication technology
US8791977B2 (en) Method and system for presenting metadata during a videoconference
US11025967B2 (en) Method for inserting information push into live video streaming, server, and terminal
US20120242695A1 (en) Augmented Reality System for Public and Private Seminars
WO2019114330A1 (zh) 一种视频播放方法、装置和终端设备
US20220385712A1 (en) Method and device for operating virtual space
WO2023098011A1 (zh) 视频播放方法及电子设备
CN111918705A (zh) 将会话内容同步到外部内容
US10698744B2 (en) Enabling third parties to add effects to an application
US10739944B1 (en) System and method for generating user interface data to provide indicators associated with content
JP2021534606A (ja) デジタルコンテンツ消費の同期
CN112565657B (zh) 通话互动方法、装置、设备及存储介质
US9264655B2 (en) Augmented reality system for re-casting a seminar with private calculations
CN111475801A (zh) 权限管理方法及***
WO2022143255A1 (zh) 实时信息交互方法及装置、设备、存储介质
CN112004100B (zh) 将多路音视频源集合成单路音视频源的驱动方法
CN114554231A (zh) 一种信息显示方法、装置、电子设备及存储介质
US11134310B1 (en) Custom content service
CN115277650B (zh) 投屏显示控制方法、电子设备及相关装置
US20220021863A1 (en) Methods and systems for facilitating population of a virtual space around a 2d content
US20120185890A1 (en) Synchronized video presentation
TW202236845A (zh) 視頻顯示方法、裝置、設備和儲存媒體

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22836505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE