CN118138824A - Interaction method, mobile terminal, playing terminal and interaction system - Google Patents

Interaction method, mobile terminal, playing terminal and interaction system Download PDF

Info

Publication number
CN118138824A
CN118138824A CN202410199134.4A CN202410199134A CN118138824A CN 118138824 A CN118138824 A CN 118138824A CN 202410199134 A CN202410199134 A CN 202410199134A CN 118138824 A CN118138824 A CN 118138824A
Authority
CN
China
Prior art keywords
mobile terminal
terminal
metadata information
playing
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410199134.4A
Other languages
Chinese (zh)
Inventor
庞超
焦健波
周帆
张南鹏
曹徐洋
万玉鹏
陈芳
顿子振
杨娜
刘晓惠
吴雅文
张东辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Media Group
Original Assignee
China Media Group
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Media Group filed Critical China Media Group
Priority to CN202410199134.4A priority Critical patent/CN118138824A/en
Publication of CN118138824A publication Critical patent/CN118138824A/en
Pending legal-status Critical Current

Links

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application provides an interaction method, a mobile terminal, a playing terminal and an interaction system, wherein audio metadata information of a program stream is obtained through the mobile terminal, interaction control content in the audio metadata information is displayed on the mobile terminal, and then audio parameters of the playing terminal are set through operation of a user on the mobile terminal, so that the purpose of audio rendering while the program stream is played is achieved, and the audio-visual experience of the user is improved.

Description

Interaction method, mobile terminal, playing terminal and interaction system
Technical Field
The present application relates to the field of information interaction control, and in particular, to an interaction method, a mobile terminal, a playing terminal, and an interaction system.
Background
With the rapid development of mobile media and the emergence of audio technologies represented by three-dimensional cyanine color sounds, some video media companies are compliant with the development of the era, and the brand new content services such as television projection and the like, which are attached to users, are sequentially integrated into various types of video playing media. Along with diversified demands generated when users use screen projection services, with adoption of advanced audio technologies, in audio rendering, rendering effects need to be controlled through metadata, and control and operation on audio playing by using metadata information are not considered in the existing screen projection technologies, so that the advanced audio technologies cannot be applied to the existing screen projection technologies, rendering control on three-dimensional sound audio cannot be realized, and experience of users in the screen projection viewing process is reduced.
Disclosure of Invention
In order to solve one of the technical defects, an embodiment of the application provides an interaction method, a mobile terminal, a playing terminal and an interaction system.
An embodiment of the present application provides an interaction method, where the method is applied to a mobile terminal, and the method includes:
connecting a playing terminal and pushing the program stream to the playing terminal;
acquiring audio metadata information in a program stream which is obtained by a playing terminal in a parsing way when the program stream is played, wherein the audio metadata information contains interactive control content;
Displaying the interactive control content in a display interface;
And receiving an operation instruction input by a user through a display interface, and sending the operation instruction to a playing terminal so that the playing terminal can set audio parameters in a program stream according to the operation instruction.
A second aspect of an embodiment of the present application provides an interaction method, where the method is applied to a playing terminal, where the playing terminal includes a player and a screen-casting application installed in the player, and the method includes:
the screen-throwing application program is connected with the mobile terminal and receives the program stream pushed by the mobile terminal;
the player plays the program stream and analyzes the program stream to obtain audio metadata information in the program stream, wherein the audio metadata information contains interaction control content;
The player synchronizes the audio metadata information to a screen throwing application program, and the screen throwing application program sends the audio metadata information to a mobile terminal;
The screen throwing application program receives an operation instruction which is sent by the mobile terminal and input by a user according to the interactive control content, converts the operation instruction into metadata setting information and then sends the metadata setting information to the player;
And the player sets the audio parameters in the program stream according to the metadata setting information.
A third aspect of the embodiment of the present application provides a mobile terminal, where the mobile terminal includes a processor, and the processor has an operation instruction executable by the processor built therein, so as to execute the interaction method according to the first aspect of the embodiment of the present application.
A fourth aspect of the present application provides a playing terminal, where the playing terminal includes a processor, and the processor is built with operation instructions executable by the processor to execute the interaction method according to the second aspect of the present application.
A fifth aspect of the embodiment of the present application provides an interaction system, where the interaction system includes a mobile terminal according to the third aspect of the embodiment of the present application and a playing terminal according to the fourth aspect of the embodiment of the present application.
By adopting the interaction method provided by the embodiment of the application, the mobile terminal is used for acquiring the audio metadata information of the program stream, the interaction control content in the audio metadata information is displayed on the mobile terminal, and then the audio parameter setting of the playing terminal is realized by the operation of the user on the mobile terminal, so that the purpose of audio rendering while the program stream is played is achieved, and the audio-visual experience of the user is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of an interaction method provided in embodiment 1 of the present application;
Fig. 2 is a display schematic diagram of a display interface of a mobile terminal according to embodiment 1 of the present application;
Fig. 3 is a schematic diagram of an interaction method according to embodiment 2 of the present application;
fig. 4 is a schematic diagram of an interactive system according to embodiment 5 of the present application.
Detailed Description
In order to make the technical solutions and advantages of the embodiments of the present application more apparent, the following detailed description of exemplary embodiments of the present application is provided in conjunction with the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application and not exhaustive of all embodiments. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
Example 1
As shown in fig. 1, this embodiment proposes an interaction method, which is applied to a mobile terminal, and includes:
S101, connecting the playing terminal and pushing the program stream to the playing terminal.
Specifically, in this embodiment, before the user uses the mobile terminal, the user first needs to play the video content to be watched on the mobile terminal, and a corresponding program stream is generated while playing the video content. The video content playing mode may be implemented by video application software installed in the mobile terminal, and the embodiment is not limited to the type of the video application software, and mainly uses habit of users. When the mobile terminal is connected with the playing terminal, the mobile terminal pushes the program stream of the video content being played to the playing terminal.
In this process, the mobile terminal needs to be connected to the playing terminal in advance, so that the mobile terminal can push the program stream to the playing terminal. In this embodiment, the user may search for the play terminal through the mobile terminal. It should be noted that, since the mobile terminal and the playing terminal generally adopt wireless connection, it is required to ensure that the mobile terminal and the playing terminal are located in the same network, and in the case of home use, the mobile terminal and the playing terminal are located in the same home network environment, so as to realize connection between the mobile terminal and the playing terminal. Meanwhile, the connection relation between the mobile terminal and the playing terminal needs to be identified, and the situation that the playing terminal is an equipment which can not be put on a screen or the playing terminal does not allow some mobile terminals to be connected is not excluded in the connection process. Therefore, only when the connection relationship between the playing terminal and the mobile terminal is connectable, the mobile terminal can be connected with the playing terminal, and then the mobile terminal pushes the program stream to the playing terminal.
The mobile terminal in this embodiment performs a series of processes of searching for the playing terminal, connecting the playing terminal, and pushing the program stream to the playing terminal based on the DLNA protocol, and can more conveniently implement interconnection and media sharing between the mobile terminal and the playing terminal in the home network environment by using the DLNA protocol.
S102, acquiring audio metadata information in the program stream obtained by the analysis of the playing terminal when the program stream is played.
Specifically, in this embodiment, after the playing terminal obtains the program stream pushed by the mobile terminal, the playing terminal may play the video content corresponding to the program stream in the player of the playing terminal. And the playing terminal analyzes the program stream while playing the program stream, so that the audio metadata information is obtained. The audio metadata information includes, but is not limited to, encoding information, sound objects, rendering settings, and the like. The audio metadata information in this embodiment may be three-dimensional cyanine color metadata information. The three-dimensional cyanine color sound is a three-dimensional sound technology which is independently researched and developed in China, supports mainstream three-dimensional sound coding, is compatible with mono, stereo, surround sound and three-dimensional sound, can accurately place and move sound at any position in a three-dimensional space, accurately describes the position, size, track, time and length of each sound, and can be applied to multiple scenes such as home environments, cinema environments, concert, sports events, individuals, AR/VR and vehicle-mounted scenes.
After the playing terminal analyzes and obtains the audio metadata information, the audio metadata information is sent to the mobile terminal when the mobile terminal requests. This procedure, the present embodiment is implemented under the home network-based DLNA protocol. Firstly, the embodiment defines the audio metadata information acquisition, including the extended DLNA standard interface (getPositionInfo), the newly added extended tag (audioData) and the custom parameters. And then the mobile terminal actively requests the playing terminal to send the audio metadata information through the extended DLNA standard interface (getPositionInfo) according to the preset request frequency. The request frequency defaults to one request per second, and of course, the parameter can be adjusted according to the actual situation. Finally, the playing terminal can send the audio metadata information to the mobile terminal through the extended DLNA standard interface (getPositionInfo) when the mobile terminal sends a request.
To further explain the audio metadata information acquisition process of the present embodiment, an example of providing an end audio metadata acquisition definition of the present embodiment is as follows:
Wherein, parameter name: audio metadata information (audioData);
Parameter type: string;
Parameter data structure: json;
Detailed description of parameters: commentary (explanation), location (audioLocation), sound effect (soundEffect), volume (volume).
In the above examples, interactive control content in the audio metadata information, that is, parameter content requiring user participation in setting, including illustration, position, sound effect, and volume, is included. The interactive control content can be adjusted or added and deleted according to the actual video playing content and the actual use requirement of the user. When the playing terminal sends the audio metadata information to the mobile terminal, the interactive control content contained in the playing terminal is also sent to the mobile terminal, so that the subsequent user can participate in audio control.
And S103, displaying the interaction control content in a display interface.
Specifically, after the mobile terminal obtains the audio metadata information sent by the playing terminal, the mobile terminal displays the interactive control content included in the audio metadata information in a display interface of the terminal device, as shown in fig. 2, and the above example includes illustration, position, sound effect and volume. The above may be presented in a display interface using a visual effect to facilitate user viewing and selection.
S104, receiving an operation instruction input by a user through a display interface, and sending the operation instruction to a playing terminal so that the playing terminal can set audio parameters in a program stream according to the operation instruction.
In particular, in this embodiment, the mobile terminal is a device capable of providing an input path for a user, including but not limited to a touch screen device, a key device, a voice device, and the like, which is not particularly limited. Taking the mobile terminal as a touch screen device for example, a user can input information such as characters, voices and the like through a display interface of the mobile terminal, and can also adjust related contents on the display interface in a point-touch mode.
In this embodiment, when the mobile terminal displays the interactive control content in the display interface, the user may adjust the interactive control content according to his own requirement, taking fig. 2 as an example, may adjust the comment audio, including switching of the comment type, adjustment of the volume level and the channel position, and may also adjust the sound effect, including switching of the sound effect type, adjustment of the volume level, and so on. The mobile terminal converts the adjustment of the interactive control content by the user into an operation instruction and then sends the operation instruction to the playing terminal.
When the mobile terminal sends an operation instruction to the playing terminal, the embodiment also defines the interaction of the operation instruction, including extending a DLNA standard interface (setavtransport uri) and customizing parameters in its existing tag (< CurrentURIMetaData >), taking setting and explaining as an example, the embodiment provides a data format example as follows:
Wherein, parameter name: event, type Object;
Parameter name: type, type String. The custom instruction information may correspond to an instruction type according to the audio metadata information, for example: selecting a comment (explanation), setting a position (audioLocation), setting a sound effect (soundEffect), and setting a volume.
Parameter name: value, type String. And customizing instruction information corresponding to the type. Such as: the selection of the comment (0, 1, 2), the setting of the position (value between 0 and 100), the setting of the sound effect (0, 1, 2), the setting of the volume (value between 0 and 100).
After the mobile terminal obtains the corresponding operation of the user on the display interface and converts the operation into an operation instruction, the mobile terminal sends the operation instruction to the playing terminal through an extended DLNA standard interface (SetAVTransportURI). The playing terminal can render different sound objects and set an audio-visual environment according to the operation instruction, so that different audio rendering modes of screen throwing in the home environment are realized.
According to the method, the device and the system, the audio metadata information of the program stream is obtained through the mobile terminal, the interactive control content in the audio metadata information is displayed on the mobile terminal, and then the audio parameter setting of the playing terminal is achieved through the operation of the user on the mobile terminal, so that the purpose of audio rendering while the program stream is played is achieved, and the audio-visual experience of the user is improved.
Example 2
As shown in fig. 3, the present embodiment proposes an interaction method applied to a playback terminal including a player and a projection application installed in the player. The method comprises the following steps:
S201, the screen projection application program is connected with the mobile terminal and receives the program stream pushed by the mobile terminal.
Specifically, in this embodiment, the playing terminal includes a player and a projection application installed in the player. The player is mainly responsible for playing and processing video content. The screen-throwing application program is mainly responsible for data transmission between the playing terminal and the mobile terminal. The screen-throwing application program can be pre-installed in the player, and can also provide an online downloading mode, and the embodiment is not particularly limited. The screen-throwing application program can be matched with video application software in the mobile terminal for use. When the mobile terminal searches for the connectable playing terminal, the screen-throwing application program in the playing terminal can be connected with the mobile terminal by adopting the DLNA protocol, so that the program stream of the video content played in the mobile terminal is sent to the player.
S202, the player plays the program stream and analyzes the program stream to obtain audio metadata information in the program stream.
Specifically, the player may play the program stream sent by the mobile terminal, and parse the program stream at the same time, so as to obtain audio metadata information. The audio metadata information includes, but is not limited to, encoding information, sound objects, rendering settings, and the like. The audio metadata information in this embodiment may be three-dimensional cyanine color metadata information.
In this embodiment, the audio metadata information includes interactive control content, that is, parameter content that requires the user to participate in the setting. According to the actual video playing content and the actual use requirement of the user, the interactive control content can be adjusted or added and deleted.
S203, the player synchronizes the audio metadata information to a screen throwing application program, and the screen throwing application program sends the audio metadata information to the mobile terminal.
Specifically, in this embodiment, when the player obtains the audio metadata information, the audio metadata information is synchronized to the screen-casting application program through the interface service, and when the mobile terminal requests to obtain the audio metadata information, the screen-casting application program sends the audio metadata information to the mobile terminal. This procedure, the present embodiment is implemented under the home network-based DLNA protocol. Firstly, in this embodiment, audio metadata information is first defined, and then the mobile terminal actively requests the screen-throwing application program to send the audio metadata information through the extended DLNA standard interface according to a preset request frequency. Finally, when the mobile terminal sends a request, the screen throwing application program also sends the audio metadata information to the mobile terminal through the extended DLNA standard interface. Specific content can be referred to the content described in embodiment 1, and the description of this embodiment is omitted.
S204, the screen throwing application program receives an operation instruction which is sent by the mobile terminal and input by a user according to the interactive control content, converts the operation instruction into metadata setting information and sends the metadata setting information to the player.
Specifically, in this embodiment, after the user adjusts the interactive control content on the mobile terminal according to the user's own needs, the mobile terminal sends an operation instruction of the user to the screen-throwing application program. The screen throwing application program firstly receives the operation instruction, then converts the operation instruction into metadata setting information and sends the metadata setting information to the player so that the player can perform subsequent setting.
In this process, the embodiment defines the interaction of the operation instruction, including extending the DLNA standard interface and customizing parameters in the existing label, and the specific content may refer to the content of embodiment 1, which is not described in detail.
S205, the player sets the audio parameters in the program stream according to the metadata setting information.
Specifically, after the mobile terminal obtains the corresponding operation of the user on the display interface and converts the operation into an operation instruction, the mobile terminal sends the operation instruction to the screen throwing application program through the extended DLNA standard interface. And the screen throwing application program converts the operation instruction into metadata setting information and sends the metadata setting information to the playing terminal. The playing terminal can render different sound objects and set an audio-visual environment according to the metadata setting information, so that different audio rendering modes of screen throwing in a home environment are realized.
According to the method, the device and the system, the audio metadata information of the program stream is obtained through the mobile terminal, the interactive control content in the audio metadata information is displayed on the mobile terminal, and then the audio parameter setting of the playing terminal is achieved through the operation of the user on the mobile terminal, so that the purpose of audio rendering while the program stream is played is achieved, and the audio-visual experience of the user is improved.
Example 3
The embodiment proposes a mobile terminal, which includes a processor, and the processor has an operation instruction executable by the processor, so as to execute the following steps:
connecting a playing terminal and pushing the program stream to the playing terminal;
acquiring audio metadata information in a program stream which is obtained by a playing terminal in a parsing way when the program stream is played, wherein the audio metadata information contains interactive control content;
Displaying the interactive control content in a display interface;
And receiving an operation instruction input by a user through a display interface, and sending the operation instruction to a playing terminal so that the playing terminal can set audio parameters in a program stream according to the operation instruction.
The specific working process of the mobile terminal according to the embodiment may refer to the content of embodiment 1, and this embodiment will not be described in detail.
Example 4
The embodiment provides a playing terminal, which comprises a playing terminal and a processor, wherein the processor is internally provided with a screen-throwing application program and an operation instruction which can be executed by the processor so as to execute the following steps:
the screen-throwing application program is connected with the mobile terminal and receives the program stream pushed by the mobile terminal;
the player plays the program stream and analyzes the program stream to obtain audio metadata information in the program stream, wherein the audio metadata information contains interaction control content;
The player synchronizes the audio metadata information to a screen throwing application program, and the screen throwing application program sends the audio metadata information to a mobile terminal;
The screen throwing application program receives an operation instruction which is sent by the mobile terminal and input by a user according to the interactive control content, converts the operation instruction into metadata setting information and then sends the metadata setting information to the player;
And the player sets the audio parameters in the program stream according to the metadata setting information.
The specific working process of the playing terminal provided in this embodiment may refer to the content of embodiment 2, and this embodiment will not be described in detail.
Example 5
As shown in fig. 4, this embodiment proposes an interactive system, where the interactive system includes a mobile terminal and a playing terminal, where the mobile terminal and the playing terminal can be connected under the same home network, and can cast the video content of the mobile terminal to the playing terminal for playing.
According to the method, the device and the system, the audio metadata information of the program stream is obtained through the mobile terminal, the interactive control content in the audio metadata information is displayed on the mobile terminal, and then the audio parameter setting of the playing terminal is achieved through the operation of the user on the mobile terminal, so that the purpose of audio rendering while the program stream is played is achieved, and the audio-visual experience of the user is improved.
In the present application, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; may be mechanically connected, may be electrically connected or may communicate with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (14)

1. An interaction method, characterized in that the method is applied to a mobile terminal, the method comprising:
connecting a playing terminal and pushing the program stream to the playing terminal;
acquiring audio metadata information in a program stream which is obtained by a playing terminal in a parsing way when the program stream is played, wherein the audio metadata information contains interactive control content;
Displaying the interactive control content in a display interface;
And receiving an operation instruction input by a user through a display interface, and sending the operation instruction to a playing terminal so that the playing terminal can set audio parameters in a program stream according to the operation instruction.
2. The method of claim 1, wherein the process of obtaining audio metadata information in the obtained program stream when the playback terminal plays back the program stream comprises:
Defining audio metadata information acquisition parameters;
presetting a request frequency;
And requesting the playing terminal to acquire the audio metadata information through a DLNA standard interface according to the acquisition parameters and the request frequency, wherein the audio metadata information is three-dimensional cyanine color metadata information.
3. The method of claim 2, wherein the defined acquisition parameters comprise: the DLNA standard interface, the newly added expansion tag and the custom parameters are expanded.
4. The method of claim 1, wherein the process of sending the operation instruction to the play terminal comprises:
defining interaction parameters of operation instructions;
And sending the operation instruction to a playing terminal through a DLNA standard interface according to the interaction parameter.
5. The method of claim 4, wherein the defined interaction parameters comprise: extending the DLNA standard interface and customizing parameters in existing tags.
6. The method according to claim 1, wherein the method further comprises:
Searching a playing terminal in a network, and identifying the connection relation between the playing terminal and a mobile terminal, wherein the playing terminal and the mobile terminal are in the same network;
when the connection relation between the playing terminal and the mobile terminal is connectable, the playing terminal is connected with the mobile terminal, and the mobile terminal pushes the program stream to the playing terminal.
7. An interaction method, wherein the method is applied to a playing terminal, the playing terminal comprises a player and a screen projection application program installed in the player, and the method comprises the following steps:
the screen-throwing application program is connected with the mobile terminal and receives the program stream pushed by the mobile terminal;
the player plays the program stream and analyzes the program stream to obtain audio metadata information in the program stream, wherein the audio metadata information contains interaction control content;
The player synchronizes the audio metadata information to a screen throwing application program, and the screen throwing application program sends the audio metadata information to a mobile terminal;
The screen throwing application program receives an operation instruction which is sent by the mobile terminal and input by a user according to the interactive control content, converts the operation instruction into metadata setting information and then sends the metadata setting information to the player;
And the player sets the audio parameters in the program stream according to the metadata setting information.
8. The method of claim 7, wherein the step of the on-screen application transmitting the audio metadata information to the mobile terminal comprises:
Defining audio metadata information acquisition parameters;
And receiving an audio metadata information acquisition request sent by the mobile terminal, and sending audio metadata information to the playing terminal through a DLNA standard interface according to the acquisition parameters and the acquisition request, wherein the audio metadata information is three-dimensional cyanine color metadata information.
9. The method of claim 8, wherein the defined acquisition parameters comprise: the DLNA standard interface, the newly added expansion tag and the custom parameters are expanded.
10. The method according to claim 7, wherein the process of receiving the operation instruction of the user input according to the interactive control content, which is sent by the mobile terminal, by the screen application program includes:
defining interaction parameters of operation instructions;
And acquiring an operation instruction sent by the playing terminal through a DLNA standard interface according to the interaction parameter.
11. The method of claim 10, wherein the defined interaction parameters comprise: the DLNA standard interface, the newly added expansion tag and the custom parameters are expanded.
12. A mobile terminal, characterized in that it comprises a processor having built-in processor-executable operating instructions to perform the interaction method according to any of claims 1 to 6.
13. A playback terminal, characterized in that the playback terminal comprises a processor with processor-executable operating instructions built in to perform the interaction method as claimed in any one of claims 7 to 11.
14. An interactive system, characterized in that it comprises a mobile terminal according to claim 12 and a playing terminal according to claim 13.
CN202410199134.4A 2024-02-22 2024-02-22 Interaction method, mobile terminal, playing terminal and interaction system Pending CN118138824A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410199134.4A CN118138824A (en) 2024-02-22 2024-02-22 Interaction method, mobile terminal, playing terminal and interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410199134.4A CN118138824A (en) 2024-02-22 2024-02-22 Interaction method, mobile terminal, playing terminal and interaction system

Publications (1)

Publication Number Publication Date
CN118138824A true CN118138824A (en) 2024-06-04

Family

ID=91233847

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410199134.4A Pending CN118138824A (en) 2024-02-22 2024-02-22 Interaction method, mobile terminal, playing terminal and interaction system

Country Status (1)

Country Link
CN (1) CN118138824A (en)

Similar Documents

Publication Publication Date Title
US8977102B2 (en) Audio routing for audio-video recording
US9148756B2 (en) Output of content from the internet on a media rendering device
US8495236B1 (en) Interaction of user devices and servers in an environment
US20150296247A1 (en) Interaction of user devices and video devices
KR100763206B1 (en) Apparatus and method for providing available codec information
KR101358807B1 (en) Method for synchronizing program between multi-device using digital watermark and system for implementing the same
JP2003209759A (en) Data broadcast receiver and data broadcast reception system
KR101472013B1 (en) Server and method for providing music streaming include data of add image
US8522296B2 (en) Broadcast receiving apparatus and method for configuring the same according to configuration setting values received from outside
JP6283318B2 (en) System for synchronizing content transmitted to a digital TV receiver with or without Internet access with a plurality of portable devices
US20130117798A1 (en) Augmenting content generating apparatus and method, augmented broadcasting transmission apparatus and method, and augmented broadcasting reception apparatus and method
KR102630037B1 (en) Information processing device, information processing method, transmission device, and transmission method
EP2437512B1 (en) Social television service
CN102065340A (en) System and method for implementing multimedia synchronous interaction
US9924209B2 (en) System and method for controlling reproduction using terminal
US20210195256A1 (en) Decoder equipment with two audio links
JP2007214794A (en) Mobile terminal device
CN118138824A (en) Interaction method, mobile terminal, playing terminal and interaction system
KR20090123236A (en) Broadcast receiver to provide a list of recorded files to external devices, method for providing recorded files, and system using the same
US9900644B2 (en) Device and method for processing an object which provides additional service in connection with a broadcast service in a broadcast receiving device
US20150156560A1 (en) Apparatus for transmitting augmented broadcast metadata, user terminal, method for transmitting augmented broadcast metadata, and reproducing augmented broadcast metadata
WO2014169634A1 (en) Media playing processing method, apparatus and system, and media server
JP2002300544A (en) Storage multi-channel broadcasting method and receiver and synchronized reproduction control program
JP2002271769A (en) Video distribution system for lecture presentation by the internet
CN114390048A (en) Cloud VR screen projection system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination