CN114584809A - Data transmission method and system and electronic equipment - Google Patents

Data transmission method and system and electronic equipment Download PDF

Info

Publication number
CN114584809A
CN114584809A CN202011391638.4A CN202011391638A CN114584809A CN 114584809 A CN114584809 A CN 114584809A CN 202011391638 A CN202011391638 A CN 202011391638A CN 114584809 A CN114584809 A CN 114584809A
Authority
CN
China
Prior art keywords
image frame
data packet
frame
electronic device
transmission channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011391638.4A
Other languages
Chinese (zh)
Inventor
易立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011391638.4A priority Critical patent/CN114584809A/en
Publication of CN114584809A publication Critical patent/CN114584809A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The embodiment of the application provides a data transmission method, a data transmission system and electronic equipment. The method comprises the steps that a first electronic device transmits image frames on a first transmission channel, and repeatedly sends the image frames on a second transmission channel and a third transmission channel in a time-delay redundant transmission mode, wherein the frame types of the image frames sent on the second transmission channel and the third transmission channel are different, so that the same image frames can be received on other channels after the second electronic device receives the image frames on the first transmission channel, the influence of interference on image frame transmission can be reduced, and the use experience of a user is improved.

Description

Data transmission method and system and electronic equipment
Technical Field
The embodiment of the application relates to the field of terminal equipment, in particular to a data transmission method, a data transmission system and electronic equipment.
Background
The current smart television has two forms, one form is that a host computer and a screen are designed integrally, and the other form is that the host computer and the screen are designed in a split mode. However, as the wireless image transmission technology in the consumer scene has not been broken through, the information transmission between the host and the screen in the split design still adopts the cable connection mode. With the further development and maturity of wireless technology, the wireless technology is adopted to transmit information between the host and the screen, and thus the wireless technology becomes a scene of a future smart television.
However, when data is transmitted between the host and the screen, once the data is interfered by the interference source, the display quality of the wireless image transmitted between the host and the screen is poor, and the user experience is poor.
Disclosure of Invention
In order to solve the technical problem, the present application provides a data transmission method, a system and an electronic device. In the method, the first electronic device can transmit the redundant image frames of the designated type on different transmission channels in a delayed redundant transmission mode so as to overcome the influence of interference on image frame transmission and improve user experience.
In a first aspect, an embodiment of the present application provides a data transmission system. The system includes a first electronic device and a second electronic device. The first electronic device transmits a first image frame on a first transmission channel. After the first image frame is completely transmitted, the first electronic device transmits a second image frame on the first transmission channel and transmits a first redundant image frame on the second transmission channel. The image data of the first redundant image frame is the same as the image data of the first image frame, and the frame types of the first image frame and the second image frame are different. And after the second image frame is sent, the first electronic equipment sends a second redundant image frame on a third transmission channel, wherein the image data of the second redundant image frame is the same as the image data of the second image frame. And the second electronic equipment is used for receiving the first image frame and the second image frame from the first transmission channel. The second electronic device receives the first redundant image frame from the second transmission channel after receiving the first image frame. And the second electronic equipment receives the second redundant image frame from the third transmission channel after receiving the second image frame. The second electronic device may then discard the first redundant image frame and the second redundant image frame. In this way, the first electronic device may transmit the redundant image frames of the designated type on different transmission channels in a delayed redundant transmission manner, so that the second electronic device may receive the original image frame on the first transmission channel, for example, after the first image frame, and may also receive the redundant image frame, for example, the first redundant image frame, in a delayed manner on another transmission channel, so that under the condition that any image frame on the first transmission channel is interfered, the second electronic device may still receive the corresponding image frame on the other transmission channels, so as to overcome the influence of the interference on the image frame transmission, and improve the user experience.
Illustratively, the first electronic device may be a host and the second electronic device may be a television.
For example, the transmission time of the second image frame may be before the transmission time of the first redundant image frame, may be after the transmission time of the first redundant image frame, or may be transmitted simultaneously.
For example, the second electronic device may receive the second image frame and the first redundant image frame at the same time. For example, the second electronic device may receive the second image frame before receiving the first redundant image frame. For example, the second electronic device may receive the first redundant image frame before receiving the second image frame.
According to the first aspect, the first electronic device is further configured to transmit a third image frame on the first transmission channel after the second image frame is transmitted. Wherein the third image frame is of a different frame type than the first image frame and the second image frame. And after the third image frame is sent, the first electronic equipment sends a third redundant image frame on a fourth transmission channel, wherein the image data of the third redundant image frame is the same as the image data of the third image frame. Correspondingly, the second electronic device is further configured to detect whether the third image frame is completely received from the first transmission channel. And when the second electronic device detects that the third image frame is not completely received, displaying the image data of the third redundant image frame received from the fourth transmission channel. Therefore, under the condition that the third image frame suffers from packet loss or data loss due to interference in the transmission process, the second electronic device can still acquire the third redundant image frame which is the same as the image data of the third image frame from the fourth transmission channel, so that other image frames which depend on the third image frame for inter-frame prediction can be correctly decoded based on the third redundant image frame, the influence of the interference on image display is overcome, and the user experience is improved.
According to the first aspect or any one of the foregoing implementation manners of the first aspect, the second electronic device is further configured to display image data of the third image frame when it is detected that the third image frame is completely received. The second electronic device then discards the third redundant image frame received from the fourth transmission channel. Therefore, the second electronic device can directly display the image data of the third image frame after detecting that the image frame is completely received, and discard the third redundant image frame, so that the pressure of the device and the occupation of the cache are reduced, and the resource utilization rate is improved.
According to the first aspect or any one of the above implementation manners of the first aspect, the third image frame is carried by the first data packet and the second data packet. Accordingly, the second electronic device is configured to receive the first data packet from the first transmission channel. When the second data packet is not received within the set first duration, the second electronic device determines that the third image frame is not completely received. Therefore, the second electronic device does not receive the second data packet within the set first duration, that is, it can be determined that the third image frame is not completely transmitted, and the second electronic device can immediately acquire the corresponding third redundant image frame from the fourth transmission channel, so as to reduce the influence of the data packet loss on the data processing delay.
According to the first aspect or any one of the above implementation manners of the first aspect, the third image frame is carried by the first data packet and the second data packet. Correspondingly, the second electronic device is used for receiving the first data packet and the second data packet from the first transmission channel. And the second electronic equipment carries out integrity check on the first data packet and the second data packet according to the Cyclic Redundancy Check (CRC) field carried by the second data packet. And when the integrity check of the first data packet and the second data packet by the second electronic equipment fails, determining that the third image frame is not completely received. In this way, the second electronic device may further determine whether the third image frame is completely received based on the integrity check, and immediately acquire a corresponding third redundant image frame from the fourth transmission channel in the case that it is determined that the third image frame is not completely received, so as to reduce the influence of the data packet loss on the data processing delay.
According to the first aspect or any one of the foregoing implementation manners of the first aspect, the third image frame is carried in a first data packet and a second data packet, where the first data packet and the second data packet carry first number information, the third redundant image frame is carried in a third data packet and a fourth data packet, and the third data packet and the fourth data packet carry the first number information. In this way, the second electronic device may determine, based on the number information carried in the data packet, that the redundant image frame corresponding to the third image frame is the third redundant image frame carried by the data packet that is consistent with the number information of the third image frame.
According to the first aspect or any one of the foregoing implementation manners of the first aspect, the first electronic device is further configured to send a fourth image frame on the first transmission channel after the third image frame is sent; wherein the fourth image frame is of the same frame type as the first image frame; the fourth image frame is carried in a fifth data packet, the fifth data packet carries second number information, and the second number information is different from the first number information. After the fourth image frame is sent, the first electronic device sends a fourth redundant image frame on the second transmission channel, wherein the image data of the fourth redundant image frame is the same as the image data of the fourth image frame. Correspondingly, the second electronic device is further configured to receive the first data packet from the first transmission channel. The second electronic device receives a fifth data packet from the first transmission channel. And then, the second electronic equipment determines that the third image frame is not completely received according to the first number information carried by the first data packet and the second number information carried by the fifth data packet. In this way, the second electronic device may determine whether the image frame is completely received by identifying the number information in the data packet, and immediately acquire the corresponding third redundant image frame from the fourth transmission channel when it is determined that the third image frame is not completely received, so as to reduce the influence of the data packet loss on the data processing delay.
According to the first aspect or any implementation manner of the first aspect, the first data packet carries first quantity information, where the first quantity information is used to indicate that there are two data packets carrying the third image frame, and the first data packet is a first one of the two data packets carrying the third image frame. The second data packet carries second quantity information, the second quantity information is used for indicating that the number of the data packets carrying the third image frame is two, and the second data packet is the second of the two data packets carrying the third image frame. In this way, the second electronic device may determine whether the data packets of the third image frame have been completely received based on the quantity information carried in the data packets.
According to the first aspect, or any implementation manner of the first aspect, the first data packet and the second data packet carry channel identification information of the first transmission channel. In this way, the second electronic device may determine the frame type of the data packet based on the channel identification information in the data packet.
According to the first aspect, or any implementation manner of the first aspect, an interval between a transmission time of the first redundant image frame and a transmission completion time of the first image frame is a set second time duration, and the second time duration is greater than or equal to 0 and is less than an inter-frame interval between the first image frame and the second image frame. In this way, the first electronic device may transmit the first redundant image frame immediately after the first image frame is transmitted, i.e., the transmission time of the first redundant image frame is the same as (aligned with) the transmission completion time of the first image frame. The first electronic device may further transmit the first redundant image frame at a second time interval after the first image frame is completely transmitted, that is, the transmission time of the first redundant image frame differs from the transmission completion time of the first image frame by the second time interval, so as to reserve processing time for the first electronic device and the second electronic device.
According to the first aspect as such or any one of the above implementation manners of the first aspect, the system further includes a third electronic device, where the third electronic device is configured to receive the first image frame and the second image frame from the first transmission channel. After receiving the first image frame, a first redundant image frame is received from the second transmission channel. After receiving the second image frame, a second redundant image frame is received from a third transmission channel. The third electronic device discards the first redundant image frame and the second redundant image frame. In this way, the data transmission method in the embodiment of the present application may be applied to a multi-device interaction scenario, where the first electronic device may send the image frame and the redundant image town in a broadcast manner, and the second electronic device and the third electronic device may monitor the corresponding transmission channels to obtain the image frame and the redundant image frame.
According to the first aspect, or any implementation manner of the first aspect above, the first electronic device is further configured to establish a first connection with a second electronic device. The first electronic device sends indication information to the second electronic device through the first connection, wherein the indication information is used for indicating a first transmission channel to be used for transmitting an original image group, a second transmission channel is used for transmitting image frames of a first frame type, and a third transmission channel is used for transmitting image frames of a second frame type, and the original image group comprises the first image frame and the second image frame. In this way, the first electronic device may indicate its delayed redundant transmission mode to the second electronic device in advance, so that the second electronic device may listen to the corresponding transmission channel to acquire the image frames and the redundant image frames.
According to the first aspect or any one of the above implementation manners of the first aspect, the first frame type is an I-frame, and the second frame type is a P-frame or a B-frame.
According to the first aspect, or any implementation manner of the first aspect above, the first transmission channel and the second transmission channel operate on the same channel.
Illustratively, the first transmission channel, the second transmission channel, the third transmission channel and the fourth transmission channel may operate in the same frequency band, for example, a 2.4GHz frequency band or a 5GHz frequency band.
Illustratively, the first transmission channel, the second transmission channel, the third transmission channel, and the fourth transmission channel may operate on the same channel.
For example, the first transmission channel, the second transmission channel, the third transmission channel and the fourth transmission channel may use OFDMA for data transmission.
In a second aspect, an embodiment of the present application provides a data transmission method. The method comprises the following steps: the first electronic device transmits a first image frame on a first transmission channel. The first electronic device sends a second image frame on a first transmission channel after the first image frame is sent, and sends a first redundant image frame on a second transmission channel, wherein the image data of the first redundant image frame is the same as the image data of the first image frame, and the frame types of the first image frame and the second image frame are different; after the second image frame is sent, the first electronic device sends a second redundant image frame on a third transmission channel, wherein the image data of the second redundant image frame is the same as the image data of the second image frame; the second electronic equipment receives the first image frame and the second image frame from the first transmission channel; after receiving the first image frame, the second electronic device receives a first redundant image frame from the second transmission channel; after receiving the second image frame, the second electronic device receives a second redundant image frame from the third transmission channel; the second electronic device discards the first redundant image frame and the second redundant image frame.
According to a second aspect, the method further comprises: the first electronic equipment sends a third image frame on the first transmission channel after the second image frame is sent; wherein the third image frame is of a different frame type than the first image frame and the second image frame; after the third image frame is sent, the first electronic device sends a third redundant image frame on a fourth transmission channel, wherein the image data of the third redundant image frame is the same as the image data of the third image frame; the second electronic equipment detects whether the third image frame is completely received from the first transmission channel; when detecting that the third image frame is not completely received, the second electronic device displays the image data of the third redundant image frame received from the fourth transmission channel.
According to a second aspect, or any implementation manner of the second aspect above, the method further includes: when detecting that the third image frame is completely received, the second electronic equipment displays the image data of the third image frame; the second electronic device discards the third redundant image frame received from the fourth transmission channel.
According to a second aspect or any implementation manner of the second aspect, the third image frame is carried in the first data packet and the second data packet; the second electronic device detects whether the third image frame is completely received from the first transmission channel, and the method comprises the following steps: the second electronic equipment receives a first data packet from the first transmission channel; when the second data packet is not received within the set first duration, the second electronic device determines that the third image frame is not completely received.
According to a second aspect or any implementation manner of the second aspect above, the third image frame is carried in the first data packet and the second data packet; the second electronic device detects whether the third image frame is completely received from the first transmission channel, and the method comprises the following steps: the second electronic device receives the first data packet and the second data packet from the first transmission channel; the second electronic equipment carries out integrity check on the first data packet and the second data packet according to a Cyclic Redundancy Check (CRC) field carried by the second data packet; when the integrity check of the first data packet and the second data packet fails, the second electronic device determines that the third image frame is not completely received.
According to the second aspect, or any implementation manner of the second aspect, the third image frame is carried in the first data packet and the second data packet, the first data packet and the second data packet carry the first number information, the third redundant image frame is carried in the third data packet and the fourth data packet, and the third data packet and the fourth data packet carry the first number information.
According to a second aspect, or any implementation manner of the second aspect above, the method further includes: after the third image frame is sent, the first electronic device sends a fourth image frame on the first transmission channel; wherein the fourth image frame is of the same frame type as the first image frame; the fourth image frame is carried in a fifth data packet, the fifth data packet carries second serial number information, and the second serial number information is different from the first serial number information; after the fourth image frame is sent, the first electronic device sends a fourth redundant image frame on the second transmission channel, wherein the image data of the fourth redundant image frame is the same as the image data of the fourth image frame; the second electronic equipment receives a first data packet from the first transmission channel; the second electronic device receives a fifth data packet from the first transmission channel; and the second electronic equipment determines that the third image frame is not completely received according to the first number information carried by the first data packet and the second number information carried by the fifth data packet.
According to the second aspect, or any implementation manner of the second aspect above, the first data packet carries first quantity information, where the first quantity information is used to indicate that there are two data packets carrying the third image frame, and the first data packet is a first one of the two data packets carrying the third image frame; the second data packet carries second quantity information, the second quantity information is used for indicating that the number of the data packets carrying the third image frame is two, and the second data packet is the second of the two data packets carrying the third image frame.
According to the second aspect, or any implementation manner of the second aspect above, the first data packet and the second data packet carry channel identification information of the first transmission channel.
According to the second aspect, or any implementation manner of the second aspect, an interval between a transmission time of the first redundant image frame and a transmission completion time of the first image frame is a set second time length, and the second time length is greater than or equal to 0 and is less than an inter-frame interval between the first image frame and the second image frame.
According to a second aspect, or any implementation manner of the second aspect above, the method further includes: the third electronic equipment receives the first image frame and the second image frame from the first transmission channel; after receiving the first image frame, the third electronic device receives a first redundant image frame from the second transmission channel; after receiving the second image frame, the third electronic device receives a second redundant image frame from the third transmission channel; the third electronic device discards the first redundant image frame and the second redundant image frame.
According to a second aspect or any one of the above implementation manners of the second aspect, before the first electronic device sends the first image frame on the first transmission channel, the method further comprises: the method comprises the steps that first connection is established between first electronic equipment and second electronic equipment; the first electronic device sends indication information to the second electronic device through the first connection, wherein the indication information is used for indicating a first transmission channel to be used for transmitting an original image group, a second transmission channel is used for transmitting image frames of a first frame type, and a third transmission channel is used for transmitting image frames of a second frame type, and the original image group comprises the first image frame and the second image frame.
According to a second aspect or any implementation manner of the second aspect above, the first frame type is an I-frame, and the second frame type is a P-frame or a B-frame.
According to a second aspect, or any implementation manner of the second aspect above, the first transmission channel and the second transmission channel operate on the same channel.
Any one implementation manner of the second aspect and the second aspect corresponds to any one implementation manner of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the second aspect and the second aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not repeated here.
In a third aspect, an embodiment of the present application provides a data transmission method. The method comprises the following steps: the first electronic device transmits a first image frame on a first transmission channel. The first electronic device sends a second image frame on a first transmission channel after the first image frame is sent, and sends a first redundant image frame on a second transmission channel, wherein the image data of the first redundant image frame is the same as the image data of the first image frame, and the frame types of the first image frame and the second image frame are different; and the first electronic equipment transmits a second redundant image frame on the third transmission channel after the second image frame is transmitted, wherein the image data of the second redundant image frame is the same as the image data of the second image frame.
According to a third aspect, the method further comprises: the first electronic equipment sends a third image frame on the first transmission channel after the second image frame is sent; wherein the third image frame is of a different frame type than the first image frame and the second image frame; and after the third image frame is sent, the first electronic equipment sends a third redundant image frame on a fourth transmission channel, wherein the image data of the third redundant image frame is the same as the image data of the third image frame.
According to the third aspect, or any implementation manner of the third aspect, the third image frame is carried in the first data packet and the second data packet, the first data packet and the second data packet carry the first number information, the third redundant image frame is carried in the third data packet and the fourth data packet, and the third data packet and the fourth data packet carry the first number information.
According to the third aspect, or any one of the above implementation manners of the third aspect, the method further includes: after the third image frame is sent, the first electronic device sends a fourth image frame on the first transmission channel; wherein the fourth image frame is of the same frame type as the first image frame; the fourth image frame is carried in a fifth data packet, the fifth data packet carries second serial number information, and the second serial number information is different from the first serial number information; and after the fourth image frame is sent, the first electronic equipment sends a fourth redundant image frame on the second transmission channel, wherein the image data of the fourth redundant image frame is the same as the image data of the fourth image frame.
According to the third aspect, or any implementation manner of the third aspect, the first data packet carries first quantity information, where the first quantity information is used to indicate that there are two data packets carrying the third image frame, and the first data packet is a first one of the two data packets carrying the third image frame; the second data packet carries second quantity information, the second quantity information is used for indicating that the number of the data packets carrying the third image frame is two, and the second data packet is the second of the two data packets carrying the third image frame.
According to the third aspect, or any implementation manner of the third aspect above, the first data packet and the second data packet carry channel identification information of the first transmission channel.
According to the third aspect, or any implementation manner of the third aspect, an interval between a transmission time of the first redundant image frame and a transmission completion time of the first image frame is a set second time duration, and the second time duration is greater than or equal to 0 and is less than an inter-frame interval between the first image frame and the second image frame.
According to the third aspect, or any implementation manner of the third aspect above, before the first electronic device transmits the first image frame on the first transmission channel, the method further includes: the method comprises the steps that first connection is established between first electronic equipment and second electronic equipment; the first electronic device sends indication information to the second electronic device through the first connection, wherein the indication information is used for indicating a first transmission channel to be used for transmitting an original image group, a second transmission channel is used for transmitting image frames of a first frame type, and a third transmission channel is used for transmitting image frames of a second frame type, and the original image group comprises the first image frame and the second image frame.
According to the third aspect, or any implementation manner of the third aspect above, the first frame type is an I frame, and the second frame type is a P frame or a B frame.
According to the third aspect, or any implementation manner of the third aspect above, the first transmission channel and the second transmission channel operate on the same channel.
Any one implementation manner of the third aspect corresponds to any one implementation manner of the first aspect. For technical effects corresponding to any one implementation manner of the third aspect and the third aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not repeated here.
In a fourth aspect, an embodiment of the present application provides a data transmission method. The method comprises the following steps: the second electronic device receives a first image frame and a second image frame sent by the first electronic device from the first transmission channel; after receiving the first image frame, the second electronic device receives a first redundant image frame sent by the first electronic device from a second transmission channel; the image data of the first redundant image frame is the same as the image data of the first image frame, and the frame types of the first image frame and the second image frame are different; after receiving the second image frame, the second electronic device receives a second redundant image frame sent by the first electronic device from the third transmission channel; the second electronic device discards the first redundant image frame and a second redundant image frame, wherein image data of the second redundant image frame is the same as image data of the second image frame.
According to a fourth aspect, the method further comprises: the second electronic equipment detects whether the third image frame is completely received from the first transmission channel; when detecting that the third image frame is not completely received, the second electronic equipment displays the image data of the third redundant image frame received from the fourth transmission channel; wherein the third image frame is of a different frame type than the first image frame and the second image frame; the image data of the third redundant image frame is the same as the image data of the third image frame.
According to a fourth aspect, or any implementation manner of the fourth aspect above, the method further comprises: when detecting that the third image frame is completely received, the second electronic equipment displays the image data of the third image frame; the second electronic device discards the third redundant image frame received from the fourth transmission channel.
According to a fourth aspect or any implementation manner of the fourth aspect above, the third image frame is carried in the first data packet and the second data packet; the second electronic device detects whether the third image frame is completely received from the first transmission channel, and the method comprises the following steps: the second electronic equipment receives a first data packet from the first transmission channel; and when the second data packet is not received within the set first time length, the second electronic equipment determines that the third image frame is not completely received.
According to a fourth aspect or any implementation manner of the fourth aspect above, the third image frame is carried in the first data packet and the second data packet; the second electronic device detects whether the third image frame is completely received from the first transmission channel, and the method comprises the following steps: the second electronic device receives the first data packet and the second data packet from the first transmission channel; the second electronic equipment carries out integrity check on the first data packet and the second data packet according to a Cyclic Redundancy Check (CRC) field carried by the second data packet; when the integrity check of the first data packet and the second data packet fails, the second electronic device determines that the third image frame is not completely received.
According to a fourth aspect, or any implementation manner of the fourth aspect, the third image frame is carried in the first data packet and the second data packet, the first data packet and the second data packet carry the first number information, the third redundant image frame is carried in the third data packet and the fourth data packet, and the third data packet and the fourth data packet carry the first number information.
According to a fourth aspect, or any implementation manner of the fourth aspect above, the method further comprises: the second electronic equipment receives a first data packet from the first transmission channel; the second electronic device receives a fifth data packet from the first transmission channel; the second electronic equipment determines that the third image frame is not completely received according to the first number information carried by the first data packet and the second number information carried by the fifth data packet; wherein the fourth image frame is of the same frame type as the first image frame; the fourth image frame is carried in a fifth data packet, the fifth data packet carries second serial number information, and the second serial number information is different from the first serial number information; wherein the image data of the fourth redundant image frame is the same as the image data of the fourth image frame.
According to the fourth aspect, or any implementation manner of the fourth aspect above, the first data packet carries first quantity information, where the first quantity information is used to indicate that there are two data packets carrying the third image frame, and the first data packet is a first one of the two data packets carrying the third image frame; the second data packet carries second quantity information, the second quantity information is used for indicating that the number of the data packets carrying the third image frame is two, and the second data packet is the second of the two data packets carrying the third image frame.
According to a fourth aspect, or any implementation manner of the fourth aspect above, the first data packet and the second data packet carry channel identification information of the first transmission channel.
According to a fourth aspect, or any implementation manner of the fourth aspect, an interval between a transmission time of the first redundant image frame and a transmission completion time of the first image frame is a set second time length, and the second time length is greater than or equal to 0 and is less than an inter-frame interval between the first image frame and the second image frame.
According to a fourth aspect, or any implementation manner of the fourth aspect above, the method further comprises: the second electronic equipment establishes a first connection with the first electronic equipment; the second electronic device receives indication information sent by the first electronic device through the first connection, wherein the indication information is used for indicating a first transmission channel to be used for transmitting an original image group, a second transmission channel is used for transmitting an image frame of a first frame type, and a third transmission channel is used for transmitting an image frame of a second frame type, and the original image group comprises the first image frame and the second image frame.
According to a fourth aspect or any implementation manner of the fourth aspect above, the first frame type is an I-frame, and the second frame type is a P-frame or a B-frame.
According to a fourth aspect, or any implementation manner of the fourth aspect above, the first transmission channel and the second transmission channel operate on the same channel.
Any one implementation manner of the fourth aspect and the fourth aspect corresponds to any one implementation manner of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the fourth aspect and the fourth aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not repeated here.
In a fifth aspect, an embodiment of the present application provides an electronic device. The electronic device includes a memory and a processor, the memory coupled with the processor. The memory stores program instructions that, when executed by the processor, cause the electronic device to perform the data transfer method performed by the first electronic device or the second electronic device in the second aspect or any possible implementation manner of the second aspect.
Any one implementation manner of the fifth aspect and the fifth aspect corresponds to any one implementation manner of the first aspect and the first aspect, respectively. For technical effects corresponding to any one of the implementation manners of the fifth aspect and the fifth aspect, reference may be made to the technical effects corresponding to any one of the implementation manners of the first aspect and the first aspect, and details are not repeated here.
In a sixth aspect, embodiments of the present application provide a chip. The chip includes one or more interface circuits and one or more processors; the interface circuit is used for receiving signals from a memory of the electronic equipment and sending the signals to the processor, and the signals comprise computer instructions stored in the memory; the computer instructions, when executed by the processor, cause the electronic device to perform the data transmission method performed by the first electronic device or the second electronic device in the second aspect or any possible implementation manner of the second aspect.
Any one implementation form of the sixth aspect and the sixth aspect corresponds to any one implementation form of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the sixth aspect and the sixth aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not described here again.
In a seventh aspect, the present application provides a computer-readable medium for storing a computer program including instructions for executing the second aspect or the method in any possible implementation manner of the second aspect.
Any one implementation form of the sixth aspect and the sixth aspect corresponds to any one implementation form of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the sixth aspect and the sixth aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not described here again.
In an eighth aspect, the present application provides a computer program including instructions for executing the method of the second aspect or any possible implementation manner of the second aspect.
Any one implementation manner of the eighth aspect and the eighth aspect corresponds to any one implementation manner of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the eighth aspect and the eighth aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not described here again.
Drawings
Fig. 1 is a schematic structural diagram of an exemplary electronic device;
fig. 2 is a schematic structural diagram of an exemplary communication module;
fig. 3 is a schematic diagram of a software structure of an exemplary electronic device;
FIG. 4 is a schematic diagram of an exemplary application scenario;
FIGS. 5 a-5 b are schematic diagrams of exemplary illustrative split screen combinations;
FIG. 6 is a schematic diagram illustrating a process for establishing a connection between a host and a television;
FIG. 7 is an exemplary illustrative resource profile;
FIG. 8 is a schematic diagram of an exemplary set of images;
fig. 9 is a diagram illustrating an inter-frame interval and a data amount of an image frame, which is exemplarily shown;
fig. 10 is a schematic diagram illustrating image frame processing at the host side;
FIGS. 11 a-11 c are schematic diagrams illustrating exemplary packet formats;
FIGS. 12 a-12 b are schematic diagrams illustrating the transmission of image frames;
fig. 13 is a schematic diagram illustrating image frame processing at the television end;
FIG. 14 is an exemplary illustration of data transmission;
15 a-15 e are exemplary illustrative module interaction diagrams;
FIG. 16 is an exemplary illustration of data transmission;
fig. 17 is a schematic diagram of an exemplary transmission channel.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The terms "first" and "second," and the like, in the description and in the claims of the embodiments of the present application are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first target object and the second target object, etc. are specific sequences for distinguishing different target objects, rather than describing target objects.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present application, the meaning of "a plurality" means two or more unless otherwise specified. For example, a plurality of processing units refers to two or more processing units; the plurality of systems refers to two or more systems.
The data transmission method in the embodiment of the application can be applied to a data interaction scene between an electronic device, such as a mobile phone, a tablet, a notebook, a wearable device, a television host, and the like, and one or more electronic devices, such as a mobile phone, a tablet, a notebook, a television (specifically, a television screen), and the like, having a display screen.
Fig. 1 shows a schematic structural diagram of an electronic device 100 in an embodiment of the present application. It should be understood that the electronic device 100 shown in fig. 1 is only one example of an electronic device, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the method can also be used for connecting a headset and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
As shown in fig. 2, which is a schematic diagram of a part of the circuit structure in the wireless communication module 160, the Wi-Fi chip 202 optionally includes a Wi-Fi CPU for processing Wi-Fi related operation functions, such as: and the method is responsible for congestion control, carrier aggregation, frame filtering, key control, management frame transceiving and the like. It is understood that the Wi-Fi CPU may also be replaced by a DSP (digital signal processor) or an independent FPGA (field programmable gate array) chip, and the specific form of the processor for implementing the Wi-Fi processing may be flexible and variable, and the number and layout of the devices in fig. 2 are only used for reference and are not limited.
Optionally, the Wi-Fi chip 202 integrates a set of independent Media Access Control (MAC) modules. The MAC module mainly has the functions of channel access, framing and deframing, data transceiving, encryption and decryption and energy-saving control.
Optionally, the Wi-Fi chip 202 is coupled to the rf component 203, and the rf component 203 is configured to convert the baseband signal into an rf signal and transmit the rf signal, and convert the rf signal received from the antenna into a baseband signal for subsequent processing by the Wi-Fi chip 202. Optionally, in this embodiment of the present application, the number of the radio frequency component 203 may be two or more, for example, 4, to implement the multichannel-based data transmission scheme in this embodiment of the present application, and optionally, the radio frequency component 203 may be integrated on the Wi-Fi chip 202, or may be external to the chip, which is not limited in this application. Optionally, the radio frequency component is a component supporting a 2.4GHz band, and may also be a component supporting a 5GHz band, which is not limited in this application. For example, if the radio frequency component is a component supporting a 2.4GHz band, the radio frequency component may perform corresponding processing on data received or transmitted on the 2.4GHz band.
Continuing to refer to fig. 2, the Wi-Fi chip 202 is coupled to an antenna through the radio frequency part 203, in this embodiment, an example is given that the electronic device has 4 antennas and integrates 4 radio frequency parts 203 correspondingly, in other embodiments, the antennas may also be any number of 2 or more than 2, and this application is not limited.
Wi-Fi chip 202 is coupled to processor 201. Optionally, the processor 201 and the Wi-Fi chip 202 may be integrated on the same chip, or may be on different chips and connected through a bus, which is not limited in this application.
With continued reference to FIG. 1, the electronic device 100 implements display functionality via the GPU, the display screen 194, and the application processor, among other things. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes the instructions stored in the internal memory 121, so that the electronic device 100 executes the data transmission method in the embodiment of the present application. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a hierarchical architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 3 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture of the electronic device 100 divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 3, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, a codec (library), and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions for the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
A codec, which may be referred to as a codec library, is used to encode or decode image frames. For example, a codec may encode an image to generate an image frame or decode an image frame to obtain an image.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a Wi-Fi driver, an audio driver, a sensor driver, a redundancy module and the like. In this embodiment of the application, the redundancy module may be configured to copy the image frame, determine whether to lose the packet, and the like. It should be noted that, names and numbers of modules in the application layer framework layer and the kernel layer are schematically illustrated, and the present application is not limited thereto. It should be further noted that the codecs and the redundancy modules involved in the embodiments of the present application may also be in other layers, and the present application is not limited thereto.
Referring to fig. 4, an application scenario is schematically illustrated in fig. 4, for example, the application scenario includes a host (which may also be referred to as a tv host) and a split screen combination, where the split screen combination includes a tv 1, a tv 2, a tv 3, and a tv 4. It should be noted that the television may also be referred to as a screen, a smart screen, or a large screen, and the present application is not limited thereto.
It is understood that the application scene shown in fig. 4 may be a game scene, a movie scene, etc., and the host may include, but is not limited to, a game machine, a set-top box, a Digital Television (DTV), a home theater device, etc., by way of example.
Optionally, the application scenario may further include other output devices or input devices, such as a sound box and a microphone (not shown in the figure), and the other output devices or input devices may perform data interaction with the host computer in a wireless or wired manner, which is not limited in the present application.
It should be noted that the type, number, and layout manner of the electronic devices in the application scenario shown in fig. 4 are only schematic examples, and the application is not limited thereto. For example, as shown in fig. 5a, which is a schematic diagram of an exemplary split screen combination, referring to fig. 5a, in a game scene, for example, three screens may be used to be combined to display a game picture in the game scene, such as: game screen (left), game screen (middle), and game screen (right). In addition, fig. 5b also shows a schematic diagram of another split screen combination, and referring to fig. 5b, for example, in a macroscreen viewing scene, four screens may also be used to be combined to display movie pictures and the like in the macroscreen viewing scene, such as: movie pictures (top left), movie pictures (top right), movie pictures (bottom left), and movie pictures (top right).
It should be noted that, in fig. 4 and fig. 5a to fig. 5b, each screen in the split-type screen combination is taken as an example to display different images, in other embodiments, a part of screens or all screens in the split-type screen combination may also display the same image, and the application is not limited thereto.
In the application scenario shown in fig. 4, for example, the host may be configured to acquire an image, encode the image, and transmit the encoded image frame to a television (e.g., televisions 1 to 4) through a wireless electromagnetic wave. The television can decode the received image frames to obtain corresponding images and display the images on a screen. Optionally, the host may interact with a router (not shown) via a Wi-Fi connection to download images or data from the server side via the router. Optionally, the host may also perform data interaction with other electronic devices, such as a cell phone, via a Wi-Fi connection (e.g., a P2P connection), to receive images or data from the cell phone side.
For example, if the televisions 1 to 4 are currently playing videos, the user may adjust the video playing progress through the remote controller. The host computer can respond to the received user operation and execute corresponding actions, namely adjusting the video playing progress, for example, sending images corresponding to the adjusted progress to the televisions 1 to 4.
For example, if the time delay between the image displayed on the screen of the television and the actual operation time of the user is large due to large time delay in the wireless transmission process between the host and the televisions 1 to 4, the user may obviously perceive the operation time delay, for example, if the transmission time delay between the host and the television is large, after the user indicates the adjustment progress through the remote controller, the televisions 1 to 4 may display the picture corresponding to the adjustment progress of the user after the interval of 1s, thereby affecting the user experience.
In addition, if there is a problem of packet loss during wireless transmission between the host and the television, due to the characteristics of the streaming media encoding algorithm (which will be specifically described below in conjunction with image frames), the packet loss during transmission will cause the television to be blurred or frozen, which affects the viewing experience of the user.
The embodiment of the application provides a data transmission method, and an electronic device can transmit image frames to other electronic devices in a delayed redundancy transmission mode through a plurality of transmission channels, so that the influence of transmission delay and packet loss on displayed video pictures is effectively reduced, and the user experience is further improved.
The following describes the data transmission method in the embodiment of the present application in detail with reference to specific embodiments.
With reference to the application scenario shown in fig. 4, for example, before the host performs data interaction with televisions (including televisions 1 to 4), the host establishes Wi-Fi connections with the televisions 1 to 4, and transmits control information to the televisions 1 to 4 through the established Wi-Fi connections, where the control information is used to indicate a transmission mode of image frames of the host on multiple channels.
In other embodiments, the host may also establish a Wi-Fi connection or a bluetooth connection with the television based on other protocols, for example, a private protocol or a NAN (neighbor awareness network) protocol, a bluetooth protocol, and the like, which is not limited in this application.
In the embodiment of the present application, when the host establishes a Wi-Fi connection with the television based on a WLAN protocol, an STA-AP mode is adopted, for example, the host functions similarly to a Station (STA) in an STA-AP, where a so-called Station refers to a terminal device (electronic device) that has a Wi-Fi communication function and is connected to a wireless network, and the Station may support multiple WLAN systems of 802.11 families, such as 802.11be, 802.11ax, 802.11ac, 802.11n, 802.11g, 802.11b, and 802.11 a.
The television functions like an access point in a STA-AP, optionally a terminal device (electronic device) with a Wi-Fi chip. For example, the access point may be a device supporting 802.11be system. The access point may also be a device supporting multiple WLAN standards of 802.11 families, such as 802.11be, 802.11ax, 802.11ac, 802.11n, 802.11g, 802.11b, and 802.11 a.
It should be noted that the electronic device in the embodiment of the present application is generally an end product supporting 802.11 series standards, and the available frequency bands include 2.4 gigahertz (GHz) and 5GHz in the evolution process from 802.11a to 802.11g, 802.11n, 802.11ac to 802.11 ax. With more and more open frequency bands, the maximum channel bandwidth supported by 802.11 extends from 20 megahertz (MHz) to 40MHz to 160 MHz. In 2017, the Federal Communications Commission (FCC) opened a new free band of 6GHz (5925 + 7125MHz), and 802.11ax standard workers expanded the operating range of 802.11ax devices from 2.4GHz, 5GHz to 2.4GHz, 5GHz and 6GHz in the 802.11ax project authorization applications (PAR).
For example, in the embodiments of the present application, the host and the televisions 1 to 4 support the 5GHz band are taken as an example for description, that is, communications between the host and the televisions 1 to 4 are performed on a channel of the 5GHz band. In other examples, the host and the televisions 1 to 4 may also support a 2.4GHz band, or a 2.4GHz band and a 5GHz band, and the methods are similar to those in the embodiments of the present application, and are not described herein again.
Fig. 6 is a schematic flowchart illustrating a process of establishing a Wi-Fi connection between the host and the televisions 1 and 2 based on the WLAN protocol and transmitting control information, and with reference to fig. 6, the process specifically includes:
s101, the host sends a Probe Request message to the television 1.
S102, the television 1 sends a Probe Response message to the host.
For example, in the embodiment of the present application, the host and the tv may operate on a specific channel, for example, the channel 36 in the 5GH band, that is, in the scanning phase, the host sends a Probe Request message on the channel 36, and the tv 1 may return a Probe Response message on the channel 36 without performing a full channel scan, so as to save time, where the full channel scan refers to that the host scans all channels in the 2.4GHz band and/or the 5GHz band, that is, sends the Probe Request message on all channels.
In order to make the reader better understand the correspondence between the frequency bands and the channels in the present application, the correspondence between the frequency bands and the channels is briefly described below with reference to fig. 7. Fig. 7 is an exemplary 5GHz resource distribution diagram, wherein the frequency band of 5GHz is 5170MHz to 5835MHz, the available channels include channels 36 to 165, the channel bandwidth of each channel is 20MHz, and the shaded portion in fig. 7 is an exemplary 5GHz channel available in china, which includes channel 36 to 64 and channel 149 to 165. It should be noted that the available channels at 2.4GHz are channels 1 to 13, and the resource distribution is similar to that of the channels at 5GHz, which is not described herein again.
S103, the host and the television 1 perform authentication processing.
S104, the host and the television 1 perform association processing.
S105, the host computer and the television 1 perform a 4-way handshake phase for four times.
For example, after the host selects the television 1 in the network selection stage, S103 to S105 may be executed, and actually, in S103 to S105, the host and the television 1 perform signaling interaction for many times, and specific details may refer to detailed description in the 802.11 protocol, and are not described in detail in this application.
So far, the host and the television 1 successfully establish Wi-Fi connection. It should be noted that, in this embodiment, a node for completing establishment of Wi-Fi connection after the host and the television complete four handshakes is taken as an example for description, in other embodiments, after the host receives a Probe Response message sent by the television, the host and the television may be considered to be established to be connected, and this application is not limited.
S106, the host transmits control information to the television 1.
Illustratively, after the host and the tv 1 perform a four-way handshake phase, it is determined that the Wi-Fi connection is successfully established with the tv 1, and control information is sent to the tv 1 based on the Wi-Fi connection. Illustratively, the control information includes, but is not limited to: identification information of the transmission channel, the channel to which the transmission channel belongs, the type of the image frame transmitted on the transmission channel, and the like.
Illustratively, in this embodiment of the present application, a host implements delayed redundant transmission in a broadcast manner, where a broadcast channel of the host includes 4 physical channels (i.e., transmission channels), and each physical channel corresponds to identification information, which is a physical channel 1, a physical channel 2, a physical channel 3, and a physical channel 4. The physical channel 1 is used for transmitting an original video stream (also referred to as an original group of pictures), that is, a video stream including all picture frames generated by a host, the physical channel 2 is used for delayed redundant transmission of data packets corresponding to an I frame, the physical channel 3 is used for delayed redundant transmission of data packets corresponding to a B frame, the physical channel 4 is used for delayed redundant transmission of data packets corresponding to a P frame, and 4 physical channels operate in the channel 40. Correspondingly, the control information in the embodiment of the present application includes the information described above, that is: identification information of physical channel 1 (i.e. physical channel 1), correspondence of physical channel 1 to original image frames (i.e. for indicating that physical channel 1 is used for transmitting all types of frames including I-frames, P-frames and B-frames), identification information of physical channel 2 (i.e. physical channel 2), correspondence of physical channel 2 to I-frames (i.e. for indicating that physical channel 2 is used for delayed redundant transmission of I-frames), identification information of the physical channel 3 (i.e. the physical channel 3), a correspondence relationship between the physical channel 3 and the B frame (i.e. for indicating that the physical channel 3 is used for delaying the redundant transmission of the B frame), identification information of the physical channel 4 (i.e. the physical channel 4), a correspondence relationship between the physical channel 4 and the P frame (i.e. for indicating that the physical channel 4 is used for delaying the redundant transmission of the P frame), and a signal sequence number of the working channel (the channel 40) (which may also be center frequency information of the channel 40).
Optionally, the host may send the control information to the tv 1 two or more times to determine that the tv 1 successfully receives the control information, and after sending the control information multiple times, disconnect the Wi-Fi connection with the tv 1. Optionally, the host may disconnect the Wi-Fi connection with the tv 1 after receiving an Acknowledgement Character (ACK) message sent by the tv 1, where the ACK message is used to indicate that the tv 1 successfully receives the control information.
Illustratively, after receiving the control information, the tv 1 may listen to the channel 40 to receive the data packets broadcast by the host on the 4 physical channels of the channel 40.
After the host is disconnected from the television 1, the host can establish Wi-Fi connection with the television 2, and with continued reference to fig. 6, the specific process includes:
s201, the host sends a Probe Request message to the television 2.
S202, the tv 2 sends a Probe Response message to the host.
S203, the host and the tv 2 perform authentication processing.
S204, the host and the television 2 perform association processing.
S205, the host and the tv 2 perform a four-way handshake phase.
S206, the host transmits control information to the television 2.
Optionally, the tv 2 may operate on the same channel as the tv 1, or may operate on a different channel, for example, the tv 2 operates on the channel 44, and accordingly, when the host establishes a Wi-Fi connection with the tv 2, the host sends a Probe Request message on the channel 44.
The description of the host establishing the Wi-Fi connection with the tv 2 and the transmission control information may refer to the description of the host establishing the connection with the tv 1 and the transmission control information, and will not be described herein again.
For example, after the host and the tv 2 transmit the control information, the host disconnects Wi-Fi connection with the tv 2, establishes connection with the tv 3 and transmits the control information, and then disconnects Wi-Fi connection with the tv 3, establishes connection with the tv 4 and transmits the control information, and disconnects Wi-Fi connection with the tv 4 after the control information is transmitted. Connection establishment and data interaction between the host and the televisions 3 and 4 can refer to fig. 6, and details are not repeated here.
It should be noted that, in the embodiments of the present application, the host is taken as an STA, that is, a connection establishment initiating terminal, and in other embodiments, the host may also be taken as an AP, and the tv is taken as an STA, that is, the tv is taken as a connection establishment initiating terminal, for example, a Probe Request message is sent, which is not limited in the present application.
Specifically, after the host transmits the control information to the televisions 1 to 4, the host can perform time-delay redundant transmission on the image frames of the specified type on the corresponding channels according to the transmission mode indicated by the control information.
In order to make the person skilled in the art better understand the image frame transmission method in the embodiment of the present application, before describing the specific transmission method, a brief description will be given to the related concepts of the image frame.
In the embodiment of the present application, after the host acquires the image (which may be acquired from a server or an electronic device), the image may be encoded to increase the compression ratio. The encoding method may include, but is not limited to: h.264 or HEVC (High Efficiency Video Coding), etc.
The encoded plurality of image frames may constitute a GOP (Group of Pictures). Optionally, one or more I-frames, and one or more B-frames and P-frames may be included in the GOP. Referring to fig. 8, which is a schematic diagram of an exemplary GOP, the GOP illustratively includes I1 frames, B1 frames, B2 frames, P1 frames, B3 frames, B4 frames, P2 frames, B5 frames, B6 frames, I2 frames, B7 frames, B8 frames, and P3 frames. It should be noted that the type, number and sequence of the image frames in the GOP in fig. 8 are only illustrative examples, and the present application is not limited thereto.
Continuing with fig. 8, an I-frame may illustratively be referred to as a full frame, or a independently decodable frame, i.e., an I-frame may be independently decodable without relying on other frames. Typically, the first frame of a GOP is an I-frame. B frames and P frames may be referred to as inter-predicted frames, and the decoding of B frames depends on the nearest I or P frame before and after the B frames, and in the case of B1 frames, the decoding thereof needs to depend on I1 frames and P1 frames. The decoding of a P frame depends on the nearest I frame or P frame before it, taking the P1 frame as an example, the decoding needs to depend on the I1 frame, and taking the P2 frame as an example, the decoding needs to depend on the P1 frame. Because the inter-frame prediction frames need to depend on the previous frame or the previous and subsequent frames to finish decoding, if the I frame or the P frame in the GOP is lost in the transmission process, the frames depending on the I frame and the P frame for decoding and the subsequent frames cannot be decoded correctly, and the television playing picture is displayed in a screen-splash state. If the B frame in the GOP is lost, the image corresponding to the B frame is lost, resulting in the frozen screen of the picture.
Referring to fig. 9, which is a schematic diagram illustrating an inter-frame space and a data amount of an image frame, for example, the inter-frame space of the partial image frame in fig. 8 is the same, wherein the inter-frame space of the I1 frame, the B1 frame, the B2 frame, the P1 frame, the I2 frame, and the P2 frame are the same. Illustratively, the inter-frame interval T is 16.7 ms. It should be noted that the size of the inter-frame interval depends on the sampling rate of the electronic device, for example, the sampling rate of the electronic device is 60fps (frames per second), that is, the electronic device acquires 60 images every 1s, and the inter-frame interval of each image is 16.7 ms. It should be further noted that the inter-frame interval may also be understood as a display duration, that is, when the television displays the image corresponding to the image frame, the duration of each image display is 16.7 ms. It should be noted that the inter-frame interval in the embodiment of the present application is only an illustrative example, and the inter-frame interval is different based on different sampling rates of the electronic device, and the present application is not limited thereto.
Illustratively, the data amount (in bytes) contained in each image frame in the GOP may be the same or different. For example, with continued reference to FIG. 9, the data size of the I1 frame is 500KB, the data size of the B1 frame is 100KB, the data size of the B2 frame is 80KB, the data size of the P1 frame is 200KB, the data size of the I2 frame is 600KB, and the data size of the P2 frame is 300 KB. The above data amount is only an exemplary example, and the present application is not limited thereto.
The following describes in detail the processing and transmission process of the image frame by the host:
after the host encodes the image, the GOP shown in fig. 8 is obtained. The host may encapsulate the image frames in the GOP to generate data packets. Referring to fig. 10, which is a schematic diagram of processing an image frame by a host end as shown in fig. 10, taking an I1 frame in a GOP as an example, the host generates a packet 1 and a packet 1' based on an I1 frame as an example. For example, the host may copy the I1 frames, encapsulate one of the I1 frames to generate packet 1, and encapsulate the other I1 frame to generate packet 1'.
Referring to fig. 11a, the format of the data packet is schematically shown, and referring to fig. 11a, the data packet illustratively includes a frame control field, a frame body field, and fields such as a Cyclic Redundancy Check (CRC). Illustratively, the frame control field includes, but is not limited to, a lane identification field for carrying a lane identification of a lane to which the packet belongs, and a packet sequence number/number field for carrying a total number of packets corresponding to the image frame and a sequence number of the current packet, for example, as described above, the image frame may be encapsulated in a plurality of packets, and if the I1 frame is encapsulated in 3 packets, i.e., the packet 1a, the packet 1b, and the packet 1c carry the I1 frame, the packet sequence number/number field of the packet 1a includes "1/3", i.e., for indicating that there are 3 packets carrying the I1 frame in total, and the current packet (i.e., the packet 1a) is the 1 st of the 3 packets. The frame body field includes fields such as a number field and an image frame field, wherein the number field includes a packet number for indicating a generation order (or transmission order) of packets. The image frame field includes image data corresponding to the image frame. Note that the names and positions of the fields in fig. 11a are merely exemplary, and the present application is not limited thereto.
For example, taking the data packet 1 and the data packet 1' in fig. 10 as an example, as shown in fig. 11b, which is a schematic format diagram of the data packet 1 exemplarily shown, referring to fig. 11b, the lane identification field in the frame control field in the data packet 1 carries "1" for indicating that the lane for transmitting the data packet 1 is the physical lane 1. The number field in the frame body field in the data packet 1 includes "1" to indicate that the number of the data packet is 1, and the image frame field includes image data corresponding to the I frame. Referring to fig. 11c, which is a schematic diagram illustrating a format of an exemplary data packet 1 ', and referring to fig. 11c, for example, a channel identification field in a frame control field in the data packet 1 ' carries "2" to indicate that a channel for transmitting the data packet 1 ' is a physical channel 2. The number field in the frame body field in the data packet 1' includes "1" to indicate that the number of the data packet is 1, and the image frame field includes image data corresponding to the I frame.
It should be noted that, in the embodiments of the present application, an image frame is packaged into one data packet, that is, one data packet includes one image frame, for example, in other embodiments, the host may perform packaging based on the data amount of the image frame, for example, the image frame is packaged into a plurality of data packets, or a plurality of image frames are packaged into the same data packet, which is not limited in the present application.
As shown in fig. 12a, which is an exemplary transmission diagram of a part of image frames in the GOP in fig. 8, referring to fig. 12a, for example, the host and the tv 1 perform data interaction by using a MIMO (multiple Input multiple Output) method, for example, the host includes antennas 1 to 4, and broadcasts the image frames through the antennas 1 to 4, for example, the antenna 1 corresponds to a physical channel 1 (which may also be referred to as a transmission channel 1), the antenna 2 corresponds to a physical channel 2, the antenna 3 corresponds to a physical channel 3, and the antenna 4 corresponds to a physical channel 4, as described above, the antennas 1 to 4 operate on the same channel (e.g., a channel 40), that is, the host may transmit data through different physical channels on the same channel. Correspondingly, the television 1 comprises the antennas 5 to 8, and the data sent by the antennas 1 to 4 can be received through the antennas 5 to 8. It should be noted that, in the embodiment of the present application, only an image frame transmission method between the host and the television 1 is taken as an example for description, a transmission method between the host and the televisions 2 to 4 is the same as an image frame transmission method between the host and the television 1, where the host sends data packets on 4 physical channels in a broadcast manner, and therefore, the televisions 1 to 4 can all receive the data packets sent by 4 antennas (i.e., 4 physical channels) of the host through respective antennas.
Continuing to refer to fig. 12a, for example, the host encapsulates each image frame in the GOP before transmitting the image frame in the GOP, and the detailed description may refer to fig. 10 and is not repeated herein. Illustratively, the host encapsulates the I1 frames to obtain a data packet 1 and a data packet 1 ', encapsulates the B1 frames and the B2 frames to obtain a data packet 2 and a data packet 2 ', encapsulates the P1 frames to obtain a data packet 3 and a data packet 3 ', encapsulates the B3 frames and the B4 frames to obtain a data packet 4 and a data packet 4 ', and encapsulates the P2 frames to obtain a data packet 5 and a data packet 5 '. In order to distinguish the original data packet from the data packet transmitted in a redundant manner (for example, the data packet 1'), the data packet transmitted in a redundant manner is referred to as a redundant data packet in the embodiment of the present application.
For example, the host uses the physical channel 1 corresponding to the antenna 1 as a main physical channel, that is, the original video stream is transmitted on the physical channel 1, that is, the data packets corresponding to all the image frames generated by the host are sequentially transmitted on the physical channel 1, for example, referring to fig. 12a, the host transmits the data packets 1 to 5 on the physical channel 1. It should be noted that, each physical channel corresponds to a corresponding sending time slot and a corresponding receiving time slot, and when the host sends a data packet, the host sends the data packet on the corresponding sending time slot, and the specific details may refer to a time slot rule in the communication protocol, which is not limited in the present application.
Illustratively, physical channel 2 may be used for delayed redundant transmission of I-frames. Illustratively, the transmission time of the redundant data packet (e.g., data packet 1') of the I frame on the physical channel 2 is aligned with the transmission completion time of the data packet (e.g., data packet 1) of the I frame on the physical channel 1. For example, referring to fig. 12a, the host sends packet 1 corresponding to I1 frame at time t1, and the transmission duration corresponding to the data size of I1 frame is (t2-t1), that is, at time t2, I1 frame transmission is completed, the host detects that I1 frame (i.e., packet 1) transmission is completed at time t2, and at time t2, the host sends packet 1' corresponding to I1 frame on physical channel 2, so as to implement delayed redundant transmission of I1 frame.
For example, physical channel 3 may be used for delayed redundant transmission of B-frames. Illustratively, the transmission timing of the redundant packet (e.g., packet 2') of the B frame on physical channel 3 is aligned with the transmission completion timing of the packet (e.g., packet 2) of the B frame on physical channel 1. For example, referring to fig. 12a, the host sends a packet 2 corresponding to a B1 frame and a B2 frame at time t2, and the transmission duration corresponding to the data amount of the B1 frame and the B2 frame is (t4-t2), that is, at time t4, the sending of the B2 frame and the B3 frame is completed, the host detects that the sending of the B1 frame and the B2 frame (i.e., the packet 2) is completed at time t4, and at time t4, the host sends a packet 2' corresponding to a B1 frame and a B2 frame on the physical channel 3, so as to implement the delayed redundant transmission of the B1 frame and the B2 frame. It should be noted that, in this embodiment, it is exemplified that the B1 frame and the B2 frame are encapsulated in the same data packet (i.e., the data packet 2 or the data packet 2'), in other embodiments, the B1 frame and the B2 frame are encapsulated in different data packets, for example, after the host sends the data packet corresponding to the B1 frame at time t3, at time t3, another data packet (hereinafter referred to as a redundant data packet) corresponding to the B1 frame is sent on the physical channel 3, and at time t3, the host sends the data packet corresponding to the B2 frame on the physical channel 1. In an example, if the data amount (i.e., the transmission duration) of the B1 frame is greater than the data amount of the B2 frame, that is, after the transmission of the data packet corresponding to the B2 frame is completed, the transmission of the redundant data packet of the B1 frame is not completed, optionally, the host may monitor the sending condition of the redundant data packet of the B1 frame, and after the sending of the redundant data packet of the B1 frame is completed, send the redundant data packet of the B2 frame on the physical channel 3. Optionally, a physical channel 5 may be further provided in this embodiment of the present application, and the physical channel may be configured to transmit consecutive image frames (which may be consecutive P frames and/or B frames) of the same type, and taking B1 frames and B2 frames as examples, the host may encapsulate B1 frames and B2 frames in different packets, respectively, and transmit the packet of B1 frames on the physical channel 3, and transmit the packet of B2 frames on the physical channel 5.
Illustratively, the physical channel 4 may be used for delayed redundant transmission of P-frames. Illustratively, the transmission timing of the redundant packet (e.g., packet 2') of the P frame on the physical channel 4 is aligned with the transmission completion timing of the packet (e.g., packet 2) of the B frame on the physical channel 1. Referring to fig. 12a, for example, the host sends the packet 3 corresponding to the P1 frame at time t4, the transmission duration corresponding to the data size of the P1 frame is (t5-t6), that is, at time t5, the transmission of the P1 frame is completed, the host detects that the transmission of the P1 frame (i.e., the packet 3) is completed at time t5, and at time t5, the host sends the packet 3' corresponding to the P1 frame on the physical channel 4, so as to implement the delayed redundant transmission of the P1 frame.
Illustratively, with continued reference to FIG. 12a, at time t5 to time t7, the host sends packet 4 corresponding to B3 frame and B4 frame on physical channel 1, and at time t7, the host sends packet 4' corresponding to B3 frame and B4 frame on physical channel 3. At times t 7-t 8, the host transmits packet 5 corresponding to P2 frame on physical channel 1, and at time t8, the host transmits packet 5' corresponding to P2 frame on physical channel 4.
It should be noted that the number of channels in the embodiment of the present application is only an illustrative example, and in other embodiments, the number of channels may be set according to practical situations, for example, if an image frame includes N types of frames, such as I-frames and P-frames (i.e., two types of frames), the number of transmission channels of the host may be N +1, i.e., 3, where one transmits an original image frame, another transmits an I-frame in a delayed redundant manner, and the third transmits a P-frame in a delayed redundant manner.
It should be further noted that the type of the image frame transmitted on the transmission channel in the embodiment of the present application is only an illustrative example, for example, the physical channel 2 may be used as a main transmission channel for transmitting an original video stream, that is, the data packet 1 to the data packet 5, the physical channel 1 may be used for transmitting a redundant data packet of an I frame, the physical channel 3 may be used for transmitting a redundant data packet of a P frame, and the physical channel 4 may be used for transmitting a redundant data packet of a B frame, which is not limited in the present application.
In one possible implementation, the host may transmit the original video stream through only one physical channel (e.g., physical channel 1), and transmit the image frame in the manner shown in fig. 12a after the set redundant transmission condition is satisfied. Optionally, the set redundant transmission condition may be that a communication quality parameter of the physical channel is less than a threshold (which may be set according to actual conditions), where the communication quality parameter includes, but is not limited to, at least one of the following: SNR (SIGNAL to NOISE RATIO), RSRP (Reference SIGNAL Receiving Power), RSRQ (Reference SIGNAL Receiving Quality), RSSI (Received SIGNAL Strength Indication), and the like. For example, at least one of the televisions 1 to 4 may periodically send a reference signal to the host, and the host may obtain the current communication quality parameter based on the received reference signal, and the specific obtaining manner may refer to a manner in an existing standard, which is not described in detail in this application. Illustratively, when the host detects that the obtained communication quality parameter of the physical channel 1 is smaller than the threshold, the host and the televisions 1 to 4 execute the Wi-Fi connection establishment process shown in fig. 6 again, so as to send updated control information to the televisions 1 to 4 respectively after the connection establishment is successful, the updated control information is used for instructing the host to perform the delayed redundant transmission in the transmission mode shown in fig. 12a, the information or the parameter included in the updated control information can refer to the above, which is not described herein again, thereby implementing a dynamic transmission mode, so as to transmit the image frame in the delayed redundant transmission mode after the set redundant transmission condition is met, and further reducing the power consumption of the device. Optionally, after detecting that the acquired communication quality parameter of the physical channel 1 is smaller than the threshold, the host may also send updated control information to the televisions 1 to 4 through the current physical channel 1, that is, in a broadcast manner, which is not limited in the present application.
In one possible implementation, referring to fig. 12a, in a time period (which may be referred to as an idle time period) when each physical channel does not send a data packet, for example, at times t 1-t 2 of the physical channel 2, at times t 1-t 5 of the physical channel 3, and the like, the idle time period may be used for transmitting other data or signaling. For example, in the process of performing transmission by the host according to the redundant transmission manner in fig. 12a, the host may detect that the communication transmission quality parameter of each physical channel is greater than or equal to the threshold value within a set time period (e.g., within 30 minutes), the host may cancel one or more redundant channels (e.g., cancel the transmission channel corresponding to the B frame with a smaller data amount, i.e., physical channel 3), it should be noted that canceling the redundant channel may be understood as canceling packet transmission on the redundant channel, and optionally, the host may send updated control information in the idle period of the physical channel 2 (or the idle period on another physical channel) to indicate that the designated redundant channel is canceled.
Referring to fig. 12B, which is a schematic diagram illustrating the effect of interference on data transmission exemplarily shown, in the process of transmitting a video stream by a host in the manner shown in fig. 12a, an interference signal (shown as a shaded area 1201 in fig. 12B) generated by an interference source interferes with a packet 3 corresponding to a P1 frame on a physical channel 1, a packet 1 'corresponding to an I1 frame on a physical channel 2, and a packet 2' corresponding to a B1 frame and a B2 frame on the physical channel 3, which may cause a packet loss or damage to the integrity of the packet, exemplarily referring to fig. 12B. It should be noted that the interference range of the interference signal is only an exemplary example, and the application is not limited thereto.
For example, in this embodiment of the application, taking the case that the data packet 3 corresponding to the P1 frame is interfered by the interference signal and lost during the transmission process as an example, referring to fig. 12b, after the receiving end detects that the data packet 3 is lost, the receiving end (e.g., the television 1) may still obtain the data packet 3' corresponding to the P1 frame from the physical channel 3.
The following describes in detail the processing flow of the receiving end with reference to the schematic processing flow diagram of the receiving end (e.g. the television 1) shown in fig. 13, and with reference to fig. 13, the following steps specifically include:
s301, receiving the data packet carrying the I frame transmitted by the channel 1 by the television.
Illustratively, taking an I-frame as an example, the tv may receive a packet carrying the I-frame from channel 1, which may be one or more packets.
S302, the television judges whether the I frame is received correctly.
Illustratively, the television may determine whether the I frame is correctly received, optionally according to the following: whether all data packets (one or more) carrying the I frame are received, and each data packet is a complete data packet, i.e. no data or field is lost.
Alternatively, if the tv determines that the I frame is not correctly received, S303 is executed. Alternatively, if the tv determines that the I frame is correctly received, S304 is performed.
S303, the television acquires the redundant data packet carrying the I frame transmitted on the channel 2.
For example, when the tv determines that the I frame is not received correctly, the tv may obtain the redundant data packet carrying the I frame from a channel, for example, channel 2, used for transmitting the redundant data packet of the I frame.
S304, displaying the image of the I frame by the television.
Illustratively, the television may retrieve an I-frame from a packet from channel 1 or a packet from channel 2, decode the I-frame, and display the retrieved image after decoding.
It should be noted that, in the embodiment of the present application, different processing manners of the I frame by the receiving end are only illustrated, and in other embodiments, when a data packet of any data frame on the channel 1 is interfered, the method that is executed by the receiving end in the embodiment of the present application may be adopted, and a description of the method is not repeated herein.
The steps in fig. 13 are described in detail below in conjunction with several embodiments.
As shown in fig. 14, which is an exemplary data transmission method, it should be noted that, in order to better describe a processing method of a receiving end, in the following embodiments, only a packet carrying an I1 frame, a packet carrying a B1 frame, and a redundant packet carrying an I1 frame that are sent by a host on a transmission channel are taken as examples for description, and for processing of packets corresponding to other image frames, reference may be made to processing of an I1 frame, and a description thereof is not repeated in this application.
For example, as described above, the image frame may carry one data packet or a plurality of data packets, in this embodiment, the I1 frame is carried by a plurality of data packets, and referring to fig. 14, for example, at time t1, the data packet 1a is transmitted on the physical channel 1 (hereinafter referred to as channel 1), and the data packet 1b and the data packet 1c are sequentially transmitted, each of the data packets 1a to 1c includes partial data of I1 frames, the data packet sequence number/number field in the data packet 1a carries "1/3" for indicating that there are 3 data packets carrying I1 frames in total, the data packet 1a is the 1 st data packet out of 3 data packets, the data packet sequence number/number field in the data packet 1b carries "2/3", and the data packet sequence number/number field in the data packet 1c carries "3/3". It should be noted that, due to the influence of processing delay and/or transmission delay, there may be a certain delay between data packets in the transmission process, that is, after the host end sends the data packet 1a, the data packet 1b may be sent at the sending completion time, but at the receiving end, for example, due to the influence of transmission delay, after the television receives the data packet 1a, the data packet 1b may be received after a certain delay (for example, 3 ms). The time delay between the data packets in fig. 14 is only an illustrative example, and the application is not limited thereto.
Illustratively, with continued reference to FIG. 14, after the host sends packets 1a 1c carrying I1 frames on physical channel 1, at time t2, redundant packets 1 'a 1' c carrying I1 frames are sent on channel 2. The data packet 1' a corresponds to the data packet 1a, that is, the data contained in the frame body field and the information carried in the data packet sequence number/number field are the same, for example, the data carried in the image frame field is the same. Packet 1 'b corresponds to packet 1b, and packet 1' c corresponds to packet 1 c. Illustratively, packets 1 a-1 c and packets 1 'a-1' c include the same number, e.g., the number "1". For example, in the embodiment of the present application, it is described by taking an example that the data packet 1c and the data packet 1 ' c include a CRC field, and the data packet 1a, the data packet 1b, the data packet 1 ' a, and the data packet 1 ' b do not include a CRC field.
Continuing with fig. 14, illustratively, after sending packet 1c, the host sends packet 2 carrying B1 frames, and as described above, there is optionally a delay between packet 1c and packet 2 during transmission due to processing delay and transmission delay (t3-t 2). It should be noted that, the present application only takes the case where the data packet 1 ' a is transmitted to the television before the data packet 2, and in other embodiments, because the processing delay and the transmission delay are different, the time when the data packet 2 and the data packet 1 ' a arrive at the television is optionally the same, or the data packet 2 may arrive before the data packet 1 ' a, which is not limited in the present application.
Referring to fig. 15a, as shown in fig. 15a, which is an exemplary schematic interaction diagram of a module, referring to fig. 15a, the Wi-Fi driver receives a data packet 1a, a data packet 1b, and a data packet 1c from the channel 1, and outputs the data packets 1a to 1c to the redundancy module.
Illustratively, the redundancy module starts a timer after determining that the last frame was correctly received. In the embodiment of the present application, the timer timing duration is 16.7ms, and in other embodiments, the timer timing duration may be longer or shorter, which is not limited in the present application.
For example, after the redundancy module starts the timer, the data packets 1a to 1c input by the Wi-Fi driver may be received. The redundant module decapsulates the data packets 1a to 1c to obtain data or information carried by each data packet.
Referring to fig. 15a, for example, as described above, the total number of data packets and the sequence numbers of the data packets are carried in the sequence number/number field of each data packet, the redundancy module receives the data packets 1a to 1c in sequence, and after decapsulating each received data packet, acquires information carried in the sequence number/number field of each data packet, and the redundancy module may determine that no packet is lost in an I1 frame based on the information carried in the sequence number/number field of each data packet, that is, after receiving the data packet 1c, the redundancy module determines that 3 data packets carrying an I1 frame have been received, and the timer stops timing.
Illustratively, the redundancy module performs an integrity check based on the CRC field of the data packet 1c, e.g., determining whether data in each received data packet is missing or erroneous, etc.
For example, taking the successful integrity check of the CRC field of the data packet 1c by the redundancy module as an example, the redundancy module may correspondingly output the I frame data carried in each decapsulated data packet to the codec. For example, the codec may be provided with a buffer area, and the buffer area may be used to store the image frames input by the redundancy module. For example, the image frames input by the redundancy module may be buffered in the buffer area in the form of a queue, and it is assumed that the queue includes: i1 frames, B1 frames, and B2 frames. The codec may sequentially extract required image frames from the queue based on the encoding order of the image frames, decode the image frames, and display the decoded images.
With continued reference to fig. 15a, for example, after the redundancy module performs integrity check on the data packets 1a to 1c, it optionally receives the data packets 1 'a to 1' c acquired from the channel 2 by the Wi-Fi driver. Illustratively, the redundancy module has determined that the I1 frame is received correctly, i.e., the data packets 1 a-1 c are received completely (including not lost packet and integrity check is successful), and optionally directly discards the redundant data packets of the I1 frame, i.e., the data packets 1 'a-1' c. Optionally, the redundancy module may also buffer the redundant data packet of the I1 frame and set the aging duration (e.g., 30ms), i.e., the buffer is cleared every 30 ms.
Optionally, if the redundancy module receives the data packets 1 'a to 1' c input by the Wi-Fi driver in the integrity check process, the redundancy module caches the data packets 1 'a to 1' c, and discards the data packets 1 'a to 1' c after the integrity check is successful.
Optionally, the redundancy module receives the data packet 1 'a input by the Wi-Fi driver during the integrity check of the data packets 1a to 1c, and after the integrity check is successful, the redundancy module may discard the data packet 1' a and indicate that the Wi-Fi driver does not receive the data packet 1 'b and the data packet 1' c any more.
Optionally, after the redundancy module successfully performs integrity check on the data packets 1a to 1c, the redundancy module may instruct the Wi-Fi driver to cancel monitoring on the channel 2, that is, no longer receive the data packets 1 'a to 1' c, until the I frame is received next time, and then monitor the channel 2.
Referring to fig. 15b, as shown in fig. 15b, which is an exemplary interaction diagram of the modules, referring to fig. 14, for example, the Wi-Fi driver receives the data packet 1a and the data packet 1c from the channel 1, where the data packet 1b is optionally lost after being interfered during transmission. Illustratively, the Wi-Fi driver outputs packet 1a and packet 1c to the redundancy module.
Illustratively, the redundancy module starts a timer after determining that the last frame was correctly received. In the embodiment of the present application, the timer timing duration is 16.7ms, and in other embodiments, the timer timing duration may be longer or shorter, which is not limited in the present application.
For example, after the redundancy module starts the timer, the data packet 1a and the data packet 1c of the Wi-Fi driver input may be received. And the redundant module decapsulates the data packet 1a and the data packet 1c to obtain data or information carried by each data packet.
Referring to fig. 15a, for example, as described above, the packet sequence number/number field in each packet carries the total number of packets and the sequence number of the packet, and the redundancy module receives the packet 1a, decapsulates the packet 1a, and obtains the data and information carried by the packet 1a, where the data and information include "1/3" carried by the packet sequence number/number field in the packet 1 a. Illustratively, the redundancy module receives the data packet 1c, decapsulates the data packet 1c, and obtains the data and the information carried by the data packet 1c, where the data and the information include "3/3" carried by the packet sequence number/number field in the data packet 1 c. The redundancy module can determine that the 2 nd packet (i.e., packet 1b) of the 3 packets corresponding to the I1 frame is lost, and the timer stops counting.
For example, the redundancy module starts a timer, and optionally, the duration of the timer may be 16.7ms, or may be other shorter or longer durations, which is not limited in this application. Illustratively, the Wi-Fi driver receives the data packets 1 'a to 1' c from the channel 2, and outputs the data packets 1 'a to 1' c to the redundancy module. It should be noted that the Wi-Fi driver may receive the data packets 1 'a to 1' c before or after the redundancy module starts the timer, and the application is not limited thereto.
Illustratively, after receiving the data packets 1 'a to 1' c, the redundancy module decapsulates the data packets 1 'a to 1' c and caches the decapsulated data packets 1 'a to 1' c.
Continuing to refer to fig. 15b, for example, after the redundancy module receives the data packet 1c, it determines that 3 redundant data packets corresponding to the I1 frame, including the data packets 1 'a to 1' c, have been received, and the timer stops counting. The redundancy module may perform integrity check based on the CRC field of the data packet 1' c, and specific details may refer to the related description in fig. 15a, which is not described herein again. For example, taking the example that the integrity check of the data packet 1 ' c is successfully performed by the redundancy module as an example, the redundancy module may output the acquired I1 frame, that is, the I1 frame carried in the redundant data packets 1 ' a to 1 ' c, to the codec, and other undescribed parts may refer to the related description in fig. 15a, which is not described herein again.
In a possible implementation manner, the redundancy module may also extract only the data packet (data packet 1 'b) corresponding to the lost data packet, i.e., the data packet 1b, from the buffered data packets 1' a to 1 'c, acquire the I1 frame based on the data packet 1a, the data packet 1' b, and the data packet 1c, and output the I1 frame to the codec.
Referring to fig. 15c, as shown in fig. 15c, which is an exemplary interaction diagram of the modules, referring to fig. 15c, for example, the Wi-Fi driver receives the data packet 1a and the data packet 1b from the channel 1, where the data packet 1c is optionally lost after being interfered during transmission. Illustratively, the Wi-Fi driver outputs packet 1a and packet 1b to the redundancy module.
Illustratively, the redundancy module starts a timer after determining that the last frame was correctly received. In the embodiment of the present application, the timer timing duration is 16.7ms, and in other embodiments, the timer timing duration may be longer or shorter, which is not limited in the present application.
For example, after the redundancy module starts the timer, the data packet 1a and the data packet 1b of the Wi-Fi driver input may be received. And the redundant module decapsulates the data packet 1a and the data packet 1b to obtain data or information carried by each data packet.
Referring to fig. 15c, for example, since the data packet 1c is lost, before the timer is over, the redundancy module receives only the data packet 1a and the data packet 1b, and may determine that the 1 st data packet and the 2 nd data packet of the 3 data packets corresponding to the I1 frame have been received and the 3 rd data packet, i.e., the data packet 1c, has not been received based on the information carried in the data packet sequence number/number field of the data packet 1a and the data packet 1 b. And the redundancy module determines that the transmission of the I1 frame is incomplete, namely, a packet loss exists, and the timer ends to time.
Illustratively, the redundancy module starts another timer, and optionally, the duration of the timer may be 16.7ms, or may be other shorter or longer durations, which is not limited in this application.
Illustratively, the Wi-Fi driver receives the data packets 1 'a to 1' c from the channel 2, and outputs the data packets 1 'a to 1' c to the redundancy module. It should be noted that the Wi-Fi driver may receive the data packets 1 'a to 1' c before or after the redundancy module starts the timer, and the application is not limited thereto.
Illustratively, after receiving the data packets 1 'a to 1' c, the redundancy module decapsulates the data packets 1 'a to 1' c and caches the decapsulated data packets 1 'a to 1' c.
Continuing to refer to fig. 15c, for example, after the redundancy module receives the data packet 1c, it determines that 3 redundant data packets corresponding to the I1 frame, including the data packets 1 'a to 1' c, have been received, and the timer stops counting. The redundancy module may perform integrity check based on the CRC field of the data packet 1' c, and specific details may refer to the related description in fig. 15a, which is not described herein again. For example, taking the example that the integrity check of the data packet 1 ' c is successfully performed by the redundancy module as an example, the redundancy module may output the acquired I1 frame, that is, the I1 frame carried in the redundant data packets 1 ' a to 1 ' c, to the codec, and other undescribed parts may refer to the related description in fig. 15a, which is not described herein again.
Referring to fig. 15d, as shown in fig. 15d, which is an exemplary schematic diagram of module interaction, in conjunction with fig. 14, for example, the Wi-Fi driver receives the data packet 1a, the data packet 1b, and the data packet 1c from the channel 1, and outputs the data packets 1a to 1c to the redundancy module.
Illustratively, the redundancy module starts a timer after determining that the last frame was correctly received. In the embodiment of the present application, the timer timing duration is 16.7ms, and in other embodiments, the timer timing duration may be longer or shorter, which is not limited in the present application.
For example, after the redundancy module starts the timer, the data packets 1a to 1c input by the Wi-Fi driver may be received. The redundant module decapsulates the data packets 1a to 1c to obtain data or information carried by each data packet.
Referring to fig. 15d, for example, as described above, the total number of data packets and the sequence numbers of the data packets are carried in the sequence number/number field of each data packet, the redundancy module receives the data packets 1a to 1c in sequence, and decapsulates each received data packet to obtain information carried in the sequence number/number field of each data packet, and the redundancy module may determine, based on the information carried in the sequence number/number field of each data packet, that is, after receiving the data packet 1c, that an I1 frame has not been lost, that is, after determining that 3 data packets carrying an I1 frame have been received, the timer stops timing.
For example, the redundancy module performs integrity check based on the CRC field of the data packet 1c, and in this embodiment, the redundancy module fails to perform integrity check on the data packet 1c, for example, a partial data error in the data packet 1c (which may also be the data packet 1a and/or the data packet 1 b).
Illustratively, the redundancy module starts another timer, and optionally, the duration of the timer may be 16.7ms, or may be other shorter or longer durations, which is not limited in this application.
Illustratively, the Wi-Fi driver receives the data packets 1 'a to 1' c from the channel 2, and outputs the data packets 1 'a to 1' c to the redundancy module. It should be noted that the Wi-Fi driver may receive the data packets 1 'a to 1' c before or after the redundancy module starts the timer, and the application is not limited thereto.
Illustratively, after receiving the data packets 1 'a to 1' c, the redundancy module decapsulates the data packets 1 'a to 1' c and caches the decapsulated data packets 1 'a to 1' c.
Continuing to refer to fig. 15d, for example, after the redundancy module receives the data packet 1c, it determines that 3 redundant data packets corresponding to the I1 frame, including the data packets 1 'a to 1' c, have been received, and the timer stops counting. The redundancy module may perform integrity check based on the CRC field of the data packet 1' c, and specific details may refer to the related description in fig. 15a, which is not described herein again. For example, taking the example that the integrity check of the data packet 1 ' c is successfully performed by the redundancy module as an example, the redundancy module may output the acquired I1 frame, that is, the I1 frame carried in the redundant data packets 1 ' a to 1 ' c, to the codec, and other undescribed parts may refer to the related description in fig. 15a, which is not described herein again.
Referring to fig. 15e, as shown in fig. 15e, which is an exemplary interaction diagram of the modules, referring to fig. 15e, the Wi-Fi driver receives the data packet 1a and the data packet 2 from the channel 1, where the data packet 1b and the data packet 1c are optionally lost after being interfered during transmission. Illustratively, the Wi-Fi driver outputs packet 1a and packet 2 to the redundancy module.
Illustratively, the redundancy module receives packet 1a and packet 2 of the Wi-Fi driver input. The redundancy module decapsulates the data packet 1a and the data packet 2 to obtain data or information carried by each data packet, where the data or information includes a code carried by each data packet, for example, the code carried by the data packet 1a is "1", and the code carried by the data packet 2 is "2".
Referring to fig. 15e, for example, the redundancy module obtains the codes carried by the packets, and may determine that the packet 2 carries a different code from the packet 1a, and the redundancy module may further determine that a part of the packets of the I1 frame are lost.
Illustratively, the Wi-Fi driver receives the data packets 1 'a to 1' c from the channel 2, and outputs the data packets 1 'a to 1' c to the redundancy module. It should be noted that, in the example shown in fig. 15e, the redundancy module further receives the data packet 2, and the redundancy module performs the steps in fig. 15e and also processes the data packet 2 in parallel, for example, the redundancy module performs integrity check on the data packet 2, optionally the redundancy module determines that the integrity check of the data packet 2 is successful, that is, the B1 frame is correctly received, and the redundancy module outputs the B1 frame to the codec.
Illustratively, after receiving the data packets 1 'a to 1' c, the redundancy module decapsulates the data packets 1 'a to 1' c and caches the decapsulated data packets 1 'a to 1' c.
Continuing to refer to fig. 15e, for example, after the redundancy module receives the data packet 1c, it determines that 3 redundant data packets corresponding to the I1 frame, including the data packets 1 'a to 1' c, have been received, and the timer stops counting. The redundancy module may perform integrity check based on the CRC field of the data packet 1' c, and specific details may refer to the related description in fig. 15a, which is not described herein again. For example, taking the example that the integrity check of the data packet 1 ' c is successfully performed by the redundancy module as an example, the redundancy module may output the acquired I1 frame, that is, the I1 frame carried in the redundant data packets 1 ' a to 1 ' c, to the codec, and other undescribed parts may refer to the related description in fig. 15a, which is not described herein again.
It should be noted that, as described above, the codec may be provided with a buffer queue, for example, in the example shown in fig. 15e, the codec may receive B1 frames first and then receive I1 frames, in this case, the image frames in the queue are B1 frames to I1 frames, it should be noted that the order in the queue does not affect the codec decoding order, that is, the codec may decode and output to an upper layer application (e.g., a video application) based on the order of the image frames (i.e., the codec may extract I1 frames from the queue based on the order of the image frames, and extract B1 frames from the queue after performing corresponding processing on the I1 frames.
In summary, in the embodiment of the present application, since the data packets 1 ' a to 1 ' c are transmitted in a delayed manner, generally, after at least one of the data packets 1a to 1c is interfered, because there is a time difference between the transmission times of the data packets 1 ' a to 1 ' c and the data packets 1a to 1c, the data packet 1 ' is not interfered in a normal condition, so even if any one of the data packets 1a to 1c is lost or data is missing, the redundancy module can still obtain the data packets 1 ' a to 1 ' c carrying the I1 frame from other physical channels (for example, the physical channel 2) and obtain the I1 frame. In addition, because the maximum receiving time difference between any one redundant data packet and the corresponding original data packet is only the transmission time length of one I frame, even if the original data packet is lost or damaged, the time delay from the acquisition of the redundant data packet to the display of the image corresponding to the redundant data packet is very small, and a user cannot perceive the time delay, so that the use experience of the user is improved.
It should be noted that, as described above, the images displayed by the televisions 1 to 4 may be different, for example, taking the television 1 as an example, after receiving the image input by the codec, the upper layer application of the television 1 may perform corresponding processing, for example, to cut the image, so as to display the cut partial image.
In one possible implementation manner, the host may set the delay transmission interval Td during the delay redundancy transmission, as shown in fig. 16, which is an exemplary data transmission manner diagram of setting the delay transmission interval Td, and referring to fig. 16, for example, after the host finishes transmitting the completion packet 1 at time t2, a timer may be started, and the timer may count for a time period Td (i.e., time t2 to time t 3). Optionally, the timer timing duration Td is greater than 0 and less than the inter-frame space (16.7 ms). For example, when the timer expires (i.e., at time t 3), the host sends the redundant data packet of I1 frame at time t3, i.e., data packet 1', and the redundant data packets of other frames may all have Td, which is the same as the delayed redundant sending method of the data packet of I1 frame, and therefore, the description is omitted here. It should be noted that the size of the delay transmission interval Td can be set according to the network condition, the processing capability of the host, and the processing capability of the receiving end. For example, after the host sends the data packet 1, due to the limitation of its processing capability, there may be a processing delay, for example, the redundant data packet 1' is sent after a delay of 10ms to 20ms, which is indeterminate for the receiving end, for example, the receiving end may misjudge that the data packet is lost, so that the receiving end may be unable to determine whether the redundant data packet is received on the redundant channel due to the processing delay indeterminacy of the network or the sending end. Accordingly, in the embodiment of the present application, the host and the receiving end may negotiate Td, for example, Td may be carried in the control information, and the receiving end may receive the redundant data packet on the corresponding redundant channel after receiving the data packet on the physical channel 1 and after the interval Td.
In another possible implementation manner, taking the I1 frame as an example, the host may send the redundant data packet of the I1 frame before the original data packet of the I1 frame is sent completely, that is, the transmission time of the redundant data packet of the I1 frame may partially overlap with the transmission time of the original data packet of the I1 frame, and optionally may be used to offset the transmission delay, that is, the host may send the redundant data packet in advance, so that the time when the receiving end receives the original data packet is the same as the arrival time of the redundant data packet.
In yet another possible implementation manner, the electronic device (e.g., the host) may further transmit the image frames and the redundant image frames in an OFDMA (Orthogonal Frequency Division Multiple Access) manner in 802.11ax, as shown in fig. 17, which is a schematic transmission channel diagram of the OFDMA mode exemplarily shown, and referring to fig. 17, the host may use the OFDMA manner on a channel 40 (the concept of the channel may refer to the above), and the host may divide the channel 40 into 4 sub-channels, including an OFDMA sub-channel 1, an OFDMA sub-channel 2, an OFDMA sub-channel 3, and an OFDMA sub-channel 4, where the bandwidth occupied by each sub-channel may be the same or different, and the present application is not limited thereto. The host may send the image frames to the televisions 1 to 4 in a broadcast manner on 4 sub-channels according to the delayed redundant transmission manner described in the embodiment of the present application, and the specific transmission manner is similar to the description in fig. 12a, and is not described herein again.
It should be noted that the data transmission method in the embodiment of the present application may also be applied to a screen projection scene, for example, a mobile phone may transmit a screen projection image to a television, a tablet, and a notebook respectively according to the delayed redundant transmission mode described in the embodiment of the present application. Optionally, the mobile phone, the tablet, and the notebook may selectively monitor the corresponding physical channel, for example, the mobile phone transmits the screen-shot image frame according to the method in fig. 12a, for example, the image corresponding to the image frame is at least part of the interface in the display interface of the mobile phone, and the specific transmission method and the processing method of the receiving end may refer to the above, which is not described herein again. Optionally, the television, the tablet and the notebook may monitor different redundant channels (e.g., physical channels 2 to 4), for example, the television may monitor 4 physical channels (including physical channels 1 to 4) to receive data packets and redundant data packets transmitted on the 4 physical channels, and the tablet and the notebook may monitor physical channels 1 to 3 to receive data packets and redundant data packets transmitted on the physical channels 1 to 3.
In a possible implementation manner, taking a mobile phone and a television as an example, under the condition that chip capabilities of the mobile phone and the television are supported, the mobile phone and the television may perform data interaction by using a D2D (Device-to-Device, direct-through terminal) communication manner, that is, the mobile phone and the television establish and maintain a multi-path Wi-Fi connection, and the mobile phone may transmit image frames to the television through the multi-path Wi-Fi connection between the televisions according to the delayed redundant transmission manner mentioned in the embodiments of the present application.
In another possible implementation, the host no longer transmits image frames and/or the television no longer listens to the broadcast channel if any of the following conditions occur, such as the host being powered off, the television being powered off, the host being on standby, etc.
It should be noted that, in the embodiment of the present application, only the transmission mode of the image frame is taken as an example for description, in other embodiments, other types of data or signaling may also be transmitted according to the delayed redundant transmission mode in the embodiment of the present application, so as to ensure stability of data transmission.
It will be appreciated that the electronic device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The present application is capable of being implemented in hardware or a combination of hardware and computer software in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present embodiment also provides a computer storage medium, where computer instructions are stored, and when the computer instructions are run on an electronic device, the electronic device executes the above related method steps to implement the data transmission method in the above embodiment.
The present embodiment also provides a computer program product, which when running on a computer, causes the computer to execute the relevant steps described above, so as to implement the data transmission method in the above embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the data transmission method in the above-mentioned method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the foregoing embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the functional modules is used for illustration, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules, so as to complete all or part of the functions described above.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
Any of the various embodiments of the present application, as well as any of the same embodiments, can be freely combined. Any combination of the above is within the scope of the present application.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (30)

1. A data transmission system, comprising a first electronic device and a second electronic device;
the first electronic device to:
transmitting a first image frame on a first transmission channel;
after the first image frame is sent, sending a second image frame on the first transmission channel, and sending a first redundant image frame on a second transmission channel, wherein the image data of the first redundant image frame is the same as the image data of the first image frame, and the frame type of the first image frame is different from that of the second image frame;
after the second image frame is sent, sending a second redundant image frame on a third transmission channel, wherein the image data of the second redundant image frame is the same as the image data of the second image frame;
the second electronic device to:
receiving the first image frame and the second image frame from the first transmission channel;
receiving the first redundant image frame from the second transmission channel after receiving the first image frame;
receiving the second redundant image frame from the third transmission channel after receiving the second image frame;
discarding the first redundant image frame and the second redundant image frame.
2. The system of claim 1,
the first electronic device is further configured to:
after the second image frame is sent, sending a third image frame on the first transmission channel; wherein the third image frame is of a different frame type than the first image frame and the second image frame;
after the third image frame is sent, sending a third redundant image frame on a fourth transmission channel, wherein the image data of the third redundant image frame is the same as the image data of the third image frame;
the second electronic device is further configured to:
detecting whether the third image frame is completely received from the first transmission channel;
when detecting that the third image frame is not completely received, displaying the image data of the third redundant image frame received from the fourth transmission channel.
3. The system of claim 2, wherein the second electronic device is further configured to:
when the third image frame is detected to be completely received, displaying the image data of the third image frame;
discarding the third redundant image frame received from the fourth transmission channel.
4. The system of claim 2, wherein the third image frame is carried in a first data packet and a second data packet;
the second electronic device to:
receiving the first data packet from the first transmission channel;
and when the second data packet is not received within a set first time length, determining that the third image frame is not completely received.
5. The system of claim 2, wherein the third image frame is carried in a first data packet and a second data packet;
the second electronic device to:
receiving the first data packet and the second data packet from the first transmission channel;
according to a Cyclic Redundancy Check (CRC) field carried by the second data packet, carrying out integrity check on the first data packet and the second data packet;
when the integrity check of the first data packet and the second data packet fails, it is determined that the third image frame is not completely received.
6. The system of claim 2, wherein the third image frame is carried in a first data packet and a second data packet, the first data packet and the second data packet carrying first number information, wherein the third redundant image frame is carried in a third data packet and a fourth data packet, and wherein the third data packet and the fourth data packet carrying the first number information.
7. The system of claim 6,
the first electronic device is further configured to:
after the third image frame is sent, sending a fourth image frame on the first transmission channel; wherein the fourth image frame is of the same frame type as the first image frame; the fourth image frame is carried in a fifth data packet, the fifth data packet carries second number information, and the second number information is different from the first number information;
after the fourth image frame is sent, sending a fourth redundant image frame on the second transmission channel, wherein the image data of the fourth redundant image frame is the same as the image data of the fourth image frame;
the second electronic device is further configured to:
receiving the first data packet from the first transmission channel;
receiving the fifth data packet from the first transmission channel;
and determining that the third image frame is not completely received according to the first number information carried by the first data packet and the second number information carried by the fifth data packet.
8. The system according to any one of claims 4 to 7, wherein the first data packet carries a first amount information indicating that there are two data packets carrying the third image frame, and the first data packet is a first one of the two data packets carrying the third image frame;
the second data packet carries second quantity information, where the second quantity information is used to indicate that there are two data packets carrying the third image frame, and the second data packet is a second of the two data packets carrying the third image frame.
9. The system according to any one of claims 4 to 7, wherein the first data packet and the second data packet carry channel identification information of the first transmission channel.
10. The system of any of claims 1 to 9, wherein the transmission time of the first redundant image frame is separated from the transmission completion time of the first image frame by a set second time period, the second time period being greater than or equal to 0 and less than the inter-frame space between the first image frame and the second image frame.
11. The system of claim 1, further comprising a third electronic device;
the third electronic device to:
receiving the first image frame and the second image frame from the first transmission channel;
receiving the first redundant image frame from the second transmission channel after receiving the first image frame;
receiving the second redundant image frame from the third transmission channel after receiving the second image frame;
discarding the first redundant image frame and the second redundant image frame.
12. The system of claim 1,
the first electronic device is further configured to:
establishing a first connection with the second electronic device;
and sending indication information to the second electronic device through the first connection, wherein the indication information is used for indicating that the first transmission channel is used for transmitting an original image group, the second transmission channel is used for transmitting image frames of a first frame type, and the third transmission channel is used for transmitting image frames of a second frame type, and the original image group comprises the first image frame and the second image frame.
13. The system of claim 12, wherein the first frame type is an I-frame and the second frame type is a P-frame or a B-frame.
14. The system according to any one of claims 1 to 13, wherein the first transmission channel and the second transmission channel operate on the same channel.
15. A method of data transmission, comprising:
the first electronic device transmits a first image frame on a first transmission channel;
the first electronic device sends a second image frame on the first transmission channel after the first image frame is sent, and sends a first redundant image frame on a second transmission channel, wherein the image data of the first redundant image frame is the same as the image data of the first image frame, and the frame types of the first image frame and the second image frame are different;
the first electronic device sends a second redundant image frame on a third transmission channel after the second image frame is sent, wherein the image data of the second redundant image frame is the same as the image data of the second image frame;
a second electronic device receives the first image frame and the second image frame from the first transmission channel;
the second electronic device receives the first redundant image frame from the second transmission channel after receiving the first image frame;
the second electronic device receives the second redundant image frame from the third transmission channel after receiving the second image frame;
the second electronic device discards the first redundant image frame and the second redundant image frame.
16. The method of claim 15, further comprising:
the first electronic device sends a third image frame on the first transmission channel after the second image frame is sent; wherein the third image frame is of a different frame type than the first image frame and the second image frame;
the first electronic device sends a third redundant image frame on a fourth transmission channel after the third image frame is sent, wherein the image data of the third redundant image frame is the same as the image data of the third image frame;
the second electronic device detecting whether the third image frame is completely received from the first transmission channel;
when detecting that the third image frame is not completely received, the second electronic device displays the image data of the third redundant image frame received from the fourth transmission channel.
17. The method of claim 16, further comprising:
when the third image frame is detected to be completely received, the second electronic equipment displays the image data of the third image frame;
the second electronic device discards the third redundant image frame received from the fourth transmission channel.
18. The method of claim 16, wherein the third image frame is carried in a first data packet and a second data packet; the second electronic device detecting whether the third image frame is completely received from the first transmission channel, including:
the second electronic device receiving the first data packet from the first transmission channel;
when the second data packet is not received within the set first duration, the second electronic device determines that the third image frame is not completely received.
19. The method of claim 16, wherein the third image frame is carried in a first data packet and a second data packet; the second electronic device detecting whether the third image frame is completely received from the first transmission channel, including:
the second electronic device receiving the first data packet and the second data packet from the first transmission channel;
the second electronic equipment carries out integrity check on the first data packet and the second data packet according to a Cyclic Redundancy Check (CRC) field carried by the second data packet;
when the integrity check of the first data packet and the second data packet fails, the second electronic device determines that the third image frame is not completely received.
20. The method of claim 16, wherein the third image frame is carried in a first data packet and a second data packet, the first data packet and the second data packet carrying first number information, wherein the third redundant image frame is carried in a third data packet and a fourth data packet, and wherein the third data packet and the fourth data packet carrying the first number information.
21. The method of claim 20, further comprising:
the first electronic device sends a fourth image frame on the first transmission channel after the third image frame is sent; wherein the fourth image frame is of the same frame type as the first image frame; the fourth image frame is carried in a fifth data packet, the fifth data packet carries second number information, and the second number information is different from the first number information;
after the fourth image frame is sent, the first electronic device sends a fourth redundant image frame on the second transmission channel, wherein the image data of the fourth redundant image frame is the same as the image data of the fourth image frame;
the second electronic device receiving the first data packet from the first transmission channel;
the second electronic device receiving the fifth data packet from the first transmission channel;
and the second electronic equipment determines that the third image frame is not completely received according to the first number information carried by the first data packet and the second number information carried by the fifth data packet.
22. The method according to any one of claims 18 to 21, wherein the first data packet carries a first amount of information, the first amount of information indicating that there are two data packets carrying the third image frame, and the first data packet is the first of the two data packets carrying the third image frame;
the second data packet carries second quantity information, the second quantity information is used for indicating that two data packets carrying the third image frame are total, and the second data packet is the second of the two data packets carrying the third image frame.
23. The method according to any one of claims 18 to 21, wherein the first data packet and the second data packet carry channel identification information of the first transmission channel.
24. The method of any of claims 15 to 23, wherein the interval between the transmission time of the first redundant image frame and the transmission completion time of the first image frame is a set second time duration, wherein the second time duration is greater than or equal to 0 and less than the inter-frame space between the first image frame and the second image frame.
25. The method of claim 15, further comprising:
a third electronic device receives the first image frame and the second image frame from the first transmission channel;
the third electronic device receives the first redundant image frame from the second transmission channel after receiving the first image frame;
the third electronic device receives the second redundant image frame from the third transmission channel after receiving the second image frame;
the third electronic device discards the first redundant image frame and the second redundant image frame.
26. The method of claim 15, wherein prior to the first electronic device transmitting the first image frame on the first transmission channel, the method further comprises:
the first electronic equipment and the second electronic equipment establish first connection;
the first electronic device sends indication information to the second electronic device through the first connection, wherein the indication information is used for indicating that the first transmission channel is used for transmitting an original image group, the second transmission channel is used for transmitting image frames of a first frame type, and the third transmission channel is used for transmitting image frames of a second frame type, and the original image group comprises the first image frame and the second image frame.
27. The method of claim 26, wherein the first frame type is an I-frame and the second frame type is a P-frame or a B-frame.
28. The method according to any of claims 15 to 27, wherein the first transmission channel and the second transmission channel operate on the same channel.
29. An electronic device, comprising:
a memory and a processor, the memory coupled with the processor;
the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the data transfer method of claim 15 to claim 28 performed by the first electronic device or the second electronic device.
30. A chip comprising one or more interface circuits and one or more processors; the interface circuit is configured to receive signals from a memory of an electronic device and to transmit the signals to the processor, the signals including computer instructions stored in the memory; the computer instructions, when executed by the processor, cause the electronic device to perform the data transfer method of claim 15 to claim 28 performed by the first electronic device or the second electronic device.
CN202011391638.4A 2020-12-02 2020-12-02 Data transmission method and system and electronic equipment Pending CN114584809A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011391638.4A CN114584809A (en) 2020-12-02 2020-12-02 Data transmission method and system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011391638.4A CN114584809A (en) 2020-12-02 2020-12-02 Data transmission method and system and electronic equipment

Publications (1)

Publication Number Publication Date
CN114584809A true CN114584809A (en) 2022-06-03

Family

ID=81768145

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011391638.4A Pending CN114584809A (en) 2020-12-02 2020-12-02 Data transmission method and system and electronic equipment

Country Status (1)

Country Link
CN (1) CN114584809A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116389352A (en) * 2023-03-17 2023-07-04 深圳市同行者科技有限公司 Method, system and related equipment for transmitting vehicle-mounted interactive screen-throwing data

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489093A (en) * 2009-01-19 2009-07-22 深圳华为通信技术有限公司 Code stream transmission method, terminal and system for conference communication
CN101695134A (en) * 2009-10-15 2010-04-14 中兴通讯股份有限公司 Terminal, system and method for improving play performance of terminal in weak signal environment
CN102006479A (en) * 2010-11-30 2011-04-06 北方工业大学 Scene-switching-oriented multiple description video coding method
CN105704580A (en) * 2016-01-21 2016-06-22 深圳比特新技术有限公司 Video transmission method
US10091553B1 (en) * 2014-01-10 2018-10-02 Sprint Communications Company L.P. Video content distribution system and method
WO2019109252A1 (en) * 2017-12-05 2019-06-13 华为技术有限公司 Method for transmitting and receiving data in pon system, network device, and system
CN110247955A (en) * 2019-05-21 2019-09-17 菜鸟智能物流控股有限公司 Unmanned vehicle communication method and unmanned vehicle
CN111698386A (en) * 2020-05-26 2020-09-22 中国科学院上海微***与信息技术研究所 Multi-channel image data synchronous transmitting device, receiving device and transmission system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489093A (en) * 2009-01-19 2009-07-22 深圳华为通信技术有限公司 Code stream transmission method, terminal and system for conference communication
CN101695134A (en) * 2009-10-15 2010-04-14 中兴通讯股份有限公司 Terminal, system and method for improving play performance of terminal in weak signal environment
CN102006479A (en) * 2010-11-30 2011-04-06 北方工业大学 Scene-switching-oriented multiple description video coding method
US10091553B1 (en) * 2014-01-10 2018-10-02 Sprint Communications Company L.P. Video content distribution system and method
CN105704580A (en) * 2016-01-21 2016-06-22 深圳比特新技术有限公司 Video transmission method
WO2019109252A1 (en) * 2017-12-05 2019-06-13 华为技术有限公司 Method for transmitting and receiving data in pon system, network device, and system
CN110247955A (en) * 2019-05-21 2019-09-17 菜鸟智能物流控股有限公司 Unmanned vehicle communication method and unmanned vehicle
CN111698386A (en) * 2020-05-26 2020-09-22 中国科学院上海微***与信息技术研究所 Multi-channel image data synchronous transmitting device, receiving device and transmission system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116389352A (en) * 2023-03-17 2023-07-04 深圳市同行者科技有限公司 Method, system and related equipment for transmitting vehicle-mounted interactive screen-throwing data

Similar Documents

Publication Publication Date Title
CN114009055B (en) Screen-throwing display method and electronic equipment
CN114390337B (en) Screen projection method and system and electronic equipment
US20230069398A1 (en) Method for Implementing Wi-Fi Peer-To-Peer Service and Related Device
CN110730448A (en) Method for establishing connection between devices and electronic device
US10499200B2 (en) Information processing device and information processing method
WO2022022019A1 (en) Screen projection data processing method and apparatus
EP4247031A1 (en) Access method and system and electronic device
EP4216501A1 (en) Method for switching channels, electronic device, and storage medium
US10085068B2 (en) Information processing apparatus and information processing method
CN109274407A (en) Data transmission method, device, electronic equipment and storage medium
US20220408281A1 (en) Method for Adjusting Quantity of Data Streams, Terminal, and MIMO System
CN114584809A (en) Data transmission method and system and electronic equipment
WO2024093521A1 (en) Handover exception processing method, device, and storage medium
WO2024093521A9 (en) Handover exception processing method, device, and storage medium
CN114205336A (en) Cross-device audio playing method, mobile terminal, electronic device and storage medium
CN115278825A (en) WiFi connection method and device
CN114765831A (en) Method and related equipment for pre-applying uplink resources
CN113556805A (en) Method and terminal for reducing power consumption
EP4354917A1 (en) Data processing method and electronic device
CN113810928B (en) Method for adjusting number of data streams, terminal and MIMO system
WO2021190277A1 (en) Uplink data split method and terminal
WO2023039890A1 (en) Video transmission method and electronic device
CN115086924A (en) Bluetooth communication method, system and electronic equipment
CN117560741A (en) Communication method, electronic device, and storage medium
CN115988424A (en) Data transmission method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220603