CN112929713B - Data synchronization method, device, terminal and storage medium - Google Patents

Data synchronization method, device, terminal and storage medium Download PDF

Info

Publication number
CN112929713B
CN112929713B CN202110168898.3A CN202110168898A CN112929713B CN 112929713 B CN112929713 B CN 112929713B CN 202110168898 A CN202110168898 A CN 202110168898A CN 112929713 B CN112929713 B CN 112929713B
Authority
CN
China
Prior art keywords
media stream
video
audio
frame
frames
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110168898.3A
Other languages
Chinese (zh)
Other versions
CN112929713A (en
Inventor
李克
余兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110168898.3A priority Critical patent/CN112929713B/en
Publication of CN112929713A publication Critical patent/CN112929713A/en
Application granted granted Critical
Publication of CN112929713B publication Critical patent/CN112929713B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a data synchronization method, a data synchronization device, a terminal and a storage medium. The method comprises the following steps: acquiring a media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized; determining a preset number of video frames from the media stream based on the first timestamp and the second timestamp; storing the preset number of video frames in a first buffer; synchronizing the corresponding video frame with the audio frame in the media stream for each video frame in the first buffer, and storing the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used to generate video pictures.

Description

Data synchronization method, device, terminal and storage medium
Technical Field
The present invention relates to the field of multimedia technologies, and in particular, to a data synchronization method, apparatus, terminal, and storage medium.
Background
With the rapid development of terminal technology, a player supporting the playing of video pictures can be installed on a terminal. In practical application, the phenomenon that the video picture is played slowly may occur in the player on the terminal, the fast-sending server may be deployed, and the media stream for playing the video picture is sent to the terminal in advance through the fast-sending server, so that the problem that the video picture is played slowly by the player on the terminal is avoided, however, if the video frame and the audio frame in the media stream sent to the terminal by the fast-sending server are not synchronous, the video picture cannot be played smoothly by the player on the terminal, so that user experience is affected.
Disclosure of Invention
In view of this, the embodiments of the present invention desire to provide a data synchronization method, apparatus, terminal and storage medium.
The technical scheme of the invention is realized as follows:
the embodiment of the invention provides a data synchronization method, which comprises the following steps:
acquiring a media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized;
determining a preset number of video frames from the media stream based on the first timestamp and the second timestamp;
storing the preset number of video frames in a first buffer;
synchronizing the corresponding video frame with the audio frame in the media stream for each video frame in the first buffer, and storing the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used to generate video pictures.
In the above aspect, the determining, based on the first timestamp and the second timestamp, a preset number of video frames from the media stream includes:
recording a first timestamp of a first video frame in the media stream and recording a second timestamp of an audio frame in the media stream adjacent to the first video frame;
The recorded first time stamp and the recorded second time stamp are subjected to difference to obtain a difference value;
starting from the first video frame, a preset number of video frames equal to the difference value is determined from the media stream.
In the above solution, for each video frame in the first buffer, synchronizing the corresponding video frame with the audio frame in the media stream includes:
determining a video frame with the smallest time stamp from the first buffer;
comparing the first time stamp of the video frame with the minimum time stamp with the second time stamp of the audio frame in the media stream to obtain a comparison result;
and when the comparison result represents that the second time stamp of the audio frame in the media stream is larger than or equal to the first time stamp of the video frame with the smallest time stamp, synchronizing the audio frame with the video frame in the first buffer memory.
In the above solution, after storing each video frame and the synchronized audio frame in the first buffer in the second buffer, the method further includes:
emptying the first cache;
redefining a preset number of video frames from the media stream;
and storing the redetermined preset number of video frames in the first buffer memory through a pointer moving operation.
In the above solution, the obtaining the media stream includes:
the media stream is retrieved from the fast-forwarding server.
According to the scheme, the method for acquiring the media stream from the quick-sending server comprises the following steps:
sending a media stream acquisition request to the quick sending server; the media stream acquisition request is sent to the quick-sending server after the terminal receives a channel switching instruction;
and receiving the media stream sent by the quick sending server aiming at the media stream acquisition request.
The scheme is applied to the quick-sending server, and the method further comprises the following steps:
receiving a media stream acquisition request sent by a terminal;
determining the double speed of the sending media stream according to the media stream acquisition request;
and sending the media stream to the terminal according to the determined speed.
An embodiment of the present invention provides a data synchronization device, including:
an acquisition unit configured to acquire a media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized;
a first processing unit configured to determine a preset number of video frames from the media stream based on the first timestamp and the second timestamp; storing the preset number of video frames in a first buffer;
The second processing unit is used for synchronizing the corresponding video frame with the audio frame in the media stream aiming at each video frame in the first buffer memory, and storing the synchronized video frame and audio frame in the second buffer memory; the synchronized video frames and audio frames are used to generate video pictures.
Above-mentioned scheme, first processing unit is specifically used for:
recording a first timestamp of a first video frame in the media stream and recording a second timestamp of an audio frame in the media stream adjacent to the first video frame;
the recorded first time stamp and the recorded second time stamp are subjected to difference to obtain a difference value;
starting from the first video frame, a preset number of video frames equal to the difference value is determined from the media stream.
Above-mentioned scheme, the second processing unit is specifically used for:
determining a video frame with the smallest time stamp from the first buffer;
comparing the first time stamp of the video frame with the minimum time stamp with the second time stamp of the audio frame in the media stream to obtain a comparison result;
and when the comparison result represents that the second time stamp of the audio frame in the media stream is larger than or equal to the first time stamp of the video frame with the smallest time stamp, synchronizing the audio frame with the video frame in the first buffer memory.
Above-mentioned scheme, first processing unit is still used for:
storing each video frame and the synchronized audio frame in the first buffer memory in the second buffer memory, and then emptying the first buffer memory;
redefining a preset number of video frames from the media stream;
and storing the redetermined preset number of video frames in the first buffer memory through a pointer moving operation.
The above solution, the obtaining unit is specifically configured to:
the media stream is retrieved from the fast-forwarding server.
The above solution, the obtaining unit is specifically configured to:
sending a media stream acquisition request to the quick sending server; the media stream acquisition request is sent to the quick-sending server after the terminal receives a channel switching instruction;
and receiving the media stream sent by the quick sending server aiming at the media stream acquisition request.
In the above solution, the apparatus further includes:
the transmission unit is used for receiving a media stream acquisition request sent by the terminal; determining the double speed of the sending media stream according to the media stream acquisition request; and sending the media stream to the terminal according to the determined speed.
An embodiment of the present invention provides a terminal, including: a processor and a memory for storing a computer program capable of running on the processor,
Wherein the processor is configured to implement steps of any of the methods described above when executing the computer program when executing the program.
An embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of any of the methods described above.
The data synchronization method, the device, the terminal and the storage medium provided by the embodiment of the invention acquire the media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized; determining a preset number of video frames from the media stream based on the first timestamp and the second timestamp; storing the preset number of video frames in a first buffer; synchronizing the corresponding video frame with the audio frame in the media stream for each video frame in the first buffer, and storing the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used to generate video pictures. According to the technical scheme, the preset number of video frames are selected from the media stream based on the time stamps between the video frames and the audio frames in the media stream and stored in the first buffer, and after the audio frames synchronous with the video frames in the first buffer are determined from the media stream, the video frames and the corresponding audio frames are stored in the second buffer so as to obtain video pictures through decoding processing, so that a user can see smooth video pictures.
Drawings
FIG. 1 is a schematic diagram of an implementation flow of a data synchronization method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a media stream according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an application scenario of a data synchronization method according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a process for implementing synchronization of unsynchronized video frames and audio frames in a media stream by a terminal according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of implementing synchronization of unsynchronized video frames and audio frames in a media stream according to an embodiment of the present invention;
fig. 6 is a second schematic diagram of a process for implementing synchronization of unsynchronized video frames and audio frames in a media stream by a terminal according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a data synchronization device according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a data synchronization system according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a composition structure of a terminal according to an embodiment of the present invention.
Detailed Description
Before the technical scheme of the embodiment of the invention is described in detail, the related technology is described first.
In the related art, in the multimedia technology field, when playing media streams by using playing devices such as a box, a set top box, a television, etc., a phenomenon that a playing picture is started very slowly sometimes occurs, especially in the live channel switching process, if the playing picture is started very slowly, user experience will be poor. Therefore, in order to accelerate the channel switching speed, the interactive network television, i.e. the iptv system generally adopts a fast-forwarding scheme, that is, a fast-forwarding server is deployed, and a media stream for playing video pictures is sent to a terminal in advance by the fast-forwarding server, so that the problem that a player on the terminal plays video pictures slowly is avoided, and the purpose of fast channel switching is achieved. However, if the video frames and the audio frames in the media stream sent to the terminal by the fast-sending server are not synchronized, the player on the terminal still cannot smoothly play the video frames, so that the user experience is affected. For example, for the situation that the audio-video gap is larger than 5s or longer, the traditional fast-transmitting scheme can only transmit the I frame first, and the corresponding audio frame is still 5s, because the player on the terminal decodes the video picture according to the program clock reference (PCR, program Clock Reference) value of the code stream and displays and plays the video picture, the time for synchronously outputting the audio-video is still more than 5s, but if the I frame is displayed immediately without synchronizing the audio-video, lip-voice asynchronism is obvious.
Based on this, in various embodiments of the invention, a media stream is acquired; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized; determining a preset number of video frames from the media stream based on the first timestamp and the second timestamp; storing the preset number of video frames in a first buffer; synchronizing the corresponding video frame with the audio frame in the media stream for each video frame in the first buffer, and storing the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used to generate video pictures.
It should be noted that, in the embodiment of the present invention, a preset number of video frames are selected from the media stream based on the timestamp difference between the video frames and the audio frames in the media stream, and stored in the first buffer, and after determining the audio frames with the same timestamp as the video frames in the first buffer from the media stream, the video frames and the corresponding audio frames are synchronized and stored in the second buffer, so as to obtain the video frames through the decoding operation.
The invention will be described in further detail with reference to the accompanying drawings and specific examples.
An embodiment of the present invention provides a data synchronization method, and fig. 1 is a schematic diagram of an implementation flow of the data synchronization method according to the embodiment of the present invention; as shown in fig. 1, the method includes:
step 101: acquiring a media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized;
step 102: determining a preset number of video frames from the media stream based on the first timestamp and the second timestamp; storing the preset number of video frames in a first buffer;
step 103: synchronizing the corresponding video frame with the audio frame in the media stream for each video frame in the first buffer, and storing the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used to generate video pictures.
Here, in step 101, in actual application, the video frames and the audio frames in the media stream may alternate, for example, the first one is an audio frame, the second one is a video frame, and then one is an audio frame and one is a video frame.
Here, in step 102, in the case where the video frames and the audio frames alternate in the media stream, for example, adjacent video frames and audio frames are taken as examples, if the first time stamp of the video frame is different from the second time stamp of the audio frame, the video frame and the audio frame are not synchronized.
Here, in step 103, in actual application, the second buffer may refer to a buffer in a bottom layer decoder of the terminal, and the first buffer may refer to a buffer in an upper layer application of the terminal. Because the video frames and the audio frames in the second buffer are synchronized, the bottom layer decoder of the terminal can obtain a smooth video picture only by decoding the synchronized video frames and audio frames.
Here, the terminal may be various embedded players, including but not limited to, an ip set-top box, an ip box, a smart tv, and other playing devices.
A detailed description of how the synchronization of unsynchronized video and audio frames in a media stream is achieved is provided below.
In practical application, in order to avoid that the bottom layer decoder of the terminal performs both decoding operation and synchronization operation on unsynchronized video frames and audio frames, the decoding operation and the synchronization operation can be separated, that is, only unsynchronized video frames can be stored in the first buffer and synchronized video frames and audio frames can be stored in the second buffer, so that the decoder corresponding to the second buffer can perform decoding operation on synchronized video frames and audio frames without performing synchronization operation.
Based on this, in an embodiment, the determining a preset number of video frames from the media stream based on the first timestamp and the second timestamp includes:
recording a first timestamp of a first video frame in the media stream and recording a second timestamp of an audio frame in the media stream adjacent to the first video frame;
the recorded first time stamp and the recorded second time stamp are subjected to difference to obtain a difference value;
starting from the first video frame, a preset number of video frames equal to the difference value is determined from the media stream.
Here, the video frame may refer to an I frame, i.e., a key frame; and may also refer to video frames consisting of I frames, B frames, and P frames.
Here, recording the first timestamp of the first video frame in the media stream may specifically refer to recording the display time stamp (PTS, presentation Time Stamps) of the first video frame; wherein the PTS characterizes a time at which a video picture corresponding to the video frame is displayed to a user. Recording a second timestamp of an audio frame in the media stream adjacent to the first video frame may specifically refer to recording a PTS of the audio frame adjacent to the first video frame; wherein the PTS characterizes a time at which audio corresponding to the audio frame is displayed to a user.
Here, when the first data frame in the media stream is an audio frame and the second data frame is a video frame, a first timestamp of the first video frame in the media stream may be recorded, and a second timestamp of the audio frame preceding the first video frame in the media stream may be recorded; alternatively, when the first data frame in the media stream is a video frame and the second data frame is an audio frame, a first timestamp of the first video frame in the media stream may be recorded, and a second timestamp of the audio frame in the media stream that follows the first video frame may be recorded.
Here, whether the data frame is an audio frame or a data frame may be determined according to the identifier of the data frame in the media stream, for example, if the identifier carried by the data frame is V, the data frame may be determined to be a video frame, and if the identifier carried by the data frame is a, the data frame may be determined to be an audio frame.
For example, fig. 2 is a schematic diagram of a media stream, as shown in fig. 2, assuming that a first timestamp of a first video frame in the recorded media stream is V (5), representing video data within 5 seconds; and (3) a second time stamp of the audio frame adjacent to the first video frame in the recorded media stream is A (0), and the audio data in the 0 th second is represented, so that the difference value obtained by differentiating the recorded first time stamp and the recorded second time stamp is 5, and 5 video frames, namely video frames corresponding to V (5), V (6), V (7), V (8) and V (9), are selected from the media stream from the first video frame corresponding to V (5).
In practical application, after the unsynchronized video frames are stored in the first buffer, the audio frames synchronized with the video frames can be received and then injected into the second buffer at the same time.
Based on this, in an embodiment, for each video frame in the first buffer, the synchronizing the corresponding video frame with the audio frame in the media stream includes:
determining a video frame with the smallest time stamp from the first buffer;
comparing the first time stamp of the video frame with the minimum time stamp with the second time stamp of the audio frame in the media stream to obtain a comparison result;
and when the comparison result represents that the second time stamp of the audio frame in the media stream is larger than or equal to the first time stamp of the video frame with the smallest time stamp, synchronizing the audio frame with the video frame in the first buffer memory.
Here, the first buffer may specifically be a first-in first-out data buffer, i.e. (FIFO, first In First Out).
Here, in order to make the display time of the video frame and the corresponding audio frame the same, the video frame with the smallest time stamp in the first buffer may be selected from the media stream, and the audio frame with the time stamp greater than or equal to the video frame and the video frame in the first buffer are synchronized, where the synchronized video frame and audio frame are used for being simultaneously injected into the second buffer of the bottom layer decoder of the terminal.
For example, assuming that the media stream is represented by a (0) V (5) a (1) V (6) a (2) V (7) a (3) V (8) … a (5) V (10), parsing the data packet corresponding to the media stream to find the first timestamp, e.g., PTS value, of the first video frame, e.g., I frame, in the media stream, represented by V (5), i.e., video data within the 5 th second; initializing a first buffer, such as a FIFO, storing a first video frame, such as an I-frame, and subsequent video frames, in the FIFO until a second timestamp, such as a PTS value, of audio data in the media stream is greater than or equal to a first timestamp, such as a PTS value, of the first video frame, e.g., a plurality of video frames stored in the FIFO are represented by V (5), V (6), V (7), V (8), V (9); after receiving an audio frame with a second time stamp being greater than a first time stamp of a first video frame from a media stream, namely after receiving A (5), simultaneously injecting A (5) and V (5) into a second buffer of a bottom layer decoder, and the like, selecting A (6), A (7), A (8) and A (9) from the media stream; after each video frame in the FIFO is fully injected into the second buffer of the underlying decoder, the FIFO is emptied and the pointer of the FIFO is moved to restore the new unsynchronized video data.
In practical application, since the length of the first buffer is determined according to the timestamp difference between the unsynchronized video frames and the audio frames in the media stream, after the first buffer storage and the second buffer storage are completed once, the first buffer can be emptied, and the unsynchronized video frames can be redetermined to repeat the processes of the first buffer storage and the second buffer storage once.
Based on this, in an embodiment, after storing each video frame and synchronized audio frame in the first buffer in the second buffer, the method further comprises:
emptying the first cache;
redefining a preset number of video frames from the media stream;
and storing the redetermined preset number of video frames in the first buffer memory through a pointer moving operation.
Here, when a predetermined number of video frames are redetermined from the media stream, selecting a first video frame from other video frames in the media stream except the video frames stored in the second buffer, recording a first timestamp of the video frame, and recording a second timestamp of an audio frame adjacent to the video frame in the media stream; the recorded first time stamp and the recorded second time stamp are subjected to difference to obtain a difference value; starting from the first video frame, a preset number of video frames equal to the difference value is determined from the media stream.
For example, assuming that the media stream is represented by a (0) V (5) a (1) V (6) a (2) V (7) a (3) V (8) … a (9) V (14), the video frames stored in the first buffer are represented by V (5), V (6), V (7), V (8), V (9), after each video frame and corresponding audio frame in the first buffer are stored in the second buffer, V (10) in the media stream may be taken as the first video frame and the first timestamp of the video frame and the second timestamp of the audio frame a (10) adjacent to the video frame are recorded; the recorded first time stamp and the recorded second time stamp are subjected to difference to obtain a difference value; starting from V (10), a preset number of video frames equal to the difference value, such as V (10), V (11), V (9), V (13), V (14), is determined from the media stream.
In practical application, the quick-sending server can be deployed, and the media stream for playing the video picture is sent to the terminal in advance through the quick-sending server, so that the problem that the player on the terminal plays the video picture slowly is avoided.
Based on this, in an embodiment, the acquiring the media stream includes:
the media stream is retrieved from the fast-forwarding server.
In an embodiment, the obtaining the media stream from the fast-forwarding server includes:
sending a media stream acquisition request to the quick sending server; the media stream acquisition request is sent to the quick-sending server after the terminal receives a channel switching instruction;
and receiving the media stream sent by the quick sending server aiming at the media stream acquisition request.
In practical application, in order to shorten the synchronization time of video frames and audio frames, the fast sending server can adjust the fast sending multiple, namely the sending speed of the media stream, and send the media stream to the terminal in a shorter time through the adjusted fast sending multiple so as to shorten the synchronization time of the terminal to the video frames and the audio frames.
Based on this, in an embodiment, applied to the fast-forwarding server, the method further comprises:
receiving a media stream acquisition request sent by a terminal;
Determining the double speed of the sending media stream according to the media stream acquisition request;
and sending the media stream to the terminal according to the determined speed.
Here, the speed doubling may refer to accelerating a rate of transmitting the media stream, for example, if a time stamp difference between an unsynchronized video frame and an audio frame in the media stream is 8 seconds(s), the determined speed doubling is 3 seconds, which means that a time for the fast-transmitting server to transmit the unsynchronized video frame and the audio frame to the terminal is 3 seconds.
Here, in order to further improve the fast forwarding stability, the fast forwarding multiple, that is, the multiple speed of the fast forwarding server may be adaptive. In practical application, the quick transmission multiple can be dynamically adjusted according to the code rate, the frame frequency and the bandwidth. .
Fig. 3 is a schematic view of an application scenario of the data synchronization method, and as shown in fig. 3, devices included in the application scenario include a remote controller, a terminal and a fast-sending server; wherein,
the remote controller is used for sending a channel switching instruction to the terminal when a user presses a channel switching button of the remote controller;
the terminal is used for receiving the channel switching instruction; responding to the channel switching instruction, and sending a media stream acquisition request to a quick-sending server;
the fast sending server is used for receiving the media stream acquisition request; determining the double speed of the sending media stream according to the media stream acquisition request; and sending the media stream to the terminal according to the determined speed.
It should be noted that, here, after receiving the channel switching instruction, the terminal may establish a unicast or multicast connection with the fast sending server, and send a media stream obtaining request to the fast sending server; after receiving the media stream acquisition request, the fast sending server determines the fast sending multiple, namely the multiple speed of sending the media stream, and sends the media stream to the terminal in the form of a data packet according to the determined multiple speed; after receiving the media stream, if the video frame and the audio frame in the media stream are not synchronous, the terminal adopts the steps of the data synchronization method provided by the embodiment of the invention to perform synchronization processing on the video frame and the audio frame in the media stream until the terminal catches up with the target multicast stream, and stops the fast-sending server from switching to the current multicast stream for playing, so as to ensure that the user sees a smooth video picture, i.e. the user sees that the switching process is smooth and has no click feeling.
In an example, as shown in fig. 4, a process of implementing synchronization by a terminal for unsynchronized video frames and audio frames in a media stream is described, including:
step 401: the terminal obtains the media stream from the quick server; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized.
Here, in the channel switching application scenario, the terminal may receive a channel switching instruction sent by the user through the remote controller; responding to the channel switching instruction, and sending a media stream acquisition request to a quick-sending server; and receiving the media stream returned by the quick sending server aiming at the media stream acquisition request.
Here, in the on-demand application scenario, the terminal may receive an on-demand instruction triggered by the user; responding to the order, and sending a media stream acquisition request to a quick sending server; and receiving the media stream returned by the quick sending server aiming at the media stream acquisition request.
Here, in the online video switching application scenario, the terminal may receive a video switching instruction triggered by the user; responding to the video switching instruction, and sending a media stream acquisition request to a quick-sending server; and receiving the media stream returned by the quick sending server aiming at the media stream acquisition request.
Step 402: recording a first timestamp of a first video frame in the media stream and recording a second timestamp of an audio frame in the media stream adjacent to the first video frame; and differencing the recorded first time stamp and the recorded second time stamp to obtain a difference value.
Step 403: starting from the first video frame, determining a preset number of video frames equal to the difference value from the media stream, and storing the preset number of video frames in a first buffer.
Here, fig. 5 is a schematic diagram for implementing synchronization of unsynchronized video frames and audio frames in a media Stream, as shown in fig. 5, taking the media Stream as a Transport Stream (TS), and assuming that the TS Stream corresponds to a real-time transmitted television program, audio and video data packets of the television program are interleaved and packed together, and the audio data packets and the video data packets are respectively identified by a and V, for example, a (5) represents audio (audio) data within 5 seconds; v (5) represents video (video) data within 5 seconds.
Assuming that the second timestamp, such as the PTS value, of the audio data in the TS stream lags the video data for 5 seconds, the sequence of the data frames corresponding to the media stream received by the terminal is: a (0) |v (5) |a (1) |v (6).
Here, the terminal determines that 5 video frames according to a (0) and V (5) are V (5), V (6), V (7), V (8), V (9), and stores in a newly added first buffer such as FIFO, specifically, V (5) is stored in the V (n) position in FIFO, V (6) is stored in the V (n+1) position in FIFO, V (7) is stored in the V (n+2) position in FIFO, V (8) is stored in the V (n+3) position in FIFO, and V (9) is stored in the V (n+4) position in FIFO; and meanwhile, asynchronous audio data before A (5), namely A (0), A (1), A (2), A (3) and A (4), are thrown away.
It should be noted that the length of the FIFO is dynamically adjusted according to the difference between the PTS of the video frame and the PTS of the audio frame in the TS stream, for example, assuming that the PTS of the two differ by 6 seconds, the FIFO is a buffer capable of storing 6 seconds of video data.
Step 404: the terminal synchronizes corresponding video frames and audio frames in the media stream aiming at each video frame in the first buffer memory, and stores the synchronized video frames and audio frames in a second buffer memory; the synchronized video frames and audio frames are used to generate video pictures.
Here, for the video frame corresponding to V (5) in the FIFO, after receiving the audio frame corresponding to a (5) in the media stream, the video data corresponding to V (5) in the FIFO (data corresponding to the V (5) position in the FIFO) and the audio data corresponding to a (5) are simultaneously injected into the second buffer in the terminal bottom hardware decoder; by analogy, the processing modes of V (6), V (7), V (8) and V (9) are the same as those of V (5), and are not described in detail herein.
Here, after storing the synchronized video frames and audio frames in the second buffer, the pointer of the FIFO may also be moved to restore the subsequently received unsynchronized video frames.
Here, using the first buffer and the second buffer to synchronize the unsynchronized video frames and the audio frames in the media stream has the following advantages:
(1) Video data and audio data with the timestamp difference of n seconds, such as 5 seconds, in a film source are sent to a terminal through a quick sending server, unsynchronized video data are stored in a first buffer corresponding to an upper application program by the terminal, after an audio frame synchronized with the video frame in the first buffer is received from a media stream, the audio frame is injected into a second buffer of a bottom hardware decoder at the same time, and the bottom hardware decoder decodes the synchronized video frame and audio frame to obtain a video picture and then displays the video picture, so that a user can see a smooth video picture.
(2) Compared with the mode of encoding the low-rate media stream sent by the fast channel switching (FCC, fast Channel Change) server to reduce the time interval in the related art, in the embodiment of the present invention, the decoding operation and the synchronization operation are separated, the first buffer memory at the upper layer of the terminal stores the unsynchronized video frames, and the second buffer memory at the bottom layer decoder of the terminal stores the synchronized video frames and audio frames, so that the bottom layer decoder of the terminal can directly obtain a smooth video picture by executing the decoding operation.
(3) Cost and risk of modification implementation are small
The method has the advantages that various links, algorithms, storage and the like of the quick-release server are not required to be modified in a large quantity, a large-scale upgrading server is not required, the terminal player is not required to be adapted and modified, the risk caused by upgrading is reduced, only the algorithm logic is required to be added on the terminal, the problem of high cost caused by development, upgrading and maintenance is avoided, and the problem of influence and uncontrollable risk caused by the upgrading of the existing network is avoided.
(4) Scheme portability is strong
The device can be decoupled from a decoder chip, can be packaged into a module, and can be conveniently transplanted to various embedded player platforms.
In an example, as shown in fig. 6, a process for implementing synchronization by a terminal for unsynchronized video frames and audio frames in a media stream is described, including:
step 601: the terminal sends a media stream acquisition request to the quick sending server; the media stream acquisition request is sent to the quick-sending server after the terminal receives a channel switching instruction.
Here, the fast-forwarding server may be a streaming media fast-forwarding server.
Step 602: the quick-sending server receives a media stream acquisition request sent by a terminal; and determining the double speed of the sending media stream according to the media stream acquisition request.
Step 603: and the quick sending server sends the media stream to the terminal according to the determined speed.
Step 604: the terminal obtains the media stream from the quick server; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized.
Step 605: the terminal determines a preset number of video frames from the media stream based on the first timestamp and the second timestamp; storing the preset number of video frames in a first buffer;
Step 606: synchronizing the corresponding video frame with the audio frame in the media stream for each video frame in the first buffer, and storing the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used to generate video pictures.
Here, experiments are performed on a source of audio and video whose PTS values differ by 8 seconds, and in the case where a fast-forwarding server is deployed and a fast-forwarding multiple is used, for example, the fast-forwarding server transmits 8 seconds of unsynchronized data to the terminal, so that the time from the reception of a media stream transmitted in the form of a data packet to the filling of unsynchronized video frames in the media stream into the FIFO is less than 3 seconds, and the time interval from the start of the injection of synchronized video frames and audio frames into the underlying decoder to the synchronous output of video pictures is 300ms to 400ms, so that the time from the reception of the media stream by the terminal to the start of the output of video pictures by the terminal is much less than 8 seconds.
It should be noted that, the time of actually synchronizing output video pictures by the bottom layer decoder is less than 400ms, and is close to the synchronization time required by the theoretical synchronization code stream, so as to achieve the expected effect.
Here, when an unsynchronized clip source is played, the multiple of the fast-transmitting stream of the fast-transmitting server may be adjusted. For example, if the fast sending server can send 8 seconds of asynchronous data for 3 seconds, the scheme of the data synchronization method adopting the embodiment of the invention is about 5 seconds faster than the scheme of the traditional synchronization method.
Here, when the audio-video gap of the individual channels is large, the time interval from the receiving of the media stream data from the terminal to the displaying of the video picture of the next channel can be controlled within 1 second by the scheme, so that the user can obviously feel that the switching time is very large, and the experience is better.
Here, the terminal synchronizes the unsynchronized video frames and audio frames in the media stream, and has the following advantages:
(1) Speed capable of improving channel switching
Based on the FCC streaming media fast-transmitting server, the switching speed can be improved on the whole by adjusting the fast-transmitting multiple and modifying the upper-layer synchronization algorithm of the player of the terminal equipment.
(2) The fast cutting effect of the code stream aiming at the larger audio and video difference is better
For the situation of large audio and video differences, the switching time can be shortened to achieve the purpose of fast switching only by adjusting the multiple of the fast sending server, and the traditional scheme cannot realize.
(3) Compared with the mode of reducing the time interval between the video packet and the audio packet by replacing the audio packet in the channel multicast message in the related art, in the embodiment of the invention, the fast sending server can shorten the synchronization time by adjusting the fast sending multiple, and the terminal can achieve the purpose of fast cutting by executing the synchronization algorithm logic.
According to the technical scheme, the preset number of video frames are selected from the media stream based on the time stamps between the video frames and the audio frames in the media stream and stored in the first buffer, and after the audio frames synchronous with the video frames in the first buffer are determined from the media stream, the video frames and the corresponding audio frames are stored in the second buffer so as to obtain video pictures through decoding processing, so that a user can see smooth video pictures.
In order to realize the data synchronization method of the embodiment of the invention, the embodiment of the invention also provides a data synchronization device. FIG. 7 is a schematic diagram of a data synchronization device according to an embodiment of the present invention; as shown in fig. 7, the apparatus includes:
an acquisition unit 71 for acquiring a media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized;
a first processing unit 72 for determining a preset number of video frames from the media stream based on the first timestamp and the second timestamp; storing the preset number of video frames in a first buffer;
a second processing unit 73, configured to synchronize, for each video frame in the first buffer, a corresponding video frame with an audio frame in the media stream, and store the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used to generate video pictures.
In one embodiment, the first processing unit 72 is specifically configured to:
recording a first timestamp of a first video frame in the media stream and recording a second timestamp of an audio frame in the media stream adjacent to the first video frame;
The recorded first time stamp and the recorded second time stamp are subjected to difference to obtain a difference value;
starting from the first video frame, a preset number of video frames equal to the difference value is determined from the media stream.
In an embodiment, the second processing unit 73 is specifically configured to:
determining a video frame with the smallest time stamp from the first buffer;
comparing the first time stamp of the video frame with the minimum time stamp with the second time stamp of the audio frame in the media stream to obtain a comparison result;
and when the comparison result represents that the second time stamp of the audio frame in the media stream is larger than or equal to the first time stamp of the video frame with the smallest time stamp, synchronizing the audio frame with the video frame in the first buffer memory.
In an embodiment, the first processing unit 72 is further configured to:
storing each video frame and the synchronized audio frame in the first buffer memory in the second buffer memory, and then emptying the first buffer memory;
redefining a preset number of video frames from the media stream;
and storing the redetermined preset number of video frames in the first buffer memory through a pointer moving operation.
In an embodiment, the obtaining unit 71 is specifically configured to:
the media stream is retrieved from the fast-forwarding server.
In an embodiment, the obtaining unit 71 is specifically configured to:
sending a media stream acquisition request to the quick sending server; the media stream acquisition request is sent to the quick-sending server after the terminal receives a channel switching instruction;
and receiving the media stream sent by the quick sending server aiming at the media stream acquisition request.
In an embodiment, the device further comprises:
the transmission unit is used for receiving a media stream acquisition request sent by the terminal; determining the double speed of the sending media stream according to the media stream acquisition request; and sending the media stream to the terminal according to the determined speed.
In practical applications, the acquisition unit 71 may be implemented by a communication interface in the device; the first processing unit 72, the second processing unit 73 may be implemented by a processor in the device; the processor may be a central processing unit (CPU, central Processing Unit), a digital signal processor (DSP, digital Signal Processor), a micro control unit (MCU, microcontroller Unit) or a programmable gate array (FPGA, field-Programmable Gate Array).
It should be noted that: in the apparatus provided in the above embodiment, only the division of each program module is used for illustration when data synchronization is performed, and in practical application, the processing allocation may be performed by different program modules according to needs, that is, the internal structure of the terminal is divided into different program modules, so as to complete all or part of the processing described above. In addition, the apparatus provided in the foregoing embodiments and the data synchronization method embodiment belong to the same concept, and specific implementation processes of the apparatus and the data synchronization method embodiment are detailed in the method embodiment, which is not described herein again.
In order to realize the data synchronization method of the embodiment of the invention, the embodiment of the invention also provides a data synchronization system. FIG. 8 is a schematic diagram of a data synchronization system according to an embodiment of the present invention; as shown in fig. 8, includes:
a fast sending server 81, configured to receive a media stream acquisition request sent by a terminal; determining the double speed of the sending media stream according to the media stream acquisition request; transmitting the media stream to a terminal according to the determined speed
A terminal 82 for acquiring a media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized; determining a preset number of video frames from the media stream based on the first timestamp and the second timestamp; storing the preset number of video frames in a first buffer; synchronizing the corresponding video frame with the audio frame in the media stream for each video frame in the first buffer, and storing the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used to generate video pictures.
It should be noted that the execution process of the terminal and the fast-sending server is described above, and will not be described herein.
Based on the hardware implementation of the above device, the embodiment of the present invention further provides a terminal, and fig. 9 is a schematic diagram of the hardware composition structure of the terminal according to the embodiment of the present invention, as shown in fig. 9, the terminal 90 includes a memory 93, a processor 92, and a computer program stored in the memory 93 and capable of running on the processor 92; the processor 92, when executing the program, implements the methods provided by one or more of the above-described aspects.
It should be noted that, specific steps implemented when the processor 92 executes the program are described in detail above, and will not be described herein.
It will be appreciated that the terminal 90 further comprises a communication interface 91, said communication interface 91 being adapted for information interaction with other devices; at the same time, the various components in terminal 90 are coupled together by bus system 94. It is to be appreciated that the bus system 94 is configured to enable connected communication between these components. The bus system 94 includes a power bus, a control bus, a status signal bus, and the like in addition to the data bus.
It will be appreciated that the memory 93 in this embodiment may be either volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory. Wherein the nonvolatile Memory may be Read Only Memory (ROM), programmable Read Only Memory (PROM, programmable Read-Only Memory), erasable programmable Read Only Memory (EPROM, erasable Programmable Read-Only Memory), electrically erasable programmable Read Only Memory (EEPROM, electrically Erasable Programmable Read-Only Memory), magnetic random access Memory (FRAM, ferromagnetic random access Memory), flash Memory (Flash Memory), magnetic surface Memory, optical disk, or compact disk Read Only Memory (CD-ROM, compact Disc Read-Only Memory); the magnetic surface memory may be a disk memory or a tape memory. The volatile memory may be random access memory (RAM, random Access Memory), which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (SRAM, static Random Access Memory), synchronous static random access memory (SSRAM, synchronous Static Random Access Memory), dynamic random access memory (DRAM, dynamic Random Access Memory), synchronous dynamic random access memory (SDRAM, synchronous Dynamic Random Access Memory), double data rate synchronous dynamic random access memory (ddr SDRAM, double Data Rate Synchronous Dynamic Random Access Memory), enhanced synchronous dynamic random access memory (ESDRAM, enhanced Synchronous Dynamic Random Access Memory), synchronous link dynamic random access memory (SLDRAM, syncLink Dynamic Random Access Memory), direct memory bus random access memory (DRRAM, direct Rambus Random Access Memory). The memory described by embodiments of the present invention is intended to comprise, without being limited to, these and any other suitable types of memory.
The method disclosed in the above embodiment of the present invention may be applied to the processor 92 or implemented by the processor 92. The processor 92 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the methods described above may be performed by integrated logic circuitry in hardware in processor 92 or by instructions in software. The processor 92 may be a general purpose processor, DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 92 may implement or perform the methods, steps and logic blocks disclosed in embodiments of the present invention. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiment of the invention can be directly embodied in the hardware of the decoding processor or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium having memory and the processor 92 reads information from the memory and, in combination with its hardware, performs the steps of the method described above.
The embodiment of the invention also provides a storage medium, particularly a computer storage medium, and more particularly a computer readable storage medium. On which computer instructions, i.e. a computer program, are stored which, when executed by a processor, provide a method according to one or more of the above-mentioned claims.
In several embodiments provided by the present invention, it should be understood that the disclosed method and intelligent device may be implemented in other manners. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described as separate units may or may not be physically separate, and units displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present invention may be integrated in one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware associated with program instructions, where the foregoing program may be stored in a computer readable storage medium, and when executed, the program performs steps including the above method embodiments; and the aforementioned storage medium includes: a removable storage device, ROM, RAM, magnetic or optical disk, or other medium capable of storing program code.
Alternatively, the above-described integrated units of the present invention may be stored in a computer-readable storage medium if implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solutions of the embodiments of the present invention may be embodied in essence or a part contributing to the prior art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a terminal, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, ROM, RAM, magnetic or optical disk, or other medium capable of storing program code.
It should be noted that: "first," "second," etc. are used to distinguish similar objects and not necessarily to describe a particular order or sequence.
In addition, the embodiments of the present invention may be arbitrarily combined without any collision.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention.

Claims (8)

1. A data synchronization method, applied to a terminal, the method comprising:
acquiring a media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized; video frames and audio frames in the media stream alternate; the first timestamp characterizes the time of displaying a video picture corresponding to the video frame to a user; the second timestamp characterizes the time of displaying the audio corresponding to the audio frame to the user;
Recording a first timestamp of a first video frame in the media stream and recording a second timestamp of an audio frame in the media stream adjacent to the first video frame;
the recorded first time stamp and the recorded second time stamp are subjected to difference to obtain a difference value;
determining a preset number of video frames equal to the difference value from the media stream from the first video frame;
storing the preset number of video frames in a first buffer; the first cache refers to a cache in an upper application program of the terminal;
synchronizing the corresponding video frame with the audio frame in the media stream for each video frame in the first buffer, and storing the synchronized video frame and audio frame in a second buffer; the synchronized video frames and audio frames are used for generating video pictures; the second buffer refers to a buffer in a bottom layer decoder of the terminal;
the synchronizing, for each video frame in the first buffer, a corresponding video frame with an audio frame in the media stream includes:
determining a video frame with the smallest time stamp from the first buffer;
comparing the first time stamp of the video frame with the minimum time stamp with the second time stamp of the audio frame in the media stream to obtain a comparison result;
And when the comparison result represents that the second time stamp of the audio frame in the media stream is larger than or equal to the first time stamp of the video frame with the smallest time stamp, synchronizing the audio frame with the video frame in the first buffer memory.
2. The method of claim 1, wherein each video frame and synchronized audio frame in the first buffer is stored in the second buffer, the method further comprising:
emptying the first cache;
redefining a preset number of video frames from the media stream;
and storing the redetermined preset number of video frames in the first buffer memory through a pointer moving operation.
3. The method of claim 1, wherein the obtaining the media stream comprises:
the media stream is retrieved from the fast-forwarding server.
4. A method according to claim 3, wherein said obtaining a media stream from a flash server comprises:
sending a media stream acquisition request to the quick sending server; the media stream acquisition request is sent to the quick-sending server after the terminal receives a channel switching instruction;
and receiving the media stream sent by the quick sending server aiming at the media stream acquisition request.
5. The method of claim 4, applied to a fast-forwarding server, the method further comprising:
receiving a media stream acquisition request sent by a terminal;
determining the double speed of the sending media stream according to the media stream acquisition request;
and sending the media stream to the terminal according to the determined speed.
6. A data synchronization device, comprising:
an acquisition unit configured to acquire a media stream; the media stream includes a plurality of video frames and a plurality of audio frames; the video frames carry a first timestamp and the audio frames carry a second timestamp; the video frames and audio frames are not synchronized; video frames and audio frames in the media stream alternate; the first timestamp characterizes the time of displaying a video picture corresponding to the video frame to a user; the second timestamp characterizes the time of displaying the audio corresponding to the audio frame to the user;
a first processing unit, configured to record a first timestamp of a first video frame in the media stream, and record a second timestamp of an audio frame adjacent to the first video frame in the media stream; the recorded first time stamp and the recorded second time stamp are subjected to difference to obtain a difference value; determining a preset number of video frames equal to the difference value from the media stream from the first video frame; storing the preset number of video frames in a first buffer; the first cache refers to a cache in an upper application program of the terminal;
The second processing unit is used for synchronizing the corresponding video frame with the audio frame in the media stream aiming at each video frame in the first buffer memory, and storing the synchronized video frame and audio frame in the second buffer memory; the synchronized video frames and audio frames are used for generating video pictures; the second buffer refers to a buffer in a bottom layer decoder of the terminal;
the second processing unit is specifically configured to determine, from the first buffer, a video frame with a minimum timestamp; comparing the first time stamp of the video frame with the minimum time stamp with the second time stamp of the audio frame in the media stream to obtain a comparison result; and when the comparison result represents that the second time stamp of the audio frame in the media stream is larger than or equal to the first time stamp of the video frame with the smallest time stamp, synchronizing the audio frame with the video frame in the first buffer memory.
7. A terminal, comprising: a processor and a memory for storing a computer program capable of running on the processor,
wherein the processor is adapted to perform the steps of the method of any of claims 1 to 5 when the computer program is run.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any one of claims 1 to 5.
CN202110168898.3A 2021-02-07 2021-02-07 Data synchronization method, device, terminal and storage medium Active CN112929713B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110168898.3A CN112929713B (en) 2021-02-07 2021-02-07 Data synchronization method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110168898.3A CN112929713B (en) 2021-02-07 2021-02-07 Data synchronization method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN112929713A CN112929713A (en) 2021-06-08
CN112929713B true CN112929713B (en) 2024-04-02

Family

ID=76171099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110168898.3A Active CN112929713B (en) 2021-02-07 2021-02-07 Data synchronization method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112929713B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113382210B (en) * 2021-08-12 2021-11-16 深圳市有为信息技术发展有限公司 Processing method of multi-channel monitoring video data, streaming media server and electronic equipment
CN114302169B (en) * 2021-12-24 2023-03-07 威创集团股份有限公司 Picture synchronous recording method, device, system and computer storage medium
CN115589450B (en) * 2022-09-01 2024-04-05 荣耀终端有限公司 Video recording method and device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815634A (en) * 1994-09-30 1998-09-29 Cirrus Logic, Inc. Stream synchronization method and apparatus for MPEG playback system
CN101394469A (en) * 2008-10-29 2009-03-25 北京创毅视讯科技有限公司 Audio and video synchronization method, device and a digital television chip
CN101466044A (en) * 2007-12-19 2009-06-24 康佳集团股份有限公司 Method and system for synchronously playing stream medium audio and video
CN101996662A (en) * 2010-10-22 2011-03-30 深圳市万兴软件有限公司 Method and device for connecting and outputting video files
CN103414957A (en) * 2013-07-30 2013-11-27 广东工业大学 Method and device for synchronization of audio data and video data
CN103648011A (en) * 2013-11-29 2014-03-19 乐视致新电子科技(天津)有限公司 Audio and video synchronization device and method based on HLS protocol
CN106791271A (en) * 2016-12-02 2017-05-31 福建星网智慧科技股份有限公司 A kind of audio and video synchronization method
CN108055566A (en) * 2017-12-26 2018-05-18 郑州云海信息技术有限公司 Method, apparatus, equipment and the computer readable storage medium of audio-visual synchronization
CN108449617A (en) * 2018-02-11 2018-08-24 浙江大华技术股份有限公司 A kind of method and device of control audio-visual synchronization
CN110035311A (en) * 2019-04-04 2019-07-19 网宿科技股份有限公司 A kind of methods, devices and systems that message flow and audio/video flow is played simultaneously
CN110519635A (en) * 2019-08-07 2019-11-29 河北远东通信***工程有限公司 A kind of audio-video frequency media stream interflow method and system of wireless clustered system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9571901B2 (en) * 2007-12-28 2017-02-14 Intel Corporation Synchronizing audio and video frames

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815634A (en) * 1994-09-30 1998-09-29 Cirrus Logic, Inc. Stream synchronization method and apparatus for MPEG playback system
CN101466044A (en) * 2007-12-19 2009-06-24 康佳集团股份有限公司 Method and system for synchronously playing stream medium audio and video
CN101394469A (en) * 2008-10-29 2009-03-25 北京创毅视讯科技有限公司 Audio and video synchronization method, device and a digital television chip
CN101996662A (en) * 2010-10-22 2011-03-30 深圳市万兴软件有限公司 Method and device for connecting and outputting video files
CN103414957A (en) * 2013-07-30 2013-11-27 广东工业大学 Method and device for synchronization of audio data and video data
CN103648011A (en) * 2013-11-29 2014-03-19 乐视致新电子科技(天津)有限公司 Audio and video synchronization device and method based on HLS protocol
CN106791271A (en) * 2016-12-02 2017-05-31 福建星网智慧科技股份有限公司 A kind of audio and video synchronization method
CN108055566A (en) * 2017-12-26 2018-05-18 郑州云海信息技术有限公司 Method, apparatus, equipment and the computer readable storage medium of audio-visual synchronization
CN108449617A (en) * 2018-02-11 2018-08-24 浙江大华技术股份有限公司 A kind of method and device of control audio-visual synchronization
WO2019153960A1 (en) * 2018-02-11 2019-08-15 Zhejiang Dahua Technology Co., Ltd. Systems and methods for synchronizing audio and video
CN110035311A (en) * 2019-04-04 2019-07-19 网宿科技股份有限公司 A kind of methods, devices and systems that message flow and audio/video flow is played simultaneously
CN110519635A (en) * 2019-08-07 2019-11-29 河北远东通信***工程有限公司 A kind of audio-video frequency media stream interflow method and system of wireless clustered system

Also Published As

Publication number Publication date
CN112929713A (en) 2021-06-08

Similar Documents

Publication Publication Date Title
CN112929713B (en) Data synchronization method, device, terminal and storage medium
US7613381B2 (en) Video data processing method and video data processing apparatus
US8521009B2 (en) Systems and methods to modify playout or playback
JP2021083116A (en) Playback method and playback device
US8111971B2 (en) Systems and methods of reducing media stream delay through independent decoder clocks
CN102215429B (en) Recording method for mobile TV
WO2018183095A1 (en) Low-latency mobile device audiovisual streaming
KR102469142B1 (en) Dynamic playback of transition frames while transitioning between media stream playbacks
CN111601136B (en) Video data processing method and device, computer equipment and storage medium
CN113225598A (en) Method, device and equipment for synchronizing audio and video of mobile terminal and storage medium
EP1978521B9 (en) System for random access to content
KR102640151B1 (en) Method for fast channel change and corresponding device
EP2545708B1 (en) Method and system for inhibiting audio-video synchronization delay
JPH11275519A (en) Data recording method and data recorder
US8750389B2 (en) Video data decoder and method for decoding video data
CN114079813A (en) Picture synchronization method, coding method, video playing device and video coding device
CN114339353B (en) Audio/video synchronization method and device, electronic equipment and computer readable storage medium
US7263275B2 (en) System and method of manipulating a system time clock in an audio/video decoding system
CN103581730A (en) Method for achieving synchronization of audio and video on digital set top box
JPH11275524A (en) Data recording method, data reproduction method, data recorder and data reproduction device
CN103139641A (en) Method and device for achieving audio/video seamless switching in real-time digital television time shifting playing
JP2013532432A (en) Receiver capable of channel change with a single decoder and method at the receiver
CN102413335A (en) Manual adjustment device and method for program audio and video synchronization
WO2021111988A1 (en) Video playback device, video playback system, and video playback method
CN115914708A (en) Media audio and video synchronization method and system and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant