CN111478914A - Timestamp processing method, device, terminal and storage medium - Google Patents

Timestamp processing method, device, terminal and storage medium Download PDF

Info

Publication number
CN111478914A
CN111478914A CN202010290959.9A CN202010290959A CN111478914A CN 111478914 A CN111478914 A CN 111478914A CN 202010290959 A CN202010290959 A CN 202010290959A CN 111478914 A CN111478914 A CN 111478914A
Authority
CN
China
Prior art keywords
timestamp
frames
video data
display
encoding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010290959.9A
Other languages
Chinese (zh)
Other versions
CN111478914B (en
Inventor
葛向东
谢导
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Kugou Computer Technology Co Ltd
Original Assignee
Guangzhou Kugou Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Kugou Computer Technology Co Ltd filed Critical Guangzhou Kugou Computer Technology Co Ltd
Priority to CN202010290959.9A priority Critical patent/CN111478914B/en
Publication of CN111478914A publication Critical patent/CN111478914A/en
Application granted granted Critical
Publication of CN111478914B publication Critical patent/CN111478914B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/762Media network packet handling at the source 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/764Media network packet handling at the destination 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/577Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application provides a timestamp processing method, a timestamp processing device, a terminal and a storage medium, wherein the method is applied to a stream pushing terminal and comprises the following steps: acquiring m frames of video data, wherein the m frames of video data respectively correspond to m display time stamps, and m is an integer greater than 1; coding the m frames of video data to obtain m frames of coded data respectively corresponding to the m frames of video data; adjusting a target display timestamp in the m display timestamps to obtain an adjusted target display timestamp; adjusting a target encoding timestamp in the m encoding timestamps to obtain an adjusted target encoding timestamp; and sending the m frames of encoded data, the adjusted target display timestamp and the adjusted target encoding timestamp to a server. The technical scheme provided by the embodiment of the application can reduce the occurrence probability of audio and video asynchronism.

Description

Timestamp processing method, device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of internet, in particular to a timestamp processing method, a timestamp processing device, a timestamp processing terminal and a storage medium.
Background
The B-frame method (B-frame) is an inter-frame compression algorithm for bi-directional prediction, and when a frame is compressed into a B-frame, it compresses the frame according to the difference between the adjacent previous frame, the current frame and the next frame, i.e. only the difference between the current frame and the previous and next frames is recorded.
In the related art, when a B-frame method is used to compress and encode multiple frames of video data, the display timestamp of the video data may be smaller than the corresponding encoding timestamp. With reference to fig. 1, a stream pushing terminal obtains 5 frames of video data, where display timestamps corresponding to the 5 frames of video data are 1, 2, 3, 4, and 5, respectively, and the stream pushing terminal encodes the 5 frames of video data according to the sequence of the display timestamps to obtain corresponding 5 frames of encoded data, where the encoding timestamps corresponding to the 5 frames of encoded data are 1, 3, 2, 5, and 4, respectively.
In the related art, when the stream pulling terminal acquires the encoded data from the server, due to the fact that the display time stamp of the video data is smaller than the corresponding encoding time stamp, the stream pulling terminal can generate disorder when the video data is decoded and played, and therefore the audio and the video are asynchronous.
Disclosure of Invention
The embodiment of the application provides a timestamp processing method, a timestamp processing device, a terminal and a storage medium, which can reduce the occurrence probability of the phenomenon of audio and video asynchronism. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a timestamp processing method, where the method is applied to a stream pushing terminal, and the method includes:
acquiring m frames of video data, wherein the m frames of video data respectively correspond to m display time stamps, and m is an integer greater than 1;
coding m frames of video data to obtain m frames of coded data corresponding to the m frames of video data respectively, wherein the m frames of coded data correspond to m coding time stamps respectively, and at least one b frame exists in the m frames of coded data;
adjusting a target display timestamp in the m display timestamps to obtain the adjusted target display timestamp;
adjusting a target encoding timestamp in the m encoding timestamps to obtain an adjusted target encoding timestamp, wherein the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp;
and sending the m frames of encoded data, the adjusted target display timestamp and the adjusted target encoding timestamp to a server.
On the other hand, an embodiment of the present application provides a timestamp processing method, where the method is applied to a stream pulling terminal, and the method includes:
receiving m frames of encoded data, an adjusted target display timestamp and an adjusted target encoding timestamp sent by a server, wherein the m frames of encoded data have at least one b frame, the m frames of encoded data are obtained by encoding m frames of video data, the m frames of video data respectively correspond to m display timestamps, the adjusted target display timestamp is obtained by adjusting a target display timestamp in the m display timestamps, the adjusted target encoding timestamp is obtained by adjusting a target encoding timestamp in the m encoding timestamps, the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp, and m is an integer greater than 1;
decoding the m frames of encoded data to obtain m frames of video data;
determining the target display timestamp according to the adjusted target display timestamp;
determining the target coding time stamp according to the adjusted target coding time stamp;
and displaying a playing picture according to the m frames of video data, the target display time stamp and the target coding time stamp.
In another aspect, an embodiment of the present application provides a timestamp processing apparatus, including:
the data acquisition module is used for acquiring m frames of video data, wherein the m frames of video data respectively correspond to m display time stamps, and m is an integer greater than 1;
the data coding module is used for coding m frames of video data to obtain m frames of coded data corresponding to the m frames of video data respectively, the m frames of coded data correspond to m coding time stamps respectively, and at least one b frame exists in the m frames of coded data;
a first adjusting module, configured to adjust a target display timestamp of the m display timestamps to obtain an adjusted target display timestamp;
a second adjusting module, configured to adjust a target encoding timestamp of the m encoding timestamps to obtain an adjusted target encoding timestamp, where the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp;
and the data sending module is used for sending the m frames of encoded data, the adjusted target display timestamp and the adjusted target encoding timestamp to a server.
In another aspect, an embodiment of the present application provides a timestamp processing apparatus, where the apparatus includes:
a data receiving module, configured to receive m frames of encoded data, an adjusted target display timestamp, and an adjusted target encoding timestamp, where the m frames of encoded data include at least one b frame, the m frames of encoded data are obtained by encoding m frames of video data, the m frames of video data respectively correspond to m display timestamps, the adjusted target display timestamp is obtained by adjusting a target display timestamp in the m display timestamps, the adjusted target encoding timestamp is obtained by adjusting a target encoding timestamp in the m encoding timestamps, the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp, and m is an integer greater than 1;
the data decoding module is used for decoding the m frames of encoded data to obtain m frames of video data;
a third adjusting module, configured to determine the target display timestamp according to the adjusted target display timestamp;
a fourth adjusting module, configured to determine the target encoding timestamp according to the adjusted target encoding timestamp;
and the playing module is used for displaying a playing picture according to the m frames of video data, the target display time stamp and the target coding time stamp.
In yet another aspect, a terminal is provided that includes a processor and a memory storing a computer program that is loaded and executed by the processor to implement a push terminal-side timestamp processing method or a pull terminal-side timestamp processing method.
In still another aspect, a computer-readable storage medium having stored therein a computer program, which is loaded and executed by a processor to implement a push terminal-side timestamp processing method or a pull terminal-side timestamp processing method is provided.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the video data are coded through the stream pushing terminal, if the obtained coded data have b frames, the display time stamp of a certain frame of video data is adjusted, the coding time stamp of the corresponding coded data is adjusted, so that the display time stamp of each frame of video data is always larger than or equal to the coding time stamp of the corresponding coded data, the adjusted display time stamp and the adjusted coding time stamp are pushed to the server together, and because the display time stamp of each frame of video data is always larger than or equal to the coding time stamp of the corresponding coded data, when the coded data are pulled and decoded by the follow-up stream pulling terminal, the disorder condition can not occur, and the occurrence probability of audio and video asynchronization is reduced.
Drawings
FIG. 1 is a diagram of a comparison of display time stamps and encoding time stamps provided by the related art;
FIG. 2 is a schematic illustration of an implementation environment shown in an exemplary embodiment of the present application;
FIG. 3 is a flow diagram illustrating a method of timestamp processing according to an exemplary embodiment of the present application;
FIG. 4 is a flow diagram illustrating a method of timestamp processing according to another exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a timestamp process shown in an exemplary embodiment of the present application;
FIG. 6 is a flow diagram illustrating a method of timestamp processing according to another exemplary embodiment of the present application;
FIG. 7 is a schematic illustration of a homogenization process shown in another exemplary embodiment of the present application;
fig. 8 is a block diagram illustrating a structure of a time stamp processing apparatus according to an exemplary embodiment of the present application;
fig. 9 is a block diagram illustrating a structure of a time stamp processing apparatus according to an exemplary embodiment of the present application;
fig. 10 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The following is a description of the related terms related to the embodiments of the present application.
Pushing flow: and transmitting the content packaged in the acquisition stage to a server.
And (3) pushing a flow terminal: and the terminal executes the process of transmitting the content packaged in the acquisition stage to the server. Taking a live scene as an example, the streaming terminal is a terminal used by an anchor user (i.e., an anchor terminal).
Drawing flow: the server has live broadcast content and performs a pull process by using a specified address.
And (3) pulling the stream terminal: and executing the terminal for pulling the live broadcast content from the server by using the specified address. Taking a live scene as an example, the pull stream terminal is a terminal used by the audience users (i.e. audience terminal).
B frame: the present frame is compressed according to the difference between the adjacent previous frame, present frame and next frame data, and the difference between the present frame and the previous and next frames is recorded. For example, for the 1 st, 2 nd and 3 rd frame video data, the difference between the 1 st frame video data and the 2 nd frame video data, and the difference between the 2 nd frame video data and the 3 rd frame video data are recorded by b frames.
Refer to FIG. 2, which illustrates a schematic diagram of an implementation environment in accordance with an embodiment of the present application. The implementation environment comprises a push flow terminal 21, a pull flow terminal 22 and a server 23.
The plug flow terminal 21 is used for transmitting the content packaged in the acquisition stage to the server 23. The plug flow terminal 21 has functions of data acquisition, data processing, data encoding, data encapsulation and the like. In this embodiment of the present application, when the stream pushing terminal 21 performs data encoding, if the obtained encoded data has b frames, the stream pushing terminal 21 further needs to adjust the display timestamp of the video data and the encoding time frame of the corresponding encoded data, so as to ensure that the display timestamp of the video data is greater than or equal to the encoding time frame of the corresponding encoded data, and further ensure that the stream pushing is successful. In some embodiments, the stream pushing terminal 21 includes a preset encoder for implementing the data encoding function and the timestamp processing function. The default encoder may be an IOS hardware encoder. The push streaming terminal 21 may be a smart phone, a tablet computer, a Personal Computer (PC), or the like. In the live scene, the stream pushing terminal 21 is also an anchor terminal.
The pull streaming terminal 22 is used to pull live content from the server 23 using a specified address. The stream terminal 22 has functions of decoding, decapsulating, rendering, displaying, and the like. The inverse of the encoding in decoding. Decapsulation is the reverse of encapsulation. In the embodiment of the present application, the live application is installed in the stream pulling terminal 22, and the above functions are implemented by the live application. The pull terminal 22 may be a smart phone, a tablet, a PC, or the like. In a live scenario, the pull terminal 22 is also a viewer terminal.
The server 23 may be a backend server corresponding to the live application. In the embodiment of the present application, the server 23 is configured to perform data interaction with a plurality of terminals (e.g., the push streaming terminal 21 and the pull streaming terminal 22). The server 23 may be one server, a server cluster formed by a plurality of servers, or a cloud computing service center.
The push flow terminal 21 and the server 23 establish a communication connection through a wireless network or a wired network. The pull stream terminal 22 and the server 23 establish a communication connection through a wireless network or a wired network.
The Network is typically the Internet, but may be any other Network including, but not limited to, a local Area Network (L cal Area Network, L AN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wireline, or wireless Network, a Private Network, or any combination of Virtual Private networks, hi some embodiments, data exchanged over the Network is represented using techniques and/or formats including Hypertext Markup language (HTM L), Extensible Markup language (XM L), etc.
It should be noted that one terminal may be either the push terminal 21 or the pull terminal 22. When one terminal is used for realizing the timestamp processing method at one side of the stream pushing terminal 21, the terminal is the stream pushing terminal; when one of the terminals is used to implement the time stamp processing method on the side of the pull terminal 22, the terminal is a pull terminal.
Referring to fig. 3, a flowchart of a timestamp processing method according to an embodiment of the present application is shown. The method can be applied to the stream pushing terminal of the embodiment shown in fig. 2. The method comprises the following steps:
step 301, m frames of video data are acquired.
The m frames of video data respectively correspond to m display time stamps, and m is an integer greater than 1. And the display time stamp corresponding to the ith frame of video data in the M frames of video data is used for indicating the time for acquiring the ith frame of video data. The m acquisition event stamps are sequentially arranged according to the time sequence. The value of M may be determined according to the acquisition duration, which is not limited in the embodiment of the present application.
In some embodiments, the plug-flow terminal has a camera assembly through which m frames of video data are captured. The camera assembly may be located in the push flow terminal or may be independent of the push flow terminal. In other embodiments, the stream pushing terminal is provided with screen recording software, and the screen of the stream pushing terminal is recorded through the screen recording software to obtain m frames of video data. In other embodiments, the stream pushing terminal collects data through the camera assembly and the screen recording software at the same time, and the data collected by the camera assembly and the screen recording software are fused to obtain m frames of video data.
Step 302, encoding the m frames of video data to obtain m frames of encoded data corresponding to the m frames of video data, respectively.
The m frames of encoded data respectively correspond to the m encoding time stamps, and at least one b frame exists in the m frames of encoded data. And the coding time stamps corresponding to the ith frame of coded data in the M frames of coded data are used for indicating the time for obtaining the ith frame of coded data, and the M coding time stamps are sequentially arranged according to the time sequence.
The B frame is obtained by compressing the frame according to the difference between the adjacent previous frame, the current frame and the next frame, and the difference between the current frame and the previous and next frames is recorded. When the m frames of encoded data have b frames, it indicates that the encoder does not encode the m frames of video data according to the sequence of the display timestamps, and the display timestamps of the m frames of video data and the encoding timestamps of the m frames of encoded data cannot correspond to each other.
In some embodiments, step 302 may include the following sub-steps:
step 302a, determining a coding sequence according to the sequence of m display timestamps corresponding to m frames of video data respectively.
The terminal may encode video data with a previous display time stamp first and encode video data with a subsequent display time stamp later. In other possible embodiments, the terminal may also encode the video data with the display time stamp after and encode the video data with the display time stamp before.
And step 302b, coding the m frames of video data by adopting a preset coder according to the coding sequence to obtain m frames of coded data.
In the embodiment of the present application, the default encoder refers to an IOS hardware encoder. And the terminal encodes each frame of video data to obtain encoded data corresponding to each frame of video data.
And 303, adjusting a target display time stamp in the m display time stamps to obtain an adjusted target display time stamp.
The number of target display time stamps may be one or more, the number of target display time stamps being less than or equal to m.
And 304, adjusting a target encoding time stamp in the m encoding time stamps to obtain the adjusted target encoding time stamp.
The number of target encoding timestamps may be one or more, the number of target encoding timestamps being less than or equal to m. The number of target display timestamps is the same as the number of target encoding timestamps.
The adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp. Because m frames of encoded data have b frames, the display time stamp of the m frames of video data and the encoding time stamp of the m frames of encoded data cannot correspond to each other, and the situation that the display time stamp of the video data is smaller than the encoding time frame of the corresponding encoded data may exist, and a subsequent stream pulling terminal may have disorder during decoding, so that the phenomena of audio and video asynchrony and the like during playing of the video data are caused.
In the embodiment of the present application, the execution sequence of step 303 and step 304 is not limited, and the terminal may execute step 303 first and then execute step 304, may execute step 304 first and then execute step 303, and may execute step 303 and step 304 simultaneously.
Step 305, sending the m frames of encoded data, the adjusted target display timestamp, and the adjusted target encoding timestamp to the server.
And the terminal sends the coded data, the display time stamp of each frame of video data and the coding time stamp of each frame of coded data to the server together. And when the display time stamp of the video data and the coding time stamp of the corresponding coded data are adjusted, the adjusted target display time stamp and the adjusted target coding time stamp are displayed.
To sum up, the technical scheme that this application embodiment provided, encode video data through pushing stream terminal, if there is b frames in the data after the coding that obtains, then to the display time stamp of certain frame video data, and the coding time stamp of corresponding data after the coding adjusts, in order to ensure that the display time stamp of each frame video data is greater than or equal to the coding time stamp of corresponding data after the coding all the time, later with the data after the coding, the display time stamp after the adjustment, the coding time stamp after the adjustment pushes away the stream to the server together, because the display time stamp of each frame video data is greater than or equal to the coding time stamp of corresponding data after the coding all the time, follow-up pulls stream terminal when pulling the data after the coding and decoding, the out of order condition can not appear, reduce the emergence probability of audio video asynchronization.
Referring to fig. 4, a flowchart of a timestamp processing method according to another embodiment of the present application is shown. The method can be applied to the stream pushing terminal in the embodiment shown in fig. 1.
Step 401, m frames of video data are acquired.
The m frames of video data respectively correspond to m display time stamps, and m is an integer greater than 1.
Referring collectively to fig. 5, a schematic diagram of the timestamp processing provided by one embodiment of the present application is shown. The stream pushing terminal acquires 5 frames of video data, and display timestamps corresponding to the 5 frames of video data are respectively 1, 2, 3, 4 and 5.
And 402, coding the m frames of video data to obtain m frames of coded data respectively corresponding to the m frames of video data.
The m frames of encoded data respectively correspond to the m encoding time stamps, and at least one b frame exists in the m frames of encoded data. With reference to fig. 5, the streaming terminal encodes the 5 frames of video data to obtain 5 frames of encoded data, where the encoding timestamps corresponding to the 5 frames of encoded data are 1, 3, 2, 5, and 4, respectively.
Step 403, for the ith frame of video data in the m frames of video data, detecting whether the display timestamp corresponding to the ith frame of video data is smaller than the encoding timestamp corresponding to the encoded ith frame of data.
The coded data of the ith frame is obtained by coding video data of the ith frame, and i is a positive integer less than or equal to m. If the display timestamp corresponding to the ith frame of video data is less than the encoding timestamp corresponding to the ith frame of encoded data, step 404 is executed, and if the display timestamp corresponding to the ith frame of video data is greater than or equal to the encoding timestamp corresponding to the ith frame of encoded data, no adjustment is needed.
And step 404, determining a display time stamp corresponding to the ith frame of video data as a target display time stamp, and determining an encoding time stamp corresponding to the ith frame of encoded data as a target encoding time stamp.
And the terminal determines the display time stamp corresponding to the ith frame of video data as a target display time stamp and determines the coding time stamp corresponding to the ith frame of coded data as a target coding time stamp.
Step 405, determining a first adjustment amount according to a time interval between two adjacent frames of video data in at least two frames of video data.
In some embodiments, the terminal determines half of a time interval between two adjacent frames of video data among the at least two frames of video data as the first adjustment amount.
Step 406, determining a second adjustment amount according to a time interval between two adjacent frames of video data in the at least two frames of video data.
The first adjustment amount and the second adjustment amount may be the same or different. When the first adjustment amount is different from the second adjustment amount, the first adjustment amount is usually larger than the second adjustment amount. In some embodiments, the terminal determines half of a time interval between two adjacent frames of video data among the at least two frames of video data as the second adjustment amount.
Step 407, the target display timestamp is increased by a first adjustment amount to obtain an adjusted target display timestamp.
And the terminal increases the target display timestamp by a first adjustment amount to obtain the adjusted target display timestamp.
Step 408, the target encoding timestamp is decreased by a second adjustment amount to obtain an adjusted target encoding timestamp.
And the terminal reduces the target coding time stamp by a second adjustment amount to obtain the adjusted target coding time stamp.
With reference to fig. 5, the display timestamp of the 2 nd frame of video data is less than the encoding timestamp of the 2 nd frame of encoded data, the terminal adjusts the display timestamp of the 2 nd frame of video data from 2 to 2.6, adjusts the encoding timestamp of the 2 nd frame of encoded data from 3 to 2.4, adjusts the display timestamp of the 4 th frame of video data to be less than the encoding timestamp of the 4 th frame of encoded data, the terminal adjusts the display timestamp of the 4 th frame of video data from 4 to 4.6, and adjusts the encoding timestamp of the 2 nd frame of encoded data from 5 to 4.4.
Step 409, sending m frames of encoded data, the adjusted target display timestamp, and the adjusted target encoding timestamp to the server.
To sum up, the technical scheme that this application embodiment provided, encode video data through pushing stream terminal, if there is b frames in the data after the coding that obtains, then to the display time stamp of certain frame video data, and the coding time stamp of corresponding data after the coding adjusts, in order to ensure that the display time stamp of each frame video data is greater than or equal to the coding time stamp of corresponding data after the coding all the time, later with the data after the coding, the display time stamp after the adjustment, the coding time stamp after the adjustment pushes away the stream to the server together, because the display time stamp of each frame video data is greater than or equal to the coding time stamp of corresponding data after the coding all the time, follow-up pulls stream terminal when pulling the data after the coding and decoding, the out of order condition can not appear, reduce the emergence probability of audio video asynchronization.
Referring to fig. 6, a flowchart of a timestamp processing method provided in an embodiment of the present application is shown, where the method may be applied to a stream pulling terminal in the embodiment shown in fig. 2. The method comprises the following steps:
step 601, receiving m frames of encoded data, an adjusted target display timestamp and an adjusted target encoding timestamp sent by a server.
The method comprises the steps that at least one b frame exists in m frames of encoded data, the m frames of encoded data are obtained by encoding m frames of video data, the m frames of video data respectively correspond to m display time stamps, the adjusted target display time stamps are obtained by adjusting the target display time stamps in the m display time stamps, the adjusted target encoding time stamps are obtained by adjusting the target encoding time stamps in the m encoding time stamps, the adjusted target display time stamps are larger than or equal to the adjusted target encoding time stamps, and m is an integer larger than 1.
Step 602, decoding the m frames of encoded data to obtain m frames of video data.
Step 603, determining a target display time stamp according to the adjusted target display time stamp.
Step 603 is the reverse of step 203, or step 407.
And step 604, determining a target encoding time stamp according to the adjusted target encoding time stamp.
Step 604 is the reverse of step 204, or step 408.
And step 605, displaying the playing picture according to the m frames of video data, the target display time stamp and the target coding time stamp.
Optionally, the stream pulling terminal renders the m frames of video data in sequence according to the sequence of the display timestamps corresponding to the video data, and displays a rendered picture.
To sum up, according to the technical scheme provided by the embodiment of the application, the stream pulling terminal pulls the encoded data, the adjusted target display timestamp and the adjusted target encoding timestamp from the server, and displays the playing picture according to the above contents, so that the condition that the audio and video are not synchronized when the display timestamp of the video data and the corresponding encoded data are not adjusted when the encoded data have b frames can be avoided.
In some embodiments, before step 605, the timestamp processing method further comprises the steps of: and carrying out homogenization treatment on m display time stamps respectively corresponding to the m frames of video data to obtain m display time stamps after the homogenization treatment.
And the time interval between two adjacent display time stamps in the m display time stamps after the homogenization treatment meets the preset condition. In some embodiments, the preset condition is that the time intervals between two adjacent display time stamps among the m display time stamps are all the same. By the method, the rendering time interval is consistent, and the rendering effect is improved. Referring collectively to FIG. 7, a schematic diagram of a homogenization process provided by one embodiment of the present application is shown. The time interval between two adjacent display time stamps of the m display time stamps after the homogenization processing is 42.
In the following, embodiments of the apparatus of the present application are described, and for portions of the embodiments of the apparatus not described in detail, reference may be made to technical details disclosed in the above-mentioned method embodiments.
Referring to fig. 8, a block diagram of a timestamp processing apparatus according to an exemplary embodiment of the present application is shown. The time stamp processing means may be implemented as all or a part of the terminal by software, hardware or a combination of both. The device includes:
a data obtaining module 801, configured to obtain m frames of video data, where the m frames of video data correspond to m display timestamps, and m is an integer greater than 1.
The data encoding module 802 is configured to encode m frames of video data to obtain m frames of encoded data corresponding to the m frames of video data, where the m frames of encoded data correspond to m encoding timestamps, and there is at least one b frame in the m frames of encoded data.
A first adjusting module 803, configured to adjust a target display timestamp in the m display timestamps, to obtain the adjusted target display timestamp.
A second adjusting module 804, configured to adjust a target encoding timestamp of the m encoding timestamps to obtain an adjusted target encoding timestamp, where the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp.
A data sending module 805, configured to send the m frames of encoded data, the adjusted target display timestamp, and the adjusted target encoding timestamp to a server.
In an alternative embodiment provided based on the embodiment shown in fig. 8, the apparatus further comprises: a determination module (not shown in fig. 8).
A determination module to:
for the ith frame of video data in the m frames of video data, detecting whether a display time stamp corresponding to the ith frame of video data is smaller than an encoding time stamp corresponding to encoded data of the ith frame, wherein the encoded data of the ith frame is obtained by encoding the video data of the ith frame, and i is a positive integer smaller than or equal to m;
and if the display time stamp corresponding to the ith frame of video data is smaller than the encoding time stamp corresponding to the ith frame of encoded data, determining the display time stamp corresponding to the ith frame of video data as the target display time stamp, and determining the encoding time stamp corresponding to the ith frame of encoded data as the target encoding time stamp.
Optionally, the first adjusting module 803 is configured to:
determining a first adjustment amount according to a time interval between two adjacent frames of video data in the at least two frames of video data;
and increasing the target display timestamp by the first adjustment amount to obtain the adjusted target display timestamp.
Optionally, the second adjusting module 803 is configured to:
determining a second adjustment amount according to a time interval between two adjacent frames of video data in the at least two frames of video data;
and reducing the target coding timestamp by the second adjustment amount to obtain the adjusted target coding timestamp.
In an alternative embodiment provided based on the embodiment shown in fig. 8, the data encoding module is configured to:
determining a coding sequence according to the sequence of m display timestamps corresponding to the m frames of video data respectively;
and coding the m frames of video data by adopting a preset coder according to the coding sequence to obtain the m frames of coded data.
Referring to fig. 9, a block diagram of a timestamp processing apparatus according to an exemplary embodiment of the present application is shown. The time stamp processing means may be implemented as all or a part of the terminal by software, hardware or a combination of both. The device includes:
a data receiving module 901, configured to receive m frames of encoded data, an adjusted target display timestamp, and an adjusted target encoding timestamp, where the m frames of encoded data include at least one b frame, the m frames of encoded data are obtained by encoding m frames of video data, the m frames of video data respectively correspond to m display timestamps, the adjusted target display timestamp is obtained by adjusting a target display timestamp in the m display timestamps, the adjusted target encoding timestamp is obtained by adjusting a target encoding timestamp in the m encoding timestamps, the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp, and m is an integer greater than 1.
A data decoding module 902, configured to decode the m frames of encoded data to obtain the m frames of video data.
A third adjusting module 903, configured to determine the target display timestamp according to the adjusted target display timestamp.
A fourth adjusting module 904, configured to determine the target encoding timestamp according to the adjusted target encoding timestamp.
A playing module 905, configured to display a playing picture according to the m frames of video data, the target display timestamp, and the target encoding timestamp.
In an alternative embodiment provided based on the embodiment shown in fig. 9, the apparatus further comprises: a homogenization treatment module (not shown in fig. 9).
The homogenization processing module is used for carrying out homogenization processing on m display timestamps corresponding to the m frames of video data respectively to obtain the m display timestamps after the homogenization processing, and the time interval between two adjacent display timestamps in the m display timestamps after the homogenization processing meets the preset condition.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 10 is a block diagram illustrating a terminal 1000 according to an exemplary embodiment of the present application, where the terminal 1000 may be a smart phone, a tablet pc, an MP3 player (Moving Picture Experts Group Audio L player iii, mpeg Audio layer 3), an MP4 player (Moving Picture Experts Group Audio L player IV, mpeg Audio layer 4), a notebook pc, or a desktop pc, and the terminal 1000 may also be referred to as a user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1000 can include: a processor 1001 and a memory 1002.
The processor 1001 may include one or more Processing cores, such as a 4-core processor, an 8-core processor, etc., the processor 1001 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), Programmable logic Array (Programmable L geographic Array, P L a), the processor 1001 may also include a main processor and a coprocessor, the main processor is a processor for Processing data in a wake-up state, also known as a Central Processing Unit (CPU), the coprocessor is a low-power processor for Processing data in a standby state, in some embodiments, the processor 1001 may be integrated with a Graphics Processing Unit (GPU) for rendering and rendering content desired for a display screen.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. The memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1002 is used to store a computer program for execution by the processor 1001 to implement the live data push streaming method provided by the method embodiments herein.
In some embodiments, terminal 1000 can also optionally include: a peripheral interface 1003 and at least one peripheral. The processor 1001, memory 1002 and peripheral interface 1003 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, touch screen display 1005, camera 10010, audio circuitry 1007, positioning components 1008, and power supply 1009.
Peripheral interface 1003 may be used to connect at least one Input/Output (I/O) related peripheral to processor 1001 and memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1001, the memory 1002, and the peripheral interface 1003 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 1004 is used to receive and transmit Radio Frequency (RF) signals, also known as electromagnetic signals. The radio frequency circuitry 1004 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or Wireless Fidelity (WiFi) networks. In some embodiments, the rf circuit 1004 may further include Near Field Communication (NFC) related circuits, which are not limited in this application.
Display 1005 is for displaying a User Interface (UI). The UI may include graphics, text, icons, video, and any combination thereof when Display 1005 is a touch Display, Display 1005 also has the ability to capture touch signals on or over the surface of Display 1005. the touch signals may be input to processor 1001 for processing as control signals.
The camera assembly 10010 is used to capture images or video. Optionally, the camera assembly 10010 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and a Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 10010 can further comprise a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing or inputting the electric signals to the radio frequency circuit 1004 for realizing voice communication. For stereo sound collection or noise reduction purposes, multiple microphones can be provided, each at a different location of terminal 1000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1007 may also include a headphone jack.
The positioning component 1008 is used to locate the current geographic position of the terminal 1000 to implement navigation or location Based services (L position Based Service, L BS.) the positioning component 1008 may be a positioning component of the Global Positioning System (GPS) Based on the united states, the beidou System in china, or the galileo System in russia.
Power supply 1009 is used to supply power to various components in terminal 1000. The power source 1009 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 1009 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1000 can also include one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyro sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 10110.
Acceleration sensor 1011 can detect acceleration magnitudes on three coordinate axes of a coordinate system established with terminal 1000. For example, the acceleration sensor 1011 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1001 may control the touch display screen 1005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the terminal 1000, and the gyro sensor 1012 and the acceleration sensor 1011 may cooperate to acquire a 3D motion of the user on the terminal 1000. From the data collected by the gyro sensor 1012, the processor 1001 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1013 may be disposed on a side frame of terminal 1000 and/or on a lower layer of touch display 1005. When pressure sensor 1013 is disposed on a side frame of terminal 1000, a user's grip signal on terminal 1000 can be detected, and processor 1001 performs left-right hand recognition or shortcut operation according to the grip signal collected by pressure sensor 1013. When the pressure sensor 1013 is disposed at a lower layer of the touch display screen 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 1005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
Fingerprint sensor 1014 is used to collect a user's fingerprint, and the identity of the user is identified by processor 1001 according to the fingerprint collected by fingerprint sensor 1014, or the identity of the user is identified by fingerprint sensor 1014 according to the collected fingerprint when the identity of the user is identified as a trusted identity, the user is authorized by processor 1001 to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying and changing settings, etc.
The optical sensor 1015 is used to collect the ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the touch display screen 1005 according to the intensity of the ambient light collected by the optical sensor 1015. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1005 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1005 is turned down. In another embodiment, the processor 1001 can also dynamically adjust the shooting parameters of the camera assembly 10010 according to the intensity of the ambient light collected by the optical sensor 1015.
Proximity sensor 10110, also referred to as a distance sensor, is typically disposed on a front panel of terminal 1000. Proximity sensor 10110 is used to capture the distance between the user and the front face of terminal 1000. In one embodiment, when proximity sensor 10110 detects that the distance between the user and the front surface of terminal 1000 gradually decreases, processor 1001 controls touch display 1005 to switch from a bright screen state to a dark screen state; when proximity sensor 10110 detects that the distance between the user and the front of terminal 1000 is gradually increasing, processor 1001 controls touch display 1005 to switch from a breath screen state to a bright screen state.
Those skilled in the art will appreciate that the configuration shown in FIG. 10 is not intended to be limiting and that terminal 1000 can include more or fewer components than shown, or some components can be combined, or a different arrangement of components can be employed.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein a computer program, which is loaded and executed by a processor of a terminal to implement the time stamp processing method in the above-described method embodiments.
Alternatively, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, which, when executed, is adapted to implement the timestamp processing method provided in the above-described method embodiments.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. As used herein, the terms "first," "second," and the like, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (11)

1. A timestamp processing method is applied to a stream pushing terminal, and comprises the following steps:
acquiring m frames of video data, wherein the m frames of video data respectively correspond to m display time stamps, and m is an integer greater than 1;
coding m frames of video data to obtain m frames of coded data corresponding to the m frames of video data respectively, wherein the m frames of coded data correspond to m coding time stamps respectively, and at least one b frame exists in the m frames of coded data;
adjusting a target display timestamp in the m display timestamps to obtain the adjusted target display timestamp;
adjusting a target encoding timestamp in the m encoding timestamps to obtain an adjusted target encoding timestamp, wherein the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp;
and sending the m frames of encoded data, the adjusted target display timestamp and the adjusted target encoding timestamp to a server.
2. The method according to claim 1, wherein before adjusting the target display timestamp of the m display timestamps to obtain the adjusted target display timestamp, the method further comprises:
for the ith frame of video data in the m frames of video data, detecting whether a display time stamp corresponding to the ith frame of video data is smaller than an encoding time stamp corresponding to encoded data of the ith frame, wherein the encoded data of the ith frame is obtained by encoding the video data of the ith frame, and i is a positive integer smaller than or equal to m;
and if the display time stamp corresponding to the ith frame of video data is smaller than the encoding time stamp corresponding to the ith frame of encoded data, determining the display time stamp corresponding to the ith frame of video data as the target display time stamp, and determining the encoding time stamp corresponding to the ith frame of encoded data as the target encoding time stamp.
3. The method according to claim 2, wherein the adjusting a target display timestamp of the m display timestamps to obtain the adjusted target display timestamp comprises:
determining a first adjustment amount according to a time interval between two adjacent frames of video data in the at least two frames of video data;
and increasing the target display timestamp by the first adjustment amount to obtain the adjusted target display timestamp.
4. The method according to claim 2, wherein the adjusting a target encoding timestamp of the m encoding timestamps to obtain the adjusted target encoding timestamp comprises:
determining a second adjustment amount according to a time interval between two adjacent frames of video data in the at least two frames of video data;
and reducing the target coding timestamp by the second adjustment amount to obtain the adjusted target coding timestamp.
5. The method according to any one of claims 1 to 4, wherein the encoding m frames of video data to obtain m frames of encoded data corresponding to the m frames of video data respectively comprises:
determining a coding sequence according to the sequence of m display timestamps corresponding to the m frames of video data respectively;
and coding the m frames of video data by adopting a preset coder according to the coding sequence to obtain the m frames of coded data.
6. A timestamp processing method is applied to a stream pulling terminal, and comprises the following steps:
receiving m frames of encoded data, an adjusted target display timestamp and an adjusted target encoding timestamp sent by a server, wherein the m frames of encoded data have at least one b frame, the m frames of encoded data are obtained by encoding m frames of video data, the m frames of video data respectively correspond to m display timestamps, the adjusted target display timestamp is obtained by adjusting a target display timestamp in the m display timestamps, the adjusted target encoding timestamp is obtained by adjusting a target encoding timestamp in the m encoding timestamps, the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp, and m is an integer greater than 1;
decoding the m frames of encoded data to obtain m frames of video data;
determining the target display timestamp according to the adjusted target display timestamp;
determining the target coding time stamp according to the adjusted target coding time stamp;
and displaying a playing picture according to the m frames of video data, the target display time stamp and the target coding time stamp.
7. The method according to claim 6, wherein before displaying the playing picture according to the m-frame video data, the target display timestamp, and the target encoding timestamp, further comprising:
and carrying out homogenization treatment on m display timestamps corresponding to the m frames of video data respectively to obtain the m display timestamps after the homogenization treatment, wherein the time interval between two adjacent display timestamps in the m display timestamps after the homogenization treatment accords with a preset condition.
8. A time stamp processing apparatus, characterized in that the apparatus comprises:
the data acquisition module is used for acquiring m frames of video data, wherein the m frames of video data respectively correspond to m display time stamps, and m is an integer greater than 1;
the data coding module is used for coding m frames of video data to obtain m frames of coded data corresponding to the m frames of video data respectively, the m frames of coded data correspond to m coding time stamps respectively, and at least one b frame exists in the m frames of coded data;
a first adjusting module, configured to adjust a target display timestamp of the m display timestamps to obtain an adjusted target display timestamp;
a second adjusting module, configured to adjust a target encoding timestamp of the m encoding timestamps to obtain an adjusted target encoding timestamp, where the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp;
and the data sending module is used for sending the m frames of encoded data, the adjusted target display timestamp and the adjusted target encoding timestamp to a server.
9. A time stamp processing apparatus, characterized in that the apparatus comprises:
a data receiving module, configured to receive m frames of encoded data, an adjusted target display timestamp, and an adjusted target encoding timestamp, where the m frames of encoded data include at least one b frame, the m frames of encoded data are obtained by encoding m frames of video data, the m frames of video data respectively correspond to m display timestamps, the adjusted target display timestamp is obtained by adjusting a target display timestamp in the m display timestamps, the adjusted target encoding timestamp is obtained by adjusting a target encoding timestamp in the m encoding timestamps, the adjusted target display timestamp is greater than or equal to the adjusted target encoding timestamp, and m is an integer greater than 1;
the data decoding module is used for decoding the m frames of encoded data to obtain m frames of video data;
a third adjusting module, configured to determine the target display timestamp according to the adjusted target display timestamp;
a fourth adjusting module, configured to determine the target encoding timestamp according to the adjusted target encoding timestamp;
and the playing module is used for displaying a playing picture according to the m frames of video data, the target display time stamp and the target coding time stamp.
10. A terminal, characterized in that the terminal comprises a processor and a memory, the memory storing a computer program which is loaded and executed by the processor to implement the time stamping method according to any one of claims 1 to 5 or the time stamping method according to claim 6 or 7.
11. A computer-readable storage medium, in which a computer program is stored, which is loaded and executed by a processor to implement the time stamp processing method according to any one of claims 1 to 5 or the time stamp processing method according to claim 6 or 7.
CN202010290959.9A 2020-04-14 2020-04-14 Timestamp processing method, device, terminal and storage medium Active CN111478914B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010290959.9A CN111478914B (en) 2020-04-14 2020-04-14 Timestamp processing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010290959.9A CN111478914B (en) 2020-04-14 2020-04-14 Timestamp processing method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111478914A true CN111478914A (en) 2020-07-31
CN111478914B CN111478914B (en) 2022-08-16

Family

ID=71751918

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010290959.9A Active CN111478914B (en) 2020-04-14 2020-04-14 Timestamp processing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111478914B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113542890A (en) * 2021-08-03 2021-10-22 厦门美图之家科技有限公司 Video editing method and related device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6297852B1 (en) * 1998-12-30 2001-10-02 Ati International Srl Video display method and apparatus with synchronized video playback and weighted frame creation
CN101374231A (en) * 2007-04-30 2009-02-25 Vixs***公司 System and method for combining a plurality of video streams
CN103535027A (en) * 2010-12-20 2014-01-22 通用仪表公司 Method of processing a sequence of coded video frames
WO2018120557A1 (en) * 2016-12-26 2018-07-05 深圳市中兴微电子技术有限公司 Method and device for synchronously processing audio and video, and storage medium
CN109120933A (en) * 2018-10-11 2019-01-01 广州酷狗计算机科技有限公司 Dynamic adjusts method, apparatus, equipment and the storage medium of code rate
CN110572722A (en) * 2019-09-26 2019-12-13 腾讯科技(深圳)有限公司 Video clipping method, device, equipment and readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6297852B1 (en) * 1998-12-30 2001-10-02 Ati International Srl Video display method and apparatus with synchronized video playback and weighted frame creation
CN101374231A (en) * 2007-04-30 2009-02-25 Vixs***公司 System and method for combining a plurality of video streams
CN103535027A (en) * 2010-12-20 2014-01-22 通用仪表公司 Method of processing a sequence of coded video frames
WO2018120557A1 (en) * 2016-12-26 2018-07-05 深圳市中兴微电子技术有限公司 Method and device for synchronously processing audio and video, and storage medium
CN109120933A (en) * 2018-10-11 2019-01-01 广州酷狗计算机科技有限公司 Dynamic adjusts method, apparatus, equipment and the storage medium of code rate
CN110572722A (en) * 2019-09-26 2019-12-13 腾讯科技(深圳)有限公司 Video clipping method, device, equipment and readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GALEN6: "《https://blog.csdn.net/zhengbin6072/article/details/78902983》", 26 December 2017 *
IOT_SHUN: "《https://blog.csdn.net/IOT_SHUN/article/details/79738230》", 29 March 2018 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113542890A (en) * 2021-08-03 2021-10-22 厦门美图之家科技有限公司 Video editing method and related device

Also Published As

Publication number Publication date
CN111478914B (en) 2022-08-16

Similar Documents

Publication Publication Date Title
CN108093268B (en) Live broadcast method and device
CN108900859B (en) Live broadcasting method and system
CN111093108B (en) Sound and picture synchronization judgment method and device, terminal and computer readable storage medium
CN109348247B (en) Method and device for determining audio and video playing time stamp and storage medium
CN108966008B (en) Live video playback method and device
CN109874043B (en) Video stream sending method, video stream playing method and video stream playing device
CN108769738B (en) Video processing method, video processing device, computer equipment and storage medium
CN108391127B (en) Video encoding method, device, storage medium and equipment
CN110996117B (en) Video transcoding method and device, electronic equipment and storage medium
CN111586431B (en) Method, device and equipment for live broadcast processing and storage medium
CN108600778B (en) Media stream transmitting method, device, system, server, terminal and storage medium
CN112929654B (en) Method, device and equipment for detecting sound and picture synchronization and storage medium
CN112584049A (en) Remote interaction method and device, electronic equipment and storage medium
CN109168032B (en) Video data processing method, terminal, server and storage medium
CN108965711B (en) Video processing method and device
CN111405312A (en) Live broadcast stream pushing method, device, terminal, server and storage medium
CN111010588B (en) Live broadcast processing method and device, storage medium and equipment
CN111586413B (en) Video adjusting method and device, computer equipment and storage medium
CN110049326B (en) Video coding method and device and storage medium
CN111478915B (en) Live broadcast data stream pushing method and device, terminal and storage medium
CN107888975B (en) Video playing method, device and storage medium
CN111586433B (en) Code rate adjusting method, device, equipment and storage medium
CN111478914B (en) Timestamp processing method, device, terminal and storage medium
CN110177275B (en) Video encoding method and apparatus, and storage medium
CN109714628B (en) Method, device, equipment, storage medium and system for playing audio and video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant