CN109040818B - Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting - Google Patents

Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting Download PDF

Info

Publication number
CN109040818B
CN109040818B CN201710439526.3A CN201710439526A CN109040818B CN 109040818 B CN109040818 B CN 109040818B CN 201710439526 A CN201710439526 A CN 201710439526A CN 109040818 B CN109040818 B CN 109040818B
Authority
CN
China
Prior art keywords
video
audio
time
video image
capturing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710439526.3A
Other languages
Chinese (zh)
Other versions
CN109040818A (en
Inventor
杨亮
张文明
陈少杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heilongjiang Jindi Film and Television Media Co.,Ltd.
Original Assignee
Wuhan Douyu Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Douyu Network Technology Co Ltd filed Critical Wuhan Douyu Network Technology Co Ltd
Priority to CN201710439526.3A priority Critical patent/CN109040818B/en
Publication of CN109040818A publication Critical patent/CN109040818A/en
Application granted granted Critical
Publication of CN109040818B publication Critical patent/CN109040818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440236Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by media transcoding, e.g. video is transformed into a slideshow of still pictures, audio is converted into text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses an audio and video synchronization method, a storage medium, electronic equipment and a system during live broadcasting, and relates to the field of audio and video synchronization of video streams. The method comprises the following steps: capturing video images and audio by a client during live broadcasting; the client records the time Tv for capturing the video image and the time Ta for capturing the audio, wherein Ta is more than Tv; the client encodes video images and audio to form video, and sets a video image time stamp Tvi and an audio time stamp Tai in the encoding process: tvi ═ T-Tv, Tai ═ T-Ta; wherein T is the user-defined time and is before Ta; and the client plays the video images according to the video image time stamps Tvi and plays the audio according to the audio time stamps Tai. The invention can keep the synchronous playing of the audio and video images in the live broadcasting process all the time, obviously improves the live broadcasting quality and is very suitable for popularization.

Description

Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting
Technical Field
The invention relates to the field of audio and video synchronization of video streams, in particular to an audio and video synchronization method, a storage medium, electronic equipment and a system during live broadcasting.
Background
With the rapid development of the live broadcast industry, more and more users watch live broadcast, and audio and video synchronization (i.e. synchronization of sound and video images) of video during live broadcast is an important standard for measuring live broadcast quality. The following explains video and audio, respectively.
The video comprises a video file and a video stream:
video files: the video file is a file in which an original video image and an original sound are encoded according to related parameters (a resolution, a frame rate, a bit rate of the video, a sampling rate of the audio, and the number of channels) by using a related video encoder and a related audio encoder, and then the encoded data is stored according to a related file format (an mp4 format, a flv format, or the like).
Video streaming: the video stream is similar to a video file, and the original video image and sound are coded by using the coder; the video stream does not store the encoded data as a file, but packs the encoded data through a network protocol to form data conforming to a transmission format, and then transmits the data through a network. The network Protocol is RTMP (Real Time Messaging Protocol), RTSP (Real Time Streaming Protocol), or the like.
Audio:
audio is a digital signal recognizable by computers, and several of the most important attributes of audio include sampling rate, channel, and sampling format. The sampling rate indicates how many samples per second are, typically a sampling rate of 44100HZ, 48000 HZ; the sound channel represents that there are several sample data in one sample data, and there are usually single channel (only one sample data in one sample), double channel (two sample data in one sample) and multiple channels (multiple sample data in one sample); the sample format is expressed as the size of each sample of data, typically in a two byte format of 16bits and a floating point format of 4 bytes. That one 44100HZ, monaural, 16bits format audio, the number of bytes of audio in one second is 44100 x 1 x 2, i.e. 88200 bytes; a 44100HZ, binaural, 16bits format audio, with the number of bytes of audio in one second being 44100 x 2, i.e., 176400 bytes.
At present, the method for audio and video synchronization during live broadcasting generally comprises the following steps: when the live broadcast is started, the video image and the audio are captured and then encoded to form a video, a video image time stamp is set for the video image during encoding, and the same time stamp (for example, 3s for both audio and video images) is set for the audio and video images. And decoding the video through a video decoder and an audio decoder to obtain an original video image and audio before encoding and a time stamp, and respectively playing the decoded video image and audio after the time stamp. The purpose of playing the video after the timestamp is as follows: and the buffer time is reserved for capturing video images and audio, so that the smoothness of video playing is ensured.
However, when capturing audio, because the audio source during live broadcasting is the sound data of the sound card and the live microphone of the live broadcasting device (such as a computer, a mobile terminal, etc.), there is a delay in capturing audio, that is, the time from the generation of audio (the sound data generated by the live broadcasting device and the microphone) to the capture of audio. Therefore, when there is a delay in capturing the audio and the captured video image is captured instantly (i.e., there is no delay), the audio and the video image cannot be captured synchronously (the audio with the delay is slower than the video), so that the video formed according to the captured audio and the video image is a video with asynchronous audio and video, the video is played by using the same timestamp, and the situation that the audio and the video are asynchronous (the video image is played first and then the audio is played) inevitably occurs.
Disclosure of Invention
Aiming at the defects in the prior art, the invention solves the technical problems that: how to keep audio and video synchronization during live broadcasting, the method can keep audio and video images to be played synchronously all the time in the live broadcasting process, obviously improves the live broadcasting quality, and is very suitable for popularization.
In order to achieve the above purpose, the audio and video synchronization method during live broadcasting provided by the invention comprises the following steps:
s1: when the client is in live broadcasting, capturing video images and audio, and turning to S2;
s2: the client records the time Tv for capturing the video image and the time Ta for capturing the audio, wherein Ta is more than Tv, and the operation goes to S3;
s3: the client encodes video images and audio to form video, and sets a video image time stamp Tvi and an audio time stamp Tai in the encoding process: tvi ═ T-Tv, Tai ═ T-Ta; wherein T is the user-defined time, go to S4;
s4: and the client plays the video images according to the video image time stamps Tvi and plays the audio according to the audio time stamps Tai.
Based on the above technical solution, T in S3 is the time before the video image and audio are captured in S1.
The storage medium provided by the invention is stored with a computer program, and the computer program realizes the audio and video synchronization method during the live broadcasting when being executed by a processor.
The electronic equipment provided by the invention comprises a memory and a processor, wherein the memory is stored with a computer program running on the processor, and the processor executes the computer program to realize the audio and video synchronization method during live broadcasting.
The audio and video synchronization system during live broadcasting comprises a video capture module, a video capture time monitoring module, a video coding module and a video playing module which are arranged on a client;
the video capture module is to: when the client is in live broadcasting, capturing video images and audio, and sending a video capturing time monitoring signal to a video capturing time monitoring module;
the video capture time monitoring module is used for: after receiving the video capture time monitoring signal, recording the time Tv of capturing a video image and the time Ta of capturing an audio, wherein Ta is more than Tv, and sending a video coding signal to a video coding module;
the video encoding module is to: after receiving the video coding signal, coding the video image and the audio to form a video, and setting a video image time stamp Tvi and an audio time stamp Tai in the coding process: tvi ═ T-Tv, Tai ═ T-Ta; wherein T is the self-defined time, and a video playing signal is sent to the video playing module;
the video playing module is used for: and after receiving the video playing signal, playing the video image according to the video image timestamp Tvi, and playing the audio according to the audio timestamp Tai.
Compared with the prior art, the invention has the advantages that:
(1) the invention fully considers the delay time of the captured audio when setting the time stamp: when the coded video is played, the video image and the audio are out of synchronization (the audio is slower than the video image). Referring to S1 to S4, when the present invention plays video, the video timestamp Tvi is longer than the audio timestamp Tai (Tvi-Tai ═ T-Tv-T + Ta ═ Ta-Tv, Ta > Tv), and the difference (Ta-Tv) between the video timestamp Tvi and the audio timestamp Tai is exactly equal to the delay time for capturing audio.
Therefore, the invention can play the audio first and then play the video image through different timestamps, further offset the delay time of capturing the audio, and finally change the audio to be synchronous with the video image rather than the video image. Therefore, the method and the device can keep the synchronous playing of the audio and video images in the live broadcasting process all the time, obviously improve the live broadcasting quality and are very suitable for popularization.
(2) The reason for setting the video image time stamp and the audio time stamp in the present invention is that: reserving buffer time for capturing video images and audio; on the basis, the calculation factor T of the time stamp is set to be the time before the video image and the audio which need to be live broadcast are captured, so that the time stamp of the video image and the time stamp of the audio are longer than the time for capturing the video image and the audio. Therefore, the invention can absolutely ensure that the video can be smoothly buffered during playing, and further improves the live broadcast quality.
Drawings
Fig. 1 is a flowchart of an audio and video synchronization method during live broadcasting according to an embodiment of the present invention;
fig. 2 is a connection block diagram of an electronic device in an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
Referring to fig. 1, the audio and video synchronization method during live broadcasting in the embodiment of the present invention includes the following steps:
s1: when the client is in live broadcasting, capturing video images and audio required to be live broadcasting, wherein the audio comprises sound data generated by a client sound card and a microphone (such as a microphone), and turning to S2;
s2: the client records the time Tv for capturing the video image and the time Ta for capturing the audio, and because the captured sound has delay, Ta is inevitably greater than Tv, and the process goes to S3;
s3: the client encodes video images and audio to form video, and sets a video image time stamp Tvi and an audio time stamp Tai in the encoding process: tvi ═ T-Tv, Tai ═ T-Ta; where T is the custom time, go to S4.
In S3, Tai-Tvi is 700ms, 700ms is the delay time for capturing audio, which is also the time that the present invention needs to cancel, and audio and video synchronization can be realized after cancellation.
S4: and the client plays the video images according to the video image time stamps Tvi and plays the audio according to the audio time stamps Tai.
The specific flow of S4 is as follows: the client decodes video images and audio in the video respectively; and the client plays the decoded video image after the video image timestamp Tvi and plays the decoded audio after the audio timestamp Tai.
When the time stamp is set, the embodiment of the invention fully considers the delay time Ty of the captured audio: when the coded video is played, the video image and the audio are out of synchronization (the audio is slower than the video image). Referring to S1 to S4, when playing video, the embodiment of the present invention has the video timestamp Tvi longer than the audio timestamp Tai (Tvi-Tai-T + Ta-Tv, Ta > Tv), and the difference (Ta-Tv) between the video timestamp Tvi and the audio timestamp Tai is exactly equal to the delay time for capturing audio. Therefore, the embodiment of the invention can play the audio first and then play the video image through different timestamps, further offset the delay time of capturing the audio, and finally change the audio to be synchronously played with the audio and the video image slower than the video image.
Further, the reason for setting the video image time stamp and the audio time stamp is that: in order to capture the reserved buffer time of the video image and the audio, the embodiment of the invention sets the time T to be the time before the video image and the audio which need to be live broadcast are captured, so that the time length of the video image time stamp and the audio time stamp is longer than the time length of capturing the video image and the audio. Therefore, the invention can absolutely ensure that the video can be smoothly buffered during playing, and further improves the live broadcast quality.
The embodiment of the invention also provides a storage medium, wherein a computer program is stored on the storage medium, and when being executed by a processor, the computer program realizes the audio and video synchronization method during live broadcasting. The storage medium includes various media capable of storing program codes, such as a usb disk, a removable hard disk, a ROM (Read-Only Memory), a RAM (Random Access Memory), a magnetic disk, or an optical disk.
Referring to fig. 2, an embodiment of the present invention further provides an electronic device, which includes a memory and a processor, where the memory stores a computer program running on the processor, and the processor executes the computer program to implement the audio and video synchronization method during live broadcasting.
The audio and video synchronization system during live broadcasting in the embodiment of the invention comprises a video capture module, a video capture time monitoring module, a video coding module and a video playing module which are arranged on a client.
The video capture module is to: when the client is in live broadcasting, video images and audio are captured, the audio comprises sound data generated by a client sound card and a microphone (such as a microphone), and a video capture time monitoring signal is sent to a video capture time monitoring module.
The video capture time monitoring module is used for: and after receiving the video capture time monitoring signal, recording the time Tv of capturing the video image and the time Ta of capturing the audio, wherein Ta is more than Tv, and sending a video coding signal to a video coding module.
The video encoding module is to: after receiving the video coding signal, coding the video image and the audio to form a video, and setting a video image time stamp Tvi and an audio time stamp Tai in the coding process: tvi ═ T-Tv, Tai ═ T-Ta; wherein T is a user-defined time (in this embodiment, T is a time before the video capture module captures the video image and the audio), and sends a video playing signal to the video playing module.
The video playing module is used for: after receiving the video playing signal, playing the video image according to the video image timestamp Tvi, and playing the audio according to the audio timestamp Tai; the specific working process is as follows: respectively decoding video images and audio in the video to obtain video image time stamps Tvi and audio time stamps Tai; and playing the decoded video image after the video image timestamp Tvi, and playing the decoded audio after the audio timestamp Tai.
It should be noted that: in the system provided in the embodiment of the present invention, when performing inter-module communication, only the division of each functional module is illustrated, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the system is divided into different functional modules to complete all or part of the above described functions.
Further, the present invention is not limited to the above-mentioned embodiments, and it will be apparent to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements are also considered to be within the scope of the present invention. Those not described in detail in this specification are within the skill of the art.

Claims (10)

1. A method for synchronizing audio and video during live broadcasting is characterized by comprising the following steps:
s1: when the client sends out video images and audio simultaneously in live broadcasting, capturing the video images and the audio, and turning to S2;
s2: the client records the time Tv for capturing the video image and the time Ta for capturing the audio, wherein Ta is more than Tv, and the operation goes to S3;
s3: the client encodes video images and audio to form video, and sets a video image time stamp Tvi and an audio time stamp Tai in the encoding process: tvi ═ T-Tv, Tai ═ T-Ta; wherein T is the user-defined time, go to S4;
s4: and the client plays the video images according to the video image time stamps Tvi and plays the audio according to the audio time stamps Tai.
2. A live audio/video synchronization method as claimed in claim 1, characterized by: the T in S3 is the time before the video image and audio are captured in S1.
3. The live audio and video synchronization method as claimed in claim 1 or 2, wherein the process of S4 includes: the client decodes the video image and the audio in the video respectively to obtain a video image timestamp Tvi and an audio timestamp Tai; and the client plays the decoded video image after the video image timestamp Tvi and plays the decoded audio after the audio timestamp Tai.
4. A storage medium having a computer program stored thereon, characterized in that: the computer program, when executed by a processor, implements the method of any of claims 1 to 3.
5. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program that runs on the processor, characterized in that: a processor implementing the method of any one of claims 1 to 3 when executing the computer program.
6. An audio and video synchronization system during live broadcasting is characterized in that: the system comprises a video capturing module, a video capturing time monitoring module, a video coding module and a video playing module which are arranged on a client;
the video capture module is to: when the client sends out video images and audio simultaneously in live broadcasting, capturing the video images and the audio, and sending video capturing time monitoring signals to a video capturing time monitoring module;
the video capture time monitoring module is used for: after receiving the video capture time monitoring signal, recording the time Tv of capturing a video image and the time Ta of capturing an audio, wherein Ta is more than Tv, and sending a video coding signal to a video coding module;
the video encoding module is to: after receiving the video coding signal, coding the video image and the audio to form a video, and setting a video image time stamp Tvi and an audio time stamp Tai in the coding process: tvi ═ T-Tv, Tai ═ T-Ta; wherein T is the self-defined time, and a video playing signal is sent to the video playing module;
the video playing module is used for: and after receiving the video playing signal, playing the video image according to the video image timestamp Tvi, and playing the audio according to the audio timestamp Tai.
7. A live audio video synchronization system as claimed in claim 6, wherein: and T in the video coding module is the time before the video capture module captures the video image and the audio.
8. The live audio/video synchronization system as claimed in claim 6, wherein the workflow of the video playing module comprises: respectively decoding video images and audio in the video to obtain video image time stamps Tvi and audio time stamps Tai; and playing the decoded video image after the video image timestamp Tvi, and playing the decoded audio after the audio timestamp Tai.
9. A live audio/video synchronization system as claimed in any one of claims 6 to 8, wherein: the audio captured by the video capture module includes sound data generated by a client sound card and a microphone.
10. A live audio/video synchronization system as claimed in claim 9, wherein: the microphone is a microphone.
CN201710439526.3A 2017-06-12 2017-06-12 Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting Active CN109040818B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710439526.3A CN109040818B (en) 2017-06-12 2017-06-12 Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710439526.3A CN109040818B (en) 2017-06-12 2017-06-12 Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting

Publications (2)

Publication Number Publication Date
CN109040818A CN109040818A (en) 2018-12-18
CN109040818B true CN109040818B (en) 2021-04-27

Family

ID=64629329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710439526.3A Active CN109040818B (en) 2017-06-12 2017-06-12 Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting

Country Status (1)

Country Link
CN (1) CN109040818B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111385588A (en) * 2018-12-28 2020-07-07 广州市百果园信息技术有限公司 Method, medium and computer equipment for synchronizing audio and video playing and anchor broadcast sending information
CN111654736B (en) * 2020-06-10 2022-05-31 北京百度网讯科技有限公司 Method and device for determining audio and video synchronization error, electronic equipment and storage medium
CN114945105B (en) * 2022-05-13 2024-02-06 宜百科技(深圳)有限公司 Wireless earphone audio hysteresis cancellation method combined with sound compensation
CN115052200B (en) * 2022-06-06 2023-09-29 中国第一汽车股份有限公司 Novel vehicle-mounted video projection method and system and vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6269122B1 (en) * 1998-01-02 2001-07-31 Intel Corporation Synchronization of related audio and video streams
CN1802851A (en) * 2003-06-12 2006-07-12 索尼株式会社 Device for recording video data and audio data
CN101271720A (en) * 2008-04-22 2008-09-24 中兴通讯股份有限公司 Synchronization process for mobile phone stream media audio and video
CN103167320A (en) * 2011-12-15 2013-06-19 中国电信股份有限公司 Audio and video synchronization method and audio and video synchronization system and mobile phone live broadcast client-side
CN103338386A (en) * 2013-07-10 2013-10-02 航天恒星科技有限公司 Audio and video synchronization method based on simplified timestamps
CN105245976A (en) * 2015-09-30 2016-01-13 合一网络技术(北京)有限公司 Method and system for synchronously playing audio and video
CN106488288A (en) * 2015-08-27 2017-03-08 宏达国际电子股份有限公司 Virtual reality system and its audio/video synchronization method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8179789B2 (en) * 2005-07-01 2012-05-15 Winnov, Lp System and method for timestamps for media stream data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6269122B1 (en) * 1998-01-02 2001-07-31 Intel Corporation Synchronization of related audio and video streams
CN1802851A (en) * 2003-06-12 2006-07-12 索尼株式会社 Device for recording video data and audio data
CN101271720A (en) * 2008-04-22 2008-09-24 中兴通讯股份有限公司 Synchronization process for mobile phone stream media audio and video
CN103167320A (en) * 2011-12-15 2013-06-19 中国电信股份有限公司 Audio and video synchronization method and audio and video synchronization system and mobile phone live broadcast client-side
CN103338386A (en) * 2013-07-10 2013-10-02 航天恒星科技有限公司 Audio and video synchronization method based on simplified timestamps
CN106488288A (en) * 2015-08-27 2017-03-08 宏达国际电子股份有限公司 Virtual reality system and its audio/video synchronization method
CN105245976A (en) * 2015-09-30 2016-01-13 合一网络技术(北京)有限公司 Method and system for synchronously playing audio and video

Also Published As

Publication number Publication date
CN109040818A (en) 2018-12-18

Similar Documents

Publication Publication Date Title
CN109040818B (en) Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting
EP3095247B1 (en) Robust live operation of dash
KR101296059B1 (en) System and method for storing multi­source multimedia presentations
CN112752115B (en) Live broadcast data transmission method, device, equipment and medium
US11895352B2 (en) System and method for operating a transmission network
JP2003114845A (en) Media conversion method and media conversion device
EP3792731A1 (en) Multimedia information transmission method and apparatus, and terminal
CN108111872B (en) Audio live broadcasting system
CN110784718A (en) Video data encoding method, apparatus, device and storage medium
US20100098161A1 (en) Video encoding apparatus and video encoding method
JP2018117259A (en) One-to-many audio video streaming method by audio video synchronous take in
CN103081488A (en) Signaling video samples for trick mode video representations
CN112087642B (en) Cloud guide playing method, cloud guide server and remote management terminal
CN105611395A (en) MP4 format video online play method and system thereof
CN1534503A (en) Method of realizing real time image sound talks in network game, system and storage medium thereof
CN114363648A (en) Method, equipment and storage medium for audio and video alignment in mixed flow process of live broadcast system
CN112153401B (en) Video processing method, communication device and readable storage medium
WO2004086765A1 (en) Data transmission device
CN109218849B (en) Live data processing method, device, equipment and storage medium
CN109600651B (en) Method and system for synchronizing file type live broadcast interactive data and audio and video data
JP4391412B2 (en) Dynamic multiplexing method of digital stream
JP5854208B2 (en) Video content generation method for multistage high-speed playback
CN108124183B (en) Method for synchronously acquiring video and audio to perform one-to-many video and audio streaming
CN102118633A (en) Method, device and system for playing video files
CN113645485A (en) Method and device for realizing conversion from any streaming media protocol to NDI (network data interface)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230823

Address after: Room 1643, 16th Floor, Building 1, No. 365 Lianhuahu Road, Jiangnan New City, Dong'an District, Mudanjiang City, Heilongjiang Province, 157000 (Cluster Registration)

Patentee after: Heilongjiang Jindi Film and Television Media Co.,Ltd.

Address before: 430000 East Lake Development Zone, Wuhan City, Hubei Province, No. 1 Software Park East Road 4.1 Phase B1 Building 11 Building

Patentee before: WUHAN DOUYU NETWORK TECHNOLOGY Co.,Ltd.