CN113923197A - Method and system for synchronously playing audio/video and courseware - Google Patents
Method and system for synchronously playing audio/video and courseware Download PDFInfo
- Publication number
- CN113923197A CN113923197A CN202111157212.7A CN202111157212A CN113923197A CN 113923197 A CN113923197 A CN 113923197A CN 202111157212 A CN202111157212 A CN 202111157212A CN 113923197 A CN113923197 A CN 113923197A
- Authority
- CN
- China
- Prior art keywords
- video
- audio
- courseware
- rtmp
- timestamp
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/02—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Security & Cryptography (AREA)
- Multimedia (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The invention discloses a method and a system for synchronously playing audio and video and courseware, wherein the method comprises the following steps: s1, the anchor terminal pushes the audio and video signals to an audio and video processing service module of the server; s2, the audio and video processing service module converts the stream data into RTMP stream and outputs the RTMP stream to the video live broadcast service module of the server; s3, the video live broadcast service module receives RTMP stream and transcodes the audio and video stream data into TS files based on HLS protocol, and the TS files are named by timestamps and issued in the CDN; s4, the HLS player throws the playing time stamp to the outside in an event mode; s5, when the anchor operates courseware, the anchor terminal collects operation instructions and sends the operation instructions to an operation instruction queue of the server; s6, writing the current timestamp into the instruction after the instruction is received by the operation instruction queue, and storing the timestamp into the instruction queue; and S7, after receiving the timestamp notice thrown by the HLS player, the courseware player of the audience terminal simulates and operates courseware according to the operation instruction sequence.
Description
Technical Field
The invention belongs to the technical field of computer multimedia, and particularly relates to a method and a system for synchronously playing audio and video and courseware.
Background
The hls (HTTP Live streaming) protocol is a streaming media playing protocol based on the HTTP protocol. Because the mainstream mobile phone operating systems (android and IOS) are compatible with the HTTP protocol, the HLS can be adapted to mobile terminal equipment of various models.
In a teaching live broadcast scene, synchronous playing of streaming media and online courseware is the most basic experience, and application scenes such as: when the teacher speaks a sentence 10 o 'clock, 10 min 30 sec, the page of the on-line courseware is turned once, and when the student receives and hears the teacher's speech, the viewed courseware also automatically turned. However, the HLS protocol is only an audio/video playing protocol, and cannot complete the synchronous operation of courseware and audio/video.
Disclosure of Invention
The invention aims to provide a method and a system for synchronously playing audio and video and courseware, which are used for keeping courseware and audio and video within second level synchronization.
In order to solve the technical problems, the invention adopts the following technical scheme:
the embodiment of the invention provides a method for synchronously playing audio and video and courseware on one hand, which is applied to a system for synchronously playing audio and video and courseware, comprising a main broadcasting terminal, a server and audience terminals, and comprises the following steps:
s1, the anchor terminal pushes the audio and video signals collected from the camera and the microphone to the audio and video processing service of the server through the WebRTC protocol;
s2, the audio and video processing service converts the received streaming data of the WebRTC protocol into RTMP streaming and outputs the RTMP streaming to the video live broadcast service of the server;
s3, the video live broadcast service receives RTMP stream, transcodes the audio and video stream data into TS files based on HLS protocol, and reads time stamps from SEI information of the RTMP stream, and the TS files are named by the time stamps and issued in the CDN;
s4, the HLS player of the audience terminal plays the video according to the TS file list in the m3u8 file, and throws out the playing time stamp to the outside in an event mode;
s5, when the anchor operates courseware, the courseware operating terminal of the anchor terminal collects the operating instruction and sends the operating instruction to the operating instruction queue of the server;
s6, writing the current timestamp into the instruction after the instruction is received by the operation instruction queue, and storing the timestamp into the instruction queue;
and S7, after receiving the timestamp notice thrown by the HLS player, the courseware player of the audience terminal acquires the operation instruction before the timestamp from the operation instruction queue, and simulates the operation courseware according to the operation instruction sequence.
Preferably, S2 specifically includes:
s201, an audio and video processing service receives audio and video streams pushed by an anchor terminal;
s202, transcoding the audio and video stream into one GoP per second through ffmpeg;
s203, writing the current inter-timestamp into SEI for each GoP;
and S204, outputting the transcoded GoP to a video live broadcast service through an RTMP protocol.
Preferably, S4 specifically includes:
s401, loading a TS file from the CDN;
s402, analyzing the time stamp in the TS file name, and taking the time stamp as a reference;
s403, accumulating the time stamp for corresponding time every time when playing a period of time;
s404, the accumulated time stamp is thrown out through an event.
In another aspect, an embodiment of the present invention provides a system for synchronously playing audio/video and courseware, including a anchor terminal, a server and a spectator terminal, where the anchor terminal further includes an audio/video acquisition module and a courseware operation terminal, the server further includes an audio/video processing service, a video live broadcast service and an operation instruction queue, the spectator terminal further includes an HLS player and a courseware player,
the audio and video acquisition module is used for pushing audio and video signals acquired from the camera and the microphone to an audio and video processing service through a WebRTC protocol;
the audio and video processing service is used for converting the received streaming data of the WebRTC protocol into an RTMP stream and outputting the RTMP stream to the video live broadcast service;
the video live broadcast service is used for receiving RTMP (real time Messaging protocol) streams, transcoding the audio and video stream data into TS (transport stream) files based on an HLS (HTTP live broadcast protocol), reading timestamps from SEI (solid information interface) information of the RTMP streams, naming the TS files by the timestamps and issuing the TS files in a Content Delivery Network (CDN);
the HLS player is used for playing the video according to the TS file list in the m3u8 file and throwing a playing time stamp to the outside in an event mode;
the courseware operation end is used for collecting operation instructions when the anchor operates courseware and sending the operation instructions to the operation instruction queue;
the operation instruction queue is used for writing a current timestamp into the instruction after receiving the instruction and storing the timestamp into the instruction queue;
and the courseware player is used for acquiring the operation instruction before the timestamp from the operation instruction queue after receiving the timestamp notice thrown by the HLS player, and simulating the courseware operation according to the operation instruction sequence.
Preferably, the audio/video processing service is configured to convert the received streaming data of the WebRTC protocol into an RTMP stream, and output the RTMP stream to the live video service specifically includes:
the audio and video processing service receives audio and video streams pushed by the anchor terminal;
the audio and video stream is transcoded into one GoP per second through ffmpeg;
for each GoP, writing the current inter-timestamp in the SEI;
and outputting the transcoded GoP to a video live broadcast service through an RTMP protocol.
Preferably, the HLS player is configured to play the video according to the TS file list in the m3u8 file, and the specific step of throwing the playing timestamp to the outside in an event manner includes:
loading a TS file from the CDN;
analyzing the time stamp in the TS file name, and taking the time stamp as a reference;
accumulating the time stamps for corresponding time every time a period of playing is carried out;
and throwing the accumulated time stamp through the event.
The invention has the following beneficial effects:
(1) in the method, the audience terminal acquires the TS file based on the HLS by using the CDN, the cost mainly generated in the process is the flow from the CDN, and the cost performance of the CDN flow is the highest.
(2) In terms of terminal compatibility, the HLS is based on the HTTP protocol, and the HTTP protocol is compatible in various terminal devices such as a PC \ a mobile phone and the like.
Drawings
Fig. 1 is a schematic diagram of a method for synchronously playing audio and video and courseware according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flow chart of steps of a method for synchronously playing audio and video and courseware according to an embodiment of the present invention is shown,
s1, the audio/video acquisition module 101 of the anchor terminal 10 pushes the audio/video signals acquired from the camera and the microphone to the audio/video processing service module 201 of the server 20 through the WebRTC protocol. WebRTC, an abbreviation of Web Real-Time Communication, is an API that supports Web browsers for Real-Time voice conversations or video conversations.
S2, the audio/video processing service module 201 converts the received streaming data of the WebRTC Protocol into a Real Time Messaging Protocol (RTMP) stream, and outputs the stream to the live video service module 202 of the server 20;
s3, the live video service module 202 receives the RTMP stream, transcodes the audio/video stream data into a TS file based on the HLS protocol, transmits the audio/video stream data in the RTMP protocol, reads a timestamp from SEI information of the RTMP stream, and publishes the TS file in a Content Delivery Network (CDN) with the timestamp name;
s4, the HLS player 301 of the audience terminal 30 plays the video according to the TS file list in the m3u8 file, and throws the playing time stamp to the outside in an event mode;
s5, when the anchor operates courseware, the courseware operating terminal 102 of the anchor terminal 10 collects the operating instruction and sends it to the operating instruction queue module 303 of the server 30;
s6, after receiving the instruction, the operation instruction queue module 303 writes the current timestamp into the instruction, and stores the current timestamp in the instruction queue;
and S7, after receiving the timestamp notice thrown by the HLS player, the courseware player of the audience terminal acquires the operation instruction before the timestamp from the operation instruction queue, and simulates the operation courseware according to the operation instruction sequence.
In the embodiment of the present invention, S2 specifically includes:
s201, the audio/video processing service module 201 receives an audio/video stream pushed by the anchor terminal 10;
s202, transcoding the audio and video stream into a Group of pictures (GoP) per second through ffmpeg;
s203, writing the current inter-timestamp into SEI for each GoP;
and S204, outputting the transcoded GoP to the live video service module 202 through an RTMP protocol.
In the embodiment of the present invention, preferably, S4 specifically includes:
s401, loading a TS file from the CDN;
s402, analyzing the time stamp in the TS file name, and taking the time stamp as a reference;
s403, accumulating the time stamp for corresponding time every time when playing a period of time;
s404, the accumulated time stamp is thrown out through an event.
In a specific application example, the frequency of the audio and video HLS player for throwing the timestamp notice can be controlled to be 1 time per 100 milliseconds, so that the error time for synchronously playing the audio and video and the courseware can be shortened to be within 100 milliseconds. Of course, it will be understood by those skilled in the art that the frequency of the time stamp notifications being thrown can be controlled as desired.
It is to be understood that the exemplary embodiments described herein are illustrative and not restrictive. Although one or more embodiments of the present invention have been described with reference to the accompanying drawings, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.
Claims (6)
1. A method for synchronously playing audio and video and courseware is characterized in that the method is applied to a system for synchronously playing the audio and video and courseware, which comprises a main broadcasting terminal, a server and audience terminals, and comprises the following steps:
s1, the anchor terminal pushes the audio and video signals collected from the camera and the microphone to the audio and video processing service of the server through the WebRTC protocol;
s2, the audio and video processing service converts the received streaming data of the WebRTC protocol into RTMP streaming and outputs the RTMP streaming to the video live broadcast service of the server;
s3, the video live broadcast service receives RTMP stream, transcodes the audio and video stream data into TS files based on HLS protocol, and reads time stamps from SEI information of the RTMP stream, and the TS files are named by the time stamps and issued in the CDN;
s4, the HLS player of the audience terminal plays the video according to the TS file list in the m3u8 file, and throws out the playing time stamp to the outside in an event mode;
s5, when the anchor operates courseware, the courseware operating terminal of the anchor terminal collects the operating instruction and sends the operating instruction to the operating instruction queue of the server;
s6, writing the current timestamp into the instruction after the instruction is received by the operation instruction queue, and storing the timestamp into the instruction queue;
and S7, after receiving the timestamp notice thrown by the HLS player, the courseware player of the audience terminal acquires the operation instruction before the timestamp from the operation instruction queue, and simulates the operation courseware according to the operation instruction sequence.
2. The method for synchronously playing the audios and videos and the courseware as claimed in claim 1, wherein the step S2 specifically comprises:
s201, an audio and video processing service receives audio and video streams pushed by an anchor terminal;
s202, transcoding the audio and video stream into one GoP per second through ffmpeg;
s203, writing the current time stamp into SEI for each GoP;
and S204, outputting the transcoded GoP to a video live broadcast service through an RTMP protocol.
3. The method for synchronously playing the audios and videos and the courseware as claimed in claim 1, wherein the step S4 specifically comprises:
s401, loading a TS file from the CDN;
s402, analyzing the time stamp in the TS file name, and taking the time stamp as a reference;
s403, accumulating the time stamp for corresponding time every time when playing a period of time;
s404, the accumulated time stamp is thrown out through an event.
4. A system for synchronously playing audio and video and courseware is characterized by comprising a main broadcasting terminal, a server and a spectator terminal, wherein the main broadcasting terminal further comprises an audio and video acquisition module and a courseware operation terminal, the server further comprises an audio and video processing service, a video live broadcast service and an operation instruction queue, the spectator terminal further comprises an HLS player and a courseware player,
the audio and video acquisition module is used for pushing audio and video signals acquired from the camera and the microphone to an audio and video processing service through a WebRTC protocol;
the audio and video processing service is used for converting the received streaming data of the WebRTC protocol into an RTMP stream and outputting the RTMP stream to the video live broadcast service;
the video live broadcast service is used for receiving RTMP (real time Messaging protocol) streams, transcoding the audio and video stream data into TS (transport stream) files based on an HLS (HTTP live broadcast protocol), reading timestamps from SEI (solid information interface) information of the RTMP streams, naming the TS files by the timestamps and issuing the TS files in a Content Delivery Network (CDN);
the HLS player is used for playing the video according to the TS file list in the m3u8 file and throwing a playing time stamp to the outside in an event mode;
the courseware operation end is used for collecting operation instructions when the anchor operates courseware and sending the operation instructions to the operation instruction queue;
the operation instruction queue is used for writing a current timestamp into the instruction after receiving the instruction and storing the timestamp into the instruction queue;
and the courseware player is used for acquiring the operation instruction before the timestamp from the operation instruction queue after receiving the timestamp notice thrown by the HLS player, and simulating the courseware operation according to the operation instruction sequence.
5. The system for synchronously playing the audios, the videos and the courseware as claimed in claim 1, wherein the audio and video processing service is configured to convert the received streaming data of the WebRTC protocol into an RTMP stream, and output the RTMP stream to the live video service specifically includes:
the audio and video processing service receives audio and video streams pushed by the anchor terminal;
the audio and video stream is transcoded into one GoP per second through ffmpeg;
for each GoP, writing the current inter-timestamp in the SEI;
and outputting the transcoded GoP to a video live broadcast service through an RTMP protocol.
6. The system for synchronously playing audio/video and courseware as claimed in claim 1, wherein the HLS player is configured to play the video according to the TS file list in the m3u8 file, and externally throwing the playing timestamp in an event manner specifically comprises:
loading a TS file from the CDN;
analyzing the time stamp in the TS file name, and taking the time stamp as a reference;
accumulating the time stamps for corresponding time every time a period of playing is carried out;
and throwing the accumulated time stamp through the event.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111157212.7A CN113923197A (en) | 2021-09-29 | 2021-09-29 | Method and system for synchronously playing audio/video and courseware |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111157212.7A CN113923197A (en) | 2021-09-29 | 2021-09-29 | Method and system for synchronously playing audio/video and courseware |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113923197A true CN113923197A (en) | 2022-01-11 |
Family
ID=79237342
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111157212.7A Pending CN113923197A (en) | 2021-09-29 | 2021-09-29 | Method and system for synchronously playing audio/video and courseware |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113923197A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201820372U (en) * | 2010-06-29 | 2011-05-04 | 张亚军 | Wireless interactive audio/video system and equipment thereof |
CN102624679A (en) * | 2011-01-28 | 2012-08-01 | 陶祖南 | Realization method for multilevel intelligent multifunctional multimedia information interaction system |
CN106846940A (en) * | 2016-12-29 | 2017-06-13 | 珠海思课技术有限公司 | A kind of implementation method of online live streaming classroom education |
CN110597610A (en) * | 2019-09-19 | 2019-12-20 | 广州华多网络科技有限公司 | Online teaching method and device, storage medium and electronic equipment |
CN113222790A (en) * | 2021-04-26 | 2021-08-06 | 深圳市方直科技股份有限公司 | Online course generation system and equipment based on artificial intelligence |
-
2021
- 2021-09-29 CN CN202111157212.7A patent/CN113923197A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201820372U (en) * | 2010-06-29 | 2011-05-04 | 张亚军 | Wireless interactive audio/video system and equipment thereof |
CN102624679A (en) * | 2011-01-28 | 2012-08-01 | 陶祖南 | Realization method for multilevel intelligent multifunctional multimedia information interaction system |
CN106846940A (en) * | 2016-12-29 | 2017-06-13 | 珠海思课技术有限公司 | A kind of implementation method of online live streaming classroom education |
CN110597610A (en) * | 2019-09-19 | 2019-12-20 | 广州华多网络科技有限公司 | Online teaching method and device, storage medium and electronic equipment |
CN113222790A (en) * | 2021-04-26 | 2021-08-06 | 深圳市方直科技股份有限公司 | Online course generation system and equipment based on artificial intelligence |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102752667B (en) | Multi-stream media live broadcast interaction system and live broadcast interaction method | |
WO2019205870A1 (en) | Video stream processing method, apparatus, computer device, and storage medium | |
CN102811368A (en) | Mobile video live broadcasting system | |
CN105100954A (en) | Interactive response system and method based on Internet communication and streaming media live broadcast | |
US20110202967A1 (en) | Apparatus and Method to Broadcast Layered Audio and Video Over Live Streaming Activities | |
CN102325181B (en) | Instant audio/video interactive communication method based on sharing service and instant audio/video interactive communication system based on sharing service | |
CN104081785A (en) | Streaming of multimedia data from multiple sources | |
CN104885473A (en) | Live timing for dynamic adaptive streaming over http (dash) | |
CN102802044A (en) | Video processing method, terminal and subtitle server | |
TW200418328A (en) | Instant video conferencing method, system and storage medium implemented in web game using A/V synchronization technology | |
CN108111872B (en) | Audio live broadcasting system | |
CN103888813A (en) | Audio and video synchronization realization method and system | |
CN102655606A (en) | Method and system for adding real-time subtitle and sign language services to live program based on P2P (Peer-to-Peer) network | |
CN101848367B (en) | File-based video live webcasting method | |
CN101202613A (en) | Terminal for clock synchronising | |
CN109040818B (en) | Audio and video synchronization method, storage medium, electronic equipment and system during live broadcasting | |
CN1534503A (en) | Method of realizing real time image sound talks in network game, system and storage medium thereof | |
CN113301359A (en) | Audio and video processing method and device and electronic equipment | |
CN106303754A (en) | A kind of audio data play method and device | |
CN101540871B (en) | Method and terminal for synchronously recording sounds and images of opposite ends based on circuit domain video telephone | |
CN113923197A (en) | Method and system for synchronously playing audio/video and courseware | |
CN111405230A (en) | Conference information processing method and device, electronic equipment and storage medium | |
CN110602523A (en) | VR panoramic live multimedia processing and synthesizing system and method | |
CN103828383A (en) | Method of saving content to a file on a server and corresponding device | |
CN102098572B (en) | Customized streaming media embedding and playing system based on flash and realization method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |