CN110536148B - Live broadcasting method and equipment based on video networking - Google Patents

Live broadcasting method and equipment based on video networking Download PDF

Info

Publication number
CN110536148B
CN110536148B CN201910775817.9A CN201910775817A CN110536148B CN 110536148 B CN110536148 B CN 110536148B CN 201910775817 A CN201910775817 A CN 201910775817A CN 110536148 B CN110536148 B CN 110536148B
Authority
CN
China
Prior art keywords
video
audio
terminal
video data
live broadcast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910775817.9A
Other languages
Chinese (zh)
Other versions
CN110536148A (en
Inventor
胡倩丽
吴乐乐
王晓燕
亓娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionvera Information Technology Co Ltd
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN201910775817.9A priority Critical patent/CN110536148B/en
Publication of CN110536148A publication Critical patent/CN110536148A/en
Application granted granted Critical
Publication of CN110536148B publication Critical patent/CN110536148B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25816Management of client data involving client authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4113PC
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4398Processing of audio elementary streams involving reformatting operations of audio signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Graphics (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The embodiment of the application provides a live broadcast method and equipment based on video networking. The method comprises the following steps: the PC terminal is accessed to the video network through the terminal simulator; the PC terminal sends a live broadcast request message to a video networking server through the terminal simulator; the PC terminal receives a live broadcast confirmation message sent by the video networking server through the terminal simulator; the PC terminal acquires audio and video data by utilizing fast forward compression coding FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator; and the PC terminal sends the audio and video data under the video networking protocol to a video networking server through the terminal simulator so that other terminals in the video networking can watch live broadcast through the video networking server. Therefore, the PC end can control the release of the video network live broadcast.

Description

Live broadcasting method and equipment based on video networking
Technical Field
The application relates to the technical field of video networking, in particular to a live broadcast method and equipment based on video networking.
Background
With the rapid development of the video networking, video conferences, video teaching and the like based on the video networking are widely popularized in the aspects of life, work, learning and the like of users.
How to issue live broadcast data in a video networking environment is an urgent problem to be solved.
Disclosure of Invention
In view of the above, embodiments of the present application are proposed to provide a video network-based live method and a corresponding device that overcome or at least partially solve the above problems.
In a first aspect, an embodiment of the present application discloses a live broadcast method based on a video network, where the method includes:
the PC terminal is accessed to the video network through the terminal simulator;
the PC terminal sends a live broadcast request message to a video network server through the terminal simulator;
the PC terminal receives a live broadcast confirmation message sent by the video networking server through the terminal simulator;
the PC terminal acquires audio and video data by utilizing fast forward compression coding FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator;
and the PC terminal sends the audio and video data under the video networking protocol to a video networking server through the terminal simulator so that other terminals in the video networking can watch live broadcast through the video networking server.
Optionally, the video network server includes a sub-control server and a main control server;
the live broadcast request message is sent to the main control server by the PC end through the terminal simulator and the sub-control server;
the live broadcast confirmation message is sent to the terminal simulator of the PC end by the main control server through the sub-control server;
the PC terminal passes through the audio and video data after terminal simulator will be handled sends the video networking server to make other terminals in the video networking watch the live through the video networking server, include:
and the PC terminal sends the processed audio and video data to the sub-control servers corresponding to the terminal simulator through the terminal simulator, so that other terminals in the video network can acquire the processed audio and video data through the sub-control servers corresponding to the other terminals.
Optionally, the PC end processes the audio and video data through the terminal simulator, including;
analyzing the collected audio and video files through the terminal simulator to obtain audio streams and video streams;
adding, modifying and transcoding the video stream and the audio stream;
and packaging the processed video stream and audio stream into the audio and video data under the video networking protocol.
Optionally, the terminal simulator performs data transmission through N transmission channels, where each transmission channel transmits different audio and video data, and N is an integer greater than or equal to one;
the PC terminal utilizes FFMPEG to collect audio and video data through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator, and the method comprises the following steps:
the PC terminal acquires audio and video data of the N transmission channels by using the FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator respectively;
the PC terminal sends the audio and video data under the video networking protocol to a video networking server through the terminal simulator so that other terminals in the video networking watch live broadcast through the video networking server, and the method comprises the following steps:
and the PC terminal sends the audio and video data of the N transmission channels under the video networking protocol to a video networking server through the terminal simulator, so that other terminals in the video networking can watch live broadcast through the video networking server.
In a second aspect, an embodiment of the present application discloses an apparatus, including:
the network access module is used for accessing the PC terminal to the video network through the terminal simulator;
the message sending module is used for sending a live broadcast request message to the video network server by the PC terminal through the terminal simulator;
the message receiving module is used for receiving the live broadcast confirmation message sent by the video networking server through the terminal simulator by the PC terminal;
the acquisition processing module is used for acquiring audio and video data by the PC end through the terminal simulator by utilizing fast forward compression coding FFMPEG and processing the audio and video data into audio and video data under a video networking protocol through the terminal simulator;
and the live broadcast release module is used for sending the audio and video data under the video networking protocol to a video networking server by the PC terminal through the terminal simulator so that other terminals in the video networking can watch live broadcast through the video networking server.
Optionally, the video network server includes a sub-control server and a main control server;
the live broadcast request message is sent to the main control server by the PC end through the terminal simulator and the sub-control server;
the live broadcast confirmation message is sent to the terminal simulator of the PC end by the main control server through the sub-control server;
the live broadcast release module is specifically used for:
and the PC terminal sends the processed audio and video data to the sub-control servers corresponding to the terminal simulator through the terminal simulator, so that other terminals in the video network can acquire the processed audio and video data through the sub-control servers corresponding to the other terminals.
Optionally, the acquisition processing module is specifically configured to:
analyzing the collected audio and video files through the terminal simulator to obtain audio streams and video streams;
adding, modifying and transcoding the video stream and the audio stream;
and packaging the processed video stream and audio stream into the audio and video data under the video networking protocol.
Optionally, the terminal simulator performs data transmission through N transmission channels, where each transmission channel transmits different audio and video data, and N is an integer greater than or equal to one;
the acquisition processing module is specifically configured to:
the PC terminal acquires audio and video data of the N transmission channels by using the FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator respectively;
the live broadcast release module is specifically used for:
and the PC terminal sends the audio and video data of the N transmission channels under the video networking protocol to a video networking server through the terminal simulator, so that other terminals in the video networking can watch live broadcast through the video networking server.
In a third aspect, an embodiment of the present application further discloses a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the method of any one of the first aspect is implemented.
In a fourth aspect, an embodiment of the present application further discloses a computer-readable storage medium, where a computer program for executing any one of the methods in the first aspect is stored in the computer-readable storage medium.
According to the live broadcasting method based on the video network, the PC terminal is accessed to the video network through the terminal simulator; the PC terminal sends a live broadcast request message to a video networking server through the terminal simulator; the PC terminal receives a live broadcast confirmation message sent by the video networking server through the terminal simulator; the PC terminal acquires audio and video data by utilizing fast forward compression coding FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator; and the PC terminal sends the audio and video data under the video networking protocol to a video networking server through the terminal simulator so that other terminals in the video networking can watch live broadcast through the video networking server. Therefore, the PC end can control the release of the video network live broadcast.
Drawings
Fig. 1 is a schematic networking diagram of a video network provided in an embodiment of the present application;
fig. 2 is a schematic hardware structure diagram of a node server according to an embodiment of the present application;
fig. 3 is a schematic hardware structure diagram of an access switch according to an embodiment of the present application;
fig. 4 is a schematic hardware structure diagram of an ethernet protocol conversion gateway according to an embodiment of the present application;
fig. 5 is a flowchart illustrating steps of a video network-based live broadcasting method according to an embodiment of the present application;
FIG. 6 is a system architecture diagram provided by an embodiment of the present application;
fig. 7 is a block diagram of a device provided in an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
The video networking is an important milestone of network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of Internet applications to high-definition video, and high definition is face-to-face.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present application, the following description refers to the internet of view:
some of the techniques applied by the video network are as follows:
network Technology (Network Technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network Circuit Switching (Circuit Switching), the internet of vision technology employs network Packet Switching to satisfy the demand of Streaming (translated into Streaming, and continuous broadcasting, which is a data transmission technology, converting received data into a stable and continuous stream, and continuously transmitting the stream, so that the sound heard by the user or the image seen by the user is very smooth, and the user can start browsing on the screen before the whole data is transmitted). The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server Technology (Server Technology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server may be directly connected to the access switch or may be directly connected to the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (circled part), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
Video networking device classification
1.1 devices in the video network of the embodiment of the present application can be mainly classified into 3 types: servers, switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node servers, access switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the network interface module 201, the switching engine module 202, the CPU module 203, and the disk array module 204 are mainly included.
The network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, reading, writing, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module (downstream network interface module 301, upstream network interface module 302), the switching engine module 303, and the CPU module 304 are mainly included.
Wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the incoming data packet of the CPU module 304 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) and obtaining the token generated by the code rate control module.
If the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 208 is configured by the CPU module 204, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway
As shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the video networking destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MAC SA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 3 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA SA Reserved Payload CRC
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (e.g. various protocol packets, multicast data packets, unicast data packets, etc.), there are at most 256 possibilities, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses.
The Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA).
The reserved byte consists of 2 bytes.
The payload part has different lengths according to types of different datagrams, and is 64 bytes if the type of the datagram is a variety of protocol packets, or is 1056 bytes if the type of the datagram is a unicast packet, but is not limited to the above 2 types.
The CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present application: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of a Label of Multi-Protocol Label switching (MPLS), and assuming that there are two connections between a device a and a device B, there are 2 labels for a packet from the device a to the device B, and 2 labels for a packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA SA Reserved label (R) Payload CRC
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
In the prior art, live broadcast is performed in an internet-based environment, and the live broadcast method provided by the embodiment of the application is performed in a video networking environment. The FFMPEG involved in the live broadcast method provided by the embodiment of the application is a set of computer programs which can be used for recording and converting digital audio and video and can be converted into streams, and the FFMPEG can realize the interconversion among a plurality of video formats based on the MPEG video coding standard. And providing a video screenshot function in the FFMPEG video decoding library, and specifically extracting the key frame of the current live video in real time by a method in an API function of the FFMPEG video decoding library. The embodiment of the application releases the live broadcast to other terminals of the video network through the terminal simulator to watch the live broadcast.
Fig. 5 shows a video network-based live broadcasting method provided by an embodiment of the present application, which includes the following steps:
step 501: the PC terminal is accessed to the video network through the terminal simulator.
Step 502: and the PC terminal sends a live broadcast request message to a video network server through the terminal simulator.
Step 503: and the PC terminal receives the live broadcast confirmation message sent by the video network server through the terminal simulator.
Step 504: and the PC terminal acquires audio and video data by utilizing fast forward compression coding FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator.
Step 505: and the PC terminal sends the audio and video data under the video networking protocol to a video networking server through the terminal simulator so that other terminals in the video networking can watch live broadcast through the video networking server.
In one possible implementation, the video network server comprises a sub-control server and a main control server; each sub-control server is attached to a terminal simulator, and the terminal simulator and the sub-control server interact. The main control server may be understood as a management server, which manages the network access conditions of all the terminals. The interaction process may be that a request message sent by the terminal simulator is sent to the sub-control server by the switch, and the request message is forwarded to the main control server by the switch of the sub-control server, where the switch is responsible for forwarding the message, and this part is based on the prior art, and is not described in detail herein. The live broadcast request message is sent to the main control server by the PC end through the terminal simulator and the sub-control server; and the live broadcast confirmation message is sent to the terminal simulator of the PC end by the main control server through the sub-control server.
In the network access stage in step 501, the PC sends a network access request message to the sub-control servers through the terminal simulator, and the sub-control servers forward the network access request message to the main control server, and the main control server performs the network access operation of the terminal according to the network access request message. The success of network access is equivalent to that the terminal is connected with a video network server, and further various services can be carried out in the video network. In general, the terminal simulator transmitting the network access request message can access the network. And the authority can be set, the terminal simulator with the authority can access the network only by the authentication of the video networking server, and the privacy is increased.
In step 504, the specific process of acquiring the audio and video data by the PC terminal through the terminal simulator by using fast forward compression coding FFMPEG is to apply the prior art, and is not described in detail in this application. Further, in the process that the PC end processes the audio and video data through the terminal simulator, the PC end may analyze the acquired audio and video file through the terminal simulator to obtain an audio stream and a video stream; adding, modifying and transcoding the video stream and the audio stream; and packaging the processed video stream and audio stream into the audio and video data under the video networking protocol.
In step 504, the PC sends the processed audio and video data to the sub-control servers corresponding to the terminal simulator through the terminal simulator, so that other terminals in the video network can obtain the processed audio and video data through the sub-control servers corresponding to the other terminals.
In a possible implementation manner, the terminal simulator performs data transmission through N transmission channels, wherein different channels transmit different audio data, video data or audio/video data, and N is an integer greater than or equal to one; in step 504, the PC terminal acquires audio and video data of the N transmission channels by using the FFMPEG through the terminal simulator, and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator respectively; in step 505, the PC sends the audio and video data under the video networking protocol of the N transmission channels to a video networking server through the terminal simulator, so that other terminals in the video networking can watch live broadcast through the video networking server.
Fig. 6 shows an architecture suitable for the video networking live broadcast method provided by the embodiment of the present application, where a PC terminal can perform live broadcast distribution in multiple channels through a terminal simulator. One or more terminal emulators can be hung on each PC end. If a terminal simulator has five channels, the first channel can transmit the audio and video of the first speaking party, the second channel can transmit the audio and video of the second speaking party, and so on. Each PC terminal sends a network access request message to the sub-control server through the terminal simulator, and the sub-control server reports the network access request message to the main control server to inform the main control server that the terminal simulator is accessed to the network.
After the video network terminal successfully accesses the network, the terminal simulator initiates a live broadcast request and sends the live broadcast request to the main control server through the sub-control server. The live broadcast request can be some binary data sent according to a video networking protocol, the main control server is used for managing the live broadcast data issued by the terminal, for example, which terminal simulator issues the live broadcast data, and other terminals can watch the live broadcast data issued by the terminal simulator. The master control server processes the live broadcast request and returns a corresponding response result (success or failure) to the video network terminal simulator.
If the main control server sends a response success message, the main control server can initiate a live broadcast starting command and return to the terminal simulator through the sub-control server so as to inform the video network terminal simulator to start to release live broadcast data. And after receiving a live broadcast starting instruction of the main control server, the video networking terminal simulator acquires audio and video data through FFMPEG (flexible flat packet moving picture experts group), processes the audio and video data and then issues the processed audio and video data to the video networking server. The video and/or audio stream is fixedly transmitted in a certain video network live broadcast channel, so that the video network terminals in the channel can watch the live broadcast.
In summary, according to the live broadcasting method based on the video network provided by the embodiment of the application, the PC terminal accesses the video network through the terminal simulator; the PC terminal sends a live broadcast request message to a video networking server through the terminal simulator; the PC terminal receives a live broadcast confirmation message sent by the video networking server through the terminal simulator; the PC terminal acquires audio and video data by utilizing fast forward compression coding FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator; and the PC terminal sends the audio and video data under the video networking protocol to a video networking server through the terminal simulator so that other terminals in the video networking can watch live broadcast through the video networking server. Therefore, the publishing of the video networking live broadcast can be controlled at the PC end.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are presently preferred and that no particular act is required of the embodiments of the application.
Based on the same technical concept, referring to fig. 7, a structural block diagram of a device provided in the embodiment of the present application is shown, where the device may be applied in a video network, and specifically may include the following modules:
and the network access module 701 is used for accessing the PC terminal to the video network through the terminal simulator.
And a message sending module 702, configured to send, by the PC terminal, a live broadcast request message to the video network server through the terminal simulator.
A message receiving module 703, configured to receive, by the PC terminal through the terminal simulator, a live broadcast confirmation message sent by the video network server.
And the acquisition processing module 704 is used for acquiring audio and video data by the PC terminal through the terminal simulator by utilizing fast forward compression coding FFMPEG and processing the audio and video data into audio and video data under a video networking protocol through the terminal simulator.
And a live broadcast release module 705, configured to send, by the PC terminal through the terminal simulator, the audio and video data under the video networking protocol to a video networking server, so that other terminals in the video networking can watch live broadcast through the video networking server.
In one possible implementation, the video network server comprises a sub-control server and a main control server; the live broadcast request message is sent to the main control server by the PC end through the terminal simulator and the sub-control server; and the live broadcast confirmation message is sent to the terminal simulator of the PC end by the main control server through the sub-control server.
The live broadcast releasing module 705 is specifically configured to: and the PC terminal sends the processed audio and video data to the sub-control servers corresponding to the terminal simulator through the terminal simulator, so that other terminals in the video network can acquire the processed audio and video data through the sub-control servers corresponding to the other terminals.
In a possible implementation, the acquisition processing module 704 is specifically configured to: analyzing the collected audio and video files through the terminal simulator to obtain audio streams and video streams; adding, modifying and transcoding the video stream and the audio stream; and packaging the processed video stream and audio stream into the audio and video data under the video networking protocol.
In a possible implementation manner, the terminal simulator performs data transmission through N transmission channels, wherein each transmission channel transmits different audio and video data, and N is an integer greater than or equal to one;
the acquisition processing module 704 is specifically configured to: and the PC terminal acquires the audio and video data of the N transmission channels by utilizing the FFMPEG through the terminal simulator and respectively processes the audio and video data into the audio and video data under a video networking protocol through the terminal simulator.
The live broadcast releasing module 705 is specifically configured to: and the PC terminal sends the audio and video data of the N transmission channels under the video networking protocol to a video networking server through the terminal simulator, so that other terminals in the video networking can watch live broadcast through the video networking server.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The embodiment of the application also provides computer equipment, which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein when the processor executes the computer program, one or more video network-based live broadcast methods are realized.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, which causes a processor to execute the video network-based live broadcasting method according to the embodiments of the present application.
As will be appreciated by one of skill in the art, embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The live broadcast method and equipment based on the video network are provided by the application. The detailed description is given, and the principle and the implementation of the present application are explained by applying specific examples, and the above description of the embodiments is only used to help understanding the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (8)

1. A live broadcast method based on video network is characterized in that a video network server comprises a sub-control server and a main control server, and the method comprises the following steps:
the PC terminal is accessed to the video network through the terminal simulator; the terminal simulator carries out data transmission through N transmission channels, wherein each transmission channel transmits different audio and video data, and N is an integer greater than or equal to one;
the PC terminal sends a live broadcast request message to a sub-control server through the terminal simulator, and then the sub-control server sends the live broadcast request message to a main control server;
the PC end receives a live broadcast confirmation message sent by the sub-control server through the terminal simulator, and the live broadcast confirmation message is sent to the sub-control server by the main control server;
the PC terminal acquires audio and video data by utilizing fast forward compression coding FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator, and the method specifically comprises the following steps:
the PC terminal acquires audio and video data of the N transmission channels by using the FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator respectively;
the PC terminal sends the audio and video data under the video networking protocol to a video networking server through the terminal simulator so that other terminals in the video networking watch live broadcast through the video networking server, and the method comprises the following steps:
the PC terminal sends the audio and video data under the video networking protocol of the N transmission channels to a video networking server through the terminal simulator, so that other terminals in the video networking can watch live broadcast through the video networking server;
and the PC terminal sends the audio and video data under the video networking protocol to a sub-control server through the terminal simulator so that other terminals in the video networking can watch live broadcast through the sub-control server.
2. The method of claim 1, wherein the PC sends the processed audio and video data to a sub-control server through the terminal simulator, so that other terminals in the video network watch live broadcast through the sub-control server, and the method comprises the following steps:
and the PC terminal sends the processed audio and video data to the sub-control servers corresponding to the terminal simulator through the terminal simulator, so that other terminals in the video network can acquire the processed audio and video data through the sub-control servers corresponding to the other terminals.
3. The method of claim 1, wherein the PC side processes the audio and video data through the terminal simulator, including;
analyzing the collected audio and video files through the terminal simulator to obtain audio streams and video streams;
adding, modifying and transcoding the video stream and the audio stream;
and packaging the processed video stream and audio stream into the audio and video data under the video networking protocol.
4. The utility model provides a live equipment based on video networking which characterized in that, video networking server includes branch accuse server and main control server, equipment includes:
the network access module is used for accessing the PC terminal to the video network through the terminal simulator; the terminal simulator carries out data transmission through N transmission channels, wherein each transmission channel transmits different audio and video data, and N is an integer greater than or equal to one;
the message sending module is used for sending a live broadcast request message to the sub-control server through the terminal simulator and then sending the live broadcast request message to the main control server through the sub-control server;
the message receiving module is used for receiving a live broadcast confirmation message sent by the sub-control server through the terminal simulator, wherein the live broadcast confirmation message is sent to the sub-control server by the main control server;
the acquisition processing module is used for acquiring audio and video data by utilizing fast forward compression coding FFMPEG through the terminal simulator and processing the audio and video data into audio and video data under a video networking protocol through the terminal simulator; the acquisition processing module is specifically configured to:
the PC terminal acquires audio and video data of the N transmission channels by using the FFMPEG through the terminal simulator and processes the audio and video data into audio and video data under a video networking protocol through the terminal simulator respectively;
the live broadcast module is specifically used for:
the PC terminal sends the audio and video data under the video networking protocol of the N transmission channels to a video networking server through the terminal simulator, so that other terminals in the video networking can watch live broadcast through the video networking server;
and the live broadcast release module is used for sending the audio and video data under the video networking protocol to the sub-control server through the terminal simulator so that other terminals in the video networking can watch live broadcast through the sub-control server.
5. The device of claim 4, wherein the live publishing module is specifically configured to:
and the PC terminal sends the processed audio and video data to the sub-control servers corresponding to the terminal simulator through the terminal simulator, so that other terminals in the video network can acquire the processed audio and video data through the sub-control servers corresponding to the other terminals.
6. The device of claim 4, wherein the acquisition processing module is specifically configured to:
analyzing the acquired audio and video file through the terminal simulator to obtain an audio stream and a video stream;
adding, modifying and transcoding the video stream and the audio stream;
and packaging the processed video stream and audio stream into the audio and video data under the video networking protocol.
7. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 3 when executing the computer program.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program for executing the method of any one of claims 1 to 3.
CN201910775817.9A 2019-08-21 2019-08-21 Live broadcasting method and equipment based on video networking Active CN110536148B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910775817.9A CN110536148B (en) 2019-08-21 2019-08-21 Live broadcasting method and equipment based on video networking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910775817.9A CN110536148B (en) 2019-08-21 2019-08-21 Live broadcasting method and equipment based on video networking

Publications (2)

Publication Number Publication Date
CN110536148A CN110536148A (en) 2019-12-03
CN110536148B true CN110536148B (en) 2022-05-06

Family

ID=68664011

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910775817.9A Active CN110536148B (en) 2019-08-21 2019-08-21 Live broadcasting method and equipment based on video networking

Country Status (1)

Country Link
CN (1) CN110536148B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111131059A (en) * 2019-12-04 2020-05-08 视联动力信息技术股份有限公司 Data transmission control method, device and computer readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106330967A (en) * 2016-10-24 2017-01-11 北京小米移动软件有限公司 Processing method and device of live broadcast data processing
CN107396200A (en) * 2017-08-22 2017-11-24 深圳市中青合创传媒科技有限公司 The method that net cast is carried out based on social software

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180054641A1 (en) * 2016-08-18 2018-02-22 Raymond L. Hall Method of Livestreaming an Audiovisual Audition
CN109587432A (en) * 2018-10-29 2019-04-05 视联动力信息技术股份有限公司 A kind of conference speech terminal switching method and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106330967A (en) * 2016-10-24 2017-01-11 北京小米移动软件有限公司 Processing method and device of live broadcast data processing
CN107396200A (en) * 2017-08-22 2017-11-24 深圳市中青合创传媒科技有限公司 The method that net cast is carried out based on social software

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
映客能用电脑直播吗 映客怎么在电脑上直播;阿北北;《http://www.downxia.com/zixun/22186.html》;20170505;正文全文 *

Also Published As

Publication number Publication date
CN110536148A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN109167960B (en) Method and system for processing video stream data
CN111193788A (en) Audio and video stream load balancing method and device
CN109120879B (en) Video conference processing method and system
CN109547728B (en) Recorded broadcast source conference entering and conference recorded broadcast method and system
CN110022295B (en) Data transmission method and video networking system
CN110475090B (en) Conference control method and system
CN110049273B (en) Video networking-based conference recording method and transfer server
CN110191315B (en) Monitoring and checking method and device based on video network
CN109246135B (en) Method and system for acquiring streaming media data
CN109040656B (en) Video conference processing method and system
CN110113564B (en) Data acquisition method and video networking system
CN109743284B (en) Video processing method and system based on video network
CN109005378B (en) Video conference processing method and system
CN110769179B (en) Audio and video data stream processing method and system
CN110650147A (en) Data acquisition method and system
CN110769297A (en) Audio and video data processing method and system
CN110446058B (en) Video acquisition method, system, device and computer readable storage medium
CN110087147B (en) Audio and video stream transmission method and device
CN110072154B (en) Video networking-based clustering method and transfer server
CN109889516B (en) Method and device for establishing session channel
CN110677315A (en) Method and system for monitoring state
CN110536148B (en) Live broadcasting method and equipment based on video networking
CN110798450B (en) Audio and video data processing method and device and storage medium
CN110493311B (en) Service processing method and device
CN110113563B (en) Data processing method based on video network and video network server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant