CN114079795B - Network live broadcast static frame and mute fault detection method - Google Patents

Network live broadcast static frame and mute fault detection method Download PDF

Info

Publication number
CN114079795B
CN114079795B CN202010834966.0A CN202010834966A CN114079795B CN 114079795 B CN114079795 B CN 114079795B CN 202010834966 A CN202010834966 A CN 202010834966A CN 114079795 B CN114079795 B CN 114079795B
Authority
CN
China
Prior art keywords
video
mute
audio
pid
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010834966.0A
Other languages
Chinese (zh)
Other versions
CN114079795A (en
Inventor
吴雪波
黄荣谞
徐慧勇
翁昌清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dekscom Technologies Ltd
Original Assignee
Dekscom Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dekscom Technologies Ltd filed Critical Dekscom Technologies Ltd
Priority to CN202010834966.0A priority Critical patent/CN114079795B/en
Publication of CN114079795A publication Critical patent/CN114079795A/en
Application granted granted Critical
Publication of CN114079795B publication Critical patent/CN114079795B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2181Source of audio or video content, e.g. local disk arrays comprising remotely distributed storage units, e.g. when movies are replicated over a plurality of video servers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234327Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into layers, e.g. base layer and one or more enhancement layers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a network live broadcast static frame and mute fault detection method, which comprises the steps of deploying video quality monitoring equipment at live broadcast program sources and CDN nodes, injecting IP video streams into the video quality monitoring equipment in a probe active pulling or switch mirroring mode, and calculating relevant network layer and code stream layer alarm indexes causing static frames and mute by carrying out packet grabbing and protocol analysis on live broadcast media streams; detecting a content layer alarm index by performing audio/video decoding analysis on the media stream; the accurate mute alarm is realized by carrying out association analysis on the alarm indexes of the network layer, the code stream layer and the content layer; and according to the broadcasting attribute of the specific live program, filtering out the mute alarm of the mute frame caused by the non-broadcasting fault by establishing a live program feature library. The method for detecting the live broadcast static frame and the mute fault can effectively detect the live broadcast static frame and the mute fault caused by the broadcast control platform, the CDN server and the network fault, and avoid false alarm.

Description

Network live broadcast static frame and mute fault detection method
Technical Field
The invention belongs to the technical field of communication detection, relates to a fault detection method, and particularly relates to a live webcast static frame and mute fault detection method.
Background
In recent years, with the comprehensive popularization of 'three networks integration' and the rapid development of internet video services in China, IPTV and network live broadcast services and traffic are growing at a remarkable speed. For charged live video services, consumers are no longer satisfied with the quality experience of the past free network video "best effort". In order to gain markets and users in the fierce video service competition, network operators and live platform companies increasingly pay attention to live service quality guarantee so as to improve the competitiveness. In order to effectively manage and guarantee the live broadcast service quality, maintenance personnel not only need to rapidly process serious faults of user complaints, but also can actively sense various live broadcast fault phenomena (such as mute frames, mute, screen patterns, black screens, asynchronous audio and video and the like) which do not complain but influence user experience, so that the live broadcast platform and network maintenance and optimization work can be developed more pertinently, the situation that the faults are prevented is avoided, and the user loss is avoided.
According to the current IPTV and Internet video user behavior big data analysis report, the network live broadcast is still the service with the largest number of watching users and highest use frequency. As vast TV users are used to the audience quality level of live broadcast services transmitted by the traditional broadcast digital TV through a special channel, similar or even higher expected values are generated for the quality of network live broadcast video transmitted based on an unreliable IP network, and extremely high requirements are put on the quality guarantee of the live broadcast video of operators. In network live broadcast service guarantee, besides the problems of screen display and mosaic caused by network packet loss, the mute of images and mute are one of important factors affecting user experience. The live silence and mute problem is affected by a number of factors, including the following:
(1) Quiet frame muting due to the nature of live program content itself, for example: the remote education teaching PPT and the news broadcasting important announcements usually show still text pictures, but have sound read by teacher teaching and broadcasting personnel; some dramatic programs (such as dummies) or silent movies have pictures that play normally, but no sound.
(2) During recording or transcoding, live program sources cause various problems due to malfunctions of related equipment, including: the content layer appears silent frames or silence; or the video PID or the audio PID is lost in the MPEG2-TS code stream layer; or filling a large number of empty frames in the audio-video PID.
(3) In the output of a video server or in the network transmission link, live broadcast programs are in a break-off fault due to the fact that the server breaks down or the network breaks down, and the live broadcast video can also cause mute frames and mute phenomena.
The multi-picture analyzer is a common network live channel monitoring and fault detection tool in the industry at present. The multi-picture analyzer is generally deployed at the head end of a video program source, introduces multiple paths of live broadcast signals into the analyzer in an IP video streaming mode, then performs image decoding on the video stream, and presents the decoded multiple paths of programs on a large screen in a set screen combination mode. Because the multi-picture analyzer decodes and restores the video image frame by frame, various abnormal picture phenomena (such as static frames, blue screens, black screens, color bars and the like) can be detected through the image pattern recognition and analysis technology. Because the video decoding needs to consume higher CPU resources of x86 equipment, the current single multi-picture analyzer can generally support the video decoding display and picture fault detection of 32-100 road definition programs (or 8-25 road high definition programs) at the same time. With the continuous increase of the number of network direct-dialing channels and the development of high definition of program resolution, the performance of the multi-picture analyzer cannot meet the requirements of network live broadcast quality operation and maintenance. In addition, the multi-picture analyzer mainly analyzes and detects the audio and video content layer of the live program, so that the problems of silence and silence caused by faults of the content layer, the code stream layer and the network layer cannot be further distinguished.
In view of this, there is an urgent need to design a new fault detection method so as to overcome at least some of the above-mentioned drawbacks of the existing fault detection methods.
Disclosure of Invention
The invention provides a method for detecting live broadcast static frames and mute faults, which can effectively detect live broadcast static frames and mute faults caused by faults of a broadcast control platform, a CDN server and a network and avoid false alarms.
In order to solve the technical problems, according to one aspect of the present invention, the following technical scheme is adopted:
a live-over-network static frame and silence fault detection method, the method comprising:
step S1, video quality monitoring equipment is deployed at live broadcast program sources and CDN nodes, live broadcast media stream data are subjected to packet capturing and protocol analysis, audio and video content is decoded and analyzed, and KPI indexes and alarms of various network layers, code stream layers and content layers are calculated;
step S2, the video quality monitoring equipment analyzes various video stream alarm indexes, including: video break out, video PID loss Vpid, audio PID loss Apid, video TS null packet VTnull, audio TS null packet ATnull, video picture frame similarity Sd, and video volume size Vs;
video interruption output, namely tracking and analyzing the interval time of adjacent data packets of the video stream, and judging that the video is interrupted if the interval time of adjacent network data packets exceeds To;
the video PID loses the Vpid, the video PID program identification number of the video stream MPEG2-TS layer is tracked and analyzed, if the TS packet interval time of the adjacent video PID is detected To exceed To, the video PID is judged To be lost, and the Vpid=1 is recorded; conversely, note vpid=0;
the audio PID loses Apid, the audio PID program identification number of the video stream MPEG2-TS layer is tracked and analyzed, if the TS packet interval time of the adjacent audio PID is detected To exceed To, the audio PID is judged To be lost, and Apid=1 is recorded; conversely, note apid=0;
the method comprises the steps of carrying out statistics on video TS Null packets in a video stream, and calculating a TS Null rate Null which is the ratio of the TS Null packets to the total TS packets; recording the video TS null packets when the video TS null packet rate exceeds N1;
an audio TS Null packet ATnull, counting audio TS Null packets in the video stream, and calculating a TS Null packet rate Null which is the ratio of the TS Null packet number to the total TS packet number; recording the audio TS null packets when the audio TS null packet rate exceeds N1;
and the video picture frame similarity Sd is obtained by decoding the video content layer, extracting a video picture frame every second, calculating the similarity Sd of adjacent picture frames, and judging the video picture frame as a static frame when Sd is greater than T1 and the duration Dt of the similar picture frames is greater than T2.
The video volume level Vs is obtained by decoding the audio content layer, and when Vs < T3 and duration Dt > T4, the audio volume level Vs is judged to be mute;
step S3, if the video cutoff output index is detected, the system prompts a mute frame and mute alarms;
if the audio PID is detected to lose Apid or the video PID to lose Vpid, entering a PID loss judging sub-flow; if the audio PID is lost and the video PID is normal, the system prompts mute alarm; if the audio PID is normal and the video PID is lost, the system prompts a static frame alarm; if the audio PID is lost and the video PID is lost, the system prompts a mute frame and mute alarms;
if the audio TS null or the video TS null is detected, entering a TS null judgment sub-flow; if the audio TS is empty and the video TS is normal, the system prompts mute alarm; if the audio TS packet is normal and the video TS packet is empty, the system prompts a static frame alarm; if the audio TS empty packet and the video TS empty packet occur at the same time, the system prompts a mute frame and mute alarm;
if the similarity Sd > T1 of the adjacent video picture frames is detected, the adjacent video picture frames are judged to be the same, if the duration Dt > T2 of the same picture frames is judged to be abnormal and static, static frame faults possibly occur, SFtmp=1 is recorded, and otherwise SFtmp=0 is recorded;
if the video volume Vs < T3 and the duration Dt > T4, determining that the video abnormality has no sound, and if the video abnormality has a silence fault, recording sltmp=1, otherwise recording sltmp=0;
step S4, entering a live program special scene alarm suppression flow, and aiming at a special picture live program and a special sound live program, establishing a program feature library to record corresponding channel names and program broadcasting time periods;
step S5, when sftmp=1, sltmp=1, i.e. the video picture is still and silent, the system prompts a silence failure; when sftmp=1, sltmp=0, i.e. the video picture is still but has sound, if the alarm program is in the special picture live channel list Ch1 and the alarm Time is within the specific Time period Time1 of the feature library, no fault is prompted, otherwise the system prompts a static frame alarm; when sftmp=0 and sltmp=1, that is, when the video picture is normal but has no sound, if the alert program is in the special sound live program channel list Ch2 and the alert Time is within the specific Time period Time2 of the feature library, no fault is prompted, otherwise, the system prompts mute alert; when sftmp=0 and sltmp=0, that is, when the video picture is normal and the sound is normal, the video broadcasting is judged to be normal, and the live program special scene alarm suppression flow is ended.
As an embodiment of the present invention, the default value of To is 1000ms.
As an embodiment of the present invention, the default value of N1 is 99%.
As one embodiment of the present invention, the audio/video TS null packet PID is a TS packet of 0x1 FFF.
As an embodiment of the present invention, T1 has a default value of 99.99% and T2 has a default value of 5 seconds.
As one embodiment of the present invention, T3 has a default value of-50 dB and T4 has a default value of 20 seconds.
The invention has the beneficial effects that: the method for detecting the live broadcast static frame and the mute fault can effectively detect the live broadcast static frame and the mute fault caused by the broadcast control platform, the CDN server and the network fault, and avoid false alarm.
Drawings
Fig. 1 is a flowchart of a live-network mute frame and mute fault detection method according to an embodiment of the present invention.
FIG. 2 is a flow chart of a PID losing sub-process in an embodiment of the invention.
Fig. 3 is a flowchart of a sub-flow of a TS null packet in an embodiment of the present invention.
Fig. 4 is a flowchart of a special program scene determination sub-process according to an embodiment of the invention.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
For a further understanding of the present invention, preferred embodiments of the invention are described below in conjunction with the examples, but it should be understood that these descriptions are merely intended to illustrate further features and advantages of the invention, and are not limiting of the claims of the invention.
The description of this section is intended to be illustrative of only a few exemplary embodiments and the invention is not to be limited in scope by the description of the embodiments. It is also within the scope of the description and claims of the invention to interchange some of the technical features of the embodiments with other technical features of the same or similar prior art.
The invention discloses a method for detecting live-broadcast mute frames and mute faults, and fig. 1 is a flow chart of a method for detecting live-broadcast mute frames and mute faults in an embodiment of the invention; referring to fig. 1, the method includes:
step S1, video quality monitoring equipment is deployed at live broadcast program sources and CDN nodes, live broadcast media stream data are subjected to packet capturing and protocol analysis, audio and video content is decoded and analyzed, and KPI indexes and alarms of various network layers, code stream layers and content layers are calculated;
step S2, the video quality monitoring equipment analyzes various video stream alarm indexes, including: video break out, video PID loss Vpid, audio PID loss Apid, video TS null packet VTnull, audio TS null packet ATnull, video picture frame similarity Sd, and video volume size Vs;
video cut-out is performed, by tracking and analyzing the interval time of adjacent data packets of the video stream, if the interval time of adjacent network data packets is detected To exceed To (the default value is 1000 ms), the video is judged To have cut-out;
the video PID loses the Vpid, the video PID program identification number of the video stream MPEG2-TS layer is tracked and analyzed, if the TS packet interval time of the adjacent video PID is detected To exceed To (the default value is 1000 ms), the video PID is judged To be lost, and the Vpid=1 is recorded; conversely, note vpid=0;
the audio PID loses Apid, the audio PID program identification number of the video stream MPEG2-TS layer is tracked and analyzed, if the TS packet interval time of the adjacent audio PID is detected To exceed To (the default value is 1000 ms), the audio PID is judged To be lost, and Apid=1 is recorded; conversely, note apid=0;
the method comprises the steps of counting video TS Null packets (namely TS packets with PID of 0x1 FFF) in a video stream, calculating TS Null rate Null (namely the ratio of TS Null packets to total TS packets), and recording the video TS Null packets when the TS Null rate of the video exceeds N1 (the default value is 99 percent);
an audio TS Null packet ATnull, counting audio TS Null packets (namely TS packets with PID of 0x1 FFF) in the video stream, and calculating a TS Null packet rate Null which is the ratio of the TS Null packet number to the total TS packet number; recording the audio TS null packets when the audio TS null packet rate exceeds N1 (the default value is 99 percent);
the video picture frame similarity Sd is obtained by decoding a video content layer, extracting a video picture frame every second, calculating the similarity Sd of adjacent picture frames, and judging as a static frame when Sd > T1 (default value is 99.99%) and the duration Dt > T2 (default value is 5 seconds) of the similar picture frames.
The video volume level Vs is obtained by decoding the audio content layer, and the volume level Vs of the audio is determined to be mute when Vs < T3 (default value is-50 dB) and duration Dt > T4 (default value is 20 seconds);
step S3, if the video cutoff output index is detected, the system prompts a mute frame and mute alarms;
FIG. 2 is a flow chart of a PID losing sub-process according to an embodiment of the invention; referring to fig. 2, in an embodiment of the present invention, if an audio PID loss Apid or a video PID loss Vpid is detected, a PID loss determination sub-flow is entered; if the audio PID is lost and the video PID is normal (namely Apid=1 and Vpid=0), the system prompts a mute alarm; if the audio PID is normal and the video PID is lost (namely Apid=0 and Vpid=1), the system prompts a static frame alarm; if the audio PID is lost and the video PID is lost (Apid=1, vpid=1), the system prompts a mute frame and mute alarm;
FIG. 3 is a flow chart of a TS empty packet sub-process according to an embodiment of the present invention; referring to fig. 3, in an embodiment of the present invention, if an audio TS null or a video TS null is detected, entering a TS null judgment sub-flow; if the audio TS is empty and the video TS is normal (ATNull=1, VTNull=0), the system prompts a mute alarm; if the audio TS packet is normal and the video TS packet is empty (ATNull=0, VTNull=1), the system prompts a static frame alarm; if the audio TS null packet and the video TS null packet simultaneously appear (ATnull=1, VTnull=1), the system prompts a mute frame and a mute alarm;
if the similarity Sd > T1 (the default value is 99.99%) of the video adjacent picture frames is detected, the video adjacent picture frames are judged to be the same, if the duration Dt > T2 (the default value is 5 seconds) of the same picture frames is judged to be abnormal and static, static frame faults possibly occur, SFtmp=1 is recorded, and otherwise SFtmp=0 is recorded;
if the video volume Vs < T3 (default value is-50 dB) and the duration Dt > T4 (default value is 20 seconds), determining that the video is abnormal and has no sound, and if the video is abnormal, a mute fault is likely to occur, recording sltmp=1, otherwise recording sltmp=0;
step S4, entering a live program special scene alarm suppression flow, and establishing a program feature library to record corresponding channel names and program broadcasting time periods for special-picture live programs (such as news simulcasts, weather forecast, remote education and the like) and special-sound live programs (such as dramas, silent dramas and the like);
step S5, FIG. 4 is a flowchart of a special program scene determination sub-process in an embodiment of the invention; referring to fig. 4, when sftmp=1, sltmp=1 (i.e., video frame is still and silent), the system prompts a "silence frame" fault; when sftmp=1 and sltmp=0 (i.e. the video picture is still but has sound), if the alert program is in the special picture live channel list Ch1 and the alert Time is within the feature library specific Time period Time1, no fault is prompted, otherwise the system prompts a "still frame" alert; when sftmp=0 and sltmp=1 (i.e. the video picture is normal but has no sound), if the alert program is in the special sound live program channel list Ch2 and the alert Time is within the feature library specific Time period Time2, no fault is prompted, otherwise the system prompts a "mute" alert; when sftmp=0 and sltmp=0 (i.e. the video picture is normal and the sound is normal), the video playing is judged to be normal, and the live program special scene alarm suppression flow is ended.
In one embodiment of the present invention, the method of the present invention comprises: video quality monitoring equipment is deployed at live broadcast program sources and CDN nodes, IP video streams are injected into the video quality monitoring equipment in a probe active streaming or switch mirroring mode, and relevant network layer and code stream layer alarm indexes (including flow break, audio and video PID loss and audio and video TS packet blank) which cause mute frames and mute are calculated by carrying out packet grabbing and protocol analysis on live broadcast media streams; detecting a content layer alarm index by performing audio/video decoding analysis on the media stream; the accurate mute alarm is realized by carrying out association analysis on the alarm indexes of the network layer, the code stream layer and the content layer; meanwhile, according to the broadcasting attribute of the specific live program, the mute frame and the mute alarm caused by the non-broadcasting fault are filtered out in a mode of establishing a live program feature library; by the method, a comprehensive processing flow is established, and live network mute frame and mute faults caused by faults of the broadcast control platform, the CDN server and the network can be effectively detected, so that false alarms are avoided.
In summary, the live-broadcast static frame and mute fault detection method provided by the invention can effectively detect live-broadcast static frame and mute faults caused by faults of the broadcast control platform, the CDN server and the network, and avoid false alarms.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The description and applications of the present invention herein are illustrative and are not intended to limit the scope of the invention to the embodiments described above. Effects or advantages referred to in the embodiments may not be embodied in the embodiments due to interference of various factors, and description of the effects or advantages is not intended to limit the embodiments. Variations and modifications of the embodiments disclosed herein are possible, and alternatives and equivalents of the various components of the embodiments are known to those of ordinary skill in the art. It will be clear to those skilled in the art that the present invention may be embodied in other forms, structures, arrangements, proportions, and with other assemblies, materials, and components, without departing from the spirit or essential characteristics thereof. Other variations and modifications of the embodiments disclosed herein may be made without departing from the scope and spirit of the invention.

Claims (6)

1. A method for detecting live-broadcast static frames and mute faults, the method comprising:
step S1, deploying video quality monitoring equipment at live program sources and CDN nodes, carrying out packet capturing and protocol analysis on live media stream data, carrying out decoding analysis on audio and video contents, and calculating KPI indexes and alarms of each network layer, each code stream layer and each content layer;
step S2, the video quality monitoring equipment analyzes various video stream alarm indexes, including: video break out, video PID loss Vpid, audio PID loss Apid, video TS null packet VTnull, audio TS null packet ATnull, video picture frame similarity Sd, and video volume size Vs;
video interruption output, namely tracking and analyzing the interval time of adjacent data packets of the video stream, and judging that the video is interrupted if the interval time of adjacent network data packets exceeds To;
the video PID loses the Vpid, the video PID program identification number of the video stream MPEG2-TS layer is tracked and analyzed, if the TS packet interval time of the adjacent video PID is detected To exceed To, the video PID is judged To be lost, and the Vpid=1 is recorded; conversely, note vpid=0;
the audio PID loses Apid, the audio PID program identification number of the video stream MPEG2-TS layer is tracked and analyzed, if the TS packet interval time of the adjacent audio PID is detected To exceed To, the audio PID is judged To be lost, and Apid=1 is recorded; conversely, note apid=0;
the method comprises the steps of carrying out statistics on video TS Null packets in a video stream, and calculating a TS Null rate Null which is the ratio of the TS Null packets to the total TS packets; recording the video TS null packets when the video TS null packet rate exceeds N1;
an audio TS Null packet ATnull, counting audio TS Null packets in the video stream, and calculating a TS Null packet rate Null which is the ratio of the TS Null packet number to the total TS packet number; recording the audio TS null packets when the audio TS null packet rate exceeds N1;
the video picture frame similarity Sd is obtained by decoding a video content layer, extracting a video picture frame every second, calculating the similarity Sd of adjacent picture frames, and judging the video picture frame as a static frame when Sd is more than T1 and the duration Dt of the similar picture frames is more than T2;
the video volume level Vs is obtained by decoding the audio content layer, and when Vs < T3 and duration Dt > T4, the audio volume level Vs is judged to be mute;
step S3, if the video cutoff output index is detected, the system prompts a mute frame and mute alarms;
if the audio PID is detected to lose Apid or the video PID to lose Vpid, entering a PID loss judging sub-flow; if the audio PID is lost and the video PID is normal, the system prompts mute alarm; if the audio PID is normal and the video PID is lost, the system prompts a static frame alarm; if the audio PID is lost and the video PID is lost, the system prompts a mute frame and mute alarms;
if the audio TS null or the video TS null is detected, entering a TS null judgment sub-flow; if the audio TS is empty and the video TS is normal, the system prompts mute alarm; if the audio TS packet is normal and the video TS packet is empty, the system prompts a static frame alarm; if the audio TS empty packet and the video TS empty packet occur at the same time, the system prompts a mute frame and mute alarm;
if the similarity Sd > T1 of the adjacent video picture frames is detected, the adjacent video picture frames are judged to be the same, if the duration Dt > T2 of the same picture frames is judged to be abnormal and static, static frame faults possibly occur, SFtmp=1 is recorded, and otherwise SFtmp=0 is recorded;
if the video volume Vs < T3 and the duration Dt > T4, determining that the video abnormality has no sound, and if the video abnormality has a silence fault, recording sltmp=1, otherwise recording sltmp=0;
step S4, entering a live program special scene alarm suppression flow, and aiming at a special picture live program and a special sound live program, establishing a program feature library to record corresponding channel names and program broadcasting time periods;
step S5, when sftmp=1, sltmp=1, i.e. the video picture is still and silent, the system prompts a silence failure; when sftmp=1, sltmp=0, i.e. the video picture is still but has sound, if the alarm program is in the special picture live channel list Ch1 and the alarm Time is within the specific Time period Time1 of the feature library, no fault is prompted, otherwise the system prompts a static frame alarm; when sftmp=0 and sltmp=1, that is, when the video picture is normal but has no sound, if the alert program is in the special sound live program channel list Ch2 and the alert Time is within the specific Time period Time2 of the feature library, no fault is prompted, otherwise, the system prompts mute alert; when sftmp=0 and sltmp=0, that is, when the video picture is normal and the sound is normal, the video broadcasting is judged to be normal, and the live program special scene alarm suppression flow is ended.
2. The live-network static frame and mute fault detection method according to claim 1, wherein:
the default value for To is 1000ms.
3. The live-network static frame and mute fault detection method according to claim 1, wherein:
the default value for N1 is 99%.
4. The live-network static frame and mute fault detection method according to claim 1, wherein:
the audio/video TS empty packet PID is a TS packet of 0x1 FFF.
5. The live-network static frame and mute fault detection method according to claim 1, wherein:
the default value for T1 is 99.99% and the default value for T2 is 5 seconds.
6. The live-network static frame and mute fault detection method according to claim 1, wherein:
the default value for T3 is-50 dB and the default value for T4 is 20 seconds.
CN202010834966.0A 2020-08-19 2020-08-19 Network live broadcast static frame and mute fault detection method Active CN114079795B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010834966.0A CN114079795B (en) 2020-08-19 2020-08-19 Network live broadcast static frame and mute fault detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010834966.0A CN114079795B (en) 2020-08-19 2020-08-19 Network live broadcast static frame and mute fault detection method

Publications (2)

Publication Number Publication Date
CN114079795A CN114079795A (en) 2022-02-22
CN114079795B true CN114079795B (en) 2023-09-15

Family

ID=80281793

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010834966.0A Active CN114079795B (en) 2020-08-19 2020-08-19 Network live broadcast static frame and mute fault detection method

Country Status (1)

Country Link
CN (1) CN114079795B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106131668A (en) * 2016-06-30 2016-11-16 杭州当虹科技有限公司 A kind of audio-video monitoring warning system pushing alarm based on mobile device message
WO2017161998A1 (en) * 2016-03-24 2017-09-28 腾讯科技(深圳)有限公司 Video processing method and device and computer storage medium
CN108810524A (en) * 2017-05-05 2018-11-13 德科仕通信(上海)有限公司 The method of IPTV flopover phenomenons detection

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017161998A1 (en) * 2016-03-24 2017-09-28 腾讯科技(深圳)有限公司 Video processing method and device and computer storage medium
CN106131668A (en) * 2016-06-30 2016-11-16 杭州当虹科技有限公司 A kind of audio-video monitoring warning system pushing alarm based on mobile device message
CN108810524A (en) * 2017-05-05 2018-11-13 德科仕通信(上海)有限公司 The method of IPTV flopover phenomenons detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Display监测***主要功能及典型故障的简析;陈功;;有线电视技术(03);全文 *
有线网络公司静帧监测算法优化实践;周清;;电视技术(24);全文 *

Also Published As

Publication number Publication date
CN114079795A (en) 2022-02-22

Similar Documents

Publication Publication Date Title
US11210704B2 (en) Monitoring and using telemetry data
US20220385717A1 (en) Unified end-to-end quality and latency measurement, optimization and management in multimedia communications
US20090089852A1 (en) Automated Multimedia Channel Error Reporting from Viewer Premises
CN102547475B (en) Method and system for improving quality of service alarming accuracy of Internet protocol (IP) video media stream
CN108810524B (en) IPTV picture fault phenomenon detection method
CN102651821B (en) Method and device for evaluating quality of video
US10306275B2 (en) Method and apparatus for managing video transport
US10560753B2 (en) Method and system for image alteration
US20130061278A1 (en) Method and System for Implementing Interaction between Set-Top Box (STB) and Home Gateway
US20150032883A1 (en) Method of identification of multimedia flows and corresponding appartus
CN102724562A (en) System and method for realizing multi-picture play processing of internet protocol television (IPTV) based on virtual set-top box
CN102761771A (en) Method and equipment for carrying out detection of inferior broadcasting of video on basis of image objective quality estimation
US8341663B2 (en) Facilitating real-time triggers in association with media streams
CN110061979B (en) Method and device for detecting business object
CN114079795B (en) Network live broadcast static frame and mute fault detection method
CN111210462A (en) Alarm method and device
EP2341680B1 (en) Method and apparatus for adaptation of a multimedia content
US20100146102A1 (en) Providing reports of received multimedia programs
CN105025308B (en) A kind of IP stream recording method based on fragment file
Wang et al. Begin with the end in mind: A unified end-to-end quality-of-experience monitoring, optimization and management framework
CN114915811A (en) Program pre-broadcasting detection system, method and device for IP broadcasting
CN108900831B (en) Flower screen event detecting method and its detection system
CN110677680A (en) Master and standby information source switching method based on decoder error detection
CN112954448A (en) Live broadcast content image feature code extraction method and live broadcast content consistency comparison method
EP2139191B1 (en) Method and device for processing data and system comprising such device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant