CN106034195A - Mobile detecting method based on grey correlation analysis - Google Patents
Mobile detecting method based on grey correlation analysis Download PDFInfo
- Publication number
- CN106034195A CN106034195A CN201510115097.5A CN201510115097A CN106034195A CN 106034195 A CN106034195 A CN 106034195A CN 201510115097 A CN201510115097 A CN 201510115097A CN 106034195 A CN106034195 A CN 106034195A
- Authority
- CN
- China
- Prior art keywords
- pixel
- value
- mobile
- block
- correlation analysis
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Image Analysis (AREA)
Abstract
A mobile detecting method based on grey correlation analysis comprises the following steps of receiving a video frame including plural input pixels; through a grey correlation analysis technology, determining a bit rate change of the input pixels so as to establish a plurality of quality background models; through two phase detecting steps based on a block and the pixels, detecting a mobile object so as to generate a binary moving mask; through entropy calculation, detecting a brightness change of the video frame so as to update the background models at a proper time; and providing a setting interface so that a user sets detecting sensitivity, and estimating error detection of the moving mask based on the sensitivity. Through the plurality of quality background models, the mobile object can be correctly detected in a video string flow possessing a variable bit rate; through the two phase detecting steps based on the block and the pixels, complete and accurate moving detecting can be realized; through detecting the brightness change, the background models are updated at the proper time so that an influence caused by the brightness change can be eliminated; and through the adjustable detecting sensitivity, accuracy can be increased and the error detection can be reduced.
Description
Technical field
The present invention relates to technical field of image processing, particularly relate to extract about shifting from video streaming (video stream)
Mobile detection (motion detection) method of animal body information.
Background technology
Mobile detection is a crucial technology in automated video monitoring system, and it is for extracting about shifting from video streaming
The information of animal body.Mobile detection method is mainly segmented into three types: time differencing method (temporal
Differencing), optical flow method (optical flow) and background subtracting method (background subtraction).Time difference
Method is easily achieved, and amount of calculation is low, but mobile object is easily caused internal fracture, it is not easy to extract complete movement
Object information.Optical flow method can extract complete mobile object information, and can be used for the mobile detection of mobile camera,
But computationally intensive, it is not easy to the application of real-time, very sensitive to noise in addition.Background subtracting method is easily achieved,
By using background model and more complete mobile object letter can be extracted with moderate amount of calculation under the background of relative quiescent
Breath, but the change to light is the most sensitive.Therefore, background subtracting method is commonly used in mobile detection application, and develops
Go out various background subtracting method, such as: gauss hybrid models method (Gaussian mixtures model, GMM) and difference are estimated
Survey method (sigma difference estimation, SDE), multiple SDE method (multiple SDE, MSDE), multiple
Time differencing method (multiple temporal difference, MTD), simple statistics calculus of finite differences (simple statistical
Difference, SSD).
Along with the development of video communication technology, radio communication becomes to be practically applicable to mobile detection application, and it can strengthen greatly
The detecting ability of the mobile object of scope.Unfortunately, radio communication is limited by the bandwidth of reality network, is particularly easy to send out
Raw network congestion or server are delayed machine, to this end, video streaming introduces video code rate controls (video rate control) technology,
The most H.264/AVC video format, it uses variable bit rate (variable bitrate, VBR) to adapt to reality net
Network condition.Above-mentioned existing background subtracting method, in the video streaming with fixed bit rate, can detect motive objects
Body, this is because in the environment of the most preferable, stable, it is easy to distinguish mobile object by background model.But,
Owing to reality network seldom provides such ideal, stable environment, above-mentioned existing background subtracting method having variable bit
In the video streaming of rate, easily change because of the unexpected of bit rate of video streaming, cause the erroneous judgement of mobile object, it is impossible to have
Effect detects mobile object.
For example, referring to Fig. 6, it is to have plural number video pictures (video in the video streaming of variable bit rate
Frame) the measurement figure of the brightness value (luminance value) of upper same pixel.At the beginning, video streaming is the 150th~240
Being the video streaming of the high bit rate (or high-quality) of 200kbps during individual video pictures, it has the back of the body that relatively high-amplitude wave is dynamic
Scape signal B1.Existing background subtracting method produces background model according to so having the dynamic background signal B1 of relatively high-amplitude wave.When
When video communication is hindered by network congestion, video code rate controls technology and distributes the remaining network bandwidth, therefore with backsight
Frequently crossfire becomes the video streaming of low bit rate (or low quality) of 5kbps during the 240th~280 video pictures,
It has a background signal B2 relatively smoothed out, and in this example because of have mobile object by and have the dynamic movement of a relatively high-amplitude wave
Signal P1.If when movable signal P1 occurs, background model not yet update and still according to there is the back of the body that relatively high-amplitude wave is dynamic
Scape signal B1 is produced, and it is background signal that existing background subtracting method may judge movable signal P1 by accident.And if when one section
After between, existing background subtracting method may update background model according to having the background signal B2 relatively smoothed out.But,
When video communication is no longer influenced by hindering, video streaming reverts to 200kbps's during the 280th~325 video pictures
The video streaming of high bit rate (or high-quality), it has the background signal B3 that relatively high-amplitude wave is dynamic.If at background signal
When the fluctuation of B3 occurs, background model not yet update and still according to there is the background signal B2 relatively smoothed out produced, show
It is movable signal that some background subtracting methods may judge the fluctuation of background signal B3 by accident.So, at the bit of video streaming
When rate is by high step-down or by low uprising, existing background subtracting method is all it may happen that judge by accident.
Summary of the invention
It is an object of the invention to provide a kind of mobile detection method, correctly can detect in the video streaming have variable bit rate
Survey mobile object, and the impact more completely caused with mobile detection accurately and elimination brightness flop can be realized.
For achieving the above object, the present invention provides a kind of shifting based on grey correlation analysis (grey relational analysis)
Dynamic method for detecting, comprising:
S1) receiving video pictures (video frame), video pictures includes plural number input pixel;
S2) multimass (multi-quality) background model is set up, comprising: S21) calculate the picture of each input pixel
Element value and the Euclidean distance (Euclidean distance) between the pixel value of corresponding plural candidate background pixel;
S22) according to Euclidean distance, the pixel value of each input pixel and the picture of corresponding plural candidate background pixel are calculated
Grey incidence coefficient (grey relational coefficient) between element value;S23) judge in grey incidence coefficient minimum
Whether value, less than or equal to first threshold, if then judging that the bit rate of input pixel changes, and adds input pixel
As candidate background pixel, if otherwise judging, the bit rate of input pixel does not changes;
S3) detecting mobile object, comprising: S31) video pictures is divided into plurality of blocks, for each block,
By the grey correlation between pixel value and the pixel value of corresponding plural candidate background pixel of input pixel each in block
In coefficient, maximum adds up, to produce the grey incidence coefficient summation about block;S32) judge that grey incidence coefficient is total
Whether more than or equal to Second Threshold, if then judging that block is background block, if otherwise judging, block is as mobile block;
S33) for each mobile block, it is judged that in mobile block, the pixel value of each input pixel is carried on the back with corresponding plural candidate
In grey incidence coefficient between the pixel value of scene element, whether maximum is more than or equal to the 3rd threshold value, if then judging defeated
Entering pixel is background pixel, if otherwise judging to input pixel as mobile pixel;S34) produce binaryzation and move shade (binary
motion mask);
S4) brightness flop of video pictures is detected, comprising: S41) calculate the grey incidence coefficient summation of each block
Entropy (entropy), and add up to produce about the entropy summation of video pictures;S42) video pictures and previous video are judged
Whether the difference of the entropy summation of picture is more than or equal to the 4th threshold value, if then judging that video pictures has brightness flop, and root
Updating, according to video pictures, the candidate background pixel that each input pixel is corresponding, if otherwise judging, video pictures does not has brightness to become
Change;
S5) error detection of the mobile shade of assessment, comprising: S51) provide setting interface for users setting detecting sensitive
Degree;S52) by the total quantity of pixel mobile in mobile shade and background pixel divided by the area of mobile shade, comment to produce
Valuation;S53) judging whether assessed value is more than the product of the 5th threshold value and detecting sensitivity, the 5th threshold value is for being just predicted as
The discreet of positive pixel (true positive pixel) divided by the area of mobile shade, if then judging that there is mistake detects
Survey, if otherwise judging to there is not error detection.
In one embodiment of this invention, candidate background pixel corresponding to the input pixel that is updated is selected from inputting pixel
Time corresponding to maximum in Euclidean distance between pixel value with the pixel value of corresponding plural candidate background pixel
Select background pixel.
In one embodiment of this invention, input pixel can all include brightness value with the pixel value of candidate background pixel
(luminance value), blue color difference value (blue-difference chroma value) and red color value
(red-difference chroma value)。
In one embodiment of this invention, brightness value, blue color difference value can all use 8 bits to represent with red color value.
In one embodiment of this invention, first threshold can be set to 0.6.
In one embodiment of this invention, Second Threshold can be set to 245.
In one embodiment of this invention, the 3rd threshold value can be set to 0.6.
In one embodiment of this invention, the 4th threshold value can be set to 0.05.
In one embodiment of this invention, it was predicted that the discreet for positive positive pixel can be set to 30 × 30, detects sensitivity
Setting according to user can be 0~10.
In one embodiment of this invention, plurality of blocks can be 16 × 16 blocks.
Technological means described in said one embodiment can be applicable in another embodiment above-mentioned, new to obtain one
Embodiment, as long as these technological means are the most conflicting.
The present invention is by grey correlation analysis technology, it is judged that the bit rate variation of input pixel, to set up multimass background mould
Type, therefore can correctly detect mobile object in the video streaming have variable bit rate;Further by based on block with
The two benches detecting step of pixel, detects mobile object, moves shade producing binaryzation, therefore can realize more complete with
Mobile detection accurately;By the calculating of entropy, the brightness flop of detecting video pictures, with timely replacement background model, because of
This can eliminate the impact that brightness flop causes;Sensitive by providing setting interface for users to set detecting according to environmental demand
Degree, utilizes the error detection of the mobile shade of adjustable detecting sensitivity assessment, therefore can increase accuracy and detect with reducing mistake
Survey.
Accompanying drawing explanation
The present invention is further detailed explanation with detailed description of the invention below in conjunction with the accompanying drawings.
Fig. 1 is the flow chart of the mobile detection method based on grey correlation analysis according to one embodiment of the invention.
Fig. 2 is the flow chart setting up multimass background model step according to one embodiment of the invention.
Fig. 3 is the flow chart based on block with the two benches detecting step of pixel according to one embodiment of the invention.
Fig. 4 is the flow chart of the brightness flop step of the detecting video pictures according to one embodiment of the invention.
Fig. 5 is the flow chart of the error detection step of the mobile shade of the assessment according to one embodiment of the invention.
Fig. 6 is to have in the video streaming of variable bit rate the measurement figure of the brightness value of same pixel on plural number video pictures.
Description of symbols:
B1, B2, B3 background signal
P1 movable signal
S1 receives the video pictures including plural number input pixel
S2 is by grey correlation analysis technology, it is judged that the bit rate variation of input pixel, to set up multimass background model
S3, by two benches detecting step based on block and pixel, detects mobile object, moves screening producing binaryzation
Cover
S4 is by the calculating of entropy, and the brightness flop of detecting video pictures, with timely replacement background model
S5 provides and sets interface for users setting detecting sensitivity, and the error detection of the mobile shade of assessment according to this
The Europe that S21 calculates between pixel value and the pixel value of corresponding plural candidate background pixel of each input pixel is several
Reed distance
S22, according to Euclidean distance, calculates the pixel value of each input pixel and corresponding plural candidate background pixel
Grey incidence coefficient between pixel value
In S23 grey incidence coefficient, whether minima is less than or equal to first threshold
The bit rate of S231 input pixel changes, adds input pixel as candidate background pixel
The bit rate of S232 input pixel does not changes
Video pictures is divided into plurality of blocks by S31, for each block, by the pixel value of input pixel each in block
Add up with maximum in the grey incidence coefficient between the pixel value of corresponding plural candidate background pixel, to produce about district
The grey incidence coefficient summation of block
Whether S32 grey incidence coefficient summation is more than or equal to Second Threshold
S321 block is background block
S322 block is mobile block
The pixel value of the pixel value of each input pixel and corresponding plural candidate background pixel in each mobile block of S33
Between grey incidence coefficient in maximum whether more than or equal to the 3rd threshold value
S331 input pixel is background pixel
S332 input pixel is mobile pixel
S34 produces binaryzation and moves shade
S41 calculates the entropy of the grey incidence coefficient summation of each block, and adds up to produce the entropy summation about video pictures
Whether S42 video pictures is more than or equal to the 4th threshold value with the difference of the entropy summation of previous video pictures
S421 video pictures has brightness flop, updates, according to video pictures, the candidate background pixel that each input pixel is corresponding
S422 video pictures does not has brightness flop
S51 provides and sets interface for users setting detecting sensitivity
S52 will move the total quantity area divided by mobile shade of pixel and background pixel, to produce assessment in mobile shade
Value
Whether S53 assessed value is more than the 5th threshold value and the product detecting sensitivity, and the 5th threshold value is to be predicted as positive positive pixel
Discreet divided by the area of mobile shade
There is error detection in S531
There is not error detection in S532.
Detailed description of the invention
Refer to the stream that Fig. 1, Fig. 1 are the mobile detection method based on grey correlation analysis according to one embodiment of the invention
Cheng Tu.In step S1, receiving video streaming, wherein, video streaming includes plural number video pictures, each video pictures
Including plural number input pixel.Distinguishing for convenience and illustrate, the t video pictures can be denoted as It, at the t video
In picture, (x, input pixel y) can be denoted as p in positiont(x, y).In the present embodiment, YC is usedbCrColor space,
Each input pixel pt(x, color or pixel value y) can be by brightness value (Y), blue color difference value (Cb) and red color
Value (Cr) represented by three variablees, it addition, brightness value (Y), blue color difference value (Cb) and red color value (Cr)
All use 8 bits to represent, i.e. its value is 0~255.
In step S2, by grey correlation analysis technology, it is judged that the bit rate variation of input pixel, to set up multimass
Background model.In the present embodiment, as in figure 2 it is shown, Fig. 2 is to set up multimass background according to one embodiment of the invention
The flow chart of model step, in step S21, calculates each input pixel pt(x, pixel value y) and corresponding plural candidate
Background pixel B (x, y)1~B (x, y)MPixel value between Euclidean distance Δ (1)~Δ (M), be expressed as follows with formula:
Δ (k)=‖ pt(x, y)-B (x, y)k‖
Wherein, k is 1~M any integer.Additionally it is noted that owing to using YCbCrColor space, the most each candidate
Background pixel B (x, y)kColor or pixel value equally by brightness value (Y), blue color difference value (Cb) and red color value
(Cr) represented by three variablees.When Euclidean distance Δ (k) is the biggest, represent input pixel pt(x, y) with candidate background picture
Element B (x, y)kBetween difference the biggest.Below this difference is quantified.
In step S22, according to Euclidean distance Δ (k), calculate each input pixel pt(x, pixel value y) is with corresponding
Plural candidate background pixel B (x, y)1~B (x, y)MPixel value between grey incidence coefficient
γ(pt(x, y), B (x, y)1)~γ (pt(x, y), B (x, y)M), it is expressed as follows with formula:
Wherein, k is 1~M any integer.It addition, ΔminRepresent minimal difference, ΔmaxRepresent maximum difference.Due to brightness value
(Y), blue color difference value (Cb) and red color value (Cr) all use 8 bits to represent, i.e. its value is 0~255,
Therefore ΔminWith Δmax0 and 255 can be set to.It addition, ξ represents resolution ratio (distinguishing coefficient),
It is between 0~1, and in the present embodiment, ξ is set to 0.2.As grey incidence coefficient γ (pt(x, y), B (x, y)k) the least, represent
Input pixel pt(x, y) with candidate background pixel B (x, y)kBetween relatedness the least, i.e. difference is the biggest.
In step S23, it is judged that grey incidence coefficient γ (pt(x, y), B (x, y)1)~γ (pt(x, y), B (x, y)MIn), minima is
No less than or equal to first threshold ε.If so, input pixel p is representedt(x, y) with candidate background pixel B (x, y)1~B (x, y)M
Between all do not have relatedness, then arrive step S231, it is judged that input pixel pt(x, bit rate y) changes, and by defeated
Enter pixel pt(x, bit rate variation judgment value V y)kIt is set to 1, furthermore, it is possible to pixel p will be inputtedt(x y) adds as waiting
Select background pixel, thereby can set up multimass background model.If it is not, represent input pixel pt(x y) carries on the back with certain candidate
Between scene element, there is great relevance, then arrive step S232, it is judged that input pixel pt(x, bit rate y) does not changes,
And pixel p will be inputtedt(x, bit rate variation judgment value V y)kIt is set to 0.Narration above can be expressed as follows with formula:
Wherein, k is 1~M any integer, and first threshold ε can be set to 0.6.
In step S3, by two benches detecting step based on block with pixel, detect mobile object, to produce two-value
Change mobile shade.In the present embodiment, as it is shown on figure 3, Fig. 3 be according to one embodiment of the invention based on block and picture
The flow chart of the two benches detecting step of element, reconnaissance phase based on block includes step S31, S32, S321 and S322.
In step S31, by video pictures It(i, j), in the present embodiment, according to H.264 Video coding to be divided into plurality of blocks M
Macro zone block (MacroBlock) type supported is by video pictures ItSplit 16 × 16 blocks, therefore i is 1~16
One integer, j is 1~16 any integer.For each block M (i, j), by block M (i, j) in each input pixel pt(x, y)
Pixel value and corresponding plural candidate background pixel B (x, y)1~B (x, y)MPixel value between grey incidence coefficient
γ(pt(x, y), B (x, y)1)~γ (pt(x, y), B (x, y)MIn), maximum adds up, to produce the grey incidence coefficient about block
Summation S (i, j), is expressed as follows with formula:
Wherein, k is 1~M any integer.
In step S32, it is judged that (whether i, j) more than or equal to Second Threshold α for grey incidence coefficient summation S.If so, represent
Block M (i, is not j) mobile block, then arrives step S321, it is judged that block M (i, j) is background block, and by block M (i, j)
Mobile block judgment value R (i j) is set to 0.If it is not, represent block M (i, j) in have many input pixel pt(x is y) mobile
A part for object, then arrive step S322, it is judged that (i, j) for mobile block, and by block M (i, turnover zone j) for block M
(i j) is set to 1 to block judgment value R.Narration above can be expressed as follows with formula:
Wherein, Second Threshold α can be set to 245.
Reconnaissance phase based on pixel includes step S33, S331, S332 and S34.In step S33, for each shifting
Dynamic block, it is judged that each input pixel p in mobile blockt(x, pixel value y) and corresponding plural candidate background pixel
B (x, y)1~B (x, y)MPixel value between grey incidence coefficient γ (pt(x, y), B (x, y)1)~γ (pt(x, y), B (x, y)M)
Whether middle maximum is more than or equal to the 3rd threshold value beta.If so, represent that this inputs pixel pt(x, y) with certain candidate background
Pixel has great relevance, then arrive step S331, it is judged that input pixel pt(x y) is background pixel.If it is not, represent
This inputs pixel pt(x, y) with candidate background pixel B (x, y)1~B (x, y)MAll do not have relatedness, then arrive step S332,
Judge input pixel pt(x, y) for mobile pixel.Then, in step S34, produce binaryzation and move shade BM, with public affairs
Formula is expressed as follows:
Wherein, k is 1~M any integer, and the 3rd threshold value beta can be set to 0.6.
In step S4, by the calculating of entropy, the brightness flop of detecting video pictures, with timely replacement background model.?
In the present embodiment, as shown in Figure 4, Fig. 4 is the brightness flop step of the detecting video pictures according to one embodiment of the invention
Flow chart, in step S41, calculate each block M (i, grey incidence coefficient summation S j) (i, entropy j), i.e.
-S (i, j) log (S (i, j)), and add up with produce about video pictures ItEntropy summation Et, it is expressed as follows with formula:
In step S42, it is judged that video pictures ItEntropy summation EtWith previous video pictures It-1Entropy summation Et-1Difference be
No it is more than or equal to the 4th threshold value μ.If so, video pictures I is representedtThere is unexpected brightness flop, then arrive step S421,
Judge video pictures ItThere is brightness flop, and by brightness flop judgment value LtIt is set to 1.If it is not, then arrive step S422, sentence
Disconnected video pictures ItThere is no brightness flop, and by brightness flop judgment value LtIt is set to 0.Narration above can be expressed as follows with formula:
Wherein, the 4th threshold value μ can be set to 0.05.Additionally, in step S421, owing to judging video pictures ItThere is brightness flop,
Therefore can be according to video pictures ItUpdate each input pixel pt(x, and y) corresponding candidate background pixel B (x, y)s, with formula
It is expressed as follows:
B (x, y)s=B (x, y)s+ρ(pt(x, y)-B (x, y)s)
Wherein, and B (x, y)sFor the candidate background pixel being updated, and B (x, y)sFor the candidate background pixel after updating, ρ is for presetting ginseng
Number.And the candidate background pixel B being updated (x, y)sIt is selected from inputting pixel pt(x, pixel value y) and corresponding plural candidate
Background pixel B (x, y)1~B (x, y)MPixel value between Euclidean distance Δ (1)~Δ (M) in corresponding to maximum
Candidate background pixel, is expressed as follows with formula:
In step S5, it is provided that set interface for users and set detecting sensitivity, and the mistake of the mobile shade of assessment according to this is detectd
Survey.In the present embodiment, as it is shown in figure 5, the mistake that Fig. 5 is the mobile shade of the assessment according to one embodiment of the invention is detectd
Survey the flow chart of step, in step S51, it is provided that set interface for users and set detecting sensitivity ds, set interface such as
It is graphical slide bar, and detects sensitivity ds and such as can be adjusted between 0~10.
In step S52, by total quantity n of pixel mobile in mobile shade BM with background pixeldpDivided by mobile shade BM
Area, to produce assessed value EBM, it is expressed as follows with formula:
Wherein, ndp=pp+pn, ppFor the quantity of pixel mobile in mobile shade BM, pnFor mobile shade BM carries on the back
The quantity of scene element, dimX Yu dimY is respectively the width of mobile shade BM with high.
In step S53, it is judged that assessed value EBMWhether more than the 5th threshold value δ and the product detecting sensitivity ds, wherein,
5th threshold value is the discreet Ω being predicted as positive positive pixel (the true positive pixel) face divided by mobile shade BM
Long-pending, it is expressed as follows with formula:
If so, it is assessed value EBMMore than product δ × ds, then arrive step S531, it is judged that mobile shade BM exists error detection,
And warn judgment value F to be set to 1 mistake.If it is not, i.e. assessed value EBMLess than or equal to product δ × ds, then arrive step
S532, it is judged that mobile shade BM does not exist error detection, and warns judgment value F to be set to 0 mistake.Narration can above
It is expressed as follows with formula:
Wherein, it was predicted that the discreet Ω for positive positive pixel can be set to 30 × 30.
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all spirit in the present invention and former
Within then, any modification, equivalent substitution and improvement etc. made, should be included within the scope of the present invention.
Claims (10)
1. mobile detection method based on grey correlation analysis, it is characterised in that comprising:
S1) receiving video pictures, described video pictures includes plural number input pixel;
S2) multimass background model is set up, comprising:
S21) Euclidean distance between the pixel value of each described input pixel and the pixel value of corresponding plural candidate background pixel is calculated,
S22) according to described Euclidean distance, the grey incidence coefficient between the pixel value of each described input pixel and the pixel value of described corresponding plural candidate background pixel is calculated,
S23) judge that in described grey incidence coefficient, whether minima is less than or equal to first threshold, if then judging that the bit rate of described input pixel changes, and described input pixel is added as candidate background pixel, if otherwise judging, the bit rate of described input pixel does not changes;
S3) detecting mobile object, comprising:
S31) described video pictures is divided into plurality of blocks, for each block, maximum in grey incidence coefficient between pixel value and the pixel value of described corresponding plural candidate background pixel of described input pixel each in described block is added up, to produce the grey incidence coefficient summation about described block
S32) judge described grey incidence coefficient summation whether more than or equal to Second Threshold, if then judging that described block is background block, if otherwise judge described block for movement block,
S33) for each mobile block, judge in described mobile block that in the grey incidence coefficient between the pixel value of each described input pixel and the pixel value of described corresponding plural candidate background pixel, whether maximum is more than or equal to the 3rd threshold value, if then judging that described input pixel is background pixel, if otherwise judging, described input pixel is as mobile pixel
S34) produce binaryzation and move shade;
S4) brightness flop of described video pictures is detected, comprising:
S41) calculate the entropy of the described grey incidence coefficient summation of each block, and add up to produce the entropy summation about described video pictures,
S42) judge that whether the described video pictures difference with the entropy summation of previous video pictures is more than or equal to the 4th threshold value, if then judging that described video pictures has brightness flop, and update, according to described video pictures, the candidate background pixel that each described input pixel is corresponding, if otherwise judging, described video pictures does not has brightness flop;
S5) error detection of described mobile shade is assessed, comprising:
S51) setting interface for users is provided to set detecting sensitivity,
S52) by described in described mobile shade, the total quantity of mobile pixel and described background pixel is divided by the area of described mobile shade, to produce assessed value,
S53) judge that whether described assessed value is more than the 5th threshold value and the product of described detecting sensitivity, described 5th threshold value is the discreet the being predicted as positive positive pixel area divided by described mobile shade, if then judging to there is error detection, if otherwise judging to there is not error detection.
Mobile detection method based on grey correlation analysis the most according to claim 1, wherein, candidate background pixel corresponding to maximum in candidate background pixel corresponding to the described input pixel that is updated Euclidean distance between pixel value and the pixel value of described corresponding plural candidate background pixel of described input pixel.
Mobile detection method based on grey correlation analysis the most according to claim 1, wherein, described input pixel all includes brightness value, blue color difference value and red color value with the pixel value of described candidate background pixel.
Mobile detection method based on grey correlation analysis the most according to claim 3, wherein, described brightness value, blue color difference value and red color value all use 8 bits to represent.
Mobile detection method based on grey correlation analysis the most according to claim 4, wherein, described first threshold is set to 0.6.
Mobile detection method based on grey correlation analysis the most according to claim 4, wherein, described Second Threshold is set to 245.
Mobile detection method based on grey correlation analysis the most according to claim 4, wherein, described 3rd threshold value is set to 0.6.
Mobile detection method based on grey correlation analysis the most according to claim 4, wherein, described 4th threshold value is set to 0.05.
Mobile detection method based on grey correlation analysis the most according to claim 1, wherein, described in be predicted as the discreet of positive positive pixel and be set to 30 × 30, described detecting sensitivity is set as 0 ~ 10 according to user.
Mobile detection method based on grey correlation analysis the most according to claim 1, wherein, described plurality of blocks is 16 × 16 blocks.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510115097.5A CN106034195A (en) | 2015-03-16 | 2015-03-16 | Mobile detecting method based on grey correlation analysis |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510115097.5A CN106034195A (en) | 2015-03-16 | 2015-03-16 | Mobile detecting method based on grey correlation analysis |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106034195A true CN106034195A (en) | 2016-10-19 |
Family
ID=57150145
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510115097.5A Pending CN106034195A (en) | 2015-03-16 | 2015-03-16 | Mobile detecting method based on grey correlation analysis |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106034195A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002034037A (en) * | 2000-07-19 | 2002-01-31 | Fuji Electric Co Ltd | Method for encoding animation by jpeg |
CN101742319A (en) * | 2010-01-15 | 2010-06-16 | 北京大学 | Background modeling-based static camera video compression method and background modeling-based static camera video compression system |
CN101860757A (en) * | 2010-06-03 | 2010-10-13 | 无锡中星微电子有限公司 | Intelligent monitoring system and method for encoding and decoding images thereof |
US20140105498A1 (en) * | 2012-10-11 | 2014-04-17 | Ittiam Systems (P) Limited | System and method for low complexity change detection in a sequence of images through background estimation |
-
2015
- 2015-03-16 CN CN201510115097.5A patent/CN106034195A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002034037A (en) * | 2000-07-19 | 2002-01-31 | Fuji Electric Co Ltd | Method for encoding animation by jpeg |
CN101742319A (en) * | 2010-01-15 | 2010-06-16 | 北京大学 | Background modeling-based static camera video compression method and background modeling-based static camera video compression system |
CN101860757A (en) * | 2010-06-03 | 2010-10-13 | 无锡中星微电子有限公司 | Intelligent monitoring system and method for encoding and decoding images thereof |
US20140105498A1 (en) * | 2012-10-11 | 2014-04-17 | Ittiam Systems (P) Limited | System and method for low complexity change detection in a sequence of images through background estimation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104063883B (en) | A kind of monitor video abstraction generating method being combined based on object and key frame | |
CN108182421B (en) | Video segmentation method and device | |
KR102153607B1 (en) | Apparatus and method for detecting foreground in image | |
CN110852283A (en) | Helmet wearing detection and tracking method based on improved YOLOv3 | |
CN103729858B (en) | A kind of video monitoring system is left over the detection method of article | |
CN104268900A (en) | Motion object detection method and device | |
CN105279772B (en) | A kind of trackability method of discrimination of infrared sequence image | |
US20130336582A1 (en) | Image processing apparatus, image processing method, and storage medium | |
TWI522967B (en) | Method and apparatus for moving object detection based on cerebellar model articulation controller network | |
KR102094506B1 (en) | Method for measuring changes of distance between the camera and the object using object tracking , Computer readable storage medium of recording the method and a device measuring changes of distance | |
CN106327488B (en) | Self-adaptive foreground detection method and detection device thereof | |
CN111160295A (en) | Video pedestrian re-identification method based on region guidance and space-time attention | |
CN105574896B (en) | A kind of efficient background modeling method towards high-resolution video | |
TW201537517A (en) | Moving object detection method and moving object detection apparatus | |
KR101750094B1 (en) | Method for classification of group behavior by real-time video monitoring | |
TWI512685B (en) | Method and apparatus for moving object detection | |
CN112637593B (en) | Video coding optimization method based on artificial intelligence and video analysis | |
CN103735269A (en) | Height measurement method based on video multi-target tracking | |
Chen et al. | Research on moving object detection based on improved mixture Gaussian model | |
CN103544703A (en) | Digital image stitching detecting method | |
IT202000016054A1 (en) | METHOD FOR DETERMINING THE CONFIDENCE OF A DISPARITY MAP BY SELF-ADAPTIVE LEARNING OF A NEURAL NETWORK, AND RELATED SENSOR SYSTEM | |
CN105654055A (en) | Method for performing face recognition training by using video data | |
Laumer et al. | Moving object detection in the H. 264/AVC compressed domain | |
CN106034195A (en) | Mobile detecting method based on grey correlation analysis | |
CN111726620A (en) | Encoding method and device for monitoring video background frame, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20161019 |
|
WD01 | Invention patent application deemed withdrawn after publication |