CN102289819B - Method for detecting infrared motion target in real time for background adaptive estimation - Google Patents

Method for detecting infrared motion target in real time for background adaptive estimation Download PDF

Info

Publication number
CN102289819B
CN102289819B CN 201110211688 CN201110211688A CN102289819B CN 102289819 B CN102289819 B CN 102289819B CN 201110211688 CN201110211688 CN 201110211688 CN 201110211688 A CN201110211688 A CN 201110211688A CN 102289819 B CN102289819 B CN 102289819B
Authority
CN
China
Prior art keywords
org
frame
image
coordinate
infrared image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201110211688
Other languages
Chinese (zh)
Other versions
CN102289819A (en
Inventor
白俊奇
赵春光
翟尚礼
王寿峰
孙宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 28 Research Institute
Original Assignee
CETC 28 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 28 Research Institute filed Critical CETC 28 Research Institute
Priority to CN 201110211688 priority Critical patent/CN102289819B/en
Publication of CN102289819A publication Critical patent/CN102289819A/en
Application granted granted Critical
Publication of CN102289819B publication Critical patent/CN102289819B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention discloses a method for detecting an infrared motion target in real time for background adaptive estimation. The method comprises the following steps of: (1) initializing a background model BG (0); (2) inputting a kth-frame original infrared image Xorg (k), wherein k is equal to 1, 2 to K; (3) restraining a space domain non-uniformity noise and a time domain random noise of the kth-frame original infrared image Xorg (k), and outputting an image Xdeal (k) which is subjected to noise restraint; (4) registering the image Xdeal (k), and outputting a registered kth-frame infrared image Xreg (k); (5) updating the background model, namely updating a background model BG (k) of the kth-frame infrared image by using an adaptive forgetting factor method; (6) performing difference operation on the kth-frame infrared image Xorg (k) and the background model BG (k), and outputting a difference operation result Xout (k); and (7) calculating the optimum segmentation threshold value Thopt of a target and a background in the Xout (k) according to a maximum inter-class variance theory to finish detection of the motion target. The method has the advantages that: by image pro-processing, the problem of over-sensitivity in noise and scene change in the conventional background difference method is effectively solved, and the detection probability of the target is improved.

Description

The infrared motion target real-time detection method that a kind of Adaptive background subtraction is estimated
Technical field
The present invention relates to a kind of infrared image object detection method, particularly a kind of moving target detecting method of suitable hardware real-time implementation.
Background technology
Detection for Moving Target is analyzed by the image sequence that sensor is obtained, and detects moving target, at aspects such as traffic monitoring, self-navigations, very important Practical significance is arranged.Typical moving target detecting method is divided into three major types: background subtraction point-score, difference image method and optical flow method.The background subtraction point-score carries out difference with present image and a known background, and passing threshold cuts apart to extract moving target.The method hypothesis background is accurately as can be known, is not subjected to the restriction of target speed, can more intactly extract moving target.The background subtraction point-score is too responsive to the scene changes that illumination and external condition cause, when there is noise in image, when particularly image and background had noise, the extraction effect of target became undesirable.The difference image method is subtracted each other by adjacent two frames in sequence image, detects target according to difference image, is the straightforward procedure that detects Moving Targets in Sequent Images.The strong adaptability of the method to environment, fast operation, weak point is can not accurately setting movement target (target location of detection is the mean place of target in two two field pictures), easily produces cavitation in the target internal zone, is difficult to detect overlapping moving target.Optical flow method does not need to know any background information, relies on time dependent light stream characteristic to calculate motion and the structural parameters of target, is the important method of moving target in the analytical sequence image.Yet, in light stream, the motion of target and structural parameters need single order or the second derivative of computed image gray scale and light stream, real image can be subject to the pollution of noise, and the computation process of derivative is the process that a noise amplifies, more the strong noise amplification is more obvious for exponent number, and the computing of optical flow method is complicated, and processing capability in real time is poor.
Summary of the invention
Goal of the invention: for the background subtraction point-score deficiency too responsive to noise and scene changes, the objective of the invention is to design a kind of respond well, calculate the infrared motion target real-time detection method that simple, as to be fit to hardware real-time implementation Adaptive background subtraction is estimated.
Technical scheme: for achieving the above object, the technical solution used in the present invention is the infrared motion target real-time detection method that a kind of Adaptive background subtraction is estimated, comprises the steps:
(1) initialization background model BG (0);
(2) the original infrared image X of input k frame org(k), k=1,2 ..., K;
(3) to the original infrared image X of the described k frame of step (2) org(k) spatial domain heterogeneity noise and time domain random noise suppress, and output suppresses the image X after noise Deal(k);
(4) the k frame infrared image X after the described inhibition noise of step of registration (3) Deal(k), output images after registration X reg(k);
(5) background model is upgraded: by the k frame infrared image X after the described registration of adaptive forgetting factor method step of updating (4) reg(k) background model BG (k);
(6) with the original infrared image X of k frame org(k) make calculus of differences with background model BG (k), the X as a result of output calculus of differences out(k);
(7) according to the described X of the theoretical calculation procedure (6) of maximum between-cluster variance out(k) the optimal segmenting threshold Th of target and background in opt, complete the detection of moving target.
In described step (1), utilize initial N frame original image { X org(k) | k=1,2 ..., K} calculates background model BG (0), and the expression formula of the background value BG that coordinate (i, j) is located (i, j, 0) can be shown below:
BG(i,j,0)=median[X org(i,j,1),X org(i,j,2),...,X org(i,j,k)]
In formula, operational symbol median[] expression finds the solution median operation, X orgThe original infrared image X of (i, j, k) expression k frame org(k) pixel value located of coordinate (i, j).
In described step (3), can use finite support territory high frequency constant statistics Nonuniformity Correction method to the original infrared image X of k frame org(k) spatial domain heterogeneity noise suppresses.The method that suppresses can be:
Utilize gauss low frequency filter g with the original infrared image X of k frame org(k) be divided into high fdrequency component
Figure BDA0000078941100000021
And low frequency component
Figure BDA0000078941100000022
Two parts:
X org ( k ) = X org high ( k ) + X org low ( k )
X org low ( k ) = X org ( k ) ⊗ g
In formula, operational symbol
Figure BDA0000078941100000025
The expression convolution operation;
Gauss low frequency filter g (i, the j) expression formula that coordinate (i, j) is located is shown below;
g ( i , j ) = 1 2 π σ 2 e - ( i 2 + j 2 2 σ 2 )
In formula, σ is the variance of gauss low frequency filter g;
Calculate the original infrared image X of k frame org(k) biasing coefficient O High(k) and gain coefficient G High(k), the biasing coefficient O that locates of k frame coordinate (i, j) High(i, j, k) and gain coefficient G High(i, j, k) expression formula is shown below respectively:
Figure BDA0000078941100000031
O tem high ( i , j , k ) = 1 Num × X org high ( i , j , k ) + ( 1 - 1 N ) × O high ( i , j , k - 1 )
G high ( i , j , k ) = 1 Num × | X org high ( i , j , k ) - O high ( i , j , k - 1 ) | + ( 1 - 1 N ) × G high ( i , j , k - 1 )
O high(i,j,0)=0
O tem high ( i , j , 0 ) = 0
G high(i,j,0)=1
In formula,
Figure BDA0000078941100000035
Candidate's coefficient of setovering,
Figure BDA0000078941100000036
It is the original infrared image of k frame
X org(k) the pixel value X that locates of coordinate (i, j) orgThe high fdrequency component of (i, j, k), Num is the accumulation frame number, ε is error constant,
Figure BDA0000078941100000037
It is the finite support territory of finite support territory high frequency constant statistics Nonuniformity Correction method;
Therefore, X org(i, j, k) output after finite support territory high frequency constant statistics Nonuniformity Correction method processing Expression formula is shown below:
X deal nu ( i , j , k ) = ( X org ( i , j , k ) - O high ( i , j , k ) ) / G high ( i , j , k )
Wherein,
Figure BDA00000789411000000310
It is the image after the heterogeneity squelch.
To the image of k frame after the heterogeneity squelch
Figure BDA00000789411000000311
Can carry out the time domain random noise and suppress, the image X of output k frame after the time domain random noise suppresses Deal(k):.
X deal H ( i , j , k ) = median [ X deal nu ( i , j + q ′ , k ) | q ′ = - 2 , - 1,0,1,2 ]
X deal ( i , j , k ) = median [ X deal H ( i + p ′ , j , k ) | p ′ = - 2 , - 1,0,1,2 ]
Wherein, operational symbol median[] expression finds the solution median operation,
Figure BDA00000789411000000314
Right
Figure BDA00000789411000000315
The Output rusults of medium filtering in the horizontal direction, X Deal(i, j, k) is right
Figure BDA00000789411000000316
The Output rusults of medium filtering, namely right in the vertical direction
Figure BDA00000789411000000317
Output image after the time domain random noise suppresses.
In described step (4), can adopt the k frame infrared image X after template registration method is exported registration reg(k):
At first, the image X after the k frame suppresses noise Deal(k) choose size centered by coordinate (i, j) and be the standard form of P * Q;
Secondly, adopt square error function MSE as criterion, the image X after the k-1 frame suppresses noise Deal(k-1) in centered by coordinate (i, j) size mate for the image-region of M * N, P<M, Q<N, calculating least mean-square error MSE min, least mean-square error MSE minCorresponding coordinate (i+ Δ i, j+ Δ j) is registration position;
Square error function MSE expression formula is as follows:
MSE ( i + Δi , j + Δj , k ) = 1 P × Q Σ p = 1 P Σ q = 1 Q ( X deal ( i + Δi + p - p 2 , j + Δj + q - Q 2 , k ) - X deal ( i + Δi + p - P 2 , j + Δj + q - Q 2 , k - 1 ) ) 2
Least mean-square error MSE minExpression formula is as follows:
MSE min(i+Δi,j+Δj,k)=min{MSE(i+Δi,j+Δj,k)}
Wherein, (Δ i, Δ j) expression is with respect to the displacement of coordinate (i, j) ,-(M-P)/2≤Δ i≤(M-P)/2 ,-(N-Q)/2≤Δ j≤(N-Q)/2;
At last, to the image X after k frame inhibition noise Deal(k) carry out registration:
X reg(i,j,k)=X deal(i+Δi,j+Δj,k)。
In described step (5), the original infrared image X of k frame org(k) background model BG (i, j, the k) expression formula located of coordinate (i, j) can be shown below:
BG(i,j,k)=(1-α(i,j,k))×X reg(i,j,k)+α(i,j,k)×BG(i,j,k-1)
In formula, X reg(i, j, k) is the original infrared image X of k frame org(k) the registration output located of coordinate (i, j), α (i, j, k) is the original infrared image X of k frame org(k) forgetting factor located of coordinate (i, j), expression formula is shown below:
α ( i , j , k ) = | H ( i , j , k ) - H ( i , j , k - 1 ) | H ( i , j , k - 1 )
H ( i , j , k ) = - Σ m = 1 M 1 Σ n = 1 N 1 ( ( X reg ( i , j , k ) Σ m = 1 M 1 Σ n = 1 N 1 X reg ( i , j , k ) ) · lg ( X reg ( i , j , k ) Σ m = 1 M 1 Σ n = 1 N 1 X reg ( i , j , k ) ) )
In formula, H (i, j, k) is the original infrared image X of k frame org(k) M centered by coordinate (i, j) 1* N 1The neighborhood entropy of the local neighborhood of size, operational symbol lg[] operation of expression logarithm.
In described step (6), the original infrared image X of k frame org(k) the output X that locates of coordinate (i, j) out(i, j, k) expression formula can be:
X out(i,j,k)=X org(i,j,k)-BG(i,j,k)
In formula, X org(i, j, k) is the original infrared image X of k frame org(k) pixel value located of coordinate (i, j), BG (i, j, k) is the pixel value that background model BG (k) coordinate (i, j) is located.
In described step (7), Th optExpression formula can be as follows:
Th opt = arg max 0 ≤ Th ≤ 255 ( ω 0 ( μ 0 - μ ) 2 + ω 1 ( μ 1 - μ ) 2 )
Wherein, operational symbol arg[] be to return to optimal value operation, operational symbol max[] be to find the solution maxima operation, Th is segmentation threshold, ω 0And ω 1Respectively that moving target is at the probability of background area and target area appearance, μ 0The mean value of background area, μ 1Be the mean value of target area, μ is the mean value of background and target area, and their expression formula is as follows respectively:
ω 0 = Σ v = 0 Th p ( v ) , μ 0 = 1 ω 0 Σ v = 0 Th v · p ( v )
ω 1 = Σ v = Th + 1 T p ( v ) , μ 1 = 1 ω 1 Σ v = Th + 1 T v · p ( v )
μ=ω 0μ 01μ 1
In formula, p (v) is that the gradation of image value is the probability of the pixel appearance of v, and T is gray level, and Th is segmentation threshold;
The expression formula of p (v) is: p ( v ) = 1 M 2 × N 2 Σ 1 X out ( i , j , k ) = v
Wherein,
Figure BDA0000078941100000057
Expression is used for computed image gray-scale value X out(i, j, k) equals the number of pixels of v, M 2And N 2Columns and the line number of difference presentation video.
Beneficial effect: the present invention compared with prior art has following remarkable advantage: (1) by the image pre-service, efficiently solves traditional background subtraction point-score to noise and scene changes sensitive issue too, has improved target detection probability; (2) finite support territory high frequency constant statistics Nonuniformity Correction method (SC-HFCS) is proposed, utilize the heterogeneity of SC-HFCS method adaptively correcting image, solved traditional scene correcting algorithm speed of convergence when improving signal noise ratio (snr) of image slow, there is the technical barrier of " ghost ", reduced false alarm rate; (3) for the context update problem of background subtraction point-score, designed the forgetting factor adaptive updates model based on the neighborhood entropy, effectively improved context update speed, solved in essence the interference problem of prospect to background; (4) there are not labyrinth and high exponent arithmetic(al) in detection method, are fit to the hardware real-time implementation.
Description of drawings
Fig. 1 is the process flow diagram of the infrared motion target real-time detection method of Adaptive background subtraction estimation of the present invention.
Embodiment
Below in conjunction with the drawings and specific embodiments, further illustrate the present invention, should understand these embodiment only is used for explanation the present invention and is not used in and limits the scope of the invention, after having read the present invention, those skilled in the art all fall within the application's claims limited range to the modification of the various equivalent form of values of the present invention.
In conjunction with Fig. 1, the below illustrates with example the infrared motion target real-time detection method that Adaptive background subtraction of the present invention is estimated.Use non-refrigeration gazing type infrared eye, the focal plane arrays (FPA) size is 320 * 240, and working frame frequency is 25HZ, and data bits is 8 bits.Infrared image (hereinafter to be referred as image) is by the special image disposable plates of optical fiber transmission to the DSP+FPGA framework, and the detection of moving target realizes in dsp processor, satisfies real-time processing requirement, and concrete implementation step is as follows:
(1) initialization background model BG (0) gathers ten initial frame original image { X org(k) | k=1,2 ..., it is as follows that 10}, coordinate (i, j) locate BG (i, j, 0) expression formula:
BG(i,j,0)=median[X org(i,j,1),X org(i,j,2),...,X org(i,j,10)]
Wherein, operational symbol median[] expression finds the solution median operation, X org(i, j, k) expression k frame original image X org(k) pixel value located of coordinate (i, j), 1≤i≤240,1≤j≤320.
(2) input k frame original image X org(k), the image size is 320 * 240;
(3) to k frame original image X org(k) spatial domain heterogeneity noise suppresses, and proofreaies and correct output to be It is the image after the heterogeneity noise processed.The template size of setting gauss low frequency filter g is 5 * 5, and variance is 1, and spatial domain 5 * 5 Gauss's templates are as shown in table 1:
0.002 0.013 0.220 0.013 0.002
0.013 0.06 0.098 0.06 0.013
0.220 0.098 0.162 0.098 0.220
0.013 0.06 0.098 0.06 0.013
0.002 0.013 0.220 0.013 0.002
Table 1
Calculate X org(k) high fdrequency component
Figure BDA0000078941100000071
Expression formula is as follows:
X org low ( k ) = X org ( k ) - X org ( k ) ⊗ g
Wherein, operational symbol
Figure BDA0000078941100000073
It is convolution operation.
By
Figure BDA0000078941100000074
Calculate X org(k) correction coefficient of heterogeneity noise, coefficient O namely setovers High(k) and gain coefficient G High(k).The biasing coefficient O that k frame coordinate (i, j) is located High(i, j, k) and gain coefficient G High(i, j, k) expression formula is as follows respectively:
Figure BDA0000078941100000075
O tem high ( i , j , k ) = 1 Num × X org high ( i , j , k ) + ( 1 - 1 N ) × O high ( i , j , k - 1 )
G high ( i , j , k ) = 1 Num × | X org high ( i , j , k ) - O high ( i , j , k - 1 ) | + ( 1 - 1 N ) × G high ( i , j , k - 1 )
O high(i,j,0)=0
O tem high ( i , j , 0 ) = 0
G high(i,j,0)=1
Wherein,
Figure BDA0000078941100000079
Candidate's coefficient of setovering,
Figure BDA00000789411000000710
X orgThe high fdrequency component of (i, j, k), Num is the accumulation frame number, ε is error constant,
Figure BDA00000789411000000711
The finite support territory of SC-HFCS algorithm, 1≤i≤240,1≤j≤320.
Therefore, k frame coordinate (i, j) is located pixel X org(i, j, k) output after the SC-HFCS algorithm process
Figure BDA00000789411000000712
Expression formula is as follows:
X deal nu ( i , j , k ) = ( X org ( i , j , k ) - O high ( i , j , k ) ) / G high ( i , j , k )
Wherein,
Figure BDA00000789411000000714
The image of k frame after the heterogeneity noise processed.
(4) to the k two field picture
Figure BDA00000789411000000715
Carry out the time domain random noise and suppress, the image X of output k frame after the time domain random noise suppresses Deal(k), expression formula is as follows:
X deal H ( i , j , k ) = median [ X deal nu ( i , j + q ′ , k ) | q ′ = - 2 , - 1,0,1,2 ]
X deal ( i , j , k ) = median [ X deal H ( i + p ′ , j , k ) | p ′ = - 2 , - 1,0,1,2 ]
Wherein, operational symbol median[] expression finds the solution median operation,
Figure BDA0000078941100000083
Right
Figure BDA0000078941100000084
The Output rusults of medium filtering in the horizontal direction, X Deal(i, j, k) is right
Figure BDA0000078941100000085
The Output rusults of medium filtering, namely right in the vertical direction
Figure BDA0000078941100000086
Output image after the time domain random noise suppresses, 1≤i≤240,1≤j≤320.
(5) utilize template registration method to X Deal(k) carry out registration, obtain the k two field picture X after registration reg(k).
At first, at the image X of k frame after the time domain random noise suppresses Deal(k) choose size centered by coordinate (i, j) and be the standard form (P=Q=30) of P * Q;
Secondly, adopt square error function (MSE) as criterion, at k-1 two field picture X Deal(k-1) big or small centered by coordinate (i, j) in is the image-region work coupling (M=N=60) of M * N, calculates least mean-square error MSE min, least mean-square error MSE minCorresponding coordinate (i+ Δ i, j+ Δ j) is optimum matching (registration) position.
Square error function (MSE) expression formula is as follows:
MSE ( i + Δi , j + Δj , k ) = 1 P × Q Σ p = 1 P Σ q = 1 Q ( X deal ( i + Δi + p - p 2 , j + Δj + q - Q 2 , k ) - X deal ( i + Δi + p - P 2 , j + Δj + q - Q 2 , k - 1 ) ) 2
Least mean-square error MSE minExpression formula is as follows:
MSE min(i+Δi,j+Δj,k)=min{MSE(i+Δi,j+Δj,k)}
Wherein, (Δ i, Δ j) expression is with respect to the displacement of coordinate (i, j) ,-(M-P)/2≤Δ i≤(M-P)/2 ,-(N-Q)/2≤Δ j≤(N-Q)/2.
At last, to the X of k frame after the time domain random noise suppresses Deal(k) carry out registration:
X reg(i,j,k)=X deal(i+Δi,j+Δj,k)。
(6) upgrade X reg(k) background model BG (k), the background model BG that coordinate (i, j) is located (i, j, k) expression formula is as follows:
BG(i,j,k)=(1-α(i,j,k))×X reg(i,j,k)+α(i,j,k)×BG(i,j,k-1)
Wherein, X reg(i, j, k) is k frame original image X org(k) the registration output located of coordinate (i, j), α (i, j, k) is k frame original image X org(k) forgetting factor of the registration output located of coordinate (i, j), expression formula is as follows:
α ( i , j , k ) = | H ( i , j , k ) - H ( i , j , k - 1 ) | H ( i , j , k - 1 )
H ( i , j , k ) = - Σ m = 1 M 1 Σ n = 1 N 1 ( ( X reg ( i , j , k ) Σ m = 1 M 1 Σ n = 1 N 1 X reg ( i , j , k ) ) · lg ( X reg ( i , j , k ) Σ m = 1 M 1 Σ n = 1 N 1 X reg ( i , j , k ) ) )
Wherein, H (i, j, k) is k frame original image X org(k) M centered by (i, j) 1* N 1The neighborhood entropy of the local neighborhood of size, X reg(i, j, k) is k frame original image X org(k) the registration output located of coordinate (i, j), operational symbol lg[] be the logarithm operation, 1≤i≤240,1≤j≤320.
(7) with k frame original image X org(k) make calculus of differences with background model BG (k), the X as a result of output calculus of differences out(k).The output X that coordinate (i, j) is located out(i, j, k) expression formula is as follows:
X out(i,j,k)=X org(i,j,k)-BG(i,j,k)
Wherein, X org(i, j, k) is k frame original image X org(k) pixel value located of coordinate (i, j), BG (i, j, k) is the pixel value that background model BG (k) coordinate (i, j) is located, 1≤i≤240,1≤j≤320.
(8) calculate X theoretical according to maximum between-cluster variance out(k) the optimal segmenting threshold Th of target and background in opt, complete the detection of moving target.Th optExpression formula is as follows:
Th opt = arg max 0 ≤ Th ≤ 255 ( ω 0 ( μ 0 - μ ) 2 + ω 1 ( μ 1 - μ ) 2 )
Wherein, operational symbol arg[] be to return to optimal value operation, operational symbol max[] be to find the solution maxima operation, Th is segmentation threshold, ω 0And ω 1Respectively that moving target is at the probability of background area and target area appearance, μ 0The mean value of background area, μ 1Be the mean value of target area, μ is the mean value of background and target area, and their expression formula is as follows respectively:
ω 0 = Σ v = 0 Th p ( v ) , μ 0 = 1 ω 0 Σ v = 0 Th v · p ( v )
ω 1 = Σ v = Th + 1 T p ( v ) , μ 1 = 1 ω 1 Σ v = Th + 1 T v · p ( v )
μ=ω 0μ 01μ 1
Wherein, p (v) is that the gradation of image value is the probability of the pixel appearance of v, and T is gray level, and Th is segmentation threshold.P (v) expression formula is as follows:
p ( v ) = 1 240 × 320 Σ 1 X out ( i , j , k ) = v
Wherein,
Figure BDA0000078941100000106
Expression is used for computed image gray-scale value X out(i, j, k) equals the number of pixels of v, v=1, and 2 ..., 255,1≤i≤240,1≤j≤320.

Claims (6)

1. the infrared motion target real-time detection method that Adaptive background subtraction is estimated, is characterized in that, comprises the steps:
(1) initialization background model BG (0);
(2) the original infrared image X of input k frame org(k), k=1,2 ..., K;
(3) to the original infrared image X of the described k frame of step (2) org(k) spatial domain heterogeneity noise and time domain random noise suppress, and output k frame suppresses the image X after noise Deal(k);
(4) the k frame infrared image X after the described inhibition noise of step of registration (3) Deal(k), the k frame infrared image X after the output registration reg(k);
(5) background model is upgraded: by the k frame infrared image X after the described registration of adaptive forgetting factor method step of updating (4) reg(k) background model BG (k);
(6) with the original infrared image X of k frame org(k) make calculus of differences with background model BG (k), the X as a result of output calculus of differences out(k);
(7) according to the described X of the theoretical calculation procedure (6) of maximum between-cluster variance out(k) the optimal segmenting threshold Th of target and background in opt, complete the detection of moving target;
In described step (3), use finite support territory high frequency constant statistics Nonuniformity Correction method to the original infrared image X of k frame org(k) spatial domain heterogeneity noise suppresses, and the method for described inhibition is:
Utilize gauss low frequency filter g with the original infrared image X of k frame org(k) be divided into high fdrequency component
Figure FDA00002759626600011
And low frequency component
Figure FDA00002759626600012
Two parts:
X org ( k ) = X org high ( k ) + X org low ( k )
X org low ( k ) = X org ( k ) ⊗ g
In formula, operational symbol
Figure FDA00002759626600015
The expression convolution operation;
Gauss low frequency filter g (i, the j) expression formula that coordinate (i, j) is located is shown below;
( i , j ) = 1 2 π σ 2 e - ( i 2 + j 2 2 σ 2 )
In formula, σ is the variance of gauss low frequency filter g;
Calculate the original infrared image X of k frame org(k) biasing coefficient O High(k) and gain coefficient G High(k), the biasing coefficient O that locates of k frame coordinate (i, j) High(i, j, k) and gain coefficient G High(i, j, k) expression formula is shown below respectively:
Figure FDA00002759626600021
O tem high ( i , j , k ) = 1 Num × X org high ( i , j , k ) + ( 1 - 1 Num ) × O high ( i , j , k - 1 )
G high ( i , j , k ) = 1 Num × | X org high ( i , j , k ) - O high ( i , j , k - 1 ) | + ( 1 - 1 Num ) × G high ( i , j , k - 1 )
O high(i,j,0)=0
O tem high ( i , j , k ) = 0
G high(i,j,0)=1
In formula,
Figure FDA00002759626600025
Candidate's coefficient of setovering,
Figure FDA00002759626600026
The original infrared image X of k frame org(k) the pixel value X that locates of coordinate (i, j) orgThe high fdrequency component of (i, j, k), Num is the accumulation frame number, ε is error constant, | O tem high ( i , j , k ) - O high ( i , j , k - 1 ) | ≤ ϵ It is the finite support territory of finite support territory high frequency constant statistics Nonuniformity Correction method;
Therefore, X org(i, j, k) output after finite support territory high frequency constant statistics Nonuniformity Correction method processing
Figure FDA00002759626600028
Expression formula is shown below:
X deal nu ( i , j , k ) = ( X org ( i , j , k ) - O high ( i , j , k ) ) / G high ( i , j , k )
Wherein,
Figure FDA000027596266000210
It is the image after the heterogeneity squelch;
In described step (5), the original infrared image X of k frame org(k) background model BG (i, j, the k) expression formula located of coordinate (i, j) is shown below:
BG(i,j,k)=(1-α(i,j,k))×X reg(i,j,k)+α(i,j,k)×BG(i,j,k-1)
In formula, X reg(i, j, k) is the original infrared image X of k frame org(k) the registration output located of coordinate (i, j), α (i, j, k) is the original infrared image X of k frame org(k) forgetting factor located of coordinate (i, j), expression formula is shown below:
α ( i , j , k ) = | H ( i , j , k ) - H ( i , j , k - 1 ) | H ( i , j , k - 1 )
In formula, H (i, j, k) is the original infrared image X of k frame org(k) M centered by coordinate (i, j) 1* N 1The neighborhood entropy of the local neighborhood of size.
2. the infrared motion target real-time detection method estimated of Adaptive background subtraction according to claim 1, is characterized in that, in described step (1), utilizes initial N frame original image { X org(k) | k=1,2 ..., N} calculates background model BG (0), and the expression formula of the background value BG that coordinate (i, j) is located (i, j, 0) is shown below:
BG(i,j,0)=median[X org(i,j,1),X org(i,j,2),...,X org(i,j,k),...,X org(i,j,N)]
In formula, operational symbol median[] expression finds the solution median operation, X orgThe original infrared image X of (i, j, k) expression k frame org(k) pixel value located of coordinate (i, j).
3. the infrared motion target real-time detection method estimated of Adaptive background subtraction according to claim 1, is characterized in that, to the image of k frame after the heterogeneity squelch
Figure FDA00002759626600032
Carry out the time domain random noise and suppress, the image X of output k frame after the time domain random noise suppresses Deal(k):
X deal H ( i , j , k ) = median [ X deal nu ( i , j + q ′ , k ) | q ′ = - 2 , - 1 , 0,1,2 ]
X deal ( i , j , k ) = median [ X deal H ( i + p ′ , j , k ) | p ′ = - 2 , - 1,0,1,2 ]
Wherein, operational symbol median[] expression finds the solution median operation,
Figure FDA00002759626600035
Right
Figure FDA00002759626600036
The Output rusults of medium filtering in the horizontal direction, X Deal(i, j, k) is right
Figure FDA00002759626600037
The Output rusults of medium filtering, namely right in the vertical direction
Figure FDA00002759626600038
Output image after the time domain random noise suppresses.
4. the infrared motion target real-time detection method estimated of Adaptive background subtraction according to claim 1, is characterized in that, in described step (4), adopts the k frame infrared image X after template registration method output registration reg(k):
At first, the image X after the k-1 frame suppresses noise Deal(k-1) choose size centered by coordinate (i, j) and be the standard form of P * Q;
Secondly, adopt square error function MSE as criterion, the image X after the k frame suppresses noise Deal(k) in centered by coordinate (i, j) size mate for the image-region of M * N, P<M, Q<N, calculating least mean-square error MSE min, least mean-square error MSE minCorresponding coordinate (i+ Δ i, j+ Δ j) is registration position;
Square error function MSE expression formula is as follows:
MSE ( i + Δi , j + Δj , k ) = 1 P × Q Σ p = 1 P Σ q = 1 Q ( X deal ( i + Δi + p - P 2 , j + Δj + q - Q 2 , k ) - X deal ( i + p - P 2 , j + q - Q 2 , k - 1 ) ) 2
Least mean-square error MSE minExpression formula is as follows:
MSE min(i+Δi,j+Δj,k)=min{MSE(i+Δi,j+Δj,k)}
Wherein, (Δ i, Δ j) expression is with respect to the displacement of coordinate (i, j) ,-(M-P)/2≤Δ i≤(M-P)/2 ,-(N-Q)/2≤Δ j≤(N-Q)/2;
At last, to the image X after k frame inhibition noise Deal(k) carry out registration:
X reg(i,j,k)=X deal(i+Δi,j+Δj,k)。
5. the infrared motion target real-time detection method estimated of Adaptive background subtraction according to claim 1, is characterized in that, in described step (6), and the original infrared image X of k frame org(k) the output X that locates of coordinate (i, j) out(i, j, k) expression formula is:
X out(i,j,k)=X org(i,j,k)-BG(i,j,k)
In formula, X org(i, j, k) is the original infrared image X of k frame org(k) pixel value located of coordinate (i, j), BG (i, j, k) is the pixel value that background model BG (k) coordinate (i, j) is located.
6. the infrared motion target real-time detection method estimated of Adaptive background subtraction according to claim 1, is characterized in that, in described step (7), and Th optExpression formula is as follows:
Th opt = arg max 0 ≤ Th ≤ 255 ( ω 0 ( μ 0 - μ ) 2 + ω 1 ( μ 1 - μ ) 2 )
Wherein, operational symbol arg[] be to return to optimal value operation, operational symbol max[] be to find the solution maxima operation, Th is segmentation threshold, ω 0And ω 1Respectively that moving target is at the probability of background area and target area appearance, μ 0The mean value of background area, μ 1Be the mean value of target area, μ is the mean value of background and target area, and their expression formula is as follows respectively:
ω 0 = Σ v = 0 Th p ( v ) , μ 0 = 1 ω 0 Σ v = 0 Th v · p ( v )
ω 1 = Σ v = Th + 1 T p ( v ) , μ 1 = 1 ω 1 Σ v = Th + 1 T v · p ( v )
μ=ω 0μ 01μ 1
In formula, p (v) is that the gradation of image value is the probability of the pixel appearance of v, and T is gray level;
The expression formula of p (v) is: p ( v ) = 1 M 2 × N 2 Σ 1 X out ( i , j , k ) = v
Wherein,
Figure FDA00002759626600056
Expression is used for computed image gray-scale value X out(i, j, k) equals the number of pixels of v, M 2And N 2Columns and the line number of difference presentation video.
CN 201110211688 2011-07-27 2011-07-27 Method for detecting infrared motion target in real time for background adaptive estimation Expired - Fee Related CN102289819B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110211688 CN102289819B (en) 2011-07-27 2011-07-27 Method for detecting infrared motion target in real time for background adaptive estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110211688 CN102289819B (en) 2011-07-27 2011-07-27 Method for detecting infrared motion target in real time for background adaptive estimation

Publications (2)

Publication Number Publication Date
CN102289819A CN102289819A (en) 2011-12-21
CN102289819B true CN102289819B (en) 2013-05-08

Family

ID=45336208

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110211688 Expired - Fee Related CN102289819B (en) 2011-07-27 2011-07-27 Method for detecting infrared motion target in real time for background adaptive estimation

Country Status (1)

Country Link
CN (1) CN102289819B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544693B (en) * 2013-04-11 2017-05-10 Tcl集团股份有限公司 Method and system for extracting foreground object
CN103208105B (en) * 2013-05-02 2015-08-19 中国电子科技集团公司第二十八研究所 A kind of infrared image details strengthens and noise Adaptive Suppression method
CN103218792B (en) * 2013-05-03 2015-07-08 中国电子科技集团公司第二十八研究所 Infrared image noise time domain filtering method based on registration
CN103945089A (en) * 2014-04-18 2014-07-23 上海复控华龙微***技术有限公司 Dynamic target detection method based on brightness flicker correction and IP camera
CN106874949B (en) * 2017-02-10 2019-10-11 华中科技大学 Movement imaging platform moving target detecting method and system based on infrared image
CN107194932B (en) * 2017-04-24 2020-05-05 江苏理工学院 Adaptive background reconstruction algorithm based on exponential forgetting
CN107564030A (en) * 2017-08-21 2018-01-09 叶军 A kind of image object detection method for infrared sensor
CN108288038A (en) * 2018-01-19 2018-07-17 东华大学 Night robot motion's decision-making technique based on scene cut
CN112802020B (en) * 2021-04-06 2021-06-25 中国空气动力研究与发展中心计算空气动力研究所 Infrared dim target detection method based on image inpainting and background estimation
CN113421282B (en) * 2021-05-28 2022-11-18 深圳数联天下智能科技有限公司 Motion detection method, apparatus, and medium
CN115471710B (en) * 2022-09-29 2023-08-15 中国电子科技集团公司信息科学研究院 Infrared detection recognition system and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101236606A (en) * 2008-03-07 2008-08-06 北京中星微电子有限公司 Shadow cancelling method and system in vision frequency monitoring
CN102024146A (en) * 2010-12-08 2011-04-20 江苏大学 Method for extracting foreground in piggery monitoring video
CN102034239A (en) * 2010-12-07 2011-04-27 北京理工大学 Local gray abrupt change-based infrared small target detection method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101236606A (en) * 2008-03-07 2008-08-06 北京中星微电子有限公司 Shadow cancelling method and system in vision frequency monitoring
CN102034239A (en) * 2010-12-07 2011-04-27 北京理工大学 Local gray abrupt change-based infrared small target detection method
CN102024146A (en) * 2010-12-08 2011-04-20 江苏大学 Method for extracting foreground in piggery monitoring video

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
中值滤波的视频背景更新;苏礼坤等;《光电工程》;20100131;第37卷(第1期);第132页第7段至最后1段 *
周显国等.贝叶斯决策分析在医学步态分析中运动目标检测的应用研究.《中国医疗设备》.2010,第25卷(第9期),
苏礼坤等.中值滤波的视频背景更新.《光电工程》.2010,第37卷(第1期),
视频监控***中运动目标检测与跟踪的研究;金克琼;《中国优秀硕士学位论文全文数据库 信息科技辑》;20101215;第5页第1段至第28页最后1段,图3-4 *
贝叶斯决策分析在医学步态分析中运动目标检测的应用研究;周显国等;《中国医疗设备》;20101231;第25卷(第9期);全文 *
金克琼.视频监控***中运动目标检测与跟踪的研究.《中国优秀硕士学位论文全文数据库 信息科技辑》.2010,

Also Published As

Publication number Publication date
CN102289819A (en) 2011-12-21

Similar Documents

Publication Publication Date Title
CN102289819B (en) Method for detecting infrared motion target in real time for background adaptive estimation
CN106875415B (en) Continuous and stable tracking method for small and weak moving targets in dynamic background
Rong et al. An improved CANNY edge detection algorithm
US10068343B2 (en) Method and apparatus for recognizing moving target
CN106846359A (en) Moving target method for quick based on video sequence
CN102968765B (en) Method for correcting infrared focal plane heterogeneity based on sigma filter
CN101311964B (en) Method and device for real time cutting motion area for checking motion in monitor system
CN104657945A (en) Infrared small target detection method for multi-scale spatio-temporal union filtering under complex background
CN102982537B (en) A kind of method and system detecting scene change
CN106548488B (en) A kind of foreground detection method based on background model and inter-frame difference
CN110428466B (en) Method and equipment for correcting nonuniformity
CN110567584B (en) Method for detecting, extracting and correcting blind pixels of real-time infrared detector
Qiu et al. Adaptive scale patch-based contrast measure for dim and small infrared target detection
CN103945089A (en) Dynamic target detection method based on brightness flicker correction and IP camera
CN104766079A (en) Remote infrared weak object detecting method
CN105427286A (en) Gray scale and gradient segmentation-based infrared target detection method
TWI394097B (en) Detecting method and system for moving object
CN108257153B (en) Target tracking method based on direction gradient statistical characteristics
CN110717454B (en) Wheel type robot obstacle detection method in stage environment
Lin et al. A new prediction method for edge detection based on human visual feature
CN112435278B (en) Visual SLAM method and device based on dynamic target detection
CN107169992A (en) A kind of traffic video moving target detecting method
CN110866863A (en) Automobile A-pillar perspective algorithm
CN106951902B (en) Image binarization processing method and device
CN103473753A (en) Target detection method based on multi-scale wavelet threshold denoising

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130508

Termination date: 20180727