CN101127120A - Target tracking algorism for effectively suppressing template drift - Google Patents

Target tracking algorism for effectively suppressing template drift Download PDF

Info

Publication number
CN101127120A
CN101127120A CNA200710045939XA CN200710045939A CN101127120A CN 101127120 A CN101127120 A CN 101127120A CN A200710045939X A CNA200710045939X A CN A200710045939XA CN 200710045939 A CN200710045939 A CN 200710045939A CN 101127120 A CN101127120 A CN 101127120A
Authority
CN
China
Prior art keywords
template
drift
noise power
target
target tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA200710045939XA
Other languages
Chinese (zh)
Other versions
CN101127120B (en
Inventor
潘吉彦
胡波
张建秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Longdong Optoelectronic Co., Ltd.
Original Assignee
Fudan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fudan University filed Critical Fudan University
Priority to CN200710045939XA priority Critical patent/CN101127120B/en
Publication of CN101127120A publication Critical patent/CN101127120A/en
Application granted granted Critical
Publication of CN101127120B publication Critical patent/CN101127120B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The utility model relate to a target tracking algorithm which can effectively inhibit the template drift occurred frequently in the process of target tracking, belonging to the technical field of computer vision and mode analysis. The utility model provides a quantitative expression called ''drift noise power'' aiming at the cause of the template drift and brings the ''drift noise power'' into the framework of Kalman-mode updated filter. The Kalman-mode updated filter with the ''drift noise power'' can achieve the optimal balance between updating target appearance in time and avoiding template drift self-adaptively in time and space. The utility model is proved to be effective by the experimental results of a plurality of different types of realistic video stream and synthetic video stream.

Description

A kind of target tracking algorism of effective inhibition template drift
Technical field
The invention belongs to computer vision and pattern analysis technical field, be specifically related to a kind of target tracking algorism of effective inhibition template drift.
Background technology
Target following has a wide range of applications in man-machine interaction, supervision automatically, video frequency searching, Traffic monitoring and automobile navigation.The task of target following is to determine how much states of target in each frame of video flowing, comprises position, size and orientation etc.Owing to do not limit the outward appearance of tracked target, and the outward appearance of target can change in tracing process, adds the interference of complex background, and target tracking algorism is faced with lot of challenges, is one of research focus of computer vision field.
Target tracking algorism is divided into two big classes, and a class is trace point target (point tracking) [1,2], another kind of then is (the kernel tracking) of tracing surface target [3~6]When target with respect to whole visual field very hour, for example radar image can adopt the point target track algorithm.For the captured image of common camera, then adopt appearance mark track algorithm more.Appearance mark track algorithm can be divided into again To Template is mated (template matching) [3,4]And only objective contour is followed the tracks of (contour tracking) [5,6]Because the template matches track algorithm has been integrated the overall appearance information and the geological information of target simultaneously, therefore use quite extensive.
The template matches track algorithm uses a rectangle or oval-shaped template to characterize target usually.The motion of target is described by the coordinate transform (translation, convergent-divergent, rotation etc.) of template usually.Different coordinate conversion parameters provides different image-regions, wherein provides the geological information that has reflected current goal with the coordinate conversion parameter of the highest image-region of template matches degree [7]
Because the outward appearance of target constantly changes in tracing process, so template also must be brought in constant renewal in.The simplest template renewal strategy be every frame or every some frames the image-region that matches as new template [8,9]Yet a frequent phenomenon that takes place is, in tracing process, target shifts out template gradually, and background object moves into template gradually, finally causes losing of target.This phenomenon is called template drift phenomenon (template drift) [3,7]
Many documents have all been done preliminary to the reason of template drift phenomenon and have been explained qualitatively [3,10,11]The basic reason of template drift is that template matches exists geometric error, and these errors are introduced in the template when each template renewal, change into the outward appearance error of template.Although these errors are very little, can build up, cause the continuous drift of the relative template of target.Template renewal must be frequent more, and the template drift phenomenon is just serious more.So just caused a pair of contradiction: can reflect the outward appearance of current goal as far as possible timely and accurately in order to make template, must the frequent updating template; On the other hand, the frequent updating template can cause the template outward appearance to be destroyed by drift phenomenon again.
In order to solve this contradiction, various template renewal algorithms have been carried.Certain methods is come the result of calibration template coupling with original template [1,7,10]These methods can be avoided template drift in target appearance preferably when whole tracking phase variation is very little, but in most of the cases, target appearance all can take place to change comparatively significantly after after a while, at this moment, original template is proofreaied and correct just no longer valid for coupling, and can cause the tracking effect variation on the contrary.
Document [11] concludes that the strategy that template is carried out filtering with Kalman's template renewal wave filter has the strongest robustness for drift and noise after having compared the various template renewal strategies that do not rely on original template.Document [12] has further been studied Kalman's coefficient of how choosing the template renewal wave filter.But, in document [11] and [12], Kalman's coefficient just remains unchanged in whole tracing process once selected, and this makes the intensity that the template renewal wave filter can't be adjusted template renewal adaptively according to the severe degree and the possible template drift degree of target appearance variation.
In document [4] and [13], one of two plant noises by estimating Kalman's template renewal wave filter, Kalman's coefficient of wave filter can be adjusted automatically according to the situation that target appearance changes.But document [4] is with [13] or supposition state transitions noise power is constant, or supposition observation noise power invariability, and this is all comparatively rare in actual conditions.Therefore, the method in document [4] and [13] still can not reduce template drift well in the outdoor scene video flowing.In order effectively to estimate the noise power of Kalman's template renewal wave filter, the calculation template drift is to the influence of noise power quantitatively.
List of references
[1]C.Rasmussen,G.Hager.Probabilistic data association methods for tracking complexvisual objects[J].IEEE Trans.on Pattern Analysis and Machine Intelligence,2001,23(6):560-576.
[2]C.Hue,J.L.Cadre,P.Prez.Sequential Monte Carlo methods for multiple target trackingand data fusion[J].IEEE Trans.on Signal Processing,2002,50(2):309-325.
[3]I.Matthews,T.Ishikawa,S.Baker.The template update problem[J].IEEE Trans.onPattern Analysis and Machine Intelligence,2004,26(6):810-815.
[4]H.T.Nguyen,A.W.M.Smeulders.Fast occluded object tracking by a robust appearancefilter[J].IEEE Trans.on Pattern Analysis and Machine Intelligence,2004,26(8):1099-1104.
[5]A.Yilmaz,X.Li,M.Shan.Contour based object tracking with occlusion handling invideo acquired using mobile cameras[J].IEEE Trans.on Pattern Analysis and MachineIntelligence,2004,26(11):1531-1536.
[6]Y.Chen,Y.Rui,T.Huang.Jpdaf based HMM for real-time contour tracking[A].Proc.IEEE Comp.Society.Conf.on Computer Vision and Pattern Recognition[C].1:543-550,2001.
[7]Z.Jia,A.Balasuriya,S.Challa.Target Tracking with Bayesian Fusion Based TemplateMatching[A].Proc.IEEE Int.Conf.On Image Processing[C].2:II-826-829,2005.
[8]M.J.Black,Y.Yacoob.Recognizing facial expressions in image sequences using localparameterized models of image motion[J].Int.J.Computer Vision,1997,25(1):23-48.
[9]H.Sidenbladh,M.J.Black,D.J.Fleet.Stochastic tracking of 3D human figures using 2Dimage motion[A].Proc.European Conf.on Computer Vision[C].2:702-718,2000.
[10]T.Kaneko,Osamu Hori.Template update criterion for template matching of imagesequences[A].Proc.IEEE Int.Conf.on Pattern Recognition[C].2:1-5,2002.
[11]A.M.Peacock,S.Matsunaga,D.Renshaw,J.Hannah,A.Murray.Reference block updatingwhen tracking with block matching algorithm[J].Electronic Letters,2000,36:309-310.
[12]C.Haworth,A.M.Peacock,D.Renshaw.Performance of reference block updatingtechniques when tracking with the block matching algorithm[A].Proc.IEEE Int.Conf.onImage Processing[C].1:365-368,2001.
[13]H.T.Nguyen,M.Worring,R.van den Boomgaard.Occlusion robust adaptive templatetracking[A].Proc.IEEE Int.Conf.on Computer Vision[C].1:678-683,2001.
[14]L.K.Liu,E.Feig.A block-based gradient descent search algorithm for block motionestimation in video coding[J].IEEE Trans.on Circuits and Systems for Video Technology,1996,6:419-422.
[15]S.Baker,I.Matthews.Lucas-Kanade 20years on:a unifying framework[J].Int.J.Computer Vision,2004,53(3):221-255.
[16]J.Pan,B.Hu,J.Q.Zhang.An efficient object tracking algorithm with adaptive predictionof initial searching point[A].Lecture Notes in Computer Science,4319/2006:1113-1122,2006.
[17]R.G.Brown,P.Y.C.Hwang.Introduction to Random Signals and Applied Kalman Filtering[M],John Wiley,1992.
Summary of the invention
The objective of the invention is to propose a kind of target tracking algorism that can effectively suppress template drift.
How key of the present invention is that the origin cause of formation quantitative modeling with template drift is a drift noise and with its part as template renewal wave filter observation noise.
Target tracking algorism of the present invention comprises:
Utilize more new template of Kalman filter, and by including the generation that effectively suppresses template drift in the Kalman filter in the influence of template drift is explicit.
With template drift The noise quantitative expression is an error power, i.e. drift noise power σ MD 2
With drift noise power σ MD 2As observation noise power σ in Kalman's template renewal wave filter M 2Ingredient.
With camera noise power σ MC 2As observation noise power σ in Kalman's template renewal wave filter M 2Another ingredient.
Observation noise power σ in Kalman's template renewal wave filter M 2By σ MD 2With σ MC 2Weighted sum obtain: σ M 2=λ σ MD 2+ σ MC 2, wherein λ is a constant.
Come quantitative Analysis drift noise power σ by the quantization error of coordinate transform coefficient and the probability distribution of pixel actual value MD 2
Under the situation of unknown camera noise power parameter, estimate the noise power σ of camera by the pixel variance yields that calculates the test pattern that produces by the grey uniform background MC 2
When target travel can only be described with translation and convergent-divergent, adopt a kind of fast algorithm to calculate drift noise power σ MD 2, this fast algorithm will produce same conversion recoil pairing each sum term of target transformation parameter vector and be merged into one, m be tieed up the summation problem be converted to two-dimentional summation problem, thereby reduce operand greatly.
Be further described in detail below
1, based on the target following of template matches
In the target following based on template matches, target is represented that by the subimage of representing its outward appearance this number of sub images is called template.Original template is the outward appearance of target in first frame normally.In the present invention, template is with T (x) expression, x=[x wherein, y] TIt is pixel coordinate.Because the existence of observation noise, real target appearance can't obtain, so the actual utilization of track algorithm is the estimated value of template
Figure A20071004593900061
In each frame, By coordinate transform φ (x; A) be mapped in the picture frame, wherein a is the transformation parameter vector.φ (x; A) motion and the deformation of target have been described.For the coordinate transform that comprises translation, convergent-divergent and rotation, a is a four-dimensional vector, and a=(a1, a2, a3, a4), φ (x; A) can be expressed as:
φ ( x ; a ) = a 1 cos a 2 - sin a 2 sin a 2 cos a 2 x y + a 3 a 4 - - ( 1 )
Here, α 1Be amount of zoom, α 2Be the anglec of rotation, (α 3, α 4) be the translational movement in (x, y) coordinate system.α in theory, φ (x; A) can have the target travel that many arbitrarily parameters are described any complexity, but can describe the most applications that runs in the reality by the motion model of (1) formula representative.
Transformation parameter vector a has reflected the geological information of target in present frame.The optimal estimation of this information obtains by the image-region of seeking the present frame that mates most with template, promptly
a ^ = arg min a 1 N Σ x ∈ Ω T | I n [ φ ( x ; a ) ] - T ^ ( x ) | - - - ( 2 )
In following formula,  is the optimal estimation value of the transformation parameter vector of n frame; I n(x) be the pixel value that the n two field picture is positioned at coordinate x place; Ω TIt is the set of template pixel coordinate; N is the number of pixels that template comprises.(2) realization of formula has a series of fast search algorithms [14,15], the initial point of search can be obtained by the algorithm in the document [16].In addition, because therefore the not necessarily rounded coordinate value that coordinate transform φ is produced need calculate I with interpolation algorithm n[φ (x; A)].
In the ideal case, the transformation parameter vector  that is obtained by (2) formula has reflected the true geometric state of target, but because the final Search Results of (2) formula must be taken from discrete vector space, quantization error has wherein caused the optimal estimation value and the actual value a of transformation parameter vector 0Between have error inevitably.Therefore, the observed reading I of target appearance all can take place in each frame n[φ (x; )] depart from its actual value I n[φ (x; a 0)].We call drift noise depart from that part of observation noise that causes owing to this outward appearance.The accumulation of drift noise is the basic reason that causes the template drift phenomenon to take place.Drift noise and camera noise have constituted observation noise jointly.
2, Kalman's template renewal wave filter
In order to obtain the optimal estimation for the true outward appearance of target, Kalman filter has been applied in the template renewal.In order to analyze the situation of change of each template pixel, we carry out Kalman filtering respectively to each template pixel.To template pixel x, its state equation is
T(x,n)=T(x,n-1)+ε S(x,n-1)(3)
Wherein, (x n) is the gray-scale value of template pixel x at the n frame to T; ε S(x is the state transitions noise n), and it has reflected the variation from the n frame to n+1 frame target appearance itself.We can think reasonably that the state transitions noise is the zero-mean white Gaussian noise, have power spectrum density σ S 2(x, n).
The observation equation of template pixel x is
I n[φ(x;)]=T(x,n)+ε M(x,n) (4)
Wherein, ε M(x n) is observation noise.Equally, observation noise also is the zero-mean white Gaussian noise, has power spectrum density σ M 2(x, n).
Note e P(x n) is T (x, predicated error n), e E(x, n) be T (then we can obtain following two formulas for x, evaluated error n):
T ( x , n - 1 ) = T ^ E ( x , n - 1 ) + e E ( x , n - 1 ) - - - ( 5 )
T ( x , n ) = T ^ P ( x , n ) + e P ( x , n ) - - - ( 6 )
In the formula,
Figure A20071004593900075
Be after having obtained preceding n-1 frame observed reading, to T (x, predicted value n);
Figure A20071004593900076
Be after having obtained preceding n frame observed reading, to T (x, estimated value n).Because the state transitions coefficient in (3) formula is 1, so
T ^ P ( x , n ) = T ^ E ( x , n - 1 ) - - - ( 7 )
From (3), (5)-(7) formula, we can obtain T, and (x, predicated error n) and the pass between the evaluated error are
e P(x,n)=e E(x,n-1)+ε S(x,n-1)(8)
Because the state transitions noise is uncorrelated with evaluated error, so we obtain
σ P 2 ( x , n ) = σ E 2 ( x , n - 1 ) + σ S 2 ( x , n - 1 ) - - - ( 9 )
Wherein, σ P 2And σ E 2It is respectively the power spectrum density of predicated error and evaluated error.Under unlikely situation about obscuring, for brevity, below power spectrum density is abbreviated as power.
According to kalman filtering theory [17], optimum Kalman's coefficient of template pixel x should be taken as
G ( x , n ) = 1 1 + σ M 2 ( x , n ) / σ P 2 ( x , n ) - - - ( 10 )
Template is then upgraded according to following equation:
T ^ E ( x , n ) = T ^ P ( x , n ) + G ( x , n ) { I n [ φ ( x ; a ^ ) - T ^ P ( x , n ) ] } = T ^ P ( x , n ) + G ( x , n ) α ( x , n ) - - - ( 11 )
Wherein, α ( x , n ) = I n [ φ ( x ; a ^ ) ] - T ^ P ( x , n ) It is the new breath of n frame.
After the template renewal, (x, evaluated error power n) becomes T
σ E 2 ( x , n ) = [ 1 - G ( x , n ) ] σ P 2 ( x , n ) - - - ( 12 )
Equation (7), (9)-(12) formula has constituted a complete iteration of Kalman's template renewal.
Here, unique needs are initialized is the evaluated error power σ of template E 2Because original template directly intercepts from the target area of first frame, the evaluated error of this moment is only caused by the camera noise.Therefore, the initial estimation error power equals the camera noise power, promptly
σ E 2 ( x , 0 ) = σ MC 2 - - - ( 13 )
σ wherein MC 2It is the camera noise power.
3, the contact of two plant noise power
In the Kalman filtering problem of standard, state transitions noise power σ S 2With observation noise power σ M 2It is known all to be considered to priori.But in target following, these two plant noise power need On-line Estimation.But, if a noise power obtains, then another noise power just is not difficult to have estimated [13]Consider (3)-(5) and (7) formula simultaneously, we can obtain getting in touch the equation of new breath, evaluated error and two plant noises at once:
α(x,n)=e E(x,n-1)+ε S(x,n-1)+ε M(x,n)(14)
Because every equal two pairwise uncorrelateds in following formula the right are so can obtain
σ α 2 ( x , n ) = σ E 2 ( x , n - 1 ) + σ S 2 ( x , n - 1 ) σ M 2 ( x , n ) - - - ( 15 )
Wherein, σ α 2(x is newly to cease α (x, power n) n).Newly breath power can be estimated by the Mean Square Error on time one space, promptly
σ α 2 ( x , n ) ≈ 1 N L Σ k = n - L + 1 n Σ z ∈ Ω L ( x ) [ α ( z , k ) ] 2 - - - ( 16 )
In the following formula, L is the length of time running mean window, and general desirable L gets 15-25; Ω L(x) be the spatial image piece that is centered close to x; N LBe to participate in average sum of all pixels.In example of the present invention, we get L=20, and the size of spatial image piece is 11 * 11 pixels.
Because evaluated error power σ E 2In the process of Kalman filtering, produce automatically, therefore according to (15) formula, if σ M 2S 2) known, σ then S 2M 2) can obtain immediately.Present problem is one that how at first to estimate in two noise powers.Document [4] has been made different hypothesis with [13] for the value of two noise powers.As previously mentioned, these hypothesis are two extreme cases may running in the reality, and we need make more efficiently estimation to the value of noise power.Because state transitions noise reflection is the variation of target appearance itself, and this variation can be fully arbitrarily, so the power of direct estimation state transitions noise and being not easy.Yet we it will be appreciated that in the back, the power of observation noise can be by estimating drift noise power online obtaining.In case we have estimated observation noise power σ M 2, state transitions noise power σ then S 2Can be expressed as follows immediately:
σ S 2 ( x , n - 1 ) = σ α 2 ( x , n ) - σ E 2 ( x , n - 1 ) - σ M 2 ( x , n ) - - - ( 17 )
In some cases, (17) formula can produce negative value, and this shows that at the x place target appearance does not almost change from the n-1 frame to the n frame.So, σ S 2Should be taken as zero, simultaneously, σ M 2Also should correspondingly be adjusted into
σ M 2 ( x , n ) = σ α 2 ( x , n ) - σ E 2 ( x , n - 1 ) - - - ( 18 )
Next, we just specifically discuss the power of On-line Estimation drift noise how and observation noise.
4, the estimation of observation noise power
As previously mentioned, the accumulation of the drift noise composition in the observation noise is the basic reason that causes the template drift phenomenon to take place.Therefore, in order to make Kalman's template renewal wave filter can suppress template drift, must estimate the power of drift noise quantitatively.
Fig. 1 has represented because the optimal estimation value  and the actual value a of transformation parameter vector 0Between inconsistent having caused at I n[φ (x; Produced drift error )].Template pixel x is at the actual position φ of n frame (x among Fig. 1; a 0) be positioned at φ (x; Neighborhood Ω ) uIn certain a bit, so the actual value of template pixel x also should be taken from Ω uIn certain a bit.The drift noise of template pixel x is exactly in fact I n[φ (x; )] with x at Ω uIn the error of expectation actual value.When the search precision of (2) formula reduces, Ω uCan become big, generally can cause drift noise to increase.
For simplicity, we replace a with a 0The actual value of representing the transformation parameter vector.According to the argumentation of front, the drift noise power of template pixel x can be expressed as
σ MD 2 ( x , n ) = ∫ a { I n [ φ ( x ; a ) ] - I n [ φ ( x ; a ^ ) ] } 2 p a ( a | a ^ ) da - - - ( 19 )
Wherein, σ MD 2(x n) is the drift noise power of template pixel x at the n frame; p aIt is the associating posterior probability Density Distribution of each component of a after the known .When  was near a, the posterior probability Density Distribution of each component of a can regard separate as because this moment some components value result can not influence the value of other component.Under the situation that target is not lost, this condition always satisfies.Therefore, (19) formula can be written as again
σ MD 2 ( x , n ) = ∫ ∫ . . . ∫ a 1 a 2 . . . a m { I n [ φ ( x ; a ) ] - I n [ φ ( x ; a ^ ) ] } 2 Π i = 1 m p i ( a i | a ^ i ) d a i - - - ( 20 )
Wherein, p iBe i the component a of a iThe posterior probability Density Distribution; M is the number of parameters that coordinate transform φ is comprised.
Ensuing problem is how to calculate p iAs seen from Figure 2,  iCan only quantize, and  iConditional probability be
P i ( a ^ i | a i ) = 1 , | a ^ i - a i | ≤ Δ i / 2 0 , else - - - ( 21 )
In the following formula, P iBe given a iBack  iConditional probability; Δ iBe (2) formula search  iThe time final step-length.According to bayes rule, a iPosteriority distribute and to be
p i ( a i | a ^ i ) = P i ( a ^ i | a i ) p ( a i ) ∫ P i ( a ^ i | a i ) p ( a i ) d a i - - - ( 22 )
With (21) formula substitution (22) Shi Kede
p i ( a i | a ^ i ) = p ( a i ) ∫ a ^ i - Δ i / 2 a ^ i + Δ i / 2 p ( a i ) d a i | a i - a ^ i | ≤ Δ i / 2 0 , else - - - ( 23 )
Although obtain p (a i) explicit value and be not easy, but we can reasonably think p (a i) be approximate constant in the integrating range of (23) formula.This is because Δ i is less, and integrating range is relatively near p (a i) comparatively smooth maximum point.Approximate based on this, (23) formula can be reduced to
p i ( a i | a ^ i ) = 1 / ( Δ i ) , | a i - a ^ i | ≤ Δ i / 2 0 , else - - - ( 24 )
With (24) formula substitution (20) formula, can try to achieve the drift noise power of template pixel x at the n frame.Yet,, be very difficult so will expect the analysis result of drift noise power owing in computation process, comprise interpolation operation.But, we can be by trying to achieve approximate numerical result with the integration discretize, that is:
σ MD 2 ( x , n ) ≈ Σ k 1 Σ k 2 . . . Σ k m { I n [ φ ( x ; a k ) ] - I n [ φ ( x ; a ^ ) ] } 2 Π i = 1 m p i ( k i Δ a i | a ^ i ) Δ a i - - - ( 25 )
= ( Π i = 1 m Δ a i Δ i ) Σ k 1 Σ k 2 . . . Σ k m { I n [ φ ( x ; a k ) ] - I n [ φ ( x ; a ^ ) ] } 2
Wherein, Δ a 1Δ a mFormed the sum unit in the residing higher dimensional space of a, and a k=[k 1Δ a 1K mΔ a m] TThe size of sum unit has determined the precision of (25) formulas.Integer k iSpan satisfy
|k iΔa i- i|≤Δi/2,i=1,2,…,m (26)
After having estimated the drift noise power of template pixel, need add that camera power just can obtain final observation noise power.Different with drift noise power, the camera noise power is in the space and all think constant on the time.The camera noise power-value can be found from the technical manual of sensor, also can obtain by the pixel variance yields that calculates the test pattern that is produced by the grey uniform background, that is:
σ MC 2 = 1 M Σ x ∈ Ω G [ I G ( x ) - 1 M Σ x ∈ Ω G I G ( x ) ] 2 - - - ( 27 )
Wherein, I GIt is test pattern; Ω GIt is the coordinate set of test zone; M is the number of pixels that test zone comprises.
At last, observation noise power is provided by following formula:
σ M 2 ( x , n ) = σ MC 2 + λ σ MD 2 ( x , n ) - - - ( 28 )
Wherein λ is the constant greater than 1, and its value depends on the precision of interpolation method.General λ gets 1<λ≤3, and the precision of difference approach is high more, and then λ is more near 1.In the bilinear interpolation for the present invention's employing, λ is taken as 2.
From above derivation and discuss as can be known, observation noise power and target appearance itself have very big relation, and the observation noise of the place generation that texture or edge are intensive more is also big more.This is understandable, because same grid deviation can cause bigger outward appearance error at the more complicated place of target appearance, thereby the template at this place is caused bigger destruction.So the details that target appearance comprises is many more, the template drift phenomenon is just serious more.In our algorithm, the target appearance details in somewhere at a time is many more, and observation noise power that this moment should the place is just big more, and this makes corresponding Kalman's coefficient can not become excessive, thereby has effectively protected template, has significantly suppressed drift.
5, fast algorithm
If we can be from p aWith the coordinate u=φ (x that obtains among the φ after the conversion; A)=[v, w] TAssociating posterior probability Density Distribution p u, then the calculated amount of (25) formula just can significantly reduce.With (19) formula with conversion after coordinate u represent, can get
σ MD 2 ( x , n ) = ∫ u ∈ Ω u [ I n ( u ) - I n ( u ^ ) ] 2 p u ( u | u ^ ) du - - - ( 29 )
(25) formula then becomes
σ MD 2 ( x , n ) ≈ ΔvΔw Σ l 1 Σ l 2 [ I n ( l 1 Δv , l 2 Δw ) - I n ( v ^ , w ^ ) ] 2 p u ( l 1 Δv , l 2 Δw | v ^ , w ^ ) - - - ( 30 )
In above two formulas,
Figure A20071004593900113
Ω uBe the zone shown in Fig. 1, Δ v Δ w is the rectangle sum unit; Integer l 1, l 2Value satisfy
[l 1Δv,l 2Δw] T∈Ω u (31)
By (30) formula is replaced (25) formula, we can be merged into one with producing same conversion recoil pairing each sum term of target transformation parameter vector, m is tieed up the summation problem be converted to two-dimentional summation problem, reduce operand greatly.
When coordinate transform φ was too complicated, the associating posterior probability Density Distribution of finding the solution coordinate u after the conversion is difficulty very.But, if φ only comprises translation and convergent-divergent (the modal target travel of two classes), promptly
u = φ ( x ; a ) = v w = a 1 x y + a 2 a 3 - - - ( 32 )
Then can obtain p uAnalytical expression.
The distribution function of u can be expressed as
F u ( u | u ^ ) = P { V ≤ v , W ≤ w | v ^ , w ^ } - - - ( 33 )
Here we represent stochastic variable with capitalization.Consider (32) formula, (33) formula can be rewritten as
F u ( u | u ^ ) = P { A 2 ≤ v - A 1 x , A 3 ≤ w - A 1 y | a ^ 1 , a ^ 2 , a ^ 3 }
= ∫ - ∞ ∞ [ ∫ - ∞ v - a 1 x p 2 ( a 2 | a ^ 2 , a 1 ) d a 2 · ∫ - ∞ w - a 1 y p 3 ( a 3 | a ^ 3 , a 1 ) d a 3 ] p 1 ( a 1 | a ^ 1 ) d a 1 - - - ( 34 )
= ∫ - ∞ ∞ [ ∫ - ∞ v - a 1 x p 2 ( a 2 | a ^ 2 ) d a 2 · ∫ - ∞ w - a 1 y p 3 ( a 3 | a ^ 3 ) d a 3 ] p 1 ( a 1 | a ^ 1 ) d a 1
Wherein, last equation establishment is because the independence between coordinate conversion parameter.
The associating posterior probability Density Distribution of u can be by asking partial derivative to obtain to (34), that is:
p u ( u | u ^ ) = ∂ 2 F u ( v , w ) ∂ v ∂ w = ∫ - ∞ ∞ p 2 ( v - a 1 x | a ^ 2 ) p 3 ( w - a 1 y | a ^ 3 ) p 1 ( a 1 | a ^ 1 ) d a 1 - - - ( 35 )
(24) formula is updated in each distribution of following formula, we obtain
p u ( u | u ^ ) = 1 &Delta; 1 &Delta; 2 &Delta; 3 &Integral; B L B H d a 1 ( B H - B L ) / ( &Delta; 1 &Delta; 2 &Delta; 3 ) , B H &GreaterEqual; B L 0 , B H < B L - - - ( 36 )
Wherein, can prove B LWith B HValue as follows:
B L = max { a ^ 1 - &Delta; 1 2 , v - a ^ 2 x - &Delta; 2 2 | x | , w - a ^ 3 y - &Delta; 3 2 | y | } , x &NotEqual; 0 , y &NotEqual; 0 - - - ( 37 )
B H = min { a ^ 1 + &Delta; 1 2 , v - a ^ 2 x + &Delta; 2 2 | x | , w - a ^ 3 y + &Delta; 3 2 | y | } , x &NotEqual; 0 , y &NotEqual; 0 - - - ( 38 )
If x=0 then works as | v- 2|≤Δ 2Second disappearance more than/2 o'clock in the two formula braces, otherwise the associating posterior probability Density Distribution of u equals zero; If y=0 then works as | w- 3|≤Δ 3The 3rd disappearance more than/2 o'clock in the two formula braces, otherwise the associating posterior probability Density Distribution of u equals zero.For simplicity, omitted B here LWith B HDerivation.When target travel can only be described with translation and convergent-divergent, can use (30), (36)-(38) formula significantly reduces the calculated amount of estimating drift noise power.
According to foregoing, the concrete steps of the target tracking algorism of effective inhibition template drift of the present invention are as follows:
1. initialization evaluated error power σ E 2, promptly move (13) formula.σ wherein MC 2Be the camera noise power, the technical manual that can look into the camera sensor obtains, if there are not the data of this respect, then uses (27) formula estimation camera noise power σ MC 2
2. selected target zone in first frame.
3.
Figure A20071004593900127
Following initialization: by initial coordinate conversion φ (x; a s) sampling Initial R OI, promptly T ^ E ( x , 0 ) = I n [ &phi; ( x ; a s ) ] , A wherein sInitial coordinate transformation parameter for target.
4. read in next frame.
5. the prediction module of present frame Be the estimation template of previous frame
Figure A200710045939001210
Promptly move (7) formula.
With prediction module by coordinate transform φ (x; A) be mapped to present frame.Obtain reflecting that by the image-region of seeking the present frame that mates most with prediction module the transformation parameter vector  of the geological information of target in present frame promptly moves (2) formula.
By calculating because the mathematical expectation of pixel observation value that the quantization error of transformation parameter vector in (2) formula causes and the square-error between the actual value obtains the drift noise power σ of each template pixel MD 2Owing to can't obtain analytical expression, the integration of (19) formula is converted into the summation of (25) formula with the mode of discretize transformation parameter.Specifically, with (25), (26) formula estimation drift noise power σ MD 2When target travel can only be described with translation and convergent-divergent, can be merged into one with producing same conversion recoil pairing each sum term of target transformation parameter vector, m is tieed up the summation problem be converted to two-dimentional summation problem, reduce operand greatly.Specifically, this fast algorithm is by (30), and (36)-(38) formula significantly reduces the calculated amount of estimation drift noise power.
8. the observation noise power σ of each template pixel M 2Be drift noise power and camera noise power σ MC 2Weighted sum.Specifically, with (28) formula calculating observation noise power σ M 2Wherein λ is the constant greater than 1, and its value depends on the precision of interpolation method.The precision of difference approach is high more, and then λ is more near 1.For the bilinear interpolation that adopts in the example of the present invention, λ is taken as 2.
9. estimate the new breath power σ of each template pixel with (16) formula α 2
10. calculate the state transitions noise power σ of each template pixel with (17) formula S 2If (17) formula produces negative value, then σ S 2Be taken as zero, simultaneously σ M 2Be adjusted accordingly according to (18) formula.
11. operation (9) formula obtains each template prediction errors power σ P 2
12. according to (10) formula determine each template pixel optimum Kalman's coefficient G (x, n).
13. obtain the estimation template of present frame according to (11) formula
Figure A20071004593900131
14. obtain the evaluated error power σ of each template pixel of present frame according to (12) formula E 2
15., then forwarded for the 5th step to, otherwise finish if video flowing has been untreated.
Description of drawings
Fig. 1: the actual position φ (x of template pixel x in picture frame; a 0) be positioned at Search Results φ (x; In neighborhood ), caused drift noise thus.
Fig. 2: the quantizing process during search optimal transformation parameter vector.
Fig. 3: the performance that different track algorithms suppress template drift compares.Four lines has shown the tracking results of four different video streams respectively.In each row, leftmost piece image is the common initial frame of each algorithm, and next three width of cloth images have from left to right shown the final tracking results of the algorithm that document [3], document [4] and the present invention propose respectively.Current template for displaying is in the lower right corner of each width of cloth image.In the 4th video flowing, the algorithm of document [3] has been lost target (seeing d2), so tracing process finishes in advance.
Fig. 4: the frame in " Lake " original image and the synthetic video stream.Sailing boat among the right figure in the red circle is tracked target.
Fig. 5: adopt the operand of fast algorithm front and back template renewal wave filter to compare.
Embodiment
1, outdoor scene video flowing experiment
We have at first compared the performance that different track algorithms suppress template drift on a large amount of outdoor scene video flowings.Tracked target has appearance change intensity in various degree in these video flowings.Because in the experiment of each class, we have obtained similar experimental result on all video flowings, so we respectively get a typical video flowing and are placed among the present invention, as shown in Figure 3 in each class experiment.Each row among Fig. 3 has been represented a video flowing, and to fourth line, the intensity of variation of target appearance is from the no change to great changes from first row.We have compared the performance that following three kinds of algorithms suppress template drift: the algorithm that the algorithm in the algorithm in the document [3], the document [4] and the present invention propose.In each row of Fig. 3, leftmost piece image is the common initial frames of all algorithms, and next three width of cloth images have from left to right shown three kinds of tracking results that algorithm is last respectively.In the 4th video flowing,, finish in advance so follow the tracks of because the algorithm in the document [3] has been lost target in tracing process.What show in the lower right corner of every width of cloth image is to work as front template.
In all experiments, we observe, and for the algorithm of document [3], because it adopts the first frame template correction target position, therefore when the target appearance variation is very little, almost do not have the template drift phenomenon (to see Fig. 3 a 2); Yet when the target appearance intensity of variation increased, the performance of this algorithm declined to a great extent and (sees Fig. 3 b 2, 3c 2), even lose objects (is seen Fig. 3 d 2), this is because the first frame template is no longer valid for the correction target position.
For the algorithm of document [4], when target appearance changes hour, too fast template renewal makes this algorithm produce very serious template drift (to see Fig. 3 a 3); And when the target appearance intensity of variation increases,, still fairly obviously (see Fig. 3 b although template drift reduces to some extent 3, 3c 3, 3d 3).This is because this algorithm is not considered drift noise power, thereby its template renewal wave filter can't be obtained gratifying effect for suppressing template drift.
For the algorithm that the present invention proposes, regardless of the target appearance intensity of variation, template drift always (is seen Fig. 3 a by effective must having restrained 4, 3b 4, 3c 4, 3d 4).This be since the effective modeling of template renewal wave filter of algorithm of the present invention drift noise, thereby can obtain the template renewal strategy of optimum at various tracking scene adaptive ground.
2, synthetic video stream experiment
The degree of the template drift that produces for the quantitative comparison algorithms of different, we also test each algorithm on the synthetic video flowing except having done on the outdoor scene video flowing a large amount of experiments.Why to use artificial synthetic video stream, be that the actual value of the geometric parameter of target in each frame all is known because in this case, and, we can effectively control various experiment condition, to obtain the more deep understanding for algorithm performance.
In our experiment, we use 512 * 512 standard testing image " lake " to generate test video stream, generating mode is as follows: (scope is the 367th~447 row with the sailing boat among the former figure for we, the 296th~350 row) extract as target, cover on the former figure through convergent-divergent and after changing outward appearance, and on former figure, move along certain track.The track that target moves is the constant speed helix of following form:
x ( n ) = ( 10 + r &CenterDot; n ) cos ( &pi; &CenterDot; r &CenterDot; n / 20 ) + 256 y ( n ) = ( 10 + r &CenterDot; n ) sin ( &pi; &CenterDot; r &CenterDot; n / 20 ) + 256 n = 0,1 , . . . - - - ( 39 )
Wherein, r constantly changes, and is 2 pixel/frame all the time with the translational speed that guarantees target.The yardstick of target between 0.5~1.5 with the rate variation of 0.03/ frame.The outward appearance of target changes according to following formula:
I &prime; ( x , n ) = [ I ( x , n ) - 128 ] &CenterDot; k + 128 , x &Element; &Omega; A I ( x , n ) , else , n = 0,1 . . . - - - ( 40 )
Wherein, and I (x, n) (x n) represents target pixel value before and after the outward appearance change respectively with I '; K is a time-varying parameter, changes between-1~1 with fixed rate; Ω ABe the zone that target appearance is modified, get successively target first, second, a left side half and right half part, and in each part, k finishes once circulation.By changing the rate of change of k, the speed that we can the controlled target appearance change.Two field picture in former figure and the synthetic video stream is shown among Fig. 4.
At first, we allow the k perseverance be zero, produce a video flowing that contains 300 frames.In this video flowing, the outward appearance of target remains unchanged.The tracking error of each algorithm is shown in Table 1.Here, tracking error refers to the actual value of the coordinate conversion parameter vector of target in each frame and the average Euclidean distance between the track algorithm estimated value.In the table 1 the 2nd to the 4th row are respectively the Kalman coefficient of template renewal wave filter on all pixels to be fixed as 0,0.5 and 1 tracking error; The the 5th to the 8th row are tracking errors of document [3], document [4], document [13] and algorithm of the present invention; Δ LWith Δ SIt is respectively search precision corresponding to the coordinate conversion parameter of position and yardstick.
We can see in table 1, Kalman's coefficient is fixed as zero has minimum tracking error.This meets expection, because when target appearance is constant, best template renewal strategy is exactly a new template more not fully.The tracking error and the least error of document [3] algorithm are very approaching, because first frame template correction target position effectively always in this case.Though it is constant to it should be noted that in this experiment target appearance remains, we observe when the yardstick of target hour (thereby the details that comprises in the unit area more for a long time), much bigger new breath when can produce than other.Obviously, these new breaths do not cause owing to target appearance itself changes, but come from the drift noise that matching error is brought.Algorithm of the present invention has correctly been analyzed the source of new breath, automatically improved observation noise power (but not state transitions noise power), Kalman's coefficient of each pixel of template renewal wave filter all is controlled at a very low level, thereby its tracking error is all more approaching with minimum value under all search precisions, and naked eyes are not almost seen.The algorithm of document [4], document [13] and Kalman's coefficient method of 0.5 of being fixed as all had bigger tracking error is not because their resulting template renewal strategies are optimum.Maximum tracking error appears at Kalman's coefficient is fixed as in 1 the method, every frame all without reservation more new template caused serious template drift.
In order to observe the tracking error of each algorithm when target appearance changes, we allow k with constant rate of change fluctuation, generate a test video stream that comprises 300 frames.Experimental result is shown in Table 2.In this experiment, keep the tracking error of constant strategy of template and document [3] algorithm all significantly to increase; 1 method that Kalman's coefficient is fixed as has still produced very big template drift; In remaining algorithm, algorithm of the present invention has all been obtained the tracking error much smaller than other algorithm under all search precisions.
From above experiment as can be seen, no matter whether target appearance changes, algorithm of the present invention always can be according to the height of target appearance situation of change and search precision, in appropriate time and appropriate position with the mode of optimum new template pixel more, so that template can either in time obtain upgrading, can excessively do not upgraded again, thereby reduced tracking error effectively, suppressed template drift.
3, fast algorithm effect
Carry out a needed multiply-add operation of template renewal (MAC) number of times before and after having shown the employing fast algorithm among Fig. 5.Wherein, the x axle is the mean value of evaluated error of the observation noise power of all pixels of template.The exact value of observation noise power obtains by get very little sum unit in (25) formula.We obtain the evaluated error and the corresponding computational complexity thereof of a series of employing fast algorithms front and back observation noise power by the sum unit size that changes (25) formula and (30) formula.As seen from Figure 5, if do not use fast algorithm, then when the evaluated error of observation noise power reduced, operand increased sharply; And if adopt fast algorithm, then the increase of operand is very slow.
Table 1 in the tracking error of target appearance algorithms of different fixedly the time relatively
k=1 G=0 G=0.5 G=1 [3] [4] [13] The present invention
Δ L=1,Δ S=0.05 0.0125 0.0557 0.2666 0.0125 0.0443 0.3222 0.0137
Δ L=1,Δ S=0.08 0.0207 0.1116 0.1057 0.0207 0.0574 0.1057 0.0238
Δ L=2,Δ S=005 0.7481 3.8261 49.878 0.7973 1.0834 1.5804 1.0356
Δ L=2,Δ S=0.08 0.6982 2.0403 60.996 0.7830 1.7115 2.5966 1.3418
Data in the table are the actual value of the coordinate conversion parameter vector of target in each frame and the average Euclidean distance between the track algorithm estimated value
The tracking error of algorithms of different relatively when table 2 changed in target appearance
|Δk/Δn|=0.01 G=0 G=0.5 G=1 [3] [4] [13] The present invention
Δ L=1,Δ S=0.05 7.3143 0.0607 0.2666 0.3357 0.0599 0.1510 0.0146
Δ L=1,Δ S=0.08 7.3981 0.1350 0.7376 0.2164 0.0571 0.8215 0.0238
Δ L=2,Δ S=0.05 16.324 2.0436 57.152 17.192 1.7046 1.3896 1.2039
Δ L=2,Δ S=0.08 19.243 2.6870 12.120 13.020 1.7864 3.5415 1.2619
Data in the table are the actual value of the coordinate conversion parameter vector of target in each frame and the average Euclidean distance between the track algorithm estimated value.

Claims (8)

1. a target tracking algorism that effectively suppresses template drift is characterized in that utilizing more new template of Kalman filter, and by including the generation that effectively suppresses template drift in the Kalman filter in the influence of template drift is explicit.
2. the target tracking algorism of effective inhibition template drift according to claim 1 is characterized in that with template drift The noise quantitative expression be an error power, i.e. drift noise power σ MD 2
3. the target tracking algorism of effective inhibition template drift according to claim 1 and 2 is characterized in that drift noise power σ MD 2As observation noise power σ in Kalman's template renewal wave filter M 2Ingredient.
4. the target tracking algorism of effective inhibition template drift according to claim 1 is characterized in that camera noise power σ MC 2As observation noise power σ in Kalman's template renewal wave filter M 2Another ingredient.
5. according to the target tracking algorism of claim 1,2,3, one of 4 described effective inhibition template drifts, it is characterized in that observation noise power σ in Kalman's template renewal wave filter M 2By σ MD 2With σ MC 2Weighted sum obtain: σ M 2=λ σ MD 2+ σ MC 2, wherein λ is a constant.
6. according to the target tracking algorism of claim 2 or 3 described effective inhibition template drifts, it is characterized in that coming quantitative Analysis drift noise power σ by the quantization error of coordinate transform coefficient and the probability distribution of pixel actual value MD 2
7. the target tracking algorism of effective inhibition template drift according to claim 4, it is characterized in that under the situation of unknown camera noise power parameter, estimate the noise power σ of camera by the pixel variance yields that calculates the test pattern that produces by the grey uniform background MC 2
8. the target tracking algorism of effective inhibition template drift according to claim 6 is characterized in that when target travel can only be described with translation and convergent-divergent, adopts a kind of fast algorithm to calculate drift noise power σ MD 2, this fast algorithm will produce same conversion recoil pairing each sum term of target transformation parameter vector and be merged into one, m be tieed up the summation problem be converted to two-dimentional summation problem, thereby reduce operand greatly.
CN200710045939XA 2007-09-13 2007-09-13 Target tracking algorism for effectively suppressing template drift Expired - Fee Related CN101127120B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200710045939XA CN101127120B (en) 2007-09-13 2007-09-13 Target tracking algorism for effectively suppressing template drift

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200710045939XA CN101127120B (en) 2007-09-13 2007-09-13 Target tracking algorism for effectively suppressing template drift

Publications (2)

Publication Number Publication Date
CN101127120A true CN101127120A (en) 2008-02-20
CN101127120B CN101127120B (en) 2011-02-09

Family

ID=39095144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200710045939XA Expired - Fee Related CN101127120B (en) 2007-09-13 2007-09-13 Target tracking algorism for effectively suppressing template drift

Country Status (1)

Country Link
CN (1) CN101127120B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853511A (en) * 2010-05-17 2010-10-06 哈尔滨工程大学 Anti-shelter target trajectory predicting and tracking method
CN103500455A (en) * 2013-10-15 2014-01-08 北京航空航天大学 Improved maneuvering target tracking method based on unbiased finite impulse response (UFIR) filter
CN107818330A (en) * 2016-09-12 2018-03-20 波音公司 The system and method that space filtering is carried out using the data with extensive different error sizes

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6917692B1 (en) * 1999-05-25 2005-07-12 Thomson Licensing S.A. Kalman tracking of color objects
DE60234649D1 (en) * 2002-12-26 2010-01-14 Mitsubishi Electric Corp IMAGE PROCESSOR
CN100359536C (en) * 2006-03-02 2008-01-02 复旦大学 Image tracking algorithm based on adaptive Kalman initial searching position selection

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853511A (en) * 2010-05-17 2010-10-06 哈尔滨工程大学 Anti-shelter target trajectory predicting and tracking method
CN101853511B (en) * 2010-05-17 2012-07-11 哈尔滨工程大学 Anti-shelter target trajectory predicting and tracking method
CN103500455A (en) * 2013-10-15 2014-01-08 北京航空航天大学 Improved maneuvering target tracking method based on unbiased finite impulse response (UFIR) filter
CN103500455B (en) * 2013-10-15 2016-05-11 北京航空航天大学 A kind of improvement maneuvering target tracking method based on without inclined to one side finite impulse response filter (UFIR)
CN107818330A (en) * 2016-09-12 2018-03-20 波音公司 The system and method that space filtering is carried out using the data with extensive different error sizes

Also Published As

Publication number Publication date
CN101127120B (en) 2011-02-09

Similar Documents

Publication Publication Date Title
CN103049892B (en) Non-local image denoising method based on similar block matrix rank minimization
CN104574445B (en) A kind of method for tracking target
Zhang et al. Exploring structured sparsity by a reweighted Laplace prior for hyperspectral compressive sensing
US7058205B2 (en) Robust, on-line, view-based appearance models for visual motion analysis and visual tracking
US6999599B2 (en) System and method for mode-based multi-hypothesis tracking using parametric contours
US8437549B2 (en) Image processing method and image processing apparatus
CN110956653B (en) Satellite video dynamic target tracking method with fusion of correlation filter and motion estimation
Liu et al. Accurate dense optical flow estimation using adaptive structure tensors and a parametric model
CN103139568B (en) Based on the Video Stabilization method of degree of rarefication and fidelity constraint
CN103514441A (en) Facial feature point locating tracking method based on mobile platform
CN101916446A (en) Gray level target tracking algorithm based on marginal information and mean shift
CN106210447B (en) Based on the matched video image stabilization method of background characteristics point
CN107292851A (en) A kind of BM3D image denoising methods converted based on pseudo- 3D
CN101271520A (en) Method and device for confirming characteristic point position in image
CN111402303A (en) Target tracking architecture based on KFSTRCF
CN112132862B (en) Adaptive scale estimation target tracking algorithm based on unmanned aerial vehicle
CN101127121A (en) Target tracking algorism based on self-adaptive initial search point forecast
CN104778670A (en) Fractal-wavelet self-adaption image denoising method based on multivariate statistical model
CN101127120A (en) Target tracking algorism for effectively suppressing template drift
JPH09161072A (en) Video processor for extracting structure information of video signal
CN110580712B (en) Improved CFNet video target tracking method using motion information and time sequence information
CN113470074B (en) Self-adaptive space-time regularization target tracking method based on block discrimination
Liew et al. Reconstruction from 2-D wavelet transform modulus maxima using projection
Roy-Chowdhury et al. 3D face modeling from monocular video sequences
Marks et al. Joint tracking of pose, expression, and texture using conditionally Gaussian filters

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: SHANGHAI LONGDONG PHOTOELECTRON CO., LTD.

Free format text: FORMER OWNER: FUDAN UNIVERSITY

Effective date: 20130325

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 200433 YANGPU, SHANGHAI TO: 201613 SONGJIANG, SHANGHAI

TR01 Transfer of patent right

Effective date of registration: 20130325

Address after: 201613, No. 88 Rong Yue Dong Road, Shanghai, Songjiang District

Patentee after: Shanghai Longdong Optoelectronic Co., Ltd.

Address before: 220 Handan Road, Shanghai, No. 200433

Patentee before: Fudan University

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110209

Termination date: 20190913

CF01 Termination of patent right due to non-payment of annual fee