CN102110296A - Method for tracking moving target in complex scene - Google Patents

Method for tracking moving target in complex scene Download PDF

Info

Publication number
CN102110296A
CN102110296A CN 201110043782 CN201110043782A CN102110296A CN 102110296 A CN102110296 A CN 102110296A CN 201110043782 CN201110043782 CN 201110043782 CN 201110043782 A CN201110043782 A CN 201110043782A CN 102110296 A CN102110296 A CN 102110296A
Authority
CN
China
Prior art keywords
target
color
probability
background
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 201110043782
Other languages
Chinese (zh)
Inventor
汪东
谢少荣
李恒宇
缪金松
郭其明
徐元玉
李超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN 201110043782 priority Critical patent/CN102110296A/en
Publication of CN102110296A publication Critical patent/CN102110296A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to a method for tracking a moving target in a complex scene. The method comprises the following steps of: obtaining a statistical histogram by adopting a component H background weighting method in an initially designated image area; creating a colour updating probability distribution graph with an adaptive background by utilizing a Bayes formula for each image of a video stream in the tracking process, updating target colour probability in a search region in real time, iterating the adaptive background updating colour probability distribution graph to obtain a position of the center of mass by utilizing a Camshift algorithm; and carrying out motion prediction by adopting a greedy prediction method in the tracking process, and continuously repeating the tracking step. By applying the method in the invention, the problem that the tracking on the moving target is not ideal enough in the complex scene can be solved, and the accuracy and the robustness are better.

Description

Motion target tracking method under a kind of complex scene
Technical field
The present invention relates to a kind of method for tracking target of image processing field, relate in particular to the motion target tracking method under a kind of complex scene.
Background technology
Target following Study on Technology and application are important branch of computer vision, are widely used in fields such as missile guidance, traffic intersection monitoring, Aero-Space, safety monitoring, sports.The main task of target following is a characteristic information extraction from video flowing, comprising information such as position, shape size, profile or colors, finishes the process of target following according to these information.
Motion target tracking method commonly used at present mainly is divided three classes: based on the method for motion analysis, based on the method for characteristics of image coupling, based on the method for color characteristic distribution.
Method based on motion analysis: as methods such as frame-to-frame differences point-score, optical flow method, the frame-to-frame differences point-score mainly is to utilize that the situation of change of pixel detects moving target between consecutive frame, but this type of algorithmic method is simple, it is relatively more responsive that light is changed, poor anti jamming capability; Optical flow method is to realize tracing process by the motion vector of calculating pixel point, but the algorithm time complexity of these class methods is bigger, is difficult to adapt to the real-time follow-up of the moving target under the complex scene.
Method based on the characteristics of image coupling: as the feature point detection operator that utilizes the gradation of image autocorrelation function of Moravec proposition, the Model Matching method of the Snake active contour model that people such as Kass propose, this class methods algorithm complex is big, and not ideal enough to adaptability such as target occlusion, distortion.
The method that distributes based on color characteristic: as the Camshit algorithm is a kind of based on the color space target tracking algorism, has solved the influence of intensity of illumination to following the tracks of preferably, to the translation of target, insensitivity such as rotate and block.But under some complex scene, especially when background color is similar with color of object, the dissatisfactory situation of tracking effect can occur, even cause losing of tracking target.
How in the video sequence under the complex scene, intended target to be carried out the research emphasis that sane, effective and real-time tracking is target following always.
Summary of the invention
Given this plant situation, the purpose of this invention is to provide the motion target tracking method under a kind of complex scene, can solve the problem of the moving object dynamic tracking under the complex scene preferably,
For achieving the above object, design of the present invention is: to statistics target area H component background weighted histogram in the target area of initial video image appointment, in the process of following the tracks of, current frame image is set up the renewal color probability distribution graph of adaptive background, and utilize the Camshift algorithm iteration and obtain centroid position.The color that can upgrade in real time in the target search zone is the probability of target like this, thereby reaches the influence of less background color to target following.Adopt greedy predicted method to the target travel prediction of taking exercises simultaneously, strengthened the accuracy and the robustness of following the tracks of.
According to the foregoing invention design, the present invention adopts following technical proposals:
Motion target tracking method under a kind of complex scene is characterized in that operation steps is as follows:
(1) sets up target area H component background weighted histogram
The area image at the target place of appointment in the initial video image, carry out the RGB color space conversion to the hsv color space, realize weighting by length to the pixel distance center in the target rectangle frame, calculate H component weighted histogram, this H component background weighted histogram is stored as look-up table as the reference model.This look-up table is the colouring information of expression target, directly influences the accuracy of target following.
(2) set up the renewal color probability distribution graph of adaptive background
Target area in the current frame video image is analyzed, and utilize the Bayes formula that the H component background weighted histogram look-up table in histogrammic H component of current color and the above-mentioned steps (1) is done computing, trying to achieve pairing H component color is the probability of target, and sets up the renewal color probability distribution graph of rectangular area.
(3) utilize Camshift track algorithm iteration to obtain centroid position
For the renewal color probability distribution graph in the region of search, utilize the Camshift algorithm computation to go out the barycenter that upgrades the color probability distribution graph in the region of search.Repeat above-mentioned steps (2) to step (3), till the centroid position convergence.
(4) adopt greedy predicted method to the target prediction of taking exercises
Adopt greedy predicted method to come to the target prediction of taking exercises, run duration in adjacent two interframe of Video processing is shorter, the motion state of target changes little, be that the target velocity variation is less, greed predicted method supposition moving target is at the adjacent two frames target of prediction coordinate that moves with uniform velocity, by doing the reference calculation error also by way of compensation with the target physical location, prediction obtains the coordinate position of next frame target again.
Set up target area H component background weighted histogram in the described step (1), concrete steps are as follows:
1. color space conversion: to camera acquisition to video sequence in initialization specify search window to carry out the RGB color space to be transformed into the hsv color space, can to reduce the influence of illumination brightness like this to target following.H parametric representation color information, the i.e. position of residing spectral color.This parameter represents with an angular metric, red, green, blue 120 degree of being separated by respectively.Complementary colors differs 180 degree respectively.
2. add up the background weighted histogram of H component: when adopting rectangle frame to specify initial target, can comprise the part background color usually.Adopt the histogrammic method of weighted background, give background color less weights, can reduce the influence of background color target following.Because background color, is supposed each point in the rectangular area mostly at the edge of rectangle frame and all is endowed weights that the size of weights depends on the distance of distance center point, regulates with gaussian kernel function.Histogrammic mathematical description computing formula of H component weighted background such as formula (1):
,?
Figure 836808DEST_PATH_IMAGE002
(1)
Wherein u is the index of certain color level in the H component,
Figure 2011100437823100002DEST_PATH_IMAGE003
For the H component is that the color of u is the probability of target,
Figure 384333DEST_PATH_IMAGE004
Be the Delta function, C is a normaliztion constant, makes
Figure 2011100437823100002DEST_PATH_IMAGE005
, x iPixel in the expression rectangular area, O represents target rectangle regional center point (O x, O y) T, h represents the radius of rectangle frame, b (x i) the color level index of the corresponding H component of pixel in the expression target area rectangle frame.K (x) is a kernel function, select for use one zero point intermediate projections the monotone decreasing profile function as the weights of pixel to the target's center position, when distance center was far away more, weights were more little, otherwise big more.So just reduced the influence of background color to the target statistics with histogram.H component background weighted histogram is set up in described zone, the reference table of this H component background weighted histogram as the reference model stored.
Set up the renewal color probability distribution graph of adaptive background in the described step (2), concrete steps are as follows:
The different colours feature of each search box is recomputated the probability of color of object.Adopt Bayes formula (2) to represent the probability density of the color of object in the search box:
Figure 709135DEST_PATH_IMAGE006
(2)
P(O|C wherein) being color is the probability that the pixel of C is represented target O.P(C|O) be that color is the color probability of the H component of C among the tracking target O.P (O), P (C) represent that respectively target accounts for that color is the probability of C in the prior probability of whole region of search area and the whole region of search.Denominator in the following formula can be expressed as formula (3):
Figure 2011100437823100002DEST_PATH_IMAGE007
(3)
P(B wherein) probability of expression background in the region of search.Here the setting search zone be the target area s doubly.P (O)=1/ s then, P (B)=(s-1)/s.Behind each search window shift position, all pixels in the region of search are rebulid color histogram, upgrade the probability distribution of P (C).Denominator that so just can the real-time update Bayesian formula.When target enter one with target in certain color C 0During close background area, be color C in the region of search for the denominator in (2) formula 0The value of probability density become big, and molecule P(C 0| O) P(O) be that the prior probability of target is a steady state value, so P(O|C 0) value will diminish.To the color map of the H component of each pixel reference table in the step (1), calculate the probability that this pixel is a target, so just can set up the color probability distribution graph of renewal.Color is the probability of target in the continual renovation region of search, can reduce color of object the probability in probability distribution graph close with background adaptively, so just effectively solves the tracking problem under complex scene.
Utilize Camshift track algorithm iteration to obtain centroid position in the described step (3), concrete steps are as follows:
Utilize the Camshift algorithm computation to go out the centroid position of the renewal color probability distribution graph of search box to the renewal color probability distribution graph of adaptive background.Its structure centroid calculation formula is as (4)~(6) formula:
Figure 78936DEST_PATH_IMAGE008
(4)
Calculate first moment:
Figure 2011100437823100002DEST_PATH_IMAGE009
(5)
(6)
Calculate barycenter (x c, y c): ,
Figure 891220DEST_PATH_IMAGE012
(7)
Calculate moving region barycenter (x by the square method c, y c).Z wherein 00Be zeroth order square and Z 10, Z 20Be first moment, I(x y) is the color probability pixel of adaptive background.The search box center is moved to above-mentioned centroid position, the iteration distance that twice is moved before and after the barycenter less than some preset threshold till.
Employing in the described step (4) greed predicted method is to the target prediction of taking exercises, and concrete steps are as follows:
The greed prediction algorithm is not pursued optimum solution and is only calculated comparatively satisfactory solution.Because adjacent two frame times of Video processing are very short, the motion state of target changes little, be that the target velocity variation is less, the greed prediction algorithm is to suppose that at first moving target is at the adjacent two frames target of prediction coordinate that moves with uniform velocity, again with do reference calculation error and by way of compensation by the target physical location, as the correction of the position of next frame prediction.The step of greed prediction algorithm is as follows:
1.. choose the actual coordinate (x of the target of adjacent three frames -2, y -2), (x -1, y -1), (x 0, y 0), (x wherein 0, y 0) be present frame, (x -1, y -1) be former frame, (x -2, y -2) be front cross frame;
2.. calculate the difference of present frame and previous frame: Δ x=x 0-x -1, Δ y=y 0-y -1;
3.. error of calculation compensation: ρ (x)=(x 0-x 0 ,)+(x 0+ x -2-2x -1), ρ (y)=(y 0-y 0 ,)+(y 0+ y -2-2y -1); (x wherein 0 , ,y 0 ,) be the prediction coordinate of present frame.Error is made up of two parts, and forward part is the error correction of the value of giving a forecast and actual value, and the rear section is the correction as the identical variable quantity of uniform variable motion in the adjacent time.
4.. the coordinate of the target location in the prediction next frame image, and with the barycenter of this coordinate as the search box of next frame image: x 1 ,=x 0+ Δ x+ ρ (x), y 1 ,=y 0+ Δ y+ ρ (y)
Coordinate points is pushed ahead a frame successively, and above-mentioned (1)-(4) step circulates.So just can be simple and the position at target of prediction place, relatively accurate ground.
The greed prediction algorithm only need utilize the actual centroid position of first three frame moving target, can dope the movement position of next frame target relatively accurately.Algorithm is simple relatively, the little accuracy and the real-time of following the tracks of that improve greatly of calculated amount.
The present invention has following outstanding feature and remarkable advantage: the present invention adopts the background weighted histogram of background weighting technique to initial target zone calculating H component, has reduced the influence that background is brought tracing process around the target in the initial appointment rectangle frame.In tracing process, adopt the method real-time update color probability of the renewal color probability distribution graph set up adaptive background to solve target following problem preferably at complex background, especially solve the dissatisfactory problem of tracking effect under the close situation of background colour and color of object, in tracing process, added greedy predicted method as motion prediction simultaneously.The present invention has strengthened the preparatory and robustness of following the tracks of, and has improved the success ratio of motion target tracking under the complex environment.
Description of drawings
Fig. 1 is the implementing procedure figure of the embodiment of the invention;
Fig. 2 obtains the implementing procedure figure of centroid position for the Camshift track algorithm iteration of utilizing of the embodiment of the invention.
Embodiment
For making the purpose, technical solutions and advantages of the present invention more clear,, the present invention is carried out embodiment below in conjunction with accompanying drawing:
Fig. 1 is the method for tracking target process flow diagram in the embodiment of the invention.As shown in Figure 1, the step of this flow process comprises the steps:
1. video acquisition image/video stream input: receive the video image input of video camera, receive the appointed information in initial target coordinate and zone.This example adopts rectangle frame to represent the location coordinate information and the area information of target.Each unique point in this zone is all as the basis of follow-up track algorithm.
2. set up target area H component background weighted histogram: color space is carried out in specified zone transform, rgb space is transformed into the HSV space.For reducing the influence of background color, adopt the thought of weighted histogram to come compute histograms to the target statistics with histogram.Suppose that each point in the rectangular area all is endowed weights, the size of weights depends on the distance of distance center point, regulates with gaussian kernel function.Histogrammic mathematical description computing formula of H component weighted background such as formula (1):
,?
Figure 860499DEST_PATH_IMAGE002
(1)
Wherein u is the index of certain color level in the H component,
Figure 871181DEST_PATH_IMAGE003
For the H component is that the color of u is the probability of target, Be the Delta function, C is a normaliztion constant, makes
Figure 872952DEST_PATH_IMAGE005
, x iPixel in the expression rectangular area, O represents target rectangle regional center point (O x, O y) T, h represents the radius of rectangle frame, b (x i) the color level index of the corresponding H component of pixel in the expression target area rectangle frame.K (x) is a kernel function, select for use one zero point intermediate projections the monotone decreasing profile function as the weights of pixel to the target's center position, when distance center was far away more, weights were more little, otherwise big more.So just reduced the influence of background color to the target statistics with histogram.And described zone set up H component background weighted histogram.The reference table of this H component background weighted histogram as the reference model stored.
3. set up the renewal color probability distribution graph of adaptive background: the different colours feature to each search box recomputates the color of object probability.Adopt Bayes formula (2) to represent the probability density of the color of object in the search box:
Figure DEST_PATH_IMAGE013
(2)
P(O|C wherein) be the probability that the pixel of C is represented target O for color.P(C|O) be that color is the color probability of the H component of C among the tracking target O.P (O), P (C) represent that respectively target accounts for that color is the probability of C in the prior probability of whole region of search area and the whole region of search.Denominator in the following formula can be expressed as formula (3):
Figure 771507DEST_PATH_IMAGE014
(3)
P(B wherein) probability of expression background in the region of search.Here the setting search zone be the target area s doubly.P (O)=1/ s then, P (B)=(s-1)/s. is rebuliding color histogram to all pixels in the region of search behind each search window shift position, upgrade the probability distribution of P (C).Denominator that so just can the real-time update Bayesian formula.When target enter one with target in certain color C 0During close background area, be color C in the region of search for the denominator in (2) formula 0The value of probability density become big, and molecule P(C 0| O) P(O) be that the prior probability of target is a steady state value, so P(O|C 0) value will diminish.To the color map of the H component of each pixel reference table in the step (1), calculate the probability that this pixel is a target, so just can set up the color probability distribution graph of renewal.
4. utilize Camshift track algorithm iteration to obtain centroid position: in to above-mentioned steps 3, to calculate the centroid position that the color probability distribution graph operation Camshift algorithm computation of upgrading frame of video goes out the renewal color probability distribution graph of search box.Its structure centroid calculation formula is as (5)~(7) formula:
Figure 636694DEST_PATH_IMAGE008
(5)
Calculate first moment:
Figure 713235DEST_PATH_IMAGE009
(6)
Figure 296663DEST_PATH_IMAGE010
(7)
Calculate barycenter (x c, y c):
Figure 546379DEST_PATH_IMAGE011
,
Figure 718603DEST_PATH_IMAGE012
(8)
Calculate moving region barycenter (x by the square method c, y c), formula (5)~(8) formula.Z wherein 00Be zeroth order square and Z 10, Z 20Be first moment, I(x is y) with color probability in the renewal color probability distribution graph of the adaptive background that calculates in the above-mentioned step 3.The search box center is moved to above-mentioned centroid position, the iteration distance that twice is moved before and after the barycenter less than some preset threshold till.
5. adopt greedy predicted method to the target prediction of taking exercises: this algorithm is not pursued optimum solution and is only calculated comparatively satisfactory solution.Because adjacent two frame times of Video processing are very short, the motion state of target changes little, be that the target velocity variation is less, the greed prediction algorithm is to suppose that at first moving target is at the adjacent two frames target of prediction coordinate that moves with uniform velocity, again with do reference calculation error and by way of compensation by the target physical location, as the correction of the position of next frame prediction.The step of greed prediction algorithm is as follows:
1.. choose the actual coordinate (x of the target of adjacent three frames -2, y -2), (x -1, y -1), (x 0, y 0), (x wherein 0, y 0) be present frame, (x -1, y -1) be former frame, (x -2, y -2) be front cross frame.
2.. calculate the difference of present frame and previous frame: Δ x=x 0-x -1, Δ y=y 0-y -1;
3.. error of calculation compensation: ρ (x)=(x 0-x 0 ,)+(x 0+ x -2-2x -1), ρ (y)=(y 0-y 0 ,)+(y 0+ y -2-2y -1); (x wherein 0 , ,y 0 ,) be the prediction coordinate of present frame.Error is made up of two parts, and forward part is the error correction of the value of giving a forecast and actual value, and the rear section is the correction as the identical variable quantity of uniform variable motion in the adjacent time.
4.. the coordinate of the target location in the prediction next frame image, and with the barycenter of this coordinate as the search box of next frame image: x 1 ,=x 0+ Δ x+ ρ (x), y 1 ,=y 0+ Δ y+ ρ (y)
Coordinate points is pushed ahead a frame successively, and the circulation above-mentioned steps 1.-4..So just can be simple and the position at target of prediction place, relatively accurate ground.
6. the prediction barycenter that utilizes above-mentioned steps 5 as next frame the center of following the tracks of frame, target area, circulation step above-mentioned steps 2 is to step 5, thereby finishes the motion target tracking process.

Claims (5)

1. the motion target tracking method under the complex scene is characterized in that operation steps is as follows:
(1) sets up target area H component background weighted histogram: the image in the target area rectangle frame in the initial video image is carried out the RGB color space conversion to the hsv color space; Realize weighting by length, calculate the background weighted histogram of H component, this H component background weighted histogram is stored as look-up table as the reference model the pixel distance center in the target rectangle frame;
(2) set up the renewal color probability distribution graph of adaptive background: target area in the current frame video image is analyzed, and utilize the Bayes formula that the H component background weighted histogram look-up table in histogrammic H component of current color and the above-mentioned steps (1) is done computing, trying to achieve pairing H component color is the probability of target, and sets up the renewal color probability distribution graph of rectangular area;
(3) utilize Camshift track algorithm iteration to obtain centroid position:, to utilize the Camshift algorithm computation to go out the barycenter that upgrades the color probability distribution graph in the region of search for the renewal color probability distribution graph in the region of search; Repeat above-mentioned steps (2) to step (3), till the centroid position convergence;
(4) adopt greedy predicted method to the target prediction of taking exercises: greedy predicted method supposes that moving target is at the adjacent two frames target of prediction coordinate that moves with uniform velocity, by doing the reference calculation error also by way of compensation with the target physical location, prediction obtains the coordinate position of next frame target again;
(5) coordinate position of prediction in the above-mentioned steps (4) is followed the tracks of the frame center as the target area of next frame, circulation above-mentioned steps (2) to step (4) realizes the tracking to moving target.
2. the motion target tracking method under the complex scene according to claim 1 is characterized in that, the target area H component background weighted histogram concrete steps of setting up in the described step (1) are:
1. to camera acquisition to video sequence in initialization rectangular search window be in the position and zone at intended target object place, the RGB color space is transformed into the hsv color space;
2. adopt the histogrammic method of weighted background, reduce the influence of background color target following; Because background color, is supposed each point in the rectangular area mostly at the edge of rectangle frame and all is endowed weights that the size of weights depends on the distance of distance center point, regulates with gaussian kernel function; Histogrammic mathematical description computing formula of weighted background such as formula (1):
Figure 323638DEST_PATH_IMAGE001
,?
Figure 437087DEST_PATH_IMAGE002
(1)
Wherein u is the index of certain color level in the H component,
Figure 424635DEST_PATH_IMAGE003
For the H component is that the color of u is the probability of target, Be the Delta function, C is a normaliztion constant, makes
Figure 772757DEST_PATH_IMAGE005
, x iPixel in the expression rectangular area, O represents target rectangle regional center point (O x, O y) T, h represents the radius of rectangle frame, b (x i) the color level index of the corresponding H component of pixel in the expression target area rectangle frame; K (x) is a kernel function, select for use one zero point intermediate projections the monotone decreasing profile function as the weights of pixel to the target's center position, and described zone set up H component background weighted histogram; The reference table of this H component background weighted histogram as the reference model stored.
3. the motion target tracking method under the complex scene according to claim 1, it is characterized in that the adaptive background color probability distribution graph concrete steps of setting up in the described step (2) are: the probability that the different colours feature of each search box is recomputated color of object; Adopt Bayes formula (2) to represent the probability density of the color of object in the search box:
Figure 57108DEST_PATH_IMAGE006
(2)
P(O|C wherein) be the probability that the pixel of C is represented target O for color; P(C|O) be that color is the color probability of the H component of C among the tracking target O; P (O), P (C) represent that respectively target accounts for that color is the probability of C in the prior probability of whole region of search area and the whole region of search; Denominator in the following formula can be expressed as formula (3):
(3) probability of expression background in region of search P(B wherein); Here the setting search zone be the target area s doubly; P (O)=1/ s then, P (B)=(s-1)/s. is rebuliding color histogram to all pixels in the region of search behind each search window shift position, upgrade the probability distribution of P (C); Denominator that so just can the real-time update Bayesian formula; When target enter one with target in certain color C 0During close background area, be color C in the region of search for the denominator in (2) formula 0The value of probability density become big, and molecule P(C 0| O) P(O) be that the prior probability of target is a steady state value, so P(O|C 0) value will diminish; To the color map of the H component of each pixel reference table in the step (1), calculate the probability that this pixel is a target, so just can set up the color probability distribution graph of renewal.
4. the motion target tracking method under the complex scene according to claim 1, it is characterized in that, utilize Camshift track algorithm iteration to obtain the centroid position concrete steps to be in the described step (3): utilize the Camshift algorithm computation to go out the centroid position of the renewal color probability distribution graph of search box the renewal color probability distribution graph of adaptive background; Its structure centroid calculation formula is as (4)~(6) formula:
(4)
Calculate first moment: (5)
Figure 993523DEST_PATH_IMAGE010
(6)
Calculate barycenter (x c, y c):
Figure 893345DEST_PATH_IMAGE011
, (7)
Calculate moving region barycenter (x by the square method c, y c); Z wherein 00Be zeroth order square and Z 10, Z 20Be first moment, I(x y) is the color probability pixel of adaptive background; The search box center is moved to above-mentioned centroid position, the iteration distance that twice is moved before and after the barycenter less than some preset threshold till.
5. the motion target tracking method under the complex scene according to claim 1 is characterized in that, adopts in the described step (4) greedy predicted method as follows to the target prediction concrete steps of taking exercises: utilize the estimation of taking exercises of greedy prediction algorithm; Because adjacent two frame times of Video processing are very short, the motion state and the velocity variations of target are less, the greed prediction algorithm is to suppose that at first moving target is at the adjacent two frames target of prediction coordinate that moves with uniform velocity, again with do reference calculation error and by way of compensation by the target physical location, as the correction of the position of next frame prediction; The step of greed prediction algorithm is as follows:
1.. choose the actual coordinate (x of the target of adjacent three frames -2, y -2), (x -1, y -1), (x 0, y 0), (x wherein 0, y 0) be present frame, (x -1, y -1) be former frame, (x -2, y -2) be front cross frame;
2.. calculate the difference of present frame and previous frame: Δ x=x 0-x -1, Δ y=y 0-y -1;
3.. error of calculation compensation: ρ (x)=(x 0-x 0 ,)+(x 0+ x -2-2x -1), ρ (y)=(y 0-y 0 ,)+(y 0+ y -2-2y -1); (x wherein 0 , ,y 0 ,) be the prediction coordinate of present frame; Error is made up of two parts, and forward part is the error correction of the value of giving a forecast and actual value, and the rear section is the correction as the identical variable quantity of uniform variable motion in the adjacent time;
4.. the coordinate of the target location in the prediction next frame image, and with the barycenter of this coordinate as the search box of next frame image: x 1 ,=x 0+ Δ x+ ρ (x), y 1 ,=y 0+ Δ y+ ρ (y)
Coordinate points is pushed ahead a frame successively, the circulation above-mentioned steps 1. to step 4.; So just can be simple and the position at target of prediction place, relatively accurate ground.
CN 201110043782 2011-02-24 2011-02-24 Method for tracking moving target in complex scene Pending CN102110296A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110043782 CN102110296A (en) 2011-02-24 2011-02-24 Method for tracking moving target in complex scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110043782 CN102110296A (en) 2011-02-24 2011-02-24 Method for tracking moving target in complex scene

Publications (1)

Publication Number Publication Date
CN102110296A true CN102110296A (en) 2011-06-29

Family

ID=44174443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110043782 Pending CN102110296A (en) 2011-02-24 2011-02-24 Method for tracking moving target in complex scene

Country Status (1)

Country Link
CN (1) CN102110296A (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102332166A (en) * 2011-09-26 2012-01-25 北京航空航天大学 Probabilistic model based automatic target tracking method for moving camera
CN102831166A (en) * 2012-07-24 2012-12-19 武汉大千信息技术有限公司 Criminal investigation video preprocessing method based on color feature detection
CN103065331A (en) * 2013-01-15 2013-04-24 南京工程学院 Target tracking method based on correlation of space-time-domain edge and color feature
CN103473792A (en) * 2013-09-11 2013-12-25 清华大学深圳研究生院 Method for detecting moving target
CN104182971A (en) * 2014-08-08 2014-12-03 北京控制工程研究所 High-precision image moment positioning method
CN104200485A (en) * 2014-07-10 2014-12-10 浙江工业大学 Video-monitoring-oriented human body tracking method
CN104253981A (en) * 2014-09-28 2014-12-31 武汉烽火众智数字技术有限责任公司 Method for sequencing movement objects for video detection according to colors
CN104680504A (en) * 2013-11-26 2015-06-03 杭州海康威视数字技术股份有限公司 Scene change detection method and device thereof
CN104820997A (en) * 2015-05-14 2015-08-05 北京理工大学 Target tracking method based on block sparse expression and HSV feature fusion
CN104867163A (en) * 2015-05-28 2015-08-26 深圳大学 Marginal distribution passing measurement-driven target tracking method and tracking system thereof
CN105139406A (en) * 2015-09-08 2015-12-09 哈尔滨工业大学 Tracking accuracy inversion method based on sequence images
CN106124175A (en) * 2016-06-14 2016-11-16 电子科技大学 A kind of compressor valve method for diagnosing faults based on Bayesian network
CN106152949A (en) * 2016-07-15 2016-11-23 同济大学 A kind of noncontact video displacement measurement method
CN106250805A (en) * 2015-06-03 2016-12-21 通用汽车环球科技运作有限责任公司 Relative motion based on object performs the method and apparatus of image screen
CN106792194A (en) * 2016-12-23 2017-05-31 深圳Tcl新技术有限公司 Television shutdown method and system
CN106960447A (en) * 2017-05-17 2017-07-18 成都通甲优博科技有限责任公司 The position correcting method and system of a kind of video frequency object tracking
WO2017124299A1 (en) * 2016-01-19 2017-07-27 深圳大学 Multi-target tracking method and tracking system based on sequential bayesian filtering
CN107845290A (en) * 2016-09-21 2018-03-27 意法半导体股份有限公司 Junction alarm method, processing system, junction alarm system and vehicle
CN108198414A (en) * 2017-12-27 2018-06-22 北斗七星(重庆)物联网技术有限公司 A kind of method, apparatus, equipment and the storage medium of road monitoring point position distribution
CN108259703A (en) * 2017-12-31 2018-07-06 深圳市秦墨科技有限公司 A kind of holder with clapping control method, device and holder
CN108334824A (en) * 2018-01-19 2018-07-27 国网电力科学研究院武汉南瑞有限责任公司 High voltage isolator state identification method based on background difference and iterative search
CN108376246A (en) * 2018-02-05 2018-08-07 南京蓝泰交通设施有限责任公司 A kind of identification of plurality of human faces and tracking system and method
CN108447078A (en) * 2018-02-28 2018-08-24 长沙师范学院 The interference of view-based access control model conspicuousness perceives track algorithm
CN109165628A (en) * 2018-09-12 2019-01-08 首都师范大学 Improve method, apparatus, electronic equipment and the storage medium of moving-target detection accuracy
CN109345566A (en) * 2018-09-28 2019-02-15 上海应用技术大学 Motion target tracking method and system
CN110501696A (en) * 2019-06-28 2019-11-26 电子科技大学 A kind of radar target tracking method based on Doppler measurements self-adaptive processing
CN110636248A (en) * 2018-06-22 2019-12-31 华为技术有限公司 Target tracking method and device
CN110717003A (en) * 2019-09-27 2020-01-21 四川长虹电器股份有限公司 Intelligent shopping cart autonomous navigation and automatic following method based on path planning
CN111145344A (en) * 2019-12-30 2020-05-12 哈尔滨工业大学 Structured light measuring method for snow carving 3D reconstruction
CN111521245A (en) * 2020-07-06 2020-08-11 华夏天信(北京)智能低碳技术研究院有限公司 Liquid level control method based on visual analysis
CN112733770A (en) * 2021-01-18 2021-04-30 全程(上海)智能科技有限公司 Regional intrusion monitoring method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101551909A (en) * 2009-04-09 2009-10-07 上海交通大学 Tracking method based on kernel and target continuous adaptive distribution characteristics

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101551909A (en) * 2009-04-09 2009-10-07 上海交通大学 Tracking method based on kernel and target continuous adaptive distribution characteristics

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
《微计算机信息》 20070725 刘雪 等 基于改进Camshift算法的视频对象跟踪方法 第2.2节、第3节 1-5 第23卷, 第7-3期 *
《控制与决策》 20100815 陈爱斌 等 一种基于目标和背景加权的目标跟踪方法 全文 1-5 第25卷, 第8期 *
《计算机工程》 20110220 汪东 等 基于ABCshift算法的目标检测与跟踪 第1、3、4节 1-5 第37卷, 第4期 *
《计算机应用研究》 20100915 李晶 等 自动人脸跟踪方法研究 第3.1节 1-5 第27卷, 第9期 *

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102332166A (en) * 2011-09-26 2012-01-25 北京航空航天大学 Probabilistic model based automatic target tracking method for moving camera
CN102831166B (en) * 2012-07-24 2015-05-27 武汉大千信息技术有限公司 Criminal investigation video preprocessing method based on color feature detection
CN102831166A (en) * 2012-07-24 2012-12-19 武汉大千信息技术有限公司 Criminal investigation video preprocessing method based on color feature detection
CN103065331A (en) * 2013-01-15 2013-04-24 南京工程学院 Target tracking method based on correlation of space-time-domain edge and color feature
CN103065331B (en) * 2013-01-15 2015-07-08 南京工程学院 Target tracking method based on correlation of space-time-domain edge and color feature
CN103473792A (en) * 2013-09-11 2013-12-25 清华大学深圳研究生院 Method for detecting moving target
CN104680504A (en) * 2013-11-26 2015-06-03 杭州海康威视数字技术股份有限公司 Scene change detection method and device thereof
CN104680504B (en) * 2013-11-26 2018-06-08 杭州海康威视数字技术股份有限公司 Scene-change detecting method and its device
CN104200485B (en) * 2014-07-10 2017-05-17 浙江工业大学 Video-monitoring-oriented human body tracking method
CN104200485A (en) * 2014-07-10 2014-12-10 浙江工业大学 Video-monitoring-oriented human body tracking method
CN104182971A (en) * 2014-08-08 2014-12-03 北京控制工程研究所 High-precision image moment positioning method
CN104182971B (en) * 2014-08-08 2017-05-31 北京控制工程研究所 A kind of high precision image square localization method
CN104253981A (en) * 2014-09-28 2014-12-31 武汉烽火众智数字技术有限责任公司 Method for sequencing movement objects for video detection according to colors
CN104253981B (en) * 2014-09-28 2017-11-28 武汉烽火众智数字技术有限责任公司 A kind of method that moving target for video investigation presses color sequence
CN104820997A (en) * 2015-05-14 2015-08-05 北京理工大学 Target tracking method based on block sparse expression and HSV feature fusion
CN104867163A (en) * 2015-05-28 2015-08-26 深圳大学 Marginal distribution passing measurement-driven target tracking method and tracking system thereof
CN106250805B (en) * 2015-06-03 2019-10-22 通用汽车环球科技运作有限责任公司 For handling the method and system of multiple images relevant to motor vehicles
CN106250805A (en) * 2015-06-03 2016-12-21 通用汽车环球科技运作有限责任公司 Relative motion based on object performs the method and apparatus of image screen
CN105139406B (en) * 2015-09-08 2018-02-23 哈尔滨工业大学 A kind of tracking accuracy inversion method based on sequence image
CN105139406A (en) * 2015-09-08 2015-12-09 哈尔滨工业大学 Tracking accuracy inversion method based on sequence images
WO2017124299A1 (en) * 2016-01-19 2017-07-27 深圳大学 Multi-target tracking method and tracking system based on sequential bayesian filtering
CN106124175B (en) * 2016-06-14 2019-08-06 电子科技大学 A kind of compressor valve method for diagnosing faults based on Bayesian network
CN106124175A (en) * 2016-06-14 2016-11-16 电子科技大学 A kind of compressor valve method for diagnosing faults based on Bayesian network
CN106152949A (en) * 2016-07-15 2016-11-23 同济大学 A kind of noncontact video displacement measurement method
CN107845290A (en) * 2016-09-21 2018-03-27 意法半导体股份有限公司 Junction alarm method, processing system, junction alarm system and vehicle
CN107845290B (en) * 2016-09-21 2021-07-27 意法半导体股份有限公司 Intersection warning method, processing system, intersection warning system and vehicle
CN106792194A (en) * 2016-12-23 2017-05-31 深圳Tcl新技术有限公司 Television shutdown method and system
CN106960447B (en) * 2017-05-17 2020-01-21 成都通甲优博科技有限责任公司 Position correction method and system for video target tracking
CN106960447A (en) * 2017-05-17 2017-07-18 成都通甲优博科技有限责任公司 The position correcting method and system of a kind of video frequency object tracking
CN108198414A (en) * 2017-12-27 2018-06-22 北斗七星(重庆)物联网技术有限公司 A kind of method, apparatus, equipment and the storage medium of road monitoring point position distribution
CN108259703A (en) * 2017-12-31 2018-07-06 深圳市秦墨科技有限公司 A kind of holder with clapping control method, device and holder
CN108334824A (en) * 2018-01-19 2018-07-27 国网电力科学研究院武汉南瑞有限责任公司 High voltage isolator state identification method based on background difference and iterative search
CN108376246A (en) * 2018-02-05 2018-08-07 南京蓝泰交通设施有限责任公司 A kind of identification of plurality of human faces and tracking system and method
CN108447078A (en) * 2018-02-28 2018-08-24 长沙师范学院 The interference of view-based access control model conspicuousness perceives track algorithm
CN108447078B (en) * 2018-02-28 2022-06-10 长沙师范学院 Interference perception tracking algorithm based on visual saliency
CN110636248A (en) * 2018-06-22 2019-12-31 华为技术有限公司 Target tracking method and device
CN109165628A (en) * 2018-09-12 2019-01-08 首都师范大学 Improve method, apparatus, electronic equipment and the storage medium of moving-target detection accuracy
CN109165628B (en) * 2018-09-12 2022-06-28 首都师范大学 Method and device for improving moving target detection precision, electronic equipment and storage medium
CN109345566A (en) * 2018-09-28 2019-02-15 上海应用技术大学 Motion target tracking method and system
CN110501696B (en) * 2019-06-28 2022-05-31 电子科技大学 Radar target tracking method based on Doppler measurement adaptive processing
CN110501696A (en) * 2019-06-28 2019-11-26 电子科技大学 A kind of radar target tracking method based on Doppler measurements self-adaptive processing
CN110717003A (en) * 2019-09-27 2020-01-21 四川长虹电器股份有限公司 Intelligent shopping cart autonomous navigation and automatic following method based on path planning
CN111145344A (en) * 2019-12-30 2020-05-12 哈尔滨工业大学 Structured light measuring method for snow carving 3D reconstruction
CN111145344B (en) * 2019-12-30 2023-03-28 哈尔滨工业大学 Structured light measuring method for snow carving 3D reconstruction
CN111521245A (en) * 2020-07-06 2020-08-11 华夏天信(北京)智能低碳技术研究院有限公司 Liquid level control method based on visual analysis
CN112733770A (en) * 2021-01-18 2021-04-30 全程(上海)智能科技有限公司 Regional intrusion monitoring method and device

Similar Documents

Publication Publication Date Title
CN102110296A (en) Method for tracking moving target in complex scene
Liu et al. Optical flow based urban road vehicle tracking
CN110533687B (en) Multi-target three-dimensional track tracking method and device
CN104915969B (en) A kind of stencil matching tracking based on particle group optimizing
CN111539273A (en) Traffic video background modeling method and system
CN110009665A (en) A kind of target detection tracking method blocked under environment
CN107968946B (en) Video frame rate improving method and device
CN103514441A (en) Facial feature point locating tracking method based on mobile platform
CN101840507A (en) Target tracking method based on character feature invariant and graph theory clustering
CN107403439B (en) Cam-shift-based prediction tracking method
CN104601964A (en) Non-overlap vision field trans-camera indoor pedestrian target tracking method and non-overlap vision field trans-camera indoor pedestrian target tracking system
CN103871079A (en) Vehicle tracking method based on machine learning and optical flow
CN106952294B (en) A kind of video tracing method based on RGB-D data
CN102999920A (en) Target tracking method based on nearest neighbor classifier and mean shift
CN110321937B (en) Motion human body tracking method combining fast-RCNN with Kalman filtering
CN107301657A (en) A kind of video target tracking method for considering target movable information
CN104123733B (en) A kind of method of motion detection and reduction error rate based on Block- matching
CN103281476A (en) Television image moving target-based automatic tracking method
CN102609945A (en) Automatic registration method of visible light and thermal infrared image sequences
Chen et al. A stereo visual-inertial SLAM approach for indoor mobile robots in unknown environments without occlusions
CN105046721A (en) Camshift algorithm for tracking centroid correction model on the basis of Grabcut and LBP (Local Binary Pattern)
CN102074000B (en) Tracking method for adaptively adjusting window width by utilizing optimal solution of variance rate
CN110706252A (en) Robot nuclear correlation filtering tracking algorithm under guidance of motion model
Najafzadeh et al. Multiple soccer players tracking
CN117036397A (en) Multi-target tracking method based on fusion information association and camera motion compensation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20110629