CN100585622C - Human body tracing method based on gauss hybrid models - Google Patents

Human body tracing method based on gauss hybrid models Download PDF

Info

Publication number
CN100585622C
CN100585622C CN200810034533A CN200810034533A CN100585622C CN 100585622 C CN100585622 C CN 100585622C CN 200810034533 A CN200810034533 A CN 200810034533A CN 200810034533 A CN200810034533 A CN 200810034533A CN 100585622 C CN100585622 C CN 100585622C
Authority
CN
China
Prior art keywords
human body
tracks
motion
state
diaphragm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200810034533A
Other languages
Chinese (zh)
Other versions
CN101251895A (en
Inventor
张玉冰
曾贵华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN200810034533A priority Critical patent/CN100585622C/en
Publication of CN101251895A publication Critical patent/CN101251895A/en
Application granted granted Critical
Publication of CN100585622C publication Critical patent/CN100585622C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

A kind of human body tracing method of technical field of image processing based on gauss hybrid models, the present invention at first sets up model with the human body linear model to human body, head, neck, barycenter and double-legged with human body as following the tracks of the position, in follow-up frame, human body is respectively followed the tracks of the position then and carry out the Kalman filtering tracking, wherein follow the tracks of based on gauss hybrid models for both legs, adopt Kalman filtering to follow the tracks of to left foot, and, simplify tracking to right crus of diaphragm by the state of left foot being judged the motion state of right crus of diaphragm.The present invention is by the parameter of Kalman filtering prediction parts at next frame, and use observed reading correction predicted value, has guaranteed the accuracy that parts are followed the tracks of.Use gauss hybrid models to be the tracing object modeling, improved the state of following the tracks of efficient and can identifying the both legs motion.

Description

Human body tracing method based on gauss hybrid models
Technical field
The present invention relates to a kind of method of technical field of image processing, specifically is a kind of human body tracing method based on gauss hybrid models.
Background technology
Human motion analysis is the important technology that contemporary biomechanics and computer vision combine.It all has a wide range of applications in fields such as intelligent monitoring, man-machine interaction, biological identification technology and virtual realities.The motion analysis of human body shank particularly is for researchs such as medical treatment, physical culture provide important reference data.
Find by prior art documents, Qi Zhao etc. are in International Conference onPattern Recognition (pattern-recognition international conference, 2006) among the Part Based HumanTracking In A Multiple Cues Fusion Framework that delivers on (the human body tracking under multi thread fusion framework) based on parts, propose to set up HMM (hidden Markov) model, adopt the criterion of maximum a posteriori probability that each several part is followed the tracks of for each tracking section of human body.Its weak point is that each is followed the tracks of position integral body follows the tracks of as tracing object, can not overcome occlusion issue effectively.
Summary of the invention
The objective of the invention is to overcome the deficiency of said method, a kind of human body tracing method based on gauss hybrid models (GMM) has been proposed, with the human body linear model human body is set up model, adopt Kalman filtering to follow the tracks of head, neck, barycenter and the both feet of human body respectively, and the motion state of both feet is set up the GMM model, not only tracking efficient can be improved, and double-legged state can be identified.
The present invention is achieved by the following technical solutions, the present invention at first sets up model with the human body linear model to human body, head, neck, barycenter and double-legged with human body as following the tracks of the position, in follow-up frame, human body is respectively followed the tracks of the position then and carry out the Kalman filtering tracking, wherein follow the tracks of based on the GMM model for both legs, adopt Kalman filtering to follow the tracks of to left foot, and, simplify tracking right crus of diaphragm by the state of left foot being judged the motion state of right crus of diaphragm.
Describedly human body is set up model with the human body linear model, be specially: head, neck, barycenter and the both feet of representing human body with circle, the human body that circle is represented is follows the tracks of the position, will follow the tracks of the position with line segment and couple together, and obtains the height ratio that corresponding partes corporis humani divides.
Described tracking, the eigenwert of its tracking is the pendulum angle of the head position of human body, neck location, centroid position and both legs.
Describedly human body is respectively followed the tracks of the position follow the tracks of, be specially: adopt Kalman filtering that each position of manikin is followed the tracks of, Kalman filtering comprises prediction and revises two parts, predicted portions is meant: the predictive equation group utilizes the state value of previous moment and predicated error to make prediction, and obtains the position of each tracking unit at current time; Retouch is meant: owing to predict the outcome and can have certain error, the observed reading correction of the current time that is obtained by the utilization of update equation group predicts the outcome.
Describedly carry out both legs based on the GMM model and follow the tracks of, be specially: barycenter and both feet for human body are set up the single pendulum model, and the pendulum angle by both legs positions both feet; Simultaneously, the motion state of both feet is set up the GMM model, the motion of human body both feet is divided into two states: state that left foot motion right crus of diaphragm is static and the static state of right crus of diaphragm motion left foot, set up a Gauss model respectively for these two states, which Gauss model the motion of judging right crus of diaphragm by the state of left foot swing is in, when right crus of diaphragm is in relative ground static the time, then need not it is carried out Kalman's prediction, only need to adopt update equation correction motion parameter just can guarantee the accuracy of following the tracks of.
Compared with prior art, the present invention has following beneficial effect: the present invention can obtain higher tracking accuracy rate, uses linear model to calculate simple, intuitive.By the parameter of Kalman filtering prediction parts, and use observed reading correction predicted value, guaranteed the accuracy that parts are followed the tracks of at next frame.Use GMM to be the tracing object modeling, improved the state of following the tracks of efficient and can identifying the both legs motion, the accuracy that the right crus of diaphragm motion state is judged is 88.8%.
Description of drawings
Fig. 1 is that the present invention sets up human body linear model synoptic diagram to the human body structure
Wherein: figure (a) is the human body structure, (b) is human body linear model synoptic diagram;
Fig. 2 is a both legs single pendulum model of the present invention;
Fig. 3 is the experimental result of the embodiment of the invention;
Fig. 4 is the tracking results curve map of right crus of diaphragm pendulum angle of the present invention.
Embodiment
Below in conjunction with accompanying drawing embodiments of the invention are elaborated: present embodiment is being to implement under the prerequisite with the technical solution of the present invention, provided detailed embodiment and concrete operating process, but protection scope of the present invention is not limited to following embodiment.
As shown in Figure 3, present embodiment is used for following the tracks of the pedestrian of one section video sequence.
Present embodiment comprises the steps:
Step 1, with the human body linear model human body is set up model, as shown in Figure 1, figure (a) is the human body structure, (b) be human body linear model synoptic diagram, represent with circle that head, neck, barycenter and the both feet of human body, the human body that circle is represented are and follow the tracks of the position, to follow the tracks of the position with line segment and couple together, position or pendulum angle according to various piece in image sequence are followed the tracks of as feature.Wherein, be separate to the tracking of head, neck and barycenter, for both legs, then utilize the angle of its swing to follow the tracks of as feature;
Step 2 to the position of head, neck, barycenter, adopts Kalman filtering to follow the tracks of, and the tracking of Kalman filtering is divided into prediction and revises two parts.
Predicted portions is specially: the predictive equation group is utilized the state value of previous moment and the predicted value that predicated error obtains current time, and the predictive equation group is:
x′ k=Ax k-1
P ' k=AP K-1A T+ Q (formula 1)
Wherein: x ' kBe k predicted state variable constantly, x K-1Be k-1 state variable constantly, A is a state-transition matrix, P ' kBe k moment predicated error correlation matrix, P K-1Be the modified value of k-1 moment error correlation matrix, Q is the motion noise correlation matrix.
Retouch is specially: obtain after predicted value and the prediction correlated error, the update equation group is made correction by the observed reading of current time to predicted value and prediction correlated error, and the update equation group is:
K k=P′ kH T(HP′ kH T+R) -1
x k=x′ k+K k(z k-Hx′ k)
P k=(I-K kH) P ' k(formula 2)
Wherein: K kBe the Kalman filtering gain matrix, H is for measuring matrix, and R is for measuring noise correlation matrix, z kObserved reading.
In the present embodiment, observed reading is set at the position at place, tracking unit center, and state variable is made as position, movement velocity and the acceleration at parts center, and the hypothesis parts do uniformly accelerated motion, then that parameter setting is as follows:
z=[s x,s y] T,x=[s x,v x,a x,s y,v y,a y] T
Wherein: z is an observed reading, and subscript x, y represent horizontal ordinate and ordinate, and s, a, v be position, movement velocity and the acceleration at part center, finger respectively.
State-transition matrix A = 1 1 0.5 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0.5 0 0 0 0 1 1 0 0 0 0 0 1 ,
Measure matrix H = 1 0 0 0 0 0 0 0 0 1 0 0
Parameter P k, Q and R initial value need utilize priori to determine.
The update equation group is by observed reading z kRevise current predicted value, obtain revised state estimation value and Noise Variance Estimation value.
Described observed reading z k, its computation process is as follows: the displacement coordinate that dopes (s ' x, s ' y) for taking the method for spiral search in the small neighbourhood at center, search first and meet ρ [p spiral k(x, y), q K-1The coordinate of]<l (x, y) TAs z k, p wherein k(x, y) (x y) is the histogram of the small neighbourhood at center, q to expression k frame with coordinate K-1The object color model histogram of expression k-1 frame, ρ [p k(x, y), q K-1] be the distance of the histogram and the object color model of current coordinate, l is a distance threshold.
Step 3, for both feet, follow the tracks of based on the GMM model:
At first, utilize Kalman filtering that left foot is followed the tracks of.As shown in Figure 2, line and the horizontal angle of establishing barycenter and left foot is θ l, the line of barycenter and right crus of diaphragm and horizontal angle are θ r, the coordinate of barycenter is (x o, y o), the coordinate of left foot is (x l, y l), the length of leg is l, so:
x l=x o+l*cos?x l
y l=y o+ l*sin x l(formula 3)
The utilization Kalman filtering is to θ lFollow the tracks of, then the observational variable z=[θ of Kalman filter l] T, state variable x=[θ l, v L θ], v L θFor the angular velocity of both legs swing, suppose θ lMove with uniform velocity, then have:
A = 1 1 0 1
H = 1 0
In the parametric procedure of revising Kalman filter, through type 3 obtains left foot coordinate (x l, y l), take the method for spiral search in the small neighbourhood around its coordinate figure, search first and meet ρ [p k(x, y), q K-1(x is y) as (x for the coordinate of]<l t, y t), p wherein k(x, y) (x y) is the histogram of the small neighbourhood at center, q to expression k frame with coordinate K-1The object color model histogram of expression k-1 frame, ρ [p k(x, y), q K-1] be the distance of the histogram and the object color model of current coordinate, l is a distance threshold.
The pendulum angle θ of left foot lCoordinate (x with left foot t, y t) and the pass be:
θ l = arctan ( x t - x o y t - y o ) . (formula 4)
Utilize Kalman filtering to obtain the parameter θ of left foot swing state lAnd v L θAfterwards, the motion of using the GMM model to judge both feet by the kinematic parameter of left foot belongs to any motion state.
The motion of human body both feet is divided into two states, i.e. state that left foot motion right crus of diaphragm is static and the static state of right crus of diaphragm motion left foot.Set up a Gauss model respectively for these two states, by parameter lambda iExpression, the task of identification are determines a state i *, the model that it is corresponding
Figure C20081003453300084
Make sample set X have maximum posterior probability p (λ i/ X).According to bayesian theory, i * = arg max i p ( X / λ i ) , i *Be the double-legged motion state that identifies.
Under the single pendulum model, double-legged motion state i *Can judge by following formula:
Figure C20081003453300086
v L θAngular velocity (formula 5) for the left foot swing
The v ' that dopes by the Kalman L θMeet Gaussian distribution, v ' L θ=v L θ+ Δ ∈ λ i, i=1,2, Δ is a noise, meets parameter and is (μ Δ, σ Δ) Gaussian distribution.
Probability p (v when the left foot motion L θDuring>0) greater than probability threshold value α, then left foot is in motion, promptly
p ( v lθ > 0 )
= p ( v lθ + Δ > Δ )
= p ( v lθ ′ > Δ ) (formula 6)
= p ( &Delta; - &mu; &Delta; &sigma; &Delta; < v l &prime; - &mu; &Delta; &sigma; &Delta; ) > &alpha;
Can draw by formula 6
Figure C20081003453300095
(formula 7)
Wherein 1 2 &pi; &Integral; - &infin; &beta; e - x 2 2 dx = &alpha; .
When judging right crus of diaphragm, then the right crus of diaphragm pendulum angle is carried out Kalman filtering and follow the tracks of in motion; When right crus of diaphragm remains static, then only need the Kalman filtering parameter is revised.
Fig. 3,4 is the figure as a result of present embodiment.As shown in Figure 3, each of movement human in the video sequence followed the tracks of position can both accurately locate (among the embodiment in order to distinguish left foot and right crus of diaphragm, with left foot box indicating), figure (a) and (b), (c), (d), (e), (f), (g), (h), (i), (j) are respectively the video intercepting picture of the 48th frame, the 60th frame, the 71st frame, the 82nd frame, the 92nd frame, the 111st frame, the 125th frame, the 140th frame, the 171st frame, the 180th frame, wherein, among Fig. 3 (b), for both feet block situation mutually, present embodiment also can correctly be followed the tracks of.Among Fig. 3 (e), both feet separate, and still are correct to the judgement of bipod.
Fig. 4 is the statistical graph of right crus of diaphragm pendulum angle in each frame, and circle represents that the circle of band plus sige represents that then this frame right crus of diaphragm is in static state by the speed of left foot swing is judged the state that right crus of diaphragm is in motion.As shown in Figure 4, the accuracy that the right crus of diaphragm motion state is judged is 88.8%, but by the correction to the right crus of diaphragm position, still can guarantee the accuracy that right crus of diaphragm is followed the tracks of.

Claims (3)

1, a kind of human body tracing method based on gauss hybrid models, it is characterized in that, the head of at first representing human body with circle, neck, barycenter and both feet, the human body that circle is represented is follows the tracks of the position, to follow the tracks of the position with line segment couples together, obtain the height ratio that corresponding partes corporis humani divides, head with human body, neck, barycenter and both feet are as following the tracks of the position, in follow-up frame, human body is respectively followed the tracks of the position then and carry out the Kalman filtering tracking, this Kalman filtering is followed the tracks of and is comprised the Kalman filtering prediction and revise two parts, predicted portions is meant: the predictive equation group utilizes the state value of previous moment and predicated error to make prediction, and obtains the position of each tracking unit at current time; Retouch is meant: can have error owing to predict the outcome, the observed reading correction of the current time that is obtained by the utilization of update equation group predicts the outcome, wherein follow the tracks of based on gauss hybrid models for both legs, barycenter and both feet for human body are set up the single pendulum model, and the pendulum angle by both legs positions both feet; Simultaneously, motion state to both feet is set up gauss hybrid models, the motion of human body both feet is divided into two states: state that left foot motion right crus of diaphragm is static and the static state of right crus of diaphragm motion left foot, set up a Gauss model respectively for these two states, which Gauss model the motion of judging right crus of diaphragm by the state of left foot swing is in, when right crus of diaphragm is in relative ground static the time, then need not it is carried out the Kalman filtering prediction, only need to adopt update equation correction motion parameter, adopt Kalman filtering to follow the tracks of to left foot, and, simplify tracking to right crus of diaphragm by the state of left foot being judged the motion state of right crus of diaphragm;
Which Gauss model the motion that described state by the left foot swing is judged right crus of diaphragm is in is meant: under the single pendulum model, double-legged motion state i can judge by following formula:
Figure C2008100345330002C1
The v ' that dopes by Kalman filtering L θMeet Gaussian distribution, v ' L θ=v L θ+ Δ ∈ λ i, i=1,2, Δ is a noise, meets parameter and is (μ Δ, σ Δ) Gaussian distribution,
Probability p (v when the left foot motion L θDuring>0) greater than probability threshold value α, then left foot is in motion, promptly
p ( v l&theta; > 0 )
= p ( v l&theta; + &Delta; < &Delta; )
(formula 6)
= p ( v l&theta; &prime; > &Delta; )
= p ( &Delta; - &mu; &Delta; &sigma; &Delta; < v l &prime; - &mu; &Delta; &sigma; &Delta; ) > &alpha;
Can draw by formula 6
Figure C2008100345330003C1
Wherein 1 2 &pi; &Integral; - &infin; &beta; e - x 2 2 dx = &alpha; .
2, the human body tracing method based on gauss hybrid models according to claim 1 is characterized in that, described tracking, and the eigenwert of its tracking is the pendulum angle of the head position of human body, neck location, centroid position and both legs.
3, the human body tracing method based on gauss hybrid models according to claim 1 is characterized in that, the observation of described Kalman filtering is meant: the displacement coordinate that draws in Kalman filtering predictive equation group (s ' x, s ' y) for taking the method for spiral search in the small neighbourhood at center, find the coordinate of observed reading, search first and meet ρ [p spiral k(x, y), q K-1The coordinate of]<l (x, y) TAs observed reading z k, p wherein k(x, y) (x y) is the histogram of the small neighbourhood at center, q to expression k frame with coordinate K-1The object color model histogram of expression k-1 frame, ρ [p k(x, y), q K-1] be the distance of the histogram and the object color model of current coordinate, l is a distance threshold.
CN200810034533A 2008-03-13 2008-03-13 Human body tracing method based on gauss hybrid models Expired - Fee Related CN100585622C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810034533A CN100585622C (en) 2008-03-13 2008-03-13 Human body tracing method based on gauss hybrid models

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810034533A CN100585622C (en) 2008-03-13 2008-03-13 Human body tracing method based on gauss hybrid models

Publications (2)

Publication Number Publication Date
CN101251895A CN101251895A (en) 2008-08-27
CN100585622C true CN100585622C (en) 2010-01-27

Family

ID=39955279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810034533A Expired - Fee Related CN100585622C (en) 2008-03-13 2008-03-13 Human body tracing method based on gauss hybrid models

Country Status (1)

Country Link
CN (1) CN100585622C (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129688B (en) * 2011-02-24 2012-09-05 哈尔滨工业大学 Moving target detection method aiming at complex background
CN102368301A (en) * 2011-09-07 2012-03-07 常州蓝城信息科技有限公司 Moving human body detection and tracking system based on video
CN103123689B (en) * 2013-01-21 2016-11-09 信帧电子技术(北京)有限公司 A kind of run detection method and device based on the detection of people's leg
CN105741312B (en) * 2014-12-09 2018-04-27 株式会社理光 Destination object tracking and equipment
CN104574444B (en) * 2015-01-19 2017-06-09 天津工业大学 A kind of Camshift trackings based on goal decomposition
CN104535995B (en) * 2015-01-27 2017-03-15 北京智谷睿拓技术服务有限公司 Information getting method, information acquisition device and user equipment
WO2019144263A1 (en) * 2018-01-23 2019-08-01 深圳市大疆创新科技有限公司 Control method and device for mobile platform, and computer readable storage medium
CN108874146B (en) * 2018-07-09 2021-11-26 北京掌中飞天科技股份有限公司 Moving human body mass center displacement calculation method applied to virtual reality system
CN110530365B (en) * 2019-08-05 2021-05-18 浙江工业大学 Human body attitude estimation method based on adaptive Kalman filtering

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
人体跟踪中的区域和动态外形跟踪的研究. 孙正杰.中国优秀硕士学位论文全文数据库. 2007
人体跟踪中的区域和动态外形跟踪的研究. 孙正杰.中国优秀硕士学位论文全文数据库. 2007 *
运动人体检测与跟踪方法研究. 何淑贤.中国优秀硕士学位论文全文数据库. 2007
运动人体检测与跟踪方法研究. 何淑贤.中国优秀硕士学位论文全文数据库. 2007 *

Also Published As

Publication number Publication date
CN101251895A (en) 2008-08-27

Similar Documents

Publication Publication Date Title
CN100585622C (en) Human body tracing method based on gauss hybrid models
CN109636829B (en) Multi-target tracking method based on semantic information and scene information
CN107635204B (en) Indoor fusion positioning method and device assisted by exercise behaviors and storage medium
CN106169188B (en) A kind of method for tracing object based on the search of Monte Carlo tree
CN102789568B (en) Gesture identification method based on depth information
US20130294651A1 (en) System and method for gesture recognition
CN110555387B (en) Behavior identification method based on space-time volume of local joint point track in skeleton sequence
CN104616028B (en) Human body limb gesture actions recognition methods based on space segmentation study
CN103093199B (en) Based on the Given Face tracking of ONLINE RECOGNITION
CN103793926B (en) Method for tracking target based on sample reselection procedure
CN106373145B (en) Multi-object tracking method based on tracking segment confidence level and the study of distinction appearance
CN104915969A (en) Template matching tracking method based on particle swarm optimization
CN104680559A (en) Multi-view indoor pedestrian tracking method based on movement behavior mode
CN103105924A (en) Man-machine interaction method and device
CN111462180B (en) Object tracking method based on AND-OR graph AOG
CN104091352A (en) Visual tracking method based on structural similarity
CN103996207A (en) Object tracking method
CN104318264A (en) Facial feature point tracking method based on human eye preferential fitting
CN102663773A (en) Dual-core type adaptive fusion tracking method of video object
Nguyen et al. Tracking facial features under occlusions and recognizing facial expressions in sign language
CN102509289A (en) Characteristic matching cell division method based on Kalman frame
Elmezain et al. A robust method for hand tracking using mean-shift algorithm and kalman filter in stereo color image sequences
CN110070120B (en) Depth measurement learning method and system based on discrimination sampling strategy
CN116952277A (en) Potential position estimation method and system based on artificial intelligence and behavior characteristics
CN116343335A (en) Motion gesture correction method based on motion recognition

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100127

Termination date: 20130313