CN108053422A - Mobile target monitoring method - Google Patents
Mobile target monitoring method Download PDFInfo
- Publication number
- CN108053422A CN108053422A CN201711086204.1A CN201711086204A CN108053422A CN 108053422 A CN108053422 A CN 108053422A CN 201711086204 A CN201711086204 A CN 201711086204A CN 108053422 A CN108053422 A CN 108053422A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msubsup
- target
- msub
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 77
- 238000012544 monitoring process Methods 0.000 title claims abstract description 26
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 19
- 238000001514 detection method Methods 0.000 claims abstract description 18
- 238000000605 extraction Methods 0.000 claims abstract description 6
- 239000000284 extract Substances 0.000 claims abstract description 5
- 239000002245 particle Substances 0.000 claims description 127
- 230000003044 adaptive effect Effects 0.000 claims description 20
- 230000033001 locomotion Effects 0.000 claims description 18
- 238000012952 Resampling Methods 0.000 claims description 17
- 230000010339 dilation Effects 0.000 claims description 16
- 230000003628 erosive effect Effects 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 9
- 238000005070 sampling Methods 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 8
- 238000005259 measurement Methods 0.000 claims description 6
- 238000012546 transfer Methods 0.000 claims description 6
- 230000003247 decreasing effect Effects 0.000 claims description 3
- 238000005530 etching Methods 0.000 claims description 3
- 238000011156 evaluation Methods 0.000 claims description 3
- 238000013178 mathematical model Methods 0.000 claims description 3
- 230000009897 systematic effect Effects 0.000 claims description 3
- 238000013316 zoning Methods 0.000 claims description 3
- 230000003750 conditioning effect Effects 0.000 abstract description 4
- 239000000463 material Substances 0.000 abstract description 3
- 238000007405 data analysis Methods 0.000 abstract description 2
- 238000007796 conventional method Methods 0.000 abstract 1
- 238000004364 calculation method Methods 0.000 description 4
- 238000005457 optimization Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000002265 prevention Effects 0.000 description 3
- 238000005260 corrosion Methods 0.000 description 2
- 230000007797 corrosion Effects 0.000 description 2
- 238000010790 dilution Methods 0.000 description 2
- 239000012895 dilution Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000243 solution Substances 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/28—Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Other Investigation Or Analysis Of Materials By Electrical Means (AREA)
Abstract
The invention discloses a kind of mobile target monitoring method, including target detection, the extraction of target centroid point feature and target following three parts.In the data analyses such as passenger flow demographics, flow direction, areal concentration present embodiments can apply to public places such as exhibitions, market, museums, the distribution situation of personnel is real-time and accurately grasped, decision-making foundation is provided for safety precaution, environment conditioning work;The logistics progress monitoring of factory, enterprise is can also be applied to, statistics is monitored to the quantity of material, position, flow direction, distribution density etc..The present invention directly extracts moving target from coloured image, and than coloured image is first switched to gray level image in conventional target detection mode, the process for carrying out target detection again after pretreatment is more convenient;Meanwhile color characteristic is more easy to distinguish than gray feature, this algorithm of target detection overcome conventional method caused when the gray value of object and background is close the target area being partitioned into there is technical issues that extensive disruption or.
Description
Technical field
The present invention relates to a kind of monitoring method more particularly to a kind of mobile target monitoring methods, belong to monitoring technology field.
Background technology
Mobile target monitoring system can be by controlling the ancillary equipments direct viewing such as gamma camera and holder, camera lens to be monitored
The situation in place, while it can be recorded the image in monitored place is all or part of, in real time or in the future to certain
The processing of a little events provides important evidence.Mobile target monitoring system can also be with other safe practices such as antitheft, alarm, fire-fighting
Prevention system coordinated operation makes prevention ability more powerful, strick precaution to crime, processing of fire incident etc. is realized, as peace
An important component in full precautionary technology system.Mobile target monitoring is applied in mobile object statistics, analysis, it can
It is flowed to the distribution situation, distribution density, target of real-time and accurately master goal, for data statistics, takes precautions against crowded, traffic and dredge
It leads, the work such as environment conditioning, Energy Saving Control provide decision-making foundation.Coloured image is first switched to gray scale by traditional object detection method
Image carries out the detection methods such as the process of target detection, used background subtraction in object and background again after pretreatment
Gray value it is close when in the presence of the target area for causing to be partitioned into occur extensive disruption or cavity the problem of.Application No.
The patent document of 201610180413.1 entitled " intelligent electric meters based on image detection ", which employs conventional gray scale changes
It changes, filter, image pre-processing methods, the object detection method such as binary conversion treatment seem excessively cumbersome;Application No.
The patent document of 201410422500.4 entitled " based on the method for tracking target for improving particle filter ", the particle filter used
Ripple method for resampling there is it is computationally intensive the shortcomings that.
The content of the invention
It is an object of the invention to provide a kind of mobile target monitoring methods, carry out target detection, target centroid point feature
Extraction and target following, are applied in the statistical analysis of destination number, flow direction, density, real-time and accurately the distribution of master goal
Situation for data statistics, takes precautions against the work such as crowded, traffic dispersion, environment conditioning, Energy Saving Control and provides decision-making foundation.
The purpose of the present invention is achieved by the following technical programs:
A kind of mobile target monitoring method, this method comprise the steps of:
1) image of rgb format is obtained;
2) from the continuous H frame image sequences f collected1、f2、...、fHMiddle extraction L two field pictures, are denoted as I1、I2、...、IL;
When target speed is more than 0.1m/s, when target speed is less than 0.1m/s, figure is extracted using 3 frames of interval by L=H
The mode of picture carries out value to L;
3) useTo represent at kth frame image slices vegetarian refreshments (i, j)
RGB component, i and j difference representative image pixel abscissas and ordinate are then rightIt presses
It is ranked up according to the order of increasing or decreasing, ranking results is denoted as
4) take respectivelyThe median of three ordered series of numbers, is denoted asUsing these three values as background pixel at pixel (i, j) corresponding to rgb value;
5) real-time update is carried out to background, introduces the context update factor and carry out adaptive updates background:
In formula,For RGB component of the current frame image at pixel (i, j), In(i,
J) it is pixel value of the current frame image at pixel (i, j),For the background before update
RGB component of the two field picture at pixel (i, j), Cn-1(i, j) is pixel of the background two field picture at point (i, j) before update
Value, Cn(i, j) is pixel value of the updated background two field picture at point (i, j), and α is the context update factor, is taken as 0.5, Δ 1
Be current frame image with RGB difference of the background two field picture at pixel (i, j) before update and, TH1 is present image with carrying on the back
The division threshold values of scape image, is taken as 100~120;
6) to the R of each pixel of current frame image, G, B triple channels carry out difference processing respectively:
In formula,For RGB component of the current frame image at pixel (i, j),For RGB component of the background two field picture at pixel (i, j),For RGB component of the error image at pixel (i, j);
It is right afterwardsIt is handled as follows, so as to obtain target at pixel (i, j)
Bianry image RIn(i,j):
In formula, RIn(i, j) be bianry image of the target at pixel (i, j), TH2 be target detection threshold values, value model
It encloses for the integer between 0 to 255, optimal value 125;
7) influenced by image disruption, cavity or the discontinuous phenomenon of image are present with inside moving object, using mathematics
Morphology opening operation handles differentiated bianry image, is reached by the process of dilation operation after first erosion operation
Except the speckle noise in target image, smooth target image edge obtains on the premise of target image area and feature is not changed
Obtain target image smoothly;
8) center of mass point of moving target is extracted:
The center of mass point of movement destination image can be obtained by following formula:
In formula, f (x, y) represents the digital picture of M × N, and M and N represent the pixel number of X-direction and Y-direction respectively,
xo, yoRespectively abscissa, the ordinate of target centroid point;
9) target following is carried out;Moving target is carried out using SADD track algorithms (minimum absolute difference divides error and algorithm)
Tracking, acquisition object motion vector is connected by the center of mass point of front and rear two frames movement destination image;
10) object motion vector is analyzed, to determine that target is to flow in or out;
11) destination number statistics is carried out;
12) zoning target density.
The purpose of the present invention can also further be realized by following technical measures:
Foregoing mobile target monitoring method, wherein step 7), the realization method of erosion operation represent as follows:
S is structural element, takes the circle that radius is 4 pixels here as structural element, and X is target image, and Θ is corrosion
The expression symbol of computing, what is represented is to carry out etching operation to target image X with structural element S;SxyRepresent central coordinate of circle as (x,
Y) structural element S;E (X) is the target image after erosion operation, this formula is meant that when structural element S origins are moved to a little
When at (x, y), if S is included completely by X, the point is 1 on new image, is otherwise 0;Erosion operation main function is to eliminate mesh
The boundary point of logo image;
The realization method of dilation operation represents as follows:
S is structural element, takes the circle that radius is 4 pixels here as structural element;X is target image,For expansion
The expression symbol of computing, what is represented is to carry out expansive working to target image X with structural element S;SxyRepresent central coordinate of circle as (x,
Y) structural element S;D (X) is the target image after dilation operation, this formula is meant that when structural element S origins are moved to a little
When at (x, y), if it is simultaneously 1 that S and X, which has any point, the point is 1 on new image, is otherwise 0;If S and X does not have completely
Intersecting, point (x, y) is 0 on new image, and dilation operation main function is the minuscule hole removed inside image;
The realization method of opening operation represents as follows:
S is structural element, takes the circle that radius is 4 pixels here as structural element;X is target image, and Ο is to open fortune
The expression symbol of calculation, what is represented is that operation is carried out out to target image X with structural element S;L (X) is the target figure after opening operation
Picture using mathematical morphology open operator, reaches the spot in removal target image by the process of dilation operation after first erosion operation
Spot noise, smooth target image edge, obtains smooth target image on the premise of target image area and feature is not changed.
Foregoing mobile target monitoring method, wherein step 9) carry out target following, using area tracking algorithm
(blobtracking) then the center of mass point of front and rear two frames movement destination image is connected and obtained into line trace by moving target
Obtain object motion vector.
Foregoing mobile target monitoring method, wherein step 9) carry out target following, using based on the adaptive of dynamic resampling
Answer particle filter target tracking algorism:
Assuming that the space mathematical model of system is:
In formula, fk-1() and hk-1() represents system transfer function and measurement functions, w respectivelykAnd vkRepresent that system is mutual respectively
Relevant process noise and measurement noise are mended, flow is as follows:
(1) particle and weights are initialized:During k=0, by prior probability p (x0) sampling obtainI.e. with target
Center of mass point (x0,y0) centered on initialize it is N number of wait weights intended particle;
(2) systematic state transfer, k=k+1, samplingCalculate importance weight
In formulaFor the importance density function, priori probability density is chosenIt is close as importance
Function is spent, in order to realize;
(3) particle weights weights are normalized, obtained
(4) particle filtering resampling is carried out:Define efficiently sampling scaleIn formula
ForVariance, to NeffApproximate estimation,Set effective sample numberAs resampling threshold values, when
Neff≤NthWhen, resampling is carried out, by original cum rights sampleThe power sample such as it is sampled as
(5) state estimation and variance evaluation of target are exported:
(6) into the k+1 moment, judge whether it is end frame, if it is tracking terminates, if not then continuing next time
Tracking process.
Foregoing mobile target monitoring method, wherein step 9) carry out target following, using based on the adaptive of particle group optimizing
Particle filter target tracking algorism method is answered, realizes that process is as follows:
(1) initial time initializes the particle of N number of Gaussian Profile centered on the target centroid point extracted, and N takes 30, if
This fixed two field picture is first frame, and extracts feature vector to the image target area of selection, initializes tracking equations, tracks process
Noise is set to Rk;
(2) next two field picture is entered, carrying out current particle according to the position of i-th of particle puies forward the feature of target state estimator
It takes, and calculates the similarity relation with initial characteristics, so as to obtain the weight of current particle, to being carried out after N number of particle statistics
Weights normalize, by formulaThe target location predicted;
(3) particle group optimizing:The iterations upper limit value of particle group optimizing is defined as K, is just jumped out after reaching this upper limit value
Particle group optimizing iteration, the rate equation and position equation of particle group optimizing are as follows:
In formula, pbidFor itself optimal value of particle i, pgdFor global optimum;r1、r2To be random between [0,1]
Number, c1、c2For Studying factors, value 2;Initialize the fitness function value F of each particle in populationN(i), particle is passed through
The speed and position equation of group's optimization update speed and the position of each particle, calculate the adaptive value of each particle;Compare grain
The current adaptive value F of sonN(i) with itself optimal value pbidIf FN(i) > pbid, then F is usedN(i) pb is replacedid;Compare particle to work as
Preceding adaptive value FN(i) with global optimum pgdIf FN(i) > pgd, then F is usedN(i) pg is replacedd;Each particle all update with
It needs to judge whether the target location of prediction reaches setting convergency value with initial target similarity again afterwards, if reaching convergence
Value can then jump out particle group optimizing process, if iteration is all not reaching to for K times, also jump out the process of particle group optimizing;
(4) jump out after particle group optimizing process, weight is recalculated to particle, and weight is normalized;
(5) state estimation is carried out:According to the position of particle and weight, the position of target is estimated, and will estimate target location
It shows, to identify;
(6) judge whether it is last frame image, if it is tracking terminates, if not then into next state
Tracking process.
Compared with prior art, present invention has the advantages that:The present invention mobile target monitoring method may be used on exhibitions,
In the data analyses such as the passenger flow demographics of the public places such as market, museum, flow direction, areal concentration, real-time and accurately grasp
The distribution situation of personnel provides decision-making foundation for the work such as safety precaution, environment conditioning, energy saving;It can also be applied to factory, enterprise
Logistics progress monitoring, statistics is monitored to the quantity of material, position, flow direction, distribution density etc..The present invention use based on
The algorithm of target detection of rgb color space can directly extract moving target from coloured image, than conventional target detection side
Coloured image is first switched into gray level image in formula, the process for carrying out target detection again after pretreatment is more convenient;Meanwhile
Color characteristic is more easy to distinguish than gray feature, this algorithm of target detection overcomes the object detection methods such as traditional background subtraction
Caused when the gray value of object and background is close the target area being partitioned into there is technical issues that extensive disruption or.
On target tracking algorism, present invention introduces the adaptive particle filter algorithms based on particle group optimizing.With Kalman filtering phase
Than particle filter algorithm clear concept is broken away from and understands the system that random quantity in linear filtering problem by no means must is fulfilled for Gaussian Profile
About condition, it can also carry out parallel computation, can preferably meet the requirement of real-time occasion.For with particle filter iteration time
The problem of several increases is present with particle degeneracy, it is more to lose particle at the phenomenon that generating particle dilution using the method for resampling
Sample.The adaptive particle filter algorithm based on particle group optimizing that the present invention uses is by particle swarm optimization algorithm and particle filter
Algorithm is combined, and has skipped the process of conventional particle filtering resampling, maintains the diversity of particle, while calculation amount is also significantly
It reduces.
Description of the drawings
Fig. 1 is the mobile target monitoring method flow chart of the present invention.
Specific embodiment
The invention will be further described in the following with reference to the drawings and specific embodiments.
As shown in Figure 1, the mobile target monitoring method of the present invention, this method comprise the steps of:
1) image of rgb format is obtained;
2) from the continuous H frame image sequences f collected1、f2、...、fHMiddle extraction L two field pictures, are denoted as I1、I2、...、IL;
When target speed is more than 0.1m/s, when target speed is less than 0.1m/s, figure is extracted using 3 frames of interval by L=H
The mode of picture carries out value to L;
3) useTo represent at kth frame image slices vegetarian refreshments (i, j)
RGB component, i and j difference representative image pixel abscissas and ordinate are then rightIt presses
It is ranked up according to the order of increasing or decreasing, ranking results is denoted as
4) take respectivelyThe median of three ordered series of numbers, is denoted asUsing these three values as background pixel at pixel (i, j) corresponding to rgb value;
5) real-time update is carried out to background, introduces the context update factor and carry out adaptive updates background:
In formula,For RGB component of the current frame image at pixel (i, j), In(i,
J) it is pixel value of the current frame image at pixel (i, j),For the background before update
RGB component of the two field picture at pixel (i, j), Cn-1(i, j) is pixel of the background two field picture at point (i, j) before update
Value, Cn(i, j) is pixel value of the updated background two field picture at point (i, j), and α is the context update factor, is taken as 0.5, Δ 1
Be current frame image with RGB difference of the background two field picture at pixel (i, j) before update and, TH1 is present image with carrying on the back
The division threshold values of scape image, it is preferable to be taken as 110 effects;
6) to the R of each pixel of current frame image, G, B triple channels carry out difference processing respectively:
In formula,For RGB component of the current frame image at pixel (i, j),For RGB component of the background two field picture at pixel (i, j),For RGB component of the error image at pixel (i, j);
It is right afterwardsIt is handled as follows, so as to obtain target at pixel (i, j)
Bianry image RIn(i,j):
In formula, RIn(i, j) is bianry image of the target at pixel (i, j), and TH2 is target detection threshold values, and value is
125 effects are preferable;
7) influenced by image disruption, cavity or the discontinuous phenomenon of image are present with inside moving object, using mathematics
Morphology opening operation handles differentiated bianry image, is reached by the process of dilation operation after first erosion operation
Except the speckle noise in target image, smooth target image edge obtains on the premise of target image area and feature is not changed
Obtain target image smoothly;The realization method of erosion operation represents as follows:
S is structural element, takes the circle that radius is 4 pixels here as structural element, and X is target image, and Θ is corrosion
The expression symbol of computing, what is represented is to carry out etching operation to target image X with structural element S;SxyRepresent central coordinate of circle as (x,
Y) structural element S;E (X) is the target image after erosion operation, this formula is meant that when structural element S origins are moved to a little
When at (x, y), if S is included completely by X, the point is 1 on new image, is otherwise 0;Erosion operation main function is to eliminate mesh
The boundary point of logo image;
The realization method of dilation operation represents as follows:
S is structural element, takes the circle that radius is 4 pixels here as structural element;X is target image,For expansion
The expression symbol of computing, what is represented is to carry out expansive working to target image X with structural element S;SxyRepresent central coordinate of circle as (x,
Y) structural element S;D (X) is the target image after dilation operation, this formula is meant that when structural element S origins are moved to a little
When at (x, y), if it is simultaneously 1 that S and X, which has any point, the point is 1 on new image, is otherwise 0;If S and X does not have completely
Intersecting, point (x, y) is 0 on new image, and dilation operation main function is the minuscule hole removed inside image;
The realization method of opening operation represents as follows:
S is structural element, takes the circle that radius is 4 pixels here as structural element;X is target image, and Ο is to open fortune
The expression symbol of calculation, what is represented is that operation is carried out out to target image X with structural element S;L (X) is the target figure after opening operation
Picture using mathematical morphology open operator, reaches the spot in removal target image by the process of dilation operation after first erosion operation
Spot noise, smooth target image edge, obtains smooth target image on the premise of target image area and feature is not changed.
8) center of mass point of moving target is extracted:
The center of mass point of movement destination image can be obtained by following formula:
In formula, f (x, y) represents the digital picture of M × N, and M and N represent the pixel number of X-direction and Y-direction respectively,
xo, yoRespectively abscissa, the ordinate of target centroid point;
9) target following is carried out;Tracking when carrying out target following there are two level can use, in movement mesh
It indicates in the case of merging and (there are close adjacent multiple target images), using SADD track algorithms (minimum absolute difference point
Error and algorithm) influencing each other between one multiple targets of exclusion;Moving target not merge in the case of, using region with
Track method (blob tracking), this strategy can improve efficiency and energy saving.Then by the matter of front and rear two frames movement destination image
Heart point connects acquisition object motion vector;
10) object motion vector is analyzed, to determine that target is to flow in or out;
11) destination number statistics is carried out;
12) zoning target density.
Using the data that above method obtains, can be processed further obtaining the real-time personnel in monitored public place point
The information such as cloth density, average translational speed, available for the crowded prevention caved in, the strick precaution of personnel's cluster, the evacuation of congestion passage,
Or the illumination in the region, fresh air volume, humiture are controlled according to personnel's distribution situation, to achieve the purpose that adjusting ambient and energy saving.
The logistics progress monitoring of factory, enterprise is can also be applied to, system is monitored to the quantity of material, position, flow direction, distribution density etc.
Meter.
The adaptive particle filter target tracking algorism based on dynamic resampling can also be used for target following:
Assuming that the space mathematical model of system is:
In formula, fk-1() and hk-1() represents system transfer function and measurement functions, w respectivelykAnd vkRepresent that system is mutual respectively
Relevant process noise and measurement noise are mended, flow is as follows:
(1) particle and weights are initialized:During k=0, by prior probability p (x0) sampling obtainI.e. with target
Center of mass point (x0,y0) centered on initialize it is N number of wait weights intended particle;
(2) systematic state transfer, k=k+1, samplingCalculate importance weight
In formulaFor the importance density function, priori probability density is chosenIt is close as importance
Function is spent, in order to realize;
(3) particle weights weights are normalized, obtained
(4) particle filtering resampling is carried out:Define efficiently sampling scaleIn formula
ForVariance, to NeffApproximate estimation,Set effective sample numberAs resampling threshold values, when
Neff≤NthWhen, resampling is carried out, by original cum rights sampleThe power sample such as it is sampled as
(5) state estimation and variance evaluation of target are exported:
(6) into the k+1 moment, judge whether it is end frame, if it is tracking terminates, if not then continuing next time
Tracking process.
But the problem of being present with particle degeneracy for the increase with particle filter iterations, using resampling
Method can generate the phenomenon that particle dilution, lose particle diversity, therefore more optimizedly using based on the adaptive of particle group optimizing
Particle filter target tracking algorism method is answered, realizes that process is as follows:
(1) initial time initializes the particle of N number of Gaussian Profile centered on the target centroid point extracted, and N takes 30, if
This fixed two field picture is first frame, and extracts feature vector to the image target area of selection, initializes tracking equations, tracks process
Noise is set to Rk;
(2) next two field picture is entered, carrying out current particle according to the position of i-th of particle puies forward the feature of target state estimator
It takes, and calculates the similarity relation with initial characteristics, so as to obtain the weight of current particle, to being carried out after N number of particle statistics
Weights normalize, by formulaThe target location predicted;
(3) particle group optimizing:The iterations upper limit value of particle group optimizing is defined as K, is just jumped out after reaching this upper limit value
Particle group optimizing iteration, the rate equation and position equation of particle group optimizing are as follows:
In formula, pbidFor itself optimal value of particle i, pgdFor global optimum;r1、r2To be random between [0,1]
Number, c1、c2For Studying factors, value 2;Initialize the fitness function value F of each particle in populationN(i), particle is passed through
The speed and position equation of group's optimization update speed and the position of each particle, calculate the adaptive value of each particle;Compare grain
The current adaptive value F of sonN(i) with itself optimal value pbidIf FN(i) > pbid, then F is usedN(i) pb is replacedid;Compare particle to work as
Preceding adaptive value FN(i) with global optimum pgdIf FN(i) > pgd, then F is usedN(i) pg is replacedd;Each particle all update with
It needs to judge whether the target location of prediction reaches setting convergency value with initial target similarity again afterwards, if reaching convergence
Value can then jump out particle group optimizing process, if iteration is all not reaching to for K times, also jump out the process of particle group optimizing;
During particle group optimizing iteration, particle rapidity upper limit value vmaxWith lower limiting value vminFor particle group optimizing
Can be most important, this patent passes through dynamic regulation vmaxAnd vminMethod determine suitable particle rapidity upper limit value and lower limit
Value.Suitable particles speed limit value and lower limiting value are initialized, the integer L that set-point is 5 is referred to, and passes through one in all particles
After secondary iteration update, by the speed and v of each particlemaxIt is compared, if there is the speed of L particle is more than or close to upper
Limit value then illustrates vmaxWhat is set is too small, by itself plus a processing, illustrates v if the speed of all particles is below upper limit valuemax
What is set bigger than normal, the processing that subtracted one.Similarly, if there is the speed of L particle then illustrates v below or near to lower limiting valueminIt sets
Excessive, the processing that subtracted one, if the speed of all particles all higher than illustrating v if upper limit valuemaxSet it is less than normal, by itself plus
One processing.
Give two identical integer H1And H2If during continuous iteration were newer, globe optimum PgContinuous H1
Secondary iteration all updates, then illustrates that current particle group is in the process for constantly entering new state, suitably number of particles is subtracted
It is few;, whereas if the continuous H of globe optimum2Secondary iteration does not all update, at this time population it is possible that be absorbed in local best points and
It can not jump out, i.e., target location is it is possible that the inaccuracy of tracking, at this time needs to be adjusted particle current state, pass through
Increase number of particles, the scope of expanded search, so as to jump out current possible locally optimal solution, when particle estimated state value is less than
Given accuracy jumps out particle group optimizing process later more than maximum iteration upper limit value.
(4) jump out after particle group optimizing process, weight is recalculated to particle, and weight is normalized;
(5) state estimation is carried out:According to the position of particle and weight, the position of target is estimated, and will estimate target location
It shows, to identify;
(6) judge whether it is last frame image, if it is tracking terminates, if not then into next state
Tracking process.
The adaptive particle filter algorithm based on particle group optimizing that the present invention uses is by particle swarm optimization algorithm and particle
Filtering algorithm is combined, and has skipped the process of conventional particle filtering resampling, has both maintained the diversity of particle, while also significantly
Reduce calculation amount.
In addition to the implementation, the present invention can also have other embodiment, all to use equivalent substitution or equivalent transformation shape
Into technical solution, all fall within the present invention claims protection domain in.
Claims (5)
1. a kind of mobile target monitoring method, which is characterized in that comprise the steps of:
1) image of rgb format is obtained;
2) from the continuous H frame image sequences f collected1、f2、...、fHMiddle extraction L two field pictures, are denoted as I1、I2、...、IL;Work as mesh
When marking movement velocity more than 0.1m/s, L=H, when target speed is less than 0.1m/s, using 3 frame abstract images of interval
Mode carries out value to L;
3) useTo represent at kth frame image slices vegetarian refreshments (i, j)
RGB component, i and j difference representative image pixel abscissas and ordinate, it is then rightAccording to
The order of increasing or decreasing is ranked up, and ranking results are denoted as
4) take respectivelyThe median of three ordered series of numbers, is denoted asUsing these three values as background pixel at pixel (i, j) corresponding to rgb value;
5) real-time update is carried out to background, introduces the context update factor and carry out adaptive updates background:
<mrow>
<mi>&Delta;</mi>
<mn>1</mn>
<mo>=</mo>
<mo>|</mo>
<msubsup>
<mi>I</mi>
<mi>n</mi>
<mi>R</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<msubsup>
<mi>C</mi>
<mrow>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
<mi>R</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mo>+</mo>
<mo>|</mo>
<msubsup>
<mi>I</mi>
<mi>n</mi>
<mi>G</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<msubsup>
<mi>C</mi>
<mrow>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
<mi>G</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mo>+</mo>
<mo>|</mo>
<msubsup>
<mi>I</mi>
<mi>n</mi>
<mi>B</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>-</mo>
<msubsup>
<mi>C</mi>
<mrow>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
<mi>B</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>C</mi>
<mi>n</mi>
</msub>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msub>
<mi>C</mi>
<mrow>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>&Delta;</mi>
<mn>1</mn>
<mo>&le;</mo>
<mi>T</mi>
<mi>H</mi>
<mn>1</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>C</mi>
<mi>n</mi>
</msub>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mrow>
<mo>(</mo>
<mrow>
<mn>1</mn>
<mo>-</mo>
<mi>&alpha;</mi>
</mrow>
<mo>)</mo>
</mrow>
<msub>
<mi>C</mi>
<mrow>
<mi>n</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>+</mo>
<msub>
<mi>&alpha;I</mi>
<mi>n</mi>
</msub>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>&Delta;</mi>
<mn>1</mn>
<mo>></mo>
<mi>T</mi>
<mi>H</mi>
<mn>1</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
In formula,For RGB component of the current frame image at pixel (i, j), In(i, j) is to work as
Pixel value of the prior image frame at pixel (i, j),For the background two field picture before update
RGB component at pixel (i, j), Cn-1The pixel value of (i, j) for the background two field picture before update at point (i, j), Cn
(i, j) is pixel value of the updated background two field picture at point (i, j), and α is the context update factor, is taken as 0.5, Δ 1 is to work as
Prior image frame and RGB difference of the background two field picture at pixel (i, j) before update and, TH1 is present image and Background
The division threshold values of picture, is taken as 100~120;
6) to the R of each pixel of current frame image, G, B triple channels carry out difference processing respectively:
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msubsup>
<mi>D</mi>
<mi>n</mi>
<mi>R</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mo>|</mo>
<msubsup>
<mi>I</mi>
<mi>n</mi>
<mi>R</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>-</mo>
<msubsup>
<mi>C</mi>
<mi>n</mi>
<mi>R</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>|</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msubsup>
<mi>D</mi>
<mi>n</mi>
<mi>G</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mo>|</mo>
<msubsup>
<mi>I</mi>
<mi>n</mi>
<mi>G</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>-</mo>
<msubsup>
<mi>C</mi>
<mi>n</mi>
<mi>G</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>|</mo>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msubsup>
<mi>D</mi>
<mi>n</mi>
<mi>B</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mo>|</mo>
<msubsup>
<mi>I</mi>
<mi>n</mi>
<mi>B</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>-</mo>
<msubsup>
<mi>C</mi>
<mi>n</mi>
<mi>B</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>|</mo>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>3</mn>
<mo>)</mo>
</mrow>
</mrow>
In formula,For RGB component of the current frame image at pixel (i, j),For RGB component of the background two field picture at pixel (i, j),For RGB component of the error image at pixel (i, j);
It is right afterwardsIt is handled as follows, so as to obtain target two at pixel (i, j)
It is worth image RIn(i,j):
<mrow>
<mi>&Delta;</mi>
<mn>2</mn>
<mo>=</mo>
<mo>&lsqb;</mo>
<mn>1</mn>
<mo>/</mo>
<mn>3</mn>
<msubsup>
<mi>D</mi>
<mi>n</mi>
<mi>R</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mn>1</mn>
<mo>/</mo>
<mn>3</mn>
<msubsup>
<mi>D</mi>
<mi>n</mi>
<mi>G</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mn>1</mn>
<mo>/</mo>
<mn>3</mn>
<msubsup>
<mi>D</mi>
<mi>n</mi>
<mi>B</mi>
</msubsup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>&rsqb;</mo>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>4</mn>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>RI</mi>
<mi>n</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mi>&Delta;</mi>
<mn>2</mn>
<mo>&GreaterEqual;</mo>
<mi>T</mi>
<mi>H</mi>
<mn>2</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>RI</mi>
<mi>n</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mn>0</mn>
<mo>,</mo>
<mi>&Delta;</mi>
<mn>2</mn>
<mo><</mo>
<mi>T</mi>
<mi>H</mi>
<mn>2</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>5</mn>
<mo>)</mo>
</mrow>
</mrow>
In formula, RIn(i, j) be bianry image of the target at pixel (i, j), TH2 be target detection threshold values, value range 0
Integer between to 255, optimal value 125;
7) influenced by image disruption, cavity or the discontinuous phenomenon of image are present with inside moving object, using Mathematical Morphology
Opening operation is learned to handle differentiated bianry image, removal mesh is reached by the process of dilation operation after first erosion operation
Speckle noise in logo image, smooth target image edge, is put down on the premise of target image area and feature is not changed
Sliding target image;
8) center of mass point of moving target is extracted:
The center of mass point of movement destination image can be obtained by following formula:
<mrow>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>x</mi>
<mi>o</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mi>x</mi>
<mi>f</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mi>f</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
</mfrac>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>y</mi>
<mi>o</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mi>y</mi>
<mi>f</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>x</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mi>f</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
</mfrac>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>6</mn>
<mo>)</mo>
</mrow>
</mrow>
In formula, f (x, y) represents the digital picture of M × N, and M and N represent the pixel number of X-direction and Y-direction, x respectivelyo, yoPoint
Not Wei target centroid point abscissa, ordinate;
9) target following is carried out;Using SADD track algorithms (minimum absolute difference point error and algorithm) moving target is carried out with
The center of mass point of front and rear two frames movement destination image is connected acquisition object motion vector by track;
10) object motion vector is analyzed, to determine that target is to flow in or out;
11) destination number statistics is carried out;
12) zoning target density.
2. mobile target monitoring method as described in claim 1, which is characterized in that in step 7), the realization side of erosion operation
Formula represents as follows:
<mrow>
<mi>E</mi>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mi>X</mi>
<mi>&Theta;</mi>
<mi>S</mi>
<mo>=</mo>
<mo>{</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>|</mo>
<msub>
<mi>S</mi>
<mrow>
<mi>x</mi>
<mi>y</mi>
</mrow>
</msub>
<mo>&SubsetEqual;</mo>
<mi>X</mi>
<mo>}</mo>
</mrow>
S is structural element, and taking the circle that radius is 4 pixels here, X is target image, and Θ is erosion operation as structural element
Expression symbol, represent be that etching operation is carried out to target image X with structural element S;SxyCentral coordinate of circle is represented as (x, y)
Structural element S;E (X) is the target image after erosion operation, this formula is meant that when structural element S origins are moved to point (x, y)
During place, if S is included completely by X, the point is 1 on new image, is otherwise 0;Erosion operation main function is to eliminate target figure
The boundary point of picture;
The realization method of dilation operation represents as follows:
<mrow>
<mi>D</mi>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mi>X</mi>
<mo>&CirclePlus;</mo>
<mi>S</mi>
<mo>=</mo>
<mo>{</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>|</mo>
<msub>
<mi>S</mi>
<mrow>
<mi>x</mi>
<mi>y</mi>
</mrow>
</msub>
<mo>&cap;</mo>
<mi>X</mi>
<mo>&NotEqual;</mo>
<mn>0</mn>
<mo>}</mo>
</mrow>
S is structural element, takes the circle that radius is 4 pixels here as structural element;X is target image,For dilation operation
Represent symbol, what is represented is to carry out expansive working to target image X with structural element S;SxyRepresent knot of the central coordinate of circle as (x, y)
Constitutive element S;D (X) is the target image after dilation operation, this formula is meant that when structural element S origins are moved at point (x, y)
When, if it is simultaneously 1 that S and X, which has any point, the point is 1 on new image, is otherwise 0;If S does not intersect completely with X, newly
Image on point (x, y) be 0, dilation operation main function be remove image inside minuscule hole;
The realization method of opening operation represents as follows:
<mrow>
<mi>L</mi>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mi>X</mi>
<mi>O</mi>
<mi>S</mi>
<mo>=</mo>
<mrow>
<mo>(</mo>
<mi>X</mi>
<mi>&Theta;</mi>
<mi>S</mi>
<mo>)</mo>
</mrow>
<mo>&CirclePlus;</mo>
<mi>S</mi>
</mrow>
S is structural element, takes the circle that radius is 4 pixels here as structural element;X is target image, and Ο is opening operation
Represent symbol, what is represented is that operation is carried out out to target image X with structural element S;L (X) is the target image after opening operation, is adopted
With mathematical morphology open operator, the spot in removal target image is reached by the process of dilation operation after first erosion operation and is made an uproar
Sound, smooth target image edge, obtains smooth target image on the premise of target image area and feature is not changed.
3. mobile target monitoring method as described in claim 1, which is characterized in that step 9) carries out target following, using area
The center of mass point of front and rear two frames movement destination image, into line trace, is then connected acquisition mesh by domain track algorithm to moving target
Mark motion vector.
4. mobile target monitoring method as described in claim 1, which is characterized in that step 9) carries out target following, using base
In the adaptive particle filter target tracking algorism of dynamic resampling:
Assuming that the space mathematical model of system is:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>x</mi>
<mi>k</mi>
</msub>
<mo>=</mo>
<mi>f</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mrow>
<mi>k</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<mo>,</mo>
<msub>
<mi>w</mi>
<mrow>
<mi>k</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>z</mi>
<mi>k</mi>
</msub>
<mo>=</mo>
<mi>h</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mrow>
<mi>k</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<mo>,</mo>
<msub>
<mi>v</mi>
<mi>k</mi>
</msub>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
In formula, fk-1() and hk-1() represents system transfer function and measurement functions, w respectivelykAnd vkSystem complementation phase is represented respectively
The process noise and measurement noise of pass, flow are as follows:
(1) particle and weights are initialized:During k=0, by prior probability p (x0) sampling obtain I.e. with target centroid point
(x0,y0) centered on initialize it is N number of wait weights intended particle;
(2) systematic state transfer, k=k+1, samplingCalculate importance weight
In formulaFor the importance density function, priori probability density is chosenAs importance density letter
Number, in order to realize;
(3) particle weights weights are normalized, obtained
(4) particle filtering resampling is carried out:Define efficiently sampling scaleIn formulaFor
Variance, to NeffApproximate estimation,Set effective sample numberAs resampling threshold values, work as Neff≤
NthWhen, resampling is carried out, by original cum rights sampleThe power sample such as it is sampled as
(5) state estimation and variance evaluation of target are exported:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mover>
<msub>
<mi>x</mi>
<mi>k</mi>
</msub>
<mo>^</mo>
</mover>
<mo>=</mo>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<msubsup>
<mi>w</mi>
<mi>k</mi>
<mi>i</mi>
</msubsup>
<msubsup>
<mi>x</mi>
<mi>k</mi>
<mi>i</mi>
</msubsup>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>p</mi>
<mi>k</mi>
</msub>
<mo>=</mo>
<munderover>
<mi>&Sigma;</mi>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<msubsup>
<mi>w</mi>
<mi>k</mi>
<mi>i</mi>
</msubsup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>k</mi>
<mi>i</mi>
</msubsup>
<mo>-</mo>
<mover>
<msub>
<mi>x</mi>
<mi>k</mi>
</msub>
<mo>^</mo>
</mover>
<mo>)</mo>
</mrow>
<mo>(</mo>
<msubsup>
<mi>x</mi>
<mi>k</mi>
<mi>i</mi>
</msubsup>
<mo>-</mo>
<mover>
<mrow>
<msub>
<mi>x</mi>
<mi>k</mi>
</msub>
<msup>
<mo>)</mo>
<mi>T</mi>
</msup>
</mrow>
<mo>^</mo>
</mover>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
(6) into the k+1 moment, judge whether it is end frame, if it is tracking terminates, if not then continuing next secondary tracking
Process.
5. mobile target monitoring method as described in claim 1, which is characterized in that step 9) carries out target following, using base
In the adaptive particle filter target tracking algorism method of particle group optimizing, realize that process is as follows:
(1) initial time initializes the particle of N number of Gaussian Profile centered on the target centroid point extracted, and N takes 30, sets this
Two field picture is first frame, and extracts feature vector to the image target area of selection, initializes tracking equations, tracks process noise
It is set to Rk;
(2) enter next two field picture, feature extraction of the current particle to target state estimator is carried out according to the position of i-th of particle, and
The similarity relation with initial characteristics is calculated, so as to obtain the weight of current particle, is returned to carrying out weights after N number of particle statistics
One changes, by formulaThe target location predicted;
(3) particle group optimizing:The iterations upper limit value of particle group optimizing is defined as K, particle is just jumped out after reaching this upper limit value
Group's Optimized Iterative, the rate equation and position equation of particle group optimizing are as follows:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<msubsup>
<mi>v</mi>
<mrow>
<mi>i</mi>
<mi>d</mi>
</mrow>
<mrow>
<mi>k</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msubsup>
<mo>=</mo>
<msubsup>
<mi>v</mi>
<mrow>
<mi>i</mi>
<mi>d</mi>
</mrow>
<mi>k</mi>
</msubsup>
<mo>+</mo>
<msub>
<mi>c</mi>
<mn>1</mn>
</msub>
<msub>
<mi>r</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<msubsup>
<mi>pb</mi>
<mrow>
<mi>i</mi>
<mi>d</mi>
</mrow>
<mi>k</mi>
</msubsup>
<mo>-</mo>
<msubsup>
<mi>x</mi>
<mrow>
<mi>i</mi>
<mi>d</mi>
</mrow>
<mi>k</mi>
</msubsup>
<mo>)</mo>
</mrow>
<mo>+</mo>
<msub>
<mi>c</mi>
<mn>2</mn>
</msub>
<msub>
<mi>r</mi>
<mn>2</mn>
</msub>
<mrow>
<mo>(</mo>
<msubsup>
<mi>pg</mi>
<mi>d</mi>
<mi>k</mi>
</msubsup>
<mo>-</mo>
<msubsup>
<mi>x</mi>
<mrow>
<mi>i</mi>
<mi>d</mi>
</mrow>
<mi>k</mi>
</msubsup>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msubsup>
<mi>x</mi>
<mrow>
<mi>i</mi>
<mi>d</mi>
</mrow>
<mrow>
<mi>k</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msubsup>
<mo>=</mo>
<msubsup>
<mi>x</mi>
<mrow>
<mi>i</mi>
<mi>d</mi>
</mrow>
<mi>k</mi>
</msubsup>
<mo>+</mo>
<msubsup>
<mi>v</mi>
<mrow>
<mi>i</mi>
<mi>d</mi>
</mrow>
<mrow>
<mi>k</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msubsup>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
In formula, pbidFor itself optimal value of particle i, pgdFor global optimum;r1、r2For the random number between [0,1], c1、
c2For Studying factors, value 2;Initialize the fitness function value F of each particle in populationN(i), particle group optimizing is passed through
Speed and position equation update speed and the position of each particle, calculate the adaptive value of each particle;It is current to compare particle
Adaptive value FN(i) with itself optimal value pbidIf FN(i) > pbid, then F is usedN(i) pb is replacedid;Compare current suitable of particle
It should value FN(i) with global optimum pgdIf FN(i) > pgd, then F is usedN(i) pg is replacedd;Each particle needs after all updating
Whether the target location and initial target similarity for judging prediction again reach setting convergency value, can if convergency value is reached
To jump out particle group optimizing process, if iteration is all not reaching to for K times, the process of particle group optimizing is also jumped out;
(4) jump out after particle group optimizing process, weight is recalculated to particle, and weight is normalized;
(5) state estimation is carried out:According to the position of particle and weight, the position of target is estimated, and estimation target location is shown
Out, to identify;
(6) judge whether it is last frame image, if it is tracking terminates, if not the tracking for then entering next state
Process.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711086204.1A CN108053422A (en) | 2017-11-07 | 2017-11-07 | Mobile target monitoring method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711086204.1A CN108053422A (en) | 2017-11-07 | 2017-11-07 | Mobile target monitoring method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108053422A true CN108053422A (en) | 2018-05-18 |
Family
ID=62119058
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711086204.1A Pending CN108053422A (en) | 2017-11-07 | 2017-11-07 | Mobile target monitoring method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108053422A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109685833A (en) * | 2018-12-28 | 2019-04-26 | 镇江市高等专科学校 | Method for tracking moving target |
CN109978941A (en) * | 2019-04-15 | 2019-07-05 | 云南民族大学 | Non-contact type sleeper localization method under a kind of tamping operation |
CN111144421A (en) * | 2019-12-10 | 2020-05-12 | 深圳市优必选科技股份有限公司 | Object color identification method and device and throwing equipment |
CN115731658A (en) * | 2021-08-31 | 2023-03-03 | 国家电网有限公司 | Security positioning device and method for power system equipment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102012419A (en) * | 2010-11-03 | 2011-04-13 | 浙江工业大学 | Biologic water quality monitoring system for perceiving fish behaviors based on vision |
CN102750710A (en) * | 2012-05-31 | 2012-10-24 | 信帧电子技术(北京)有限公司 | Method and device for counting motion targets in images |
-
2017
- 2017-11-07 CN CN201711086204.1A patent/CN108053422A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102012419A (en) * | 2010-11-03 | 2011-04-13 | 浙江工业大学 | Biologic water quality monitoring system for perceiving fish behaviors based on vision |
CN102750710A (en) * | 2012-05-31 | 2012-10-24 | 信帧电子技术(北京)有限公司 | Method and device for counting motion targets in images |
Non-Patent Citations (4)
Title |
---|
张昊堃: "基于粒子滤波算法的目标跟踪研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
徐以美: "视频序列中运动目标检测与跟踪技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
汤石晨: "基于光流法的视频人数统计方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
王佩思: "基于粒子滤波的目标跟踪算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109685833A (en) * | 2018-12-28 | 2019-04-26 | 镇江市高等专科学校 | Method for tracking moving target |
CN109685833B (en) * | 2018-12-28 | 2023-04-07 | 镇江市高等专科学校 | Moving target tracking method |
CN109978941A (en) * | 2019-04-15 | 2019-07-05 | 云南民族大学 | Non-contact type sleeper localization method under a kind of tamping operation |
CN111144421A (en) * | 2019-12-10 | 2020-05-12 | 深圳市优必选科技股份有限公司 | Object color identification method and device and throwing equipment |
CN111144421B (en) * | 2019-12-10 | 2024-02-13 | 深圳市优必选科技股份有限公司 | Object color recognition method and device and throwing equipment |
CN115731658A (en) * | 2021-08-31 | 2023-03-03 | 国家电网有限公司 | Security positioning device and method for power system equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhou et al. | Safety helmet detection based on YOLOv5 | |
CN103098076B (en) | Gesture recognition system for TV control | |
WO2020173226A1 (en) | Spatial-temporal behavior detection method | |
CN108053422A (en) | Mobile target monitoring method | |
CN109816692A (en) | A kind of motion target tracking method based on Camshift algorithm | |
CN111598066A (en) | Helmet wearing identification method based on cascade prediction | |
CN112257569B (en) | Target detection and identification method based on real-time video stream | |
CN106127812B (en) | A kind of passenger flow statistical method of the non-gate area in passenger station based on video monitoring | |
CN108694356B (en) | Pedestrian detection device and method and auxiliary driving system | |
CN104601964A (en) | Non-overlap vision field trans-camera indoor pedestrian target tracking method and non-overlap vision field trans-camera indoor pedestrian target tracking system | |
CN102542289A (en) | Pedestrian volume statistical method based on plurality of Gaussian counting models | |
CN103824070A (en) | Rapid pedestrian detection method based on computer vision | |
CN102999920A (en) | Target tracking method based on nearest neighbor classifier and mean shift | |
CN103605983A (en) | Remnant detection and tracking method | |
CN102663775A (en) | Target tracking method oriented to video with low frame rate | |
CN111931654A (en) | Intelligent monitoring method, system and device for personnel tracking | |
KR20170015299A (en) | Method and apparatus for object tracking and segmentation via background tracking | |
CN117994987B (en) | Traffic parameter extraction method and related device based on target detection technology | |
CN112132862A (en) | Adaptive scale estimation target tracking algorithm based on unmanned aerial vehicle | |
Wang et al. | Forest smoke detection based on deep learning and background modeling | |
CN109165592B (en) | Real-time rotatable face detection method based on PICO algorithm | |
CN107452019A (en) | A kind of object detection method based on models switching, device, system and storage medium | |
CN103905826A (en) | Self-adaptation global motion estimation method | |
CN104182990B (en) | A kind of Realtime sequence images motion target area acquisition methods | |
Guo et al. | Overlapped pedestrian detection based on yolov5 in crowded scenes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180518 |
|
RJ01 | Rejection of invention patent application after publication |