CN103400395A - Light stream tracking method based on HAAR feature detection - Google Patents

Light stream tracking method based on HAAR feature detection Download PDF

Info

Publication number
CN103400395A
CN103400395A CN2013103179930A CN201310317993A CN103400395A CN 103400395 A CN103400395 A CN 103400395A CN 2013103179930 A CN2013103179930 A CN 2013103179930A CN 201310317993 A CN201310317993 A CN 201310317993A CN 103400395 A CN103400395 A CN 103400395A
Authority
CN
China
Prior art keywords
tracking
frame
error
related coefficient
haar feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2013103179930A
Other languages
Chinese (zh)
Inventor
毛亮
冯琰一
张少文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PCI Suntek Technology Co Ltd
Original Assignee
PCI Suntek Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PCI Suntek Technology Co Ltd filed Critical PCI Suntek Technology Co Ltd
Priority to CN2013103179930A priority Critical patent/CN103400395A/en
Publication of CN103400395A publication Critical patent/CN103400395A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a light stream tracking method based on HAAR feature detection. The light stream tracking method comprises the following steps of firstly acquiring a video frame, carrying out illumination normalization processing and detecting an object by using an ADABOOST detection algorithm based on the HAAR feature; then, calculating an error between tracking points of front and rear frames by using a mid-value light stream tracking algorithm, and calculating a corresponding related coefficient through matching between the current frame and a previous frame; and finally, screening feature points of less than 50% of tracking when the error and the related coefficient are larger than a certain threshold value, calculating the tracking offset through the accumulation of the error and the related coefficient, predicting the target position of the next frame and realizing effective tracking of the object. The light stream tracking method can be adaptive to the influences of complicated dynamic background disturbance and illumination change, can be used for accurately tracking a running object and simultaneously has good robustness.

Description

A kind of method of optical flow tracking based on the HAAR feature detection
Technical field
The present invention relates to computer vision technique, particularly relate to a kind of method of optical flow tracking based on the HAAR feature detection.
Background technology
Along with the development of science and technology and the people continuous enhancing to security precautions, have the video monitoring system of new generation of intellectual analysis function, start in the very positive effect of security monitoring field performance, oneself is through starting to penetrate in the middle of our daily life.
Intelligent video monitoring refers in the situation that do not need human intervention, utilize the computer vision analysis method to carry out automatic analysis to video sequence, realize moving object detection, classification, identification, tracking etc., and on this basis, by predefined rule, the behavior of target is analyzed, thereby for taking further measures, provided with reference to (such as at object, entering automatic alarm while setting up defences district).Wherein, the purpose of motion target tracking is to find interested moving target position in each two field picture of image sequence, namely in the different frame of one section video, tracked object marker out.
At present, applying method for tracking target more widely has: Kalman filtering algorithm, CAMSHIFT algorithm etc.; The efficiency of Kalman filtering algorithm is higher, upgrades and measures and upgrade two equations by the time, realizes that a kind of pre-estimation is to the strategy of demarcating; The CAMSHIFT algorithm utilizes the color characteristic of target in video image, to find position and the size at moving target place, in the next frame video image, with the current position of moving target and big or small initialization search window, repeat this process and just can realize the Continuous Tracking to target.The problem that CAMS HIFT can effectively solve target distortion and block, less demanding to system resource, time complexity is low, under simple background, can obtain good tracking effect.But when background is comparatively complicated, or have in the situation of many and the similar pixel interference of color of object, can cause following the tracks of unsuccessfully.
In order to improve the accuracy of tracking, existing optical flow tracking method based on the HAAR feature detection, at first gather frame of video, carries out the unitary of illumination processing, and utilize the ADABOOST detection algorithm based on the HAAR feature to detect target; Then utilize intermediate value optical flow tracking algorithm to calculate the error of front and back two frame trace points, and calculate corresponding related coefficient by present frame and former frame meter coupling; Finally, by error and related coefficient, greater than a certain threshold value, filter out the unique point less than 50% tracking, and cumulative by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.The method can adapt to the impact of complicated dynamic background disturbance and illumination variation, can to operational objective, follow the tracks of accurately, has simultaneously good robustness.
Motion target tracking can be subject to that uneven illumination is even, the interference of other moving targets, the impact of factor such as blocks, if therefore want to set up the target tracking algorism of a suitable all situations, is unpractical, must set up effective track algorithm according to actual conditions.
Summary of the invention
The invention provides a kind of method of optical flow tracking based on the HAAR feature detection, the method can adapt to the impact of complicated dynamic background disturbance and illumination variation, accurately operational objective is followed the tracks of, and has simultaneously good robustness.
To achieve these goals, the present invention includes following technical characterictic: comprise and at first gather frame of video, carry out the unitary of illumination processing, and utilize the ADABOOST detection algorithm based on the HAAR feature to detect target; Then utilize intermediate value optical flow tracking algorithm to calculate the error of front and back two frame trace points, and calculate corresponding related coefficient by present frame and former frame meter coupling; Finally, by error and related coefficient, greater than a certain threshold value, filter out the unique point less than 50% tracking, and cumulative by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.
Compared with the existing methods, the present invention proposes and at first gather frame of video, carry out the unitary of illumination processing, and utilize the ADABOOST detection algorithm based on the HAAR feature to detect target; Then utilize intermediate value optical flow tracking algorithm to calculate the error of front and back two frame trace points, and calculate corresponding related coefficient by present frame and former frame meter coupling; Finally, by error and related coefficient, greater than a certain threshold value, filter out the unique point less than 50% tracking, and cumulative by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.The method can adapt to the impact of complicated dynamic background disturbance and illumination variation, can to operational objective, follow the tracks of accurately, has simultaneously good robustness.
The accompanying drawing explanation
Accompanying drawing is overview flow chart of the present invention;
Embodiment
The present invention has designed a kind of method of optical flow tracking based on the HAAR feature detection, and the method can adapt to the impact of complicated dynamic background disturbance and illumination variation, accurately operational objective is followed the tracks of, and has simultaneously good robustness.
As shown in drawings, the method process flow diagram comprises the collection frame of video, and unitary of illumination utilizes the ADABOOST detection algorithm based on the HAAR feature to detect target, the error of two frame trace points before and after calculating, calculate related coefficient, whether error in judgement and related coefficient greater than a certain threshold value, filter out the unique point less than 50% tracking greater than, and adding up by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.
Specific implementation is: comprise the collection frame of video, carry out the unitary of illumination processing, and utilize the ADABOOST detection algorithm based on the HAAR feature to detect target; Then utilize intermediate value optical flow tracking algorithm to calculate the error of front and back two frame trace points, and calculate corresponding related coefficient by present frame and former frame meter coupling; Finally, by error and related coefficient, greater than a certain threshold value, filter out the unique point less than 50% tracking, and cumulative by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.
Describedly from obtaining frame of video, carry out unitary of illumination, and utilize the ADABOOST detection algorithm based on the HAAR feature to detect target;
Described according to the described error of utilizing intermediate value optical flow tracking algorithm to calculate front and back two frame trace points, and calculate corresponding related coefficient by present frame and former frame meter coupling.
Described according to described by error and related coefficient greater than a certain threshold value, filter out the unique point less than 50% tracking, and cumulative by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.
By above-mentioned visible, specific embodiment of the present invention is for to follow the tracks of the moving target in scene.Further, by gathering frame of video, and extract the initial background frame, target is detected; Then utilize intermediate value optical flow tracking algorithm to calculate the error of front and back two frame trace points, and calculate corresponding related coefficient by present frame and former frame meter coupling; Finally, by error and related coefficient, greater than a certain threshold value, filter out the unique point less than 50% tracking, and cumulative by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.
Therefore; easily understand, the foregoing is only preferred embodiment of the present invention, be not intended to limit spirit of the present invention and protection domain; the equivalent variations that any those of ordinary skill in the art make or replacement, within all should being considered as being encompassed in protection scope of the present invention.

Claims (4)

1. the method for the optical flow tracking based on the HAAR feature detection, is characterized in that: at first gather frame of video, carry out the unitary of illumination processing, and utilize the ADABOOST detection algorithm based on the HAAR feature to detect target; Then utilize intermediate value optical flow tracking algorithm to calculate the error of front and back two frame trace points, and calculate corresponding related coefficient by present frame and former frame meter coupling; Finally, by error and related coefficient, greater than a certain threshold value, filter out the unique point less than 50% tracking, and cumulative by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.
2. a kind of method of optical flow tracking based on the HAAR feature detection according to claim 1, is characterized in that: according to described collection frame of video, carry out the unitary of illumination processing, and utilize the ADABOOST detection algorithm based on the HAAR feature to detect target.
3. a kind of method of optical flow tracking based on the HAAR feature detection according to claim 1, it is characterized in that: according to the described error of utilizing intermediate value optical flow tracking algorithm to calculate front and back two frame trace points, and calculate corresponding related coefficient by present frame and former frame meter coupling.
4. a kind of method of optical flow tracking based on the HAAR feature detection according to claim 1, it is characterized in that: according to described by error and related coefficient greater than a certain threshold value, filter out the unique point less than 50% tracking, and adding up by error and related coefficient, calculate the tracking side-play amount, the target location of prediction next frame, realize target is effectively followed the tracks of.
CN2013103179930A 2013-07-24 2013-07-24 Light stream tracking method based on HAAR feature detection Pending CN103400395A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2013103179930A CN103400395A (en) 2013-07-24 2013-07-24 Light stream tracking method based on HAAR feature detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2013103179930A CN103400395A (en) 2013-07-24 2013-07-24 Light stream tracking method based on HAAR feature detection

Publications (1)

Publication Number Publication Date
CN103400395A true CN103400395A (en) 2013-11-20

Family

ID=49564007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2013103179930A Pending CN103400395A (en) 2013-07-24 2013-07-24 Light stream tracking method based on HAAR feature detection

Country Status (1)

Country Link
CN (1) CN103400395A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108470354A (en) * 2018-03-23 2018-08-31 云南大学 Video target tracking method, device and realization device
CN108876812A (en) * 2017-11-01 2018-11-23 北京旷视科技有限公司 Image processing method, device and equipment for object detection in video
WO2018233438A1 (en) * 2017-06-21 2018-12-27 腾讯科技(深圳)有限公司 Human face feature point tracking method, device, storage medium and apparatus
CN110517296A (en) * 2018-05-21 2019-11-29 北京京东尚科信息技术有限公司 Method for tracking target, device, storage medium and electronic equipment
CN111882583A (en) * 2020-07-29 2020-11-03 成都英飞睿技术有限公司 Moving target detection method, device, equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010276529A (en) * 2009-05-29 2010-12-09 Panasonic Corp Apparatus and method of identifying object
CN102156991A (en) * 2011-04-11 2011-08-17 上海交通大学 Quaternion based object optical flow tracking method
CN102867311A (en) * 2011-07-07 2013-01-09 株式会社理光 Target tracking method and target tracking device
CN102903122A (en) * 2012-09-13 2013-01-30 西北工业大学 Video object tracking method based on feature optical flow and online ensemble learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010276529A (en) * 2009-05-29 2010-12-09 Panasonic Corp Apparatus and method of identifying object
CN102156991A (en) * 2011-04-11 2011-08-17 上海交通大学 Quaternion based object optical flow tracking method
CN102867311A (en) * 2011-07-07 2013-01-09 株式会社理光 Target tracking method and target tracking device
CN102903122A (en) * 2012-09-13 2013-01-30 西北工业大学 Video object tracking method based on feature optical flow and online ensemble learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨明浩 等: "排除光流错误跟踪点的鲁棒方法", 《计算机辅助设计与图形学学报》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018233438A1 (en) * 2017-06-21 2018-12-27 腾讯科技(深圳)有限公司 Human face feature point tracking method, device, storage medium and apparatus
US10943091B2 (en) 2017-06-21 2021-03-09 Tencent Technology (Shenzhen) Company Limited Facial feature point tracking method, apparatus, storage medium, and device
CN108876812A (en) * 2017-11-01 2018-11-23 北京旷视科技有限公司 Image processing method, device and equipment for object detection in video
CN108470354A (en) * 2018-03-23 2018-08-31 云南大学 Video target tracking method, device and realization device
CN110517296A (en) * 2018-05-21 2019-11-29 北京京东尚科信息技术有限公司 Method for tracking target, device, storage medium and electronic equipment
CN110517296B (en) * 2018-05-21 2022-06-07 北京京东尚科信息技术有限公司 Target tracking method and device, storage medium and electronic equipment
CN111882583A (en) * 2020-07-29 2020-11-03 成都英飞睿技术有限公司 Moving target detection method, device, equipment and medium
CN111882583B (en) * 2020-07-29 2023-11-14 成都英飞睿技术有限公司 Moving object detection method, device, equipment and medium

Similar Documents

Publication Publication Date Title
CN100545867C (en) Aerial shooting traffic video frequency vehicle rapid checking method
CN105023278B (en) A kind of motion target tracking method and system based on optical flow method
EP2801078B1 (en) Context aware moving object detection
CN107123131B (en) Moving target detection method based on deep learning
CN101916447B (en) Robust motion target detecting and tracking image processing system
CN105046719B (en) A kind of video frequency monitoring method and system
CN112669349A (en) Passenger flow statistical method, electronic equipment and storage medium
CN102426785B (en) Traffic flow information perception method based on contour and local characteristic point and system thereof
CN102307274A (en) Motion detection method based on edge detection and frame difference
CN103400395A (en) Light stream tracking method based on HAAR feature detection
CN103810717A (en) Human behavior detection method and device
WO2014082480A1 (en) Method and device for calculating number of pedestrians and crowd movement directions
CN103093198A (en) Crowd density monitoring method and device
CN103473533B (en) Moving Objects in Video Sequences abnormal behaviour automatic testing method
CN105374049B (en) Multi-corner point tracking method and device based on sparse optical flow method
CN103945089A (en) Dynamic target detection method based on brightness flicker correction and IP camera
CN103428409A (en) Video denoising processing method and device based on fixed scene
CN110619651A (en) Driving road segmentation method based on monitoring video
CN104778723A (en) Method for performing motion detection on infrared image with three-frame difference method
Kawakatsu et al. Traffic surveillance system for bridge vibration analysis
CN105426928B (en) A kind of pedestrian detection method based on Haar feature and EOH feature
Furuya et al. Road intersection monitoring from video with large perspective deformation
Tao et al. Real-time detection and tracking of moving object
CN107729811B (en) Night flame detection method based on scene modeling
CN103093481B (en) A kind of based on moving target detecting method under the static background of watershed segmentation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20131120

WD01 Invention patent application deemed withdrawn after publication