KR20140035176A - Apparatus and method for object tracking using adaptive multiple feature weight decision - Google Patents

Apparatus and method for object tracking using adaptive multiple feature weight decision Download PDF

Info

Publication number
KR20140035176A
KR20140035176A KR1020120101718A KR20120101718A KR20140035176A KR 20140035176 A KR20140035176 A KR 20140035176A KR 1020120101718 A KR1020120101718 A KR 1020120101718A KR 20120101718 A KR20120101718 A KR 20120101718A KR 20140035176 A KR20140035176 A KR 20140035176A
Authority
KR
South Korea
Prior art keywords
feature
tracker
weight
state
object tracking
Prior art date
Application number
KR1020120101718A
Other languages
Korean (ko)
Inventor
최진우
문대성
이한성
유장희
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020120101718A priority Critical patent/KR20140035176A/en
Publication of KR20140035176A publication Critical patent/KR20140035176A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • H04N19/126Details of normalisation or weighting functions, e.g. normalisation matrices or variable uniform quantisers

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to an apparatus and method for tracking an object reflecting an adaptive multi-feature weight, and extracts a multi-feature that is a weighted sum of the feature items of each object, wherein each feature item has a maximum differentiation. The weighting coefficient is determined to estimate the state of the object to improve the accuracy of tracking a large number of objects.

Description

Apparatus and method for object tracking using adaptive multiple feature weight decision}

TECHNICAL FIELD The present invention relates to object tracking techniques, and more particularly, to an object tracking apparatus and method that reflects adaptive multiple feature weights.

Recently, video security technology using CCTV is rapidly spreading. However, the method of observing multiple images simultaneously with the naked eye in the security control center has many limitations in terms of cost or efficiency due to the physical and mental limitations of the person.

In order to overcome these limitations, intelligent image security using computer vision technology is attracting attention. There are a number of technologies required to implement intelligent video security.

The object tracking technology is a core technology, which is a technique of estimating one specific object or a plurality of unspecified objects every frame in an image sequence acquired by a camera.

In the case of the multiple object tracking technique using the particle filtering framework, unlike the multiple tracking technique using the mean-shift or Kalman filter, temporary overlapping, occlusion, background clutter ( clutter) and the like.

In particular, object tracking methods using color histogram information presented in Korean Patent Registration No. 10-0886323 (2009. 02. 23), etc. are robust to light changes, and relatively high accuracy even in the case of low resolution objects having few feature points. Perform tracing with However, in the case of tracking multiple objects using color histogram information, the probability of tracking failure increases when adjacent or overlapping objects have similar color distributions. In addition, the tracking accuracy is not high when the color distribution of the object changes dynamically due to changes in the surrounding environment such as a change in illumination.

Color histogram-based object tracking, which is widely used in object tracking techniques, has a high probability of tracking failure when adjacent objects have similar colors. Therefore, the accuracy of object tracking may be improved by using various feature information such as texture and edge, instead of simply using color information.

Accordingly, the present inventors have studied a technique of adaptively weighting various feature information of each object to improve the accuracy of tracking a plurality of objects by maximizing the differentiation of feature information of each object in the image.

Domestic Patent No. 10-0886323 (2009. 02. 23)

The present invention has been invented under the above-mentioned object, and extracts one multi-features by weighting the feature items of each object, and determines the weighted sum coefficient for each feature item so that each feature item has the greatest difference. It is an object of the present invention to provide an object tracking apparatus and method that reflects the adaptive multi-feature weight that can improve the accuracy of tracking a plurality of objects by estimating the state.

According to an aspect of the present invention for achieving the above object, the object tracking device reflecting the adaptive multi-feature weight extracts one multi-features by weighting the feature items of trackers assigned to each of the tracked objects The weighted coefficient value for each feature item included in the multiple features of each of the extracted trackers is determined as a weight, the weighted coefficient values for which the maximum difference of each feature item is determined as a weight, and the determined weighted sum is determined. It is characterized by estimating the state of each tracker by applying the coefficient value.

The present invention extracts one multi-features by weighting feature items of each object, and determines the weighted sum coefficient for each feature item to estimate the state of the object so that each feature item has the maximum difference. There is a useful effect that can improve the accuracy of the poem.

1 is a block diagram showing the configuration of an embodiment of an object tracking apparatus reflecting the adaptive multi-feature weight according to the present invention.
2 is a flowchart illustrating a configuration of an embodiment of an object tracking method reflecting an adaptive multi-feature weight according to the present invention.
3 is a flowchart illustrating a configuration of another embodiment of an object tracking method reflecting an adaptive multi-feature weight according to the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout.

In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.

The terms used throughout the specification of the present invention have been defined in consideration of the functions of the embodiments of the present invention and can be sufficiently modified according to the intentions and customs of the user or operator. It should be based on the contents of.

1 is a block diagram showing the configuration of an embodiment of an object tracking apparatus reflecting the adaptive multi-feature weight according to the present invention. As shown in FIG. 1, the object tracking apparatus 100 reflecting the adaptive multi-feature weight according to the present embodiment includes a multi-feature extractor 110, a weight determiner 120, and an object estimator 130. It is made, including.

The multi-feature extraction unit 110 extracts one multi-features by weighting feature items of trackers assigned to each of the tracked objects. In this case, the feature item weighted to one multiple feature may include a plurality of feature items that may characterize an object, such as a color feature item, a texture feature item, and an edge feature item. .

For example, the multi-feature extraction unit 110 calculates color histograms of particles included in each tracker to extract color features, and calculates LPB (Local Binary Pattern) histograms of particles included in each tracker. The texture feature may be extracted, and the edge feature may be extracted by calculating edge histograms of particles included in each tracker.

The weight determiner 120 determines a weighted sum coefficient for each feature item included in the multiple features of the trackers extracted by the multi-feature extractor 110, but determines the maximum difference of each feature item. do.

For example, the sum of weighted sum coefficients for each feature item may be one. In this case, the weight determination unit 120 may be implemented to determine the weighted coefficient values having the largest multiple feature value deviation as weights while adjusting the weighted coefficient coefficients having the sum of 1.

If the k-th tracker's color feature is hist_color k , the texture feature is hist_LBP k , the edge feature is hist_edge k , and each of these weighting coefficients is W C , W L , W E , and W C + W L + W E = 1, multiple features are W C * hist_color k + W L * hist_LBP k + W E * can be defined as hist_edge k

The weight determining unit 120 adjusts W C , W L , W E values, obtains multiple feature values, averages them, calculates multiple feature values and deviations, and weights the weighted sum coefficients when the deviation is maximum. Can be determined.

The object estimator 130 estimates the state of each tracker by applying the weighted sum coefficient value determined by the weight determiner 120. In this case, the object estimator 130 may include an observation value calculator 131, a particle re-extractor 132, and a state estimation value calculator 133.

The observation value calculator 131 calculates an observation value of each particle included in each tracker by using the weighted sum coefficient value determined by the weight determiner 120. For example, the observation value may be a probability distribution of each particle in the current frame predicted using a weighted coefficient value determined by the weight determination unit 120 from the probability distribution of each particle obtained in the previous frame.

The particle re-extractor 132 resamples the particles included in each tracker by using the observation values of the particles included in each tracker calculated by the observation calculator 131. At this time, the particle re-extraction unit 132 may be implemented to extract particles having a probability that the state value is close to the true value or more.

For example, the particle re-extractor 132 may observe observations of each particle included in each tracker calculated by the observation calculator 131 and each tracker in the current frame obtained by the state transition equation. Compares the state values of the particles included in and extracts particles whose probability is more than the threshold.

 The state estimation value calculator 133 obtains a state estimate of each tracker by taking an average value of the state values of particles included in each of the trackers re-extracted by the particle re-extractor 132 to obtain the current state of each object. The state in the frame can be estimated.

By implementing in this way, the present invention extracts one multi-features by weighting the feature items of each object, and estimates the state of the object by determining the weighted sum coefficient for each feature item such that each feature item shows the maximum difference. It is possible to improve the accuracy when tracking a large number of objects.

Meanwhile, according to an additional aspect of the present invention, the object tracking apparatus 100 reflecting the adaptive multi-feature weight may further include a state transition equation calculator 140. The state transition equation calculator 140 calculates a state transition equation of particles included in each tracker every frame.

For example, the state transition equation calculation unit 140 calculates the state transition equation by using the state of the particles included in each tracker in the previous frame, the state transition matrix, and the motion model of the particles in each frame. It can be implemented to calculate the state value.

At this time, the state of the particles include the coordinates of the particles, in addition to the movement speed and size of the particles may further include. Meanwhile, the motion model of the particles may be a Gaussian model, a random walk model, or the like.

According to an additional aspect of the present invention, the object tracking apparatus 100 reflecting the adaptive multi-feature weight may further include an object detector 150 and an initializer 160. The object detector 150 detects the tracked objects, generates a tracker for the detected tracked objects, and the initializer 160 determines an initial position of each tracker generated by the object detector 150. Set it.

The present invention can operate in two modes. The first mode is a feature weight determination based object tracking mode that operates when a new object is detected, and an independent object tracking mode that operates when a new person is not detected.

In the feature weight determination based object tracking mode operating when a new object is detected, when the object to be tracked is detected, the object detector 150 generates a tracker for the detected object to be tracked and initializes the initializer 160. ) Sets the initial positions of the generated trackers, and calculates the state transition equations of the particles included in the tracker through the state transition equation calculation unit 140, and multiplies the trackers through the multiple feature extractor 110. The feature is extracted, the weight of the tracker is determined through the weight determining unit 120, and the state of the tracker is estimated through the object estimating unit 130.

In the independent object tracking mode that operates when a new person is not detected, the object detector 150, the initialization unit 160, and the weight determiner 120 do not need to operate, and calculate the state transition equation using the already determined weights. Calculate the state transition equation of each particle included in the tracker through the unit 140, extract the multiple features of the tracker through the multi-feature extraction unit 110, the state of each tracker through the object estimator 130 Estimate

The object estimation operation of the object tracking apparatus reflecting the adaptive multi-feature weight according to the present invention as described above will be described with reference to FIGS. 2 and 3. FIG. 2 is a flowchart illustrating a configuration of an embodiment of an object tracking method reflecting an adaptive multi-feature weight in accordance with the present invention, and illustrates an operation in a feature weight determination based object tracking mode operating when a new object is detected. to be.

In the feature weight determination based object tracking mode, when the tracking object is detected, the object tracking apparatus reflecting the adaptive multiple feature weights generates a tracker for the tracking object detected in step 210.

Next, the object tracker reflecting the adaptive multiple feature weight sets an initial position of the tracker generated by step 210 in step 220.

Next, the object tracker reflecting the adaptive multi-feature weight calculates a state transition equation of each particle included in the tracker in step 230. Since the state transition equation calculation has been described above, redundant description is omitted.

Next, the object tracking apparatus reflecting the adaptive multiple feature weights extracts one multiple feature that weights the feature items of the tracker in step 240. Since multiple feature extraction has been described above, redundant descriptions are omitted.

Next, the object tracking apparatus reflecting the adaptive multi-feature weight determines a weighted coefficient value for each feature item included in the multi-feature extracted by the step 240 in step 250, but the maximum difference of each feature item is determined. Decide if possible. Since the weight determination has been described above, duplicate description will be omitted.

Next, the object tracker reflecting the adaptive multiple feature weights estimates the state of the tracker by applying the weighted sum coefficient value determined in step 260 to 250. At this time, the tracker state is estimated through the following process.

First, the object tracker reflecting the adaptive multi-feature weight calculates an observation value of each particle included in the tracker using the weighted sum coefficient value determined in step 261. Since the calculation of the observed value of the particles has been described above, duplicate descriptions are omitted.

Next, the object tracker reflecting the adaptive multi-feature weight re-extracts the particles included in the tracker using the observation values of the particles included in the tracker in step 262. Since the particle re-extraction has been described above, duplicate description will be omitted.

Next, the object tracker reflecting the adaptive multi-feature weights takes an average value of the state values of the particles included in the tracker re-extracted in step 263 to obtain a tracker state estimate and estimates the tracker state.

FIG. 3 is a flowchart illustrating a configuration of another embodiment of an object tracking method reflecting an adaptive multi-feature weight according to the present invention, and illustrates an operation in an independent object tracking mode operating when a new person is not detected. .

In the independent object tracking mode, the object tracker reflecting the adaptive multi-feature weight calculates a state transition of each particle included in each tracker in step 310. Since the state transition calculation has been described above, redundant description is omitted.

Next, the object tracking apparatus reflecting the adaptive multi-feature weight extracts one multi-feature that weights the feature items of each tracker in step 320. Since multiple feature extraction has been described above, redundant descriptions are omitted.

Next, the object tracker reflecting the adaptive multiple feature weights estimates the state of each tracker by applying the weighted sum coefficient value determined in step 330. At this time, the state of each tracker is estimated through the following process.

First, the object tracker reflecting the adaptive multi-feature weight calculates an observation value of each particle included in each tracker using the weighted sum coefficient value determined in step 331. Since the calculation of the observed value of the particles has been described above, duplicate descriptions are omitted.

Next, the object tracker reflecting the adaptive multi-feature weight re-extracts the particles included in the trackers using the observation of each particle included in each tracker in step 332. Since the particle re-extraction has been described above, duplicate description will be omitted.

Next, the object tracker reflecting the adaptive multi-feature weights takes an average value of the state values of the particles included in each of the trackers re-extracted in step 263 to obtain a state estimate of each tracker, and calculates the state of each tracker. Estimate.

Accordingly, by implementing in this way, the present invention extracts one multi-features by weighting the feature items of each object, and determines the weighted sum coefficient for each feature item so that each feature item has the greatest difference. Since the accuracy of tracking a large number of objects can be improved by estimating, it is possible to achieve the above object of the present invention.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. .

The present invention is industrially available in the object tracking art and its application art.

100: object tracking device 110: multi-feature extraction unit
120: weight determination unit 130: object estimation unit
131: observation value calculation unit 132: particle re-extraction unit
133: state estimation value calculation unit 140: state transition equation calculation unit
150: object detection unit 160: initialization unit

Claims (1)

A multiple feature extraction unit for extracting one multiple feature by weighting feature items of trackers assigned to each of the tracked objects;
A weight determination unit for determining a weighted sum coefficient for each feature item included in the multiple features of the trackers extracted by the multiple feature extractor, and for determining a weighted sum coefficient such that differentiation of each feature item is maximized;
An object estimating unit estimating a state of each tracker by applying a weighted sum coefficient value determined by the weight determining unit;
Apparatus for object tracking reflecting the adaptive multi-feature weight, characterized in that comprises a.
KR1020120101718A 2012-09-13 2012-09-13 Apparatus and method for object tracking using adaptive multiple feature weight decision KR20140035176A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120101718A KR20140035176A (en) 2012-09-13 2012-09-13 Apparatus and method for object tracking using adaptive multiple feature weight decision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120101718A KR20140035176A (en) 2012-09-13 2012-09-13 Apparatus and method for object tracking using adaptive multiple feature weight decision

Publications (1)

Publication Number Publication Date
KR20140035176A true KR20140035176A (en) 2014-03-21

Family

ID=50645271

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120101718A KR20140035176A (en) 2012-09-13 2012-09-13 Apparatus and method for object tracking using adaptive multiple feature weight decision

Country Status (1)

Country Link
KR (1) KR20140035176A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101406334B1 (en) * 2013-04-18 2014-06-19 전북대학교산학협력단 System and method for tracking multiple object using reliability and delayed decision
CN104392465A (en) * 2014-11-13 2015-03-04 南京航空航天大学 Multi-core target tracking method based on D-S evidence theory information integration

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101406334B1 (en) * 2013-04-18 2014-06-19 전북대학교산학협력단 System and method for tracking multiple object using reliability and delayed decision
CN104392465A (en) * 2014-11-13 2015-03-04 南京航空航天大学 Multi-core target tracking method based on D-S evidence theory information integration

Similar Documents

Publication Publication Date Title
Ojha et al. Image processing techniques for object tracking in video surveillance-A survey
Lalonde et al. Detecting ground shadows in outdoor consumer photographs
US10902614B2 (en) Image recognition by setting a ratio of combining the feature correlation coefficients for the respective acquired image feature amounts
Subburaman et al. Counting people in the crowd using a generic head detector
Nallasivam et al. Moving human target detection and tracking in video frames
Yadav Efficient method for moving object detection in cluttered background using Gaussian Mixture Model
WO2012141663A1 (en) A method for individual tracking of multiple objects
Mirabi et al. People tracking in outdoor environment using Kalman filter
Socek et al. A hybrid color-based foreground object detection method for automated marine surveillance
Izadi et al. Robust region-based background subtraction and shadow removing using color and gradient information
KR20140035176A (en) Apparatus and method for object tracking using adaptive multiple feature weight decision
Junejo et al. Dynamic scene modeling for object detection using single-class SVM
Hoseinnezhad et al. Visual tracking of multiple targets by multi-Bernoulli filtering of background subtracted image data
KR20150081797A (en) Apparatus and method for tracking object
Waykole et al. Detecting and tracking of moving objects from video
Mashak et al. Background subtraction for object detection under varying environments
Lai et al. Robust background extraction scheme using histogram-wise for real-time tracking in urban traffic video
Chen et al. Moving human full body and body parts detection, tracking, and applications on human activity estimation, walking pattern and face recognition
Meshgi et al. Fusion of multiple cues from color and depth domains using occlusion aware bayesian tracker
Singh et al. Human activity tracking using star skeleton and activity recognition using hmms and neural network
Kushwaha et al. Performance evaluation of various moving object segmentation techniques for intelligent video surveillance system
Álvarez et al. Monocular vision-based target detection on dynamic transport infrastructures
Boufama et al. Tracking multiple people in the context of video surveillance
Mandal et al. Embedded local feature based background modeling for video object detection
Rahmatian et al. Online multiple people tracking-by-detection in crowded scenes

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination