CN111354012A - Tracking method before complex scene moving small target detection based on subspace projection - Google Patents

Tracking method before complex scene moving small target detection based on subspace projection Download PDF

Info

Publication number
CN111354012A
CN111354012A CN202010099912.4A CN202010099912A CN111354012A CN 111354012 A CN111354012 A CN 111354012A CN 202010099912 A CN202010099912 A CN 202010099912A CN 111354012 A CN111354012 A CN 111354012A
Authority
CN
China
Prior art keywords
motion
effective
track
dimensional space
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010099912.4A
Other languages
Chinese (zh)
Other versions
CN111354012B (en
Inventor
陈华杰
白浩然
吕丹妮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202010099912.4A priority Critical patent/CN111354012B/en
Publication of CN111354012A publication Critical patent/CN111354012A/en
Application granted granted Critical
Publication of CN111354012B publication Critical patent/CN111354012B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a tracking method before complex scene moving small target detection based on subspace projection, which adopts the process of distinguishing target motion track and clutter track on a three-dimensional space-time sequence to carry out staged cascade processing on the target motion track and the clutter track: firstly, performing two-dimensional plane projection of a subspace on a three-dimensional space-time image sequence, and removing most irregular clutter points according to morphological difference of a motion track and a general clutter track on a two-dimensional projection plane; then, carrying out three-dimensional space-time flight path backtracking of a local area on the filtered candidate track area, screening effective candidate point tracks and calculating the central point of the effective point tracks; and finally, calculating the continuity and regularity of the flight path aiming at the effective motion point path in the candidate motion, deleting the interference point path, and extracting the effective target motion path. The method greatly improves the rapidity of target tracking in the image sequence under the dense motion clutter interference.

Description

Tracking method before complex scene moving small target detection based on subspace projection
Technical Field
The invention belongs to the field of moving target detection, and relates to a tracking method before detection of a small moving target in a complex scene based on subspace projection and three-dimensional space-time track backtracking.
Background
The small target tracking under the complex environment needs to overcome complex type interference (static clutter and dynamic clutter) in the complex environment, the interference has great adverse effect on a small target tracking result, and the existing target tracking method has certain limitation under the complex type interference environment.
In the Hough transform method and the particle filtering method, the real-time performance of the method cannot meet the actual requirement in the occasions with higher requirements on rapidity. The dynamic programming method has a certain improvement on the real-time effect of the method, but the efficiency of the method is still insufficient when more clutter points exist in the environment. If the track interruption condition exists in the target track process, the defects of the dynamic programming method in the iterative process can be amplified, and when a large amount of clutter interference exists, the real-time performance of the dynamic programming method can be greatly influenced.
Disclosure of Invention
Aiming at the complex interference in the complex environment, the invention provides a rapid TBD (track-before-detect) method based on subspace projection, which adopts the process of distinguishing a target motion track and a clutter track on a three-dimensional space-time sequence and carries out staged cascade processing on the target motion track and the clutter track: firstly, performing two-dimensional plane projection of a subspace on a three-dimensional space-time image sequence, and removing most irregular clutter points according to morphological difference of a motion track and a general clutter track on a two-dimensional projection plane; then, carrying out three-dimensional space-time flight path backtracking of a local area on the filtered candidate track area, screening effective candidate point tracks and calculating the central point of the effective point tracks; and finally, calculating the continuity and regularity of the flight path aiming at the effective motion point path in the candidate motion, deleting the interference point path, and extracting the effective target motion path.
And (1) carrying out foreground background detection on the original image sequence.
1.1 the separation of foreground and background of continuous image sequence is realized by using the method of Vibe (visual background extraction).
And processing the continuous image sequence by a Vibe method to obtain a binary image with separated foreground and background.
And 1.2, filtering the foreground and background separation result.
And (3) generating flickering noise points on the Vibe result by factors such as illumination conditions in the picture sequence and flowers, trees, water surface and the like in the image, and summarizing the noise in the foreground point to the background through comparison calculation of the Vibe result of the image sequence.
Step (2), calculation of subspace projection
2.1 intercepting a three-dimensional space-time sequence segment of the foreground result, and mapping the three-dimensional space-time sequence segment into a two-dimensional space plane;
Figure BDA0002386522190000021
wherein p (x, y) represents the pixel value of the (x, y) position in the projection view, and f (x, y, t) represents the pixel value of the (x, y) position in the t-th image in the intercepted image sequence;
2.2, performing morphological processing on the projected two-dimensional space plane image to realize plane expansion of the small target motion track;
2.3 calculating the size of the communicated area, eliminating invalid clutter interference and obtaining a projection area of the motion trail;
Figure BDA0002386522190000022
wherein w represents the width of the target, v represents the movement speed of the target, t' represents the time length in the three-dimensional space-time sequence, and L represents the diagonal length of the communication region;
step (3), backtracking the local three-dimensional space-time flight path;
3.1 filtering the current frame by utilizing an effective communication area of a two-dimensional space projection plane;
3.2 calculating the effective moving target central point position of the current frame picture in the three-dimensional space-time sequence;
Xc=mean(Xf)
wherein XcRepresenting the position of the center point of the effective moving object in the current frame image, mean (X)f) Representing the position mean value of a connected region where a moving target is located in the current frame image;
3.3, calculating the position of a central point of effective motion in the space-time sequence, and reconstructing a motion point trace in the three-dimensional space-time sequence;
step (4), effective track detection
4.1, the effective motion target point track in the flight path accounts for the ratio;
setting a ratio threshold of target effective motion point tracks in the space-time sequence, and preliminarily deleting effective tracks in the space-time sequence when the ratio threshold is larger than the threshold;
4.2, calculating the inter-frame relative displacement in the flight path;
Figure BDA0002386522190000023
wherein SkRelative distance of current frame representing moving object, (X)k,Yk) Representing the coordinates of the center point of the current moving object in the current frame, (X)k-1,Yk-1) Representing the coordinates of the central point of the current moving target in the previous frame;
according to the actual motion rule of the object, the motion is similar to linear motion within a short time interval, the track motion direction and the distance of the current frame are integrally consistent with the overall motion trend, and the preliminary selection result of 4.1 is further selected;
4.3 calculating the short-term accumulated displacement in the flight path
Accumulating the displacement of the moving target in each frame of image of the current time-space sequence;
Figure BDA0002386522190000031
wherein S represents the integrated displacement of the moving object;
setting a comprehensive displacement threshold, and further deleting the deletion result of 4.2 when the comprehensive displacement is greater than the set threshold, and eliminating invalid motion traces to obtain an effective motion trace set;
and (5) clustering the effective motion point trace set, and obtaining continuous effective motion tracks according to clustering results.
The invention has the following beneficial effects:
in target tracking under the complex interference type environment, the method has a good result on the static clutter which is easy to process by the existing method, and also has a good processing effect on the dense dynamic clutter. Compared with Hough transform and particle filtering methods, the method provided by the invention has the advantage that the rapidity of target tracking in the image sequence under the dense motion clutter interference is greatly improved. Compared with the defects of the dynamic planning method shown in the target expansion problem and the track interruption condition, the method has good solution effect, and realizes the rapid, accurate and continuous detection of the target motion track under the condition of compound interference.
Drawings
Figure 1 is a flow chart of subspace projection TBD,
fig. 2 is a subspace projection of a moving object trajectory.
Detailed Description
The present invention is further analyzed with reference to the following specific examples.
In the experiment, a group of sonar image sequences are used as tracking detection input. As shown in fig. 1, the specific steps in tracking the s task before detecting the small moving target of the complex scene based on subspace projection are as follows:
and (1) carrying out foreground background detection on the original image sequence.
1.1 the separation of foreground and background of continuous image sequence is realized by using the Vibe method.
And processing the continuous image sequence by a Vibe method to obtain a binary image with separated foreground and background.
And 1.2, filtering the foreground and background separation result.
And (3) generating flickering noise points on the Vibe result by factors such as illumination conditions in the picture sequence and flowers, trees, water surface and the like in the image, and summarizing the noise in the foreground point to the background through comparison calculation of the Vibe result of the image sequence.
And (2) calculating the projection of the subspace.
And 2.1, intercepting a three-dimensional space-time sequence segment of the foreground result, and mapping the three-dimensional space-time sequence segment into a two-dimensional space plane.
Figure BDA0002386522190000041
Where p (x, y) represents the pixel value at the (x, y) position in the projection view, and f (x, y, t) represents the pixel value at the (x, y) position in the t-th image in the sequence of truncated images.
And 2.2, performing morphological processing and connected region calculation on the projected two-dimensional space plane image to realize plane expansion of the small target motion track.
And 2.3, calculating the size of the communicated area, eliminating invalid clutter interference and obtaining a projection area of the motion trail.
Figure BDA0002386522190000042
Wherein w represents the width of the target, v represents the moving speed of the target, t represents the time length in the three-dimensional space-time sequence, and L represents the diagonal length of the communication region, as shown in fig. 2;
and (3) backtracking the local three-dimensional space-time flight path.
And 3.1, filtering the current frame by utilizing an effective communication area of the two-dimensional space projection plane.
3.2 calculating the effective moving target central point position of the current frame picture in the three-dimensional space-time sequence.
Xc=mean(Xf)
Wherein XcRepresenting the position of the center point of the effective moving object in the current frame image, mean (X)f) And the position mean value of the connected region where the moving target is located in the current frame image is represented.
3.3, calculating the effective motion position in the space-time sequence and reconstructing the motion point trace in the three-dimensional space-time sequence.
And (4) detecting the effective track.
4.1 the effective motion target point track in the flight path accounts for the ratio.
The effective flight path in the space-time sequence needs to ensure the occupation ratio of the target effective motion point path in the current space-time sequence.
4.2 relative displacement between frames in the track.
According to the actual motion rule of the object, the motion is similar to linear motion within a short time interval, and the track motion direction and the distance of the current frame are close to the same.
Figure BDA0002386522190000043
Wherein SkRelative distance of current frame representing moving object, (X)k,Yk) Representing the coordinates of the center point of the current moving object in the current frame, (X)k-1,Yk-1) The coordinates of the center point of the current moving object in the previous frame are shown.
4.3 short-term cumulative displacement in track.
Accumulation of moving object displacement in each frame of image of current space-time sequence.
Figure BDA0002386522190000051
Where S represents the integrated displacement of the moving object.
4.4 effective motion trace points need to ensure effective motion trace point ratio, effective displacement ratio and comprehensive displacement ratio of the target, and invalid motion trace points are eliminated.
And (5) clustering the effective motion point trace set, and obtaining continuous effective motion tracks according to clustering results.
The above embodiments are not intended to limit the present invention, and the present invention is not limited to the above embodiments, and all embodiments are within the scope of the present invention as long as the requirements of the present invention are met.

Claims (2)

1. The method for tracking the small moving target in the complex scene before detection based on subspace projection is characterized by comprising the following steps: firstly, performing two-dimensional plane projection of a subspace on a three-dimensional space-time image sequence, and excluding most irregular clutter points according to morphological difference of a motion track and a general clutter track on a two-dimensional projection plane; then, carrying out three-dimensional space-time flight path backtracking of a local area on the filtered candidate track area, screening effective candidate point tracks and calculating the central point of the effective point tracks; and finally, calculating the continuity and regularity of the flight path aiming at the effective motion point path in the candidate motion, deleting the interference point path, and extracting the effective target motion path.
2. The subspace projection-based tracking before detection method for small moving objects in a complex scene according to claim 1, wherein: the method is as follows
Step (1), foreground background detection is carried out on an original image sequence
1.1, realizing the separation of foreground and background of a continuous image sequence by using a Vibe method, namely visual background extraction;
after the continuous image sequence is processed by a Vibe method, obtaining a binary image with separated foreground and background;
1.2 filtering the foreground and background separation result;
through comparison and calculation of Vibe results of the image sequence, noise in foreground points is summarized to a background;
step (2), calculation of subspace projection
2.1 intercepting a three-dimensional space-time sequence segment of the foreground result, and mapping the three-dimensional space-time sequence segment into a two-dimensional space plane;
Figure FDA0002386522180000011
wherein p (x, y) represents the pixel value of the (x, y) position in the projection view, and f (x, y, t) represents the pixel value of the (x, y) position in the t-th image in the intercepted image sequence;
2.2, performing morphological processing on the projected two-dimensional space plane image to realize plane expansion of the small target motion track;
2.3 calculating the size of the communicated area, eliminating invalid clutter interference and obtaining a projection area of the motion trail;
Figure FDA0002386522180000012
wherein w represents the width of the target, v represents the movement speed of the target, t' represents the time length in the three-dimensional space-time sequence, and L represents the diagonal length of the communication region;
step (3), backtracking the local three-dimensional space-time flight path;
3.1 filtering the current frame by utilizing an effective communication area of a two-dimensional space projection plane;
3.2 calculating the effective moving target central point position of the current frame picture in the three-dimensional space-time sequence;
Xc=mean(Xf)
wherein XcRepresenting the position of the center point of the effective moving object in the current frame image, mean (X)f) Representing the position mean value of a connected region where a moving target is located in the current frame image;
3.3, calculating the position of a central point of effective motion in the space-time sequence, and reconstructing a motion point trace in the three-dimensional space-time sequence;
step (4), effective track detection
4.1, the effective motion target point track in the flight path accounts for the ratio;
setting a ratio threshold of target effective motion point tracks in the space-time sequence, and preliminarily deleting effective tracks in the space-time sequence when the ratio threshold is larger than the threshold;
4.2, calculating the inter-frame relative displacement in the flight path;
Figure FDA0002386522180000021
wherein SkRelative distance of current frame representing moving object, (X)k,Yk) Representing the coordinates of the center point of the current moving object in the current frame, (X)k-1,Yk-1) Representing the coordinates of the central point of the current moving target in the previous frame;
according to the actual motion rule of the object, the motion is similar to linear motion within a short time interval, the track motion direction and the distance of the current frame are integrally consistent with the overall motion trend, and the preliminary selection result of 4.1 is further selected;
4.3 calculating the short-term accumulated displacement in the flight path
Accumulating the displacement of the moving target in each frame of image of the current time-space sequence;
Figure FDA0002386522180000022
wherein S represents the integrated displacement of the moving object;
setting a comprehensive displacement threshold, and further deleting the deletion result of 4.2 when the comprehensive displacement is greater than the set threshold, and eliminating invalid motion traces to obtain an effective motion trace set;
and (5) clustering the effective motion point trace set, and obtaining continuous effective motion tracks according to clustering results.
CN202010099912.4A 2020-02-18 2020-02-18 Tracking method before complex scene moving small target detection based on subspace projection Active CN111354012B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010099912.4A CN111354012B (en) 2020-02-18 2020-02-18 Tracking method before complex scene moving small target detection based on subspace projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010099912.4A CN111354012B (en) 2020-02-18 2020-02-18 Tracking method before complex scene moving small target detection based on subspace projection

Publications (2)

Publication Number Publication Date
CN111354012A true CN111354012A (en) 2020-06-30
CN111354012B CN111354012B (en) 2023-03-28

Family

ID=71194094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010099912.4A Active CN111354012B (en) 2020-02-18 2020-02-18 Tracking method before complex scene moving small target detection based on subspace projection

Country Status (1)

Country Link
CN (1) CN111354012B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150378361A1 (en) * 2014-06-30 2015-12-31 Collin Walker Systems and methods for controlling vehicle position and orientation
JP2017010224A (en) * 2015-06-19 2017-01-12 キヤノン株式会社 Object tracking apparatus, object tracking method, and program
CN108198207A (en) * 2017-12-22 2018-06-22 湖南源信光电科技股份有限公司 Multiple mobile object tracking based on improved Vibe models and BP neural network
US20190137604A1 (en) * 2017-11-09 2019-05-09 Vadum, Inc. Target Identification and Clutter Mitigation in High Resolution Radar Systems
CN110197472A (en) * 2018-02-26 2019-09-03 四川省人民医院 A kind of method and system for ultrasonic contrast image stabilization quantitative analysis

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150378361A1 (en) * 2014-06-30 2015-12-31 Collin Walker Systems and methods for controlling vehicle position and orientation
JP2017010224A (en) * 2015-06-19 2017-01-12 キヤノン株式会社 Object tracking apparatus, object tracking method, and program
US20190137604A1 (en) * 2017-11-09 2019-05-09 Vadum, Inc. Target Identification and Clutter Mitigation in High Resolution Radar Systems
CN108198207A (en) * 2017-12-22 2018-06-22 湖南源信光电科技股份有限公司 Multiple mobile object tracking based on improved Vibe models and BP neural network
CN110197472A (en) * 2018-02-26 2019-09-03 四川省人民医院 A kind of method and system for ultrasonic contrast image stabilization quantitative analysis

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
钟雷等: "未知强杂波下基于DP-TBD的雷达弱目标检测" *
陈华杰等: "SAR/GMTI动目标检测软件平台的结构设计与实现" *

Also Published As

Publication number Publication date
CN111354012B (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN106846359B (en) Moving target rapid detection method based on video sequence
WO2021012757A1 (en) Real-time target detection and tracking method based on panoramic multichannel 4k video images
CN108304808A (en) A kind of monitor video method for checking object based on space time information Yu depth network
US7995800B2 (en) System and method for motion detection and the use thereof in video coding
CN104835147A (en) Method for detecting crowded people flow in real time based on three-dimensional depth map data
CN110930411B (en) Human body segmentation method and system based on depth camera
Liang et al. Aviation video moving-target detection with inter-frame difference
Brahme et al. An implementation of moving object detection, tracking and counting objects for traffic surveillance system
CN113763427A (en) Multi-target tracking method based on coarse-fine shielding processing
CN109063630B (en) Rapid vehicle detection method based on separable convolution technology and frame difference compensation strategy
Al-Ariny et al. An efficient vehicle counting method using mask r-cnn
Najeeb et al. Tracking ball in soccer game video using extended Kalman filter
CN111354012B (en) Tracking method before complex scene moving small target detection based on subspace projection
Kiratiratanapruk et al. Vehicle detection and tracking for traffic monitoring system
Almomani et al. Segtrack: A novel tracking system with improved object segmentation
CN114463800A (en) Multi-scale feature fusion face detection and segmentation method based on generalized intersection-parallel ratio
Russell et al. Vehicle detection based on color analysis
Algethami et al. Combining Accumulated Frame Differencing and Corner Detection for Motion Detection.
Zeppelzauer et al. A novel trajectory clustering approach for motion segmentation
Chandrasekhar et al. A survey of techniques for background subtraction and traffic analysis on surveillance video
Li et al. Effective moving objects detection based on clustering background model for video surveillance
Malavika et al. Moving object detection and velocity estimation using MATLAB
Suresh et al. A survey on occlusion detection
Kalith et al. Video Scene Segmentation: A Novel Method to Determine Objects
Zhen et al. Design of moving object detection algorithm based on computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant