CN111105444A - Continuous tracking method suitable for underwater robot target grabbing - Google Patents
Continuous tracking method suitable for underwater robot target grabbing Download PDFInfo
- Publication number
- CN111105444A CN111105444A CN201911408004.2A CN201911408004A CN111105444A CN 111105444 A CN111105444 A CN 111105444A CN 201911408004 A CN201911408004 A CN 201911408004A CN 111105444 A CN111105444 A CN 111105444A
- Authority
- CN
- China
- Prior art keywords
- target
- tracking
- sample
- cyclic
- kernel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a continuous tracking method suitable for underwater robot target grabbing, belonging to the technical field of visual target tracking; establishing a nuclear-related motion model; HOG feature extraction; establishing an appearance model; updating and iterating the model. The invention realizes intensive sampling by cyclic displacement of base samples based on a kernel correlation filtering theory so as to extract a richer training sample set, establishes an appearance model of a tracking target by extracting HOG characteristics and introducing a kernel function so as to solve the problem of nonlinear regression, improves the calculation efficiency, judges whether the tracking needs to be reinitialized according to a feedback result, provides a system confidence coefficient-based self-discrimination mechanism, and realizes continuous tracking of the target. The invention not only can ensure the stable tracking of the underwater target, but also can automatically judge the shielding and error tracking conditions, thereby re-identifying the tracking and further completing the continuous tracking and reliable grabbing of the underwater target.
Description
Technical Field
The invention relates to a continuous tracking method suitable for underwater robot target grabbing, and belongs to the technical field of visual target tracking.
Background
Conventionally, marine organism fishing work such as sea cucumber, scallop, sea urchin, etc. has been performed by manpower. The underwater robot is limited by diving time, diving depth, working conditions and fishing cost, diving personnel are prone to occupational diseases and other objective conditions, and the underwater robot can reduce fishing cost and improve operation safety. The field of computer vision has developed rapidly over the last two decades. The tracking method is from the earliest mean shift algorithm, subspace algorithm and the like to the sparse representation theory, and then to the current mainstream related filtering algorithm and deep learning algorithm. Although the development of target tracking algorithms has made great progress, challenges such as motion blur and occlusion still exist.
The visual target tracking is one of important research directions in the field of computer vision, is also a key link for realizing autonomous target grabbing operation of an underwater robot, and integrates advanced technologies and core ideas in the fields of mode recognition, image processing, computers and the like. The target tracking of the underwater robot is based on target recognition, and the accurate position coordinates of a determined target in the next frame are given to a continuous video sequence aiming at the target recognition result of the first frame. Only through continuous stable tracking to the underwater target can guarantee that underwater robot realizes accurate reliable snatching. However, underwater target tracking often fails to track and grab due to the similarity of the target to the surrounding environment, and the occlusion of the target by the paw or other parts of the robot.
The patent document "an underwater tracking method based on deep learning and monocular vision (application number: 201910474803.3)" relates to an underwater robot target method based on deep learning, but the tracking speed of deep learning is slow, and the underwater robot target grabbing control requires fast and high-frequency tracking feedback, and particularly needs to consider the situation that the underwater robot target grabbing is blocked by a paw or the like, so the patent cannot support the implementation of underwater robot target grabbing. The patent document "method and system for detecting and tracking small underwater moving target (application number: 201910595413.1)" relates to a target tracking method for small underwater targets, but is mainly used for detecting targets by using underwater sonar and has a great difference with the implementation method of visual detection and tracking. The patent document "an image recognition and tracking system (application number: 201710424070.3)" relates to an image recognition and target tracking system of an underwater robot, but mainly fuses multi-clue characteristics of a target image to analyze correlation with a test image, and the method has low requirements on system resources and low time complexity, but a tested sample has redundancy due to a large number of overlapped parts, so that the increased number of samples brings increased calculation amount and cannot meet the real-time requirement.
Disclosure of Invention
The invention aims to provide a continuous tracking method suitable for underwater robot target grabbing, which aims to solve the problem that tracking and grabbing are failed due to the fact that targets are similar to the surrounding environment and are blocked by a paw or other parts of a robot at present in underwater target tracking.
The invention aims to realize the continuous tracking method suitable for underwater robot target grabbing, which specifically comprises the following steps:
step 1, establishing a nuclear correlation motion model;
the establishment of the kernel correlation motion model requires that a KCF method based on correlation filtering extracts a richer training sample set in a dense sampling mode; dense sampling is realized by cyclic shift of base samples, and a sample cyclic matrix is generated in the sampling process:
the dense sampling is to perform cyclic displacement on the region of interest through the characteristic of a cyclic matrix to provide a training sample for a filter; according to the kernel function theorem, the cyclic characteristic is still kept after the sample cyclic matrix is mapped to the kernel space;
step 2, HOG characteristic extraction;
step 2.1, dividing the sample region into a plurality of regions, extracting 32-dimensional features in each region, namely 3 XnOrients +5, wherein nOrients 9 is the number of boxes divided in the gradient direction, namely bins, extracting three features in each direction, including two sensitive boxes, one insensitive, the other 4 apparent textures of the features, and the last zero representing stage features;
step 2.2, connecting the 31 features extracted by each cell unit in parallel, assuming that the division result of the unit is mxn, then the result of f-HOG extraction is mxn × 31, and said 31 directions are called channels;
step 2.3, generating samples through the cyclic drift of the cell units, wherein the corresponding displacement is the displacement of the corresponding position of each channel, and the ith channels of all the samples are obtained by the cyclic drift of the ith channel of the generated image;
step 3, establishing an appearance model;
step 3.1, training the tracker based on the nuclear correlation filtering by adopting a regularized least square classifier, and training a classifier f (z) based on the generated sample set to minimize the regularization risk:
wherein y isiRepresenting the expected output of the regression target corresponding to the training sample, y overall expectation following a gaussian distribution, λ being a regularization parameter, preventing overfitting; the objective function is: f (z) ═ wTz, the minimum value obtained by the above formula has a closed-loop solution, namely the closed-loop solution is a related filtering template:
w=(XTX+λI)-1XHy
wherein I is a unit array;
step 3.2, aiming at the nonlinear problem, introducing a kernel function method, and assumingIs a space input from a low dimensionMapping of space to high dimensional space, kernel functionRepresenting cross-correlation with a high dimensional space, the filter template coefficients w of the above equation are linear combinations of the sample sets generated by cyclic shifts, i.e.
Converting the problem of solving w into a solution α of solving a kernel regularization least square classifier [ α ]0,...,αn-1]TThe problems of (2):
α=(K+λI)-1y
where K is a kernel function matrix whose elements are Kij=κ(Pix,Pjx); if the kernel function satisfies k (x, x ') -k (Px, Px'), then the kernel function matrix is a circulant matrix;
the above equation is transformed into a frequency domain calculation according to the properties of the circulant matrix:
wherein k isxxIs the first row K ═ C (K) of the kernel matrix Kxx) Namely, the training process of the filter only needs to calculate the autocorrelation of the base sample in the high-dimensional space;
in the test, K is usedZRepresenting a kernel matrix between a training sample and a test sample, wherein the training sample is generated by the cyclic shift of a training base sample, and the test sample is generated by the cyclic shift of a candidate base sample; finally, the response of each test sample is obtained:
finding the position of the maximum f (Z) is the tracking target;
step 4, updating and iterating the model;
aiming at the continuous tracking requirement, a system confidence coefficient-based self-discrimination mechanism is provided to discriminate whether the target is similar or shielded to cause the error tracking; the self-discrimination mechanism is as follows:
where δ is an extremely small number, preventing AS → ∞; f. ofmax,faveAnd fxyRespectively representing the maximum value and the average value of the response of the current frame and the response values of the elements of the x-th row and the y-th column; for a sharp and low-noise response diagram, the AS index becomes large, and the tracked target is considered to be accurate at the moment; otherwise, the AS index is obviously reduced; when AS is less than or equal to zeta1ASaveWhen the current frame is blocked, the tracker is reinitialized through a detector; the mechanism judges whether the target is shielded or not, and when the target is shielded, updating is stopped so as to improve the tracking robustness.
Compared with the prior art, the invention has the beneficial effects that: the invention designs a continuous tracking method suitable for underwater robot target grabbing, which is based on a kernel correlation filtering theory and realizes dense sampling through cyclic displacement of a base sample so as to extract a richer training sample set, and establishes an apparent model of a tracked target and updates and iterates the model to realize continuous tracking of the target through extracting HOG characteristics. The invention not only can ensure the stable tracking of the underwater target, but also can automatically judge the shielding and error tracking conditions, thereby re-identifying the tracking and further completing the continuous tracking and reliable grabbing of the underwater target.
Drawings
FIG. 1 is a tracking flow diagram;
FIG. 2 is a graph of a mis-trace;
fig. 3 is a tracking grab diagram.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
A continuous tracking method suitable for underwater robot target grabbing mainly comprises the following steps: establishing a kernel-related motion model, extracting HOG characteristics, establishing an appearance model, updating the model, iterating and the like. The establishment of the kernel correlation motion model requires a KCF method based on correlation filtering to extract a richer training sample set in a dense sampling mode. Dense sampling is achieved by cyclic shifting of the base samples, and a sample cyclic matrix is generated in the sampling process.
Dense sampling is to cyclically shift the region of interest by the characteristics of the circulant matrix to provide training samples for the filter. According to the kernel function theorem, the cyclic characteristic is still kept after the sample cyclic matrix is mapped to the kernel space.
And (4) HOG feature extraction, wherein the HOG feature is to divide the image into cells (cells) with dense grids and uniform sizes. In one image, the appearance and shape of local objects can be described by a gradient directional distribution. To reduce the illumination effect, the orientation histograms of these cells are contrast normalized over a larger range of the image. Finally, all cell unit histograms are connected in series to form the characteristics of the image. The sample region is divided into a plurality of regions, then 32-dimensional features, namely 3 xnorients +5, are extracted from each region, wherein, nrorients is 9, namely the number of boxes (bins) divided in the gradient direction, three features are extracted from each direction, including two sensitive boxes, one insensitive, another 4 apparent textures and the last zero (representing stage features). Assuming the partition result of the cell is mxn, then the result of the f-HOG extraction is mxn × 31, we call these 31 directions as channels. The two-dimensional image matrix is a cell unit after the characteristics are extracted, samples are generated through the cyclic drift of the cell unit, the corresponding displacement of the corresponding position of each channel corresponds to the sample, and the ith channel of all the samples is obtained through the cyclic drift of the ith channel of the generated image.
The establishment of the kernel correlation motion model requires that a KCF method based on correlation filtering adopts a dense sampling mode to extract a richer training sample set. Dense sampling is achieved by cyclic shifting of the base samples, and a sample cyclic matrix is generated in the sampling process, as follows:
dense sampling is to cyclically shift the region of interest by the characteristics of the circulant matrix to provide training samples for the filter. According to the kernel function theorem, the cyclic characteristic is still kept after the sample cyclic matrix is mapped to the kernel space.
And (4) HOG feature extraction, wherein the HOG feature is to divide the image into cells (cells) with dense grids and uniform sizes. In one image, the appearance and shape of local objects can be described by a gradient directional distribution. To reduce the illumination effect, the orientation histograms of these cells are contrast normalized over a larger range of the image. Finally, all cell unit histograms are connected in series to form the characteristics of the image. The sample region is divided into a plurality of regions, then 32-dimensional features, namely 3 xnorients +5, are extracted from each region, wherein, nrorients is 9, namely the number of boxes (bins) divided in the gradient direction, three features are extracted from each direction, including two sensitive boxes, one insensitive, another 4 apparent textures and the last zero (representing stage features). Assuming the partition result of the cell is mxn, then the result of the f-HOG extraction is mxn × 31, we call these 31 directions as channels. The two-dimensional image matrix is a cell unit after the characteristics are extracted, samples are generated through the cyclic drift of the cell unit, the corresponding displacement of the corresponding position of each channel corresponds to the sample, and the ith channel of all the samples is obtained through the cyclic drift of the ith channel of the generated image.
And (3) establishing an appearance model, and training a tracker based on nuclear correlation filtering by adopting a regularized least square classifier. Training classifier f (z) based on the generated sample set described above minimizes regularization risk.
Wherein, yiRepresenting regression of corresponding training samplesThe desired output of the target, overall y, is expected to follow a gaussian distribution, and λ is a regularization parameter, preventing overfitting. The objective function is: f (z) ═ wTz。
The minimum value obtained by the above formula has a closed-loop solution, namely, the closed-loop solution is a related filtering template:
w=(XTX+λI)-1XHy (3)
wherein I is a unit matrix.
For the non-linear problem, we introduce a kernel function method. Suppose thatIs a mapping from a low-dimensional input space to a high-dimensional space, the kernel functionRepresenting the cross-correlation with the object in a high-dimensional space. The filter template coefficients w of the above equation are linear combinations of the sample sets generated by the cyclic shift, i.e.
Converting the problem of solving w into a solution α of solving a kernel regularization least square classifier [ α ]0,...,αn-1]TThe problems of (2):
α=(K+λI)-1y (4)
where K is a kernel function matrix whose elements are Kij=κ(Pix,Pjx). If the kernel satisfies k (x, x ') -k (Px, Px'), then the kernel matrix is a circulant matrix.
The above equation can be transformed into a frequency domain calculation according to the properties of the circulant matrix:
wherein k isxxIs the first row K ═ C (K) of the kernel matrix Kxx) I.e. the training process of the filter only needs to calculate the autocorrelation of the base samples in the high-dimensional space.
In the test, K is usedZRepresenting a kernel matrix between the training samples and the test samples. The training samples are generated by the cyclic shift of the training base samples, and the testing samples are generated by the cyclic shift of the candidate base samples. Finally, the response of each test sample is obtained as the following formula, and the position where the maximum f (Z) is located is the tracking target.
d. An updating iteration strategy of the model provides a self-discrimination mechanism based on system confidence coefficient aiming at the requirement of continuous tracking so as to discriminate whether the tracking error is caused by target similarity or occlusion. The self-discrimination mechanism is as follows:
where δ is an extremely small number, preventing AS → ∞; f. ofmax,faveAnd fxyRespectively representing the maximum value and the average value of the response of the current frame and the response values of the elements of the x-th row and the y-th column. For a sharp and low-noise response diagram, the AS index becomes larger, and the tracked target is considered to be accurate at the moment; otherwise, the AS index is reduced obviously. When AS is less than or equal to zeta1ASaveWhen the current frame is blocked, the tracker is reinitialized by the detector. The mechanism judges whether the target is shielded or not, and when the target is shielded, updating is stopped so as to improve the tracking robustness.
Fig. 1 is a tracking flow chart. First, the observation model is initialized at the first frame (either manually calibrated or given by the detection algorithm). The tracking frame is expanded by 2.5 times and becomes a search frame, and then a feature vector (matrix) is extracted in the search frame and used as a basic sample. The basic sample is circularly drifted to generate a large number of virtual samples to be used as training samples, and the parameter vector is obtained by quickly calculating a formula at the momentExtracting features from predicted target region of previous frameAnd (3) making a basic sample, then carrying out cyclic drift on the sample to generate a test sample, calculating the response of the trained classifier by using a formula to obtain a detected confidence map, wherein the maximum response position is the most possible target position. And forming a new search box at the current position, and extracting features for the next input sample, thereby continuously training and detecting to complete the tracking task.
As shown in fig. 2, the underwater robot passes through the pool, recognizes scallop targets and then tracks and grabs. The method comprises the steps of firstly detecting and identifying a scallop target by a small target identification algorithm, and then tracking the scallop target based on a KCF tracking algorithm.
The mechanism judges whether to reinitialize tracking according to the feedback result. A self-discrimination mechanism based on system confidence is provided.
Where δ is an extremely small number, preventing AS → ∞; f. ofmax,faveAnd fxyRespectively representing the maximum value and the average value of the response of the current frame and the response values of the elements of the x-th row and the y-th column. For a sharp and low-noise response diagram, the AS index becomes larger, and the tracked target is considered to be accurate at the moment; otherwise, the AS index is reduced obviously. When AS is less than or equal to zeta1ASaveWhen the current frame is blocked, the tracker is reinitialized by the detector. The mechanism judges whether the target is shielded or not, and when the target is shielded, updating is stopped so as to improve the tracking robustness.
To further enhance the anti-blocking effect of the algorithm, if the N continuous frames of pictures are provided, the method comprises the following stepsWe believe that there is a possible tracking drift at this point and reinitialize the tracker. w, h are the length and width of the picture, respectively. Because the robot always moves relative to the target in the grabbing process of the robot, if the variation of the pixel positions of the centers of the targets of the continuous N frames is smaller than the threshold value, the robot is likely to miss the paw or the targetThe robot body. Therefore, the target recognition module is called again to re-recognize and track the target.
According to the third figure, the effect of the method on long-term continuous tracking is further shown. The scallop target is continuously tracked for 128 seconds in the experiment, 3200 frames of pictures are obtained, and the grabbing is completed, wherein 25 pictures per second are obtained. The target is shielded by the paw in the tracking process, the method detects the shielding condition, reinitializes the identification of the target and tracks the target, and continuous tracking of the target is realized.
In summary, the invention provides a continuous tracking method suitable for underwater robot target grabbing, which not only can ensure stable tracking of an underwater target, but also can automatically judge shielding and error tracking conditions, thereby re-identifying and tracking and further completing continuous tracking and reliable grabbing of the underwater target. The invention realizes intensive sampling by cyclic displacement of base samples based on a kernel correlation filtering theory so as to extract a richer training sample set, establishes an appearance model of a tracking target by extracting HOG characteristics and introducing a kernel function so as to solve the problem of nonlinear regression, improves the calculation efficiency, judges whether the tracking needs to be reinitialized according to a feedback result, provides a system confidence coefficient-based self-discrimination mechanism, and realizes continuous tracking of the target.
Claims (1)
1. A continuous tracking method suitable for underwater robot target grabbing is characterized by comprising the following steps:
step 1, establishing a nuclear correlation motion model;
the establishment of the kernel correlation motion model requires that a KCF method based on correlation filtering extracts a richer training sample set in a dense sampling mode; dense sampling is realized by cyclic shift of base samples, and a sample cyclic matrix is generated in the sampling process:
the dense sampling is to perform cyclic displacement on the region of interest through the characteristic of a cyclic matrix to provide a training sample for a filter; according to the kernel function theorem, the cyclic characteristic is still kept after the sample cyclic matrix is mapped to the kernel space;
step 2, HOG characteristic extraction;
step 2.1, dividing the sample region into a plurality of regions, extracting 32-dimensional features in each region, namely 3 XnOrients +5, wherein nOrients 9 is the number of boxes divided in the gradient direction, namely bins, extracting three features in each direction, including two sensitive boxes, one insensitive, the other 4 apparent textures of the features, and the last zero representing stage features;
step 2.2, connecting the 31 features extracted by each cell unit in parallel, assuming that the division result of the unit is mxn, then the result of f-HOG extraction is mxn × 31, and said 31 directions are called channels;
step 2.3, generating samples through the cyclic drift of the cell units, wherein the corresponding displacement is the displacement of the corresponding position of each channel, and the ith channels of all the samples are obtained by the cyclic drift of the ith channel of the generated image;
step 3, establishing an appearance model;
step 3.1, training the tracker based on the nuclear correlation filtering by adopting a regularized least square classifier, and training a classifier f (z) based on the generated sample set to minimize the regularization risk:
wherein y isiRepresenting the expected output of the regression target corresponding to the training sample, y overall expectation following a gaussian distribution, λ being a regularization parameter, preventing overfitting; the objective function is: f (z) ═ wTz, the minimum value obtained by the above formula has a closed-loop solution, namely the closed-loop solution is a related filtering template:
w=(XTX+λI)-1XHy
wherein I is a unit array;
step 3.2, aiming at the nonlinear problem, introducing a kernel function method, and assumingIs a mapping from a low-dimensional input space to a high-dimensional space, the kernel functionRepresenting cross-correlation with a high dimensional space, the filter template coefficients w of the above equation are linear combinations of the sample sets generated by cyclic shifts, i.e.
Converting the problem of solving w into a solution α of solving a kernel regularization least square classifier [ α ]0,...,αn-1]TThe problems of (2):
α=(K+λI)-1y
where K is a kernel function matrix whose elements are Kij=κ(Pix,Pjx); if the kernel function satisfies k (x, x ') -k (Px, Px'), then the kernel function matrix is a circulant matrix;
the above equation is transformed into a frequency domain calculation according to the properties of the circulant matrix:
wherein k isxxIs the first row K ═ C (K) of the kernel matrix Kxx) Namely, the training process of the filter only needs to calculate the autocorrelation of the base sample in the high-dimensional space;
in the test, K is usedZRepresenting a kernel matrix between a training sample and a test sample, wherein the training sample is generated by the cyclic shift of a training base sample, and the test sample is generated by the cyclic shift of a candidate base sample; finally, the response of each test sample is obtained:
finding the position of the maximum f (Z) is the tracking target;
step 4, updating and iterating the model;
aiming at the continuous tracking requirement, a system confidence coefficient-based self-discrimination mechanism is provided to discriminate whether the target is similar or shielded to cause the error tracking; the self-discrimination mechanism is as follows:
where δ is an extremely small number, preventing AS → ∞; f. ofmax,faveAnd fxyRespectively representing the maximum value and the average value of the response of the current frame and the response values of the elements of the x-th row and the y-th column; for a sharp and low-noise response diagram, the AS index becomes large, and the tracked target is considered to be accurate at the moment; otherwise, the AS index is obviously reduced; when AS is less than or equal to zeta1ASaveWhen the current frame is blocked, the tracker is reinitialized through a detector; the mechanism judges whether the target is shielded or not, and when the target is shielded, updating is stopped so as to improve the tracking robustness.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911408004.2A CN111105444B (en) | 2019-12-31 | 2019-12-31 | Continuous tracking method suitable for grabbing underwater robot target |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911408004.2A CN111105444B (en) | 2019-12-31 | 2019-12-31 | Continuous tracking method suitable for grabbing underwater robot target |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111105444A true CN111105444A (en) | 2020-05-05 |
CN111105444B CN111105444B (en) | 2023-07-25 |
Family
ID=70424777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911408004.2A Active CN111105444B (en) | 2019-12-31 | 2019-12-31 | Continuous tracking method suitable for grabbing underwater robot target |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111105444B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112100067A (en) * | 2020-09-10 | 2020-12-18 | 北京完美赤金科技有限公司 | Test method, system and storage medium based on regression analysis |
CN114371784A (en) * | 2022-01-14 | 2022-04-19 | 天津大学 | Brain-computer interface decoding method for steady-state visual evoked potential |
CN114708307A (en) * | 2022-05-17 | 2022-07-05 | 北京航天晨信科技有限责任公司 | Target tracking method, system, storage medium and device based on correlation filter |
Citations (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006318350A (en) * | 2005-05-16 | 2006-11-24 | Sony Corp | Object tracking method, program for object tracking method, recording medium where program for object tracking method is recorded, and object tracking device |
US20070139269A1 (en) * | 2005-12-16 | 2007-06-21 | Lucent Technologies Inc. | System and method for model-free position estimation and tracking |
US20130050502A1 (en) * | 2010-02-19 | 2013-02-28 | Kabushiki Kaisha Toshiba | Moving object tracking system and moving object tracking method |
CN103246896A (en) * | 2013-05-24 | 2013-08-14 | 成都方米科技有限公司 | Robust real-time vehicle detection and tracking method |
US20150286859A1 (en) * | 2014-04-03 | 2015-10-08 | Avago Technologies General Ip (Singapore) Pte.Ltd. | Image Processor Comprising Gesture Recognition System with Object Tracking Based on Calculated Features of Contours for Two or More Objects |
JP2016162096A (en) * | 2015-02-27 | 2016-09-05 | セコム株式会社 | Mobile object tracking device |
US20170018090A1 (en) * | 2015-07-15 | 2017-01-19 | Thomson Licensing | Method and apparatus for object tracking in image sequences |
CN106557774A (en) * | 2015-09-29 | 2017-04-05 | 南京信息工程大学 | The method for real time tracking of multichannel core correlation filtering |
US9874931B1 (en) * | 2016-02-22 | 2018-01-23 | Rockwell Collins, Inc. | Head-tracking system and method |
CN107644430A (en) * | 2017-07-27 | 2018-01-30 | 孙战里 | Target following based on self-adaptive features fusion |
CN108010067A (en) * | 2017-12-25 | 2018-05-08 | 北京航空航天大学 | A kind of visual target tracking method based on combination determination strategy |
CN108090919A (en) * | 2018-01-02 | 2018-05-29 | 华南理工大学 | Improved kernel correlation filtering tracking method based on super-pixel optical flow and adaptive learning factor |
CN108288062A (en) * | 2017-12-29 | 2018-07-17 | 中国电子科技集团公司第二十七研究所 | A kind of method for tracking target based on core correlation filtering |
US20180232891A1 (en) * | 2017-02-13 | 2018-08-16 | Electronics And Telecommunications Research Institute | System and method for tracking multiple objects |
JP2018139086A (en) * | 2017-02-24 | 2018-09-06 | 三菱電機株式会社 | Correlation tracking device, correlation tracking method and correlation tracking program |
CN108573217A (en) * | 2018-03-21 | 2018-09-25 | 南京邮电大学 | A kind of compression tracking of combination partial structurtes information |
CN108765458A (en) * | 2018-04-16 | 2018-11-06 | 上海大学 | High sea situation unmanned boat sea-surface target dimension self-adaption tracking based on correlation filtering |
CN108830879A (en) * | 2018-05-29 | 2018-11-16 | 上海大学 | A kind of unmanned boat sea correlation filtering method for tracking target suitable for blocking scene |
US20190034700A1 (en) * | 2016-02-03 | 2019-01-31 | Goertek Inc. | Method and device for face tracking and smart terminal |
CN109308713A (en) * | 2018-08-02 | 2019-02-05 | 哈尔滨工程大学 | A kind of improvement core correlation filtering Method for Underwater Target Tracking based on Forward-looking Sonar |
CN109360225A (en) * | 2018-10-16 | 2019-02-19 | 北京理工大学 | A kind of optimization system and method for motion model |
CN109685073A (en) * | 2018-12-28 | 2019-04-26 | 南京工程学院 | A kind of dimension self-adaption target tracking algorism based on core correlation filtering |
US20190139232A1 (en) * | 2017-11-03 | 2019-05-09 | Beijing Deephi Technology Co., Ltd. | Target tracking hardware implementation system and method |
US20190221002A1 (en) * | 2018-01-18 | 2019-07-18 | Autel Robotics Co., Ltd. | Target detection method and unmanned aerial vehicle |
CN110533690A (en) * | 2019-08-08 | 2019-12-03 | 河海大学 | The core correlation filtering Method for Underwater Target Tracking of Fusion Features and learning rate optimization |
CN110569723A (en) * | 2019-08-02 | 2019-12-13 | 西安工业大学 | Target tracking method combining feature fusion and model updating |
US20200327681A1 (en) * | 2017-12-29 | 2020-10-15 | Huawei Technologies Co., Ltd. | Target tracking method and apparatus |
-
2019
- 2019-12-31 CN CN201911408004.2A patent/CN111105444B/en active Active
Patent Citations (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006318350A (en) * | 2005-05-16 | 2006-11-24 | Sony Corp | Object tracking method, program for object tracking method, recording medium where program for object tracking method is recorded, and object tracking device |
US20070139269A1 (en) * | 2005-12-16 | 2007-06-21 | Lucent Technologies Inc. | System and method for model-free position estimation and tracking |
US20130050502A1 (en) * | 2010-02-19 | 2013-02-28 | Kabushiki Kaisha Toshiba | Moving object tracking system and moving object tracking method |
CN103246896A (en) * | 2013-05-24 | 2013-08-14 | 成都方米科技有限公司 | Robust real-time vehicle detection and tracking method |
US20150286859A1 (en) * | 2014-04-03 | 2015-10-08 | Avago Technologies General Ip (Singapore) Pte.Ltd. | Image Processor Comprising Gesture Recognition System with Object Tracking Based on Calculated Features of Contours for Two or More Objects |
JP2016162096A (en) * | 2015-02-27 | 2016-09-05 | セコム株式会社 | Mobile object tracking device |
US20170018090A1 (en) * | 2015-07-15 | 2017-01-19 | Thomson Licensing | Method and apparatus for object tracking in image sequences |
CN106557774A (en) * | 2015-09-29 | 2017-04-05 | 南京信息工程大学 | The method for real time tracking of multichannel core correlation filtering |
US20190034700A1 (en) * | 2016-02-03 | 2019-01-31 | Goertek Inc. | Method and device for face tracking and smart terminal |
US9874931B1 (en) * | 2016-02-22 | 2018-01-23 | Rockwell Collins, Inc. | Head-tracking system and method |
US20180232891A1 (en) * | 2017-02-13 | 2018-08-16 | Electronics And Telecommunications Research Institute | System and method for tracking multiple objects |
JP2018139086A (en) * | 2017-02-24 | 2018-09-06 | 三菱電機株式会社 | Correlation tracking device, correlation tracking method and correlation tracking program |
CN107644430A (en) * | 2017-07-27 | 2018-01-30 | 孙战里 | Target following based on self-adaptive features fusion |
US20190139232A1 (en) * | 2017-11-03 | 2019-05-09 | Beijing Deephi Technology Co., Ltd. | Target tracking hardware implementation system and method |
CN108010067A (en) * | 2017-12-25 | 2018-05-08 | 北京航空航天大学 | A kind of visual target tracking method based on combination determination strategy |
CN108288062A (en) * | 2017-12-29 | 2018-07-17 | 中国电子科技集团公司第二十七研究所 | A kind of method for tracking target based on core correlation filtering |
US20200327681A1 (en) * | 2017-12-29 | 2020-10-15 | Huawei Technologies Co., Ltd. | Target tracking method and apparatus |
CN108090919A (en) * | 2018-01-02 | 2018-05-29 | 华南理工大学 | Improved kernel correlation filtering tracking method based on super-pixel optical flow and adaptive learning factor |
US20190221002A1 (en) * | 2018-01-18 | 2019-07-18 | Autel Robotics Co., Ltd. | Target detection method and unmanned aerial vehicle |
CN108573217A (en) * | 2018-03-21 | 2018-09-25 | 南京邮电大学 | A kind of compression tracking of combination partial structurtes information |
CN108765458A (en) * | 2018-04-16 | 2018-11-06 | 上海大学 | High sea situation unmanned boat sea-surface target dimension self-adaption tracking based on correlation filtering |
CN108830879A (en) * | 2018-05-29 | 2018-11-16 | 上海大学 | A kind of unmanned boat sea correlation filtering method for tracking target suitable for blocking scene |
CN109308713A (en) * | 2018-08-02 | 2019-02-05 | 哈尔滨工程大学 | A kind of improvement core correlation filtering Method for Underwater Target Tracking based on Forward-looking Sonar |
CN109360225A (en) * | 2018-10-16 | 2019-02-19 | 北京理工大学 | A kind of optimization system and method for motion model |
CN109685073A (en) * | 2018-12-28 | 2019-04-26 | 南京工程学院 | A kind of dimension self-adaption target tracking algorism based on core correlation filtering |
CN110569723A (en) * | 2019-08-02 | 2019-12-13 | 西安工业大学 | Target tracking method combining feature fusion and model updating |
CN110533690A (en) * | 2019-08-08 | 2019-12-03 | 河海大学 | The core correlation filtering Method for Underwater Target Tracking of Fusion Features and learning rate optimization |
Non-Patent Citations (8)
Title |
---|
ECHO_HAO: "KCF算法原理笔记", 《HTTPS://BLOG.CSDN.NET/QQ_37392244/ARTICLE/DETAILS/103240929》 * |
JOÃO F. HENRIQUES 等: "High-Speed Tracker with Kernelized Correlation Filters", 《ARXIV:1404.7584》 * |
JUN LIU 等: "Anti-occlusion object tracking based on correlation filter" * |
周浩 等: "基于视觉感知的海生物吸纳式水下机器人目标捕获控制", 《机器人》 * |
成科扬 等: "面向复杂场景的鲁棒KCF行人跟踪方法", 《南京航空航天大学学报》 * |
陆佳辉: "基于视觉的增强现实关键技术研究" * |
陈志旺 等: "抗遮挡与尺度自适应的改进KCF跟踪算法研究", 《控制与决策》 * |
韦迅: "基于均值漂移的动态目标跟踪算法研究" * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112100067A (en) * | 2020-09-10 | 2020-12-18 | 北京完美赤金科技有限公司 | Test method, system and storage medium based on regression analysis |
CN112100067B (en) * | 2020-09-10 | 2023-08-25 | 北京完美赤金科技有限公司 | Regression analysis-based test method, system and storage medium |
CN114371784A (en) * | 2022-01-14 | 2022-04-19 | 天津大学 | Brain-computer interface decoding method for steady-state visual evoked potential |
CN114371784B (en) * | 2022-01-14 | 2023-11-03 | 天津大学 | Brain-computer interface decoding method oriented to steady-state visual evoked potential |
CN114708307A (en) * | 2022-05-17 | 2022-07-05 | 北京航天晨信科技有限责任公司 | Target tracking method, system, storage medium and device based on correlation filter |
Also Published As
Publication number | Publication date |
---|---|
CN111105444B (en) | 2023-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Huang et al. | Faster R-CNN for marine organisms detection and recognition using data augmentation | |
CN110232350B (en) | Real-time water surface multi-moving-object detection and tracking method based on online learning | |
CN107909082B (en) | Sonar image target identification method based on deep learning technology | |
CN108154118B (en) | A kind of target detection system and method based on adaptive combined filter and multistage detection | |
US20230169623A1 (en) | Synthetic aperture radar (sar) image target detection method | |
CN108549873B (en) | Three-dimensional face recognition method and three-dimensional face recognition system | |
Tao et al. | Automatic apple recognition based on the fusion of color and 3D feature for robotic fruit picking | |
Wang et al. | Real-time underwater onboard vision sensing system for robotic gripping | |
CN112184752A (en) | Video target tracking method based on pyramid convolution | |
CN108109162B (en) | Multi-scale target tracking method using self-adaptive feature fusion | |
CN111105444B (en) | Continuous tracking method suitable for grabbing underwater robot target | |
CN112285712A (en) | Method for improving detection precision of ship on shore in SAR image | |
CN112149591B (en) | SSD-AEFF automatic bridge detection method and system for SAR image | |
CN110610210B (en) | Multi-target detection method | |
CN109242019B (en) | Rapid detection and tracking method for optical small target on water surface | |
CN112183675B (en) | Tracking method for low-resolution target based on twin network | |
CN113705331B (en) | SAR ship detection method based on quaternary feature pyramid network | |
CN113822352A (en) | Infrared dim target detection method based on multi-feature fusion | |
CN107808391B (en) | Video dynamic target extraction method based on feature selection and smooth representation clustering | |
CN115375737A (en) | Target tracking method and system based on adaptive time and serialized space-time characteristics | |
CN112489088A (en) | Twin network visual tracking method based on memory unit | |
Wei et al. | Novel green-fruit detection algorithm based on D2D framework | |
Wu et al. | A cascaded CNN-based method for monocular vision robotic grasping | |
CN112509014B (en) | Robust interpolation light stream computing method matched with pyramid shielding detection block | |
CN114120129A (en) | Three-dimensional identification method for landslide slip surface based on unmanned aerial vehicle image and deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |