CN111724405A - Long-time multi-target prawn tracking method based on boundary constraint Kalman filtering - Google Patents

Long-time multi-target prawn tracking method based on boundary constraint Kalman filtering Download PDF

Info

Publication number
CN111724405A
CN111724405A CN202010485883.5A CN202010485883A CN111724405A CN 111724405 A CN111724405 A CN 111724405A CN 202010485883 A CN202010485883 A CN 202010485883A CN 111724405 A CN111724405 A CN 111724405A
Authority
CN
China
Prior art keywords
target
area
tracking
foreground
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010485883.5A
Other languages
Chinese (zh)
Inventor
刘向荣
彭惠民
毛勇
程文志
黄静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen University
Original Assignee
Xiamen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen University filed Critical Xiamen University
Priority to CN202010485883.5A priority Critical patent/CN111724405A/en
Publication of CN111724405A publication Critical patent/CN111724405A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

A long-time multi-target prawn tracking method based on boundary constraint Kalman filtering relates to computer application. 1) Dividing the water tank into a plurality of areas by using partition plates, forming a target tracking area for the shrimps in each area, distributing serial numbers, and selecting a foreground from frames; 2) separating the background and the foreground based on an OTSU algorithm, calculating the average gray value of the background area, filling the foreground area and establishing a background model; 3) establishing a Kalman filter for each observation area and initializing the Kalman filter; 4) detecting the position of the target by using a background model; 5) checking a tracking area to which each observation value belongs, and allocating a tracked ID to each observation vector; 6) predicting a target and determining a tracking result of the target; 7) updating a Kalman filter, and taking the corrected state vector as a target state value of the current frame; 8) reading the next frame of image, and repeating the steps 4) to 8) until all the images are calculated. The tracking precision is improved, and the interference noise is reduced.

Description

Long-time multi-target prawn tracking method based on boundary constraint Kalman filtering
Technical Field
The invention relates to the field of computer application, in particular to a long-time multi-target prawn tracking method based on boundary constraint Kalman filtering.
Background
In recent years, target detection and target tracking in the computer field have been rapidly developed, and have been successfully applied in various fields, such as pedestrian detection, pedestrian tracking, defect detection, disaster warning, intelligent traffic, and the like. The biological motion trail can be effectively extracted by tracking the motion trail of the aquatic organism, and scientific research personnel can be helped to excavate the motion characteristics and the biological activity of the animal under various different environments, so that the development of marine biology is promoted. Designing a proper algorithm aiming at a specific aquatic organism has important guiding significance.
In a common multi-target tracking technology, for example, IDtracker detects a foreground through a dynamic modeling method, then certain limitation is performed on a block, when the detected block is larger than a certain threshold value, the detected block is considered to be formed by overlapping a plurality of targets, a single target is separated according to bidirectional block variation, and distribution is performed according to a path track. However, this kind of technology has certain limitations, and the prawn movement may be a high speed bounce, causing a target loss phenomenon. In a long-time monitoring experiment, environmental variables such as external environments such as illumination conditions and temperature conditions need to be controlled, the influence of water surface fluctuation, illumination condition change and target motion uncertainty are large, and the general technologies such as IDtracker cannot accurately track the prawns for a long time.
Disclosure of Invention
The invention aims to provide a long-time multi-target prawn tracking method based on boundary constraint Kalman filtering, which can solve the problem of long-time motion behavior experiments of multi-prawn and can adapt to external factors such as complex water surface environment, illumination condition change and the like.
The invention comprises the following steps:
1) dividing a water tank used for an experiment into a plurality of areas by using partition plates, forming a target tracking area for a tail shrimp in each area, distributing the serial numbers of each area to be tracked, and then framing out the foreground;
2) separating the background and the foreground by using an OTSU algorithm, calculating an average gray value of a background area, filling the foreground area with the average gray value and establishing a background model;
3) establishing a Kalman filter for each observation area selected in the step 1), and initializing the Kalman filter;
4) detecting the position of the target by using a background difference method based on the background model established in the step 2);
5) checking a tracking area to which each observation value belongs according to the condition defined by the boundary, and then allocating a tracked ID to each observation vector;
6) predicting a target and determining a tracking result of the target;
7) updating a Kalman filter, calculating Kalman filtering gain, and taking the corrected state vector as a target state value of the current frame;
8) reading the next frame of image, and repeating the steps 4) to 8) until all the images are calculated.
In step 1), the water tank for the experiment is divided into a plurality of areas by partition boards, each area forms a target tracking area for shrimps, each area to be tracked is assigned with a number, and then the specific steps of selecting a foreground by a frame can be as follows: firstly, a water tank used for an experiment is divided into a plurality of areas by using a baffle, the motion condition of one prawn is observed in each area, before the prawn is tracked, a tracking area is divided by using a mouse according to the partition of the water tank in a first frame image in a man-machine interaction mode, each area corresponds to one tracking number (1-N), and then all prawn foreground areas are selected by using an observation area of the mouse on the image.
In step 2), the OTSU algorithm is the ohio algorithm, and the steps include: for image I (x, y), the threshold value of foreground and target segmentation is recorded as T, and the proportion of pixel points belonging to the foreground part in the whole image is recorded as w0Average gray scale of foreground is recorded as u0(ii) a The ratio of background pixel points to the whole image is recorded asw1Average gray of background is recorded as u1The average gray scale of the whole image is recorded as u, the inter-class variance is recorded as g, the size of the image is assumed to be M × N, and the number of pixels in the image with the gray scale value smaller than T is N0The number of pixels with the pixel gray level larger than the threshold value T is N1Then:
(1) calculating w according to the formula0
Figure BDA0002519042610000021
(2) Calculating w according to the formula1
Figure BDA0002519042610000022
(3) Calculating u, u-w according to formula0*u0+w1*u1
(4) Calculating g, g ═ w according to the formula0(u0-u)2+w1(u1-u)2
(5) Repeating the steps (1) to (5) to obtain a threshold value T which maximizes the inter-class variance g;
(6) according to the obtained threshold value T, the gray-scale image is binarized, the part of the pixel value larger than T is 255, and the part smaller than T is 0.
In step 2), the specific process of establishing the background model may be: graying the image based on the step 1), recording the image as O (x, y), and cutting out a foreground area of each prawn with the length of LkWidth of MkThe image of (a); binarizing the foreground area image containing partial background by using OTSU algorithm to obtain image mask of each shrimp, and using Ik(x, y) represents, wherein k represents the kth shrimp, the background part value is 0, the foreground part value is 255, and the total number of pixels in the background part of the mask area is calculated by using the mask of the shrimps:
Figure BDA0002519042610000031
then, the background mean gray value of the foreground region of the kth shrimp is:
Figure BDA0002519042610000032
then use mean from foreground maskkFilling in the original part IkAnd (x, y) ═ 255, and after all foreground regions are filled, a background model is obtained.
In step 3), the method for establishing a kalman filter and initializing the kalman filter may include: establishing a Kalman filter with a target state vector of xkThe target observation vector is zkSystem state transition equation
Figure BDA0002519042610000033
Δ t is the time difference between two adjacent frames, and the system observes the matrix
Figure BDA0002519042610000034
Kalman filtering covariance matrix at initial time
Figure BDA0002519042610000035
System process noise matrix
Figure BDA0002519042610000036
System observation noise
Figure BDA0002519042610000037
In step 4), the specific method for detecting the position of the target by using the background difference method may be: subtracting a background model from the current frame image to obtain a background difference image, then carrying out OTSU binaryzation, obtaining a detected foreground contour from the binaryzation image, and calculating the center of each contour, wherein the contour center is the observation position of the prawn; after the position of the target is detected by using a background difference method, binarization can be performed on the background difference result image, masking operation is performed on the background difference image by using the mask of the tracking area in the step 1), and the non-tracking area part is shielded.
In the step 5), the observation point with the largest outline area in the tracking area to which each observation value belongs is used as the observation value of the position of the prawn, and the non-maximum noise is shielded.
In step 6), the specific method for predicting the target and determining the tracking result of the target may be: when the number of frames read is less than 6, the observed value z is usedkAs a tracking result of the target, updating the Kalman filter by using the current observation value; when the number of the read frames is more than 6, using a Kalman filter predicted value as a target tracking result; the prediction formula is xk=Fxk-1Wherein x isk-1And F is a state vector of the target at the last moment and is a system state transition matrix.
In step 7), the updating the kalman filter, calculating a kalman filter gain, and the specific method of using the modified state vector as the target state value of the current frame may be:
updating the Kalman filter according to the formula
Figure BDA0002519042610000041
Calculating a Kalman filter gain, wherein
Figure BDA0002519042610000042
Pk-1Kalman Filter covariance matrix at time k-1, according to equation x'k=xk+K′(zk-Akxk) Correcting the target state vector according to formula Pk=P′k-K′HkP′kUpdating Kalman Filter covariance matrix PkAnd taking the corrected state vector as a target state value of the current frame.
Compared with the prior art, the invention has the beneficial effects that:
the invention divides a physical water tank into n tracking areas and allocates a tracking id to each area. And then selecting the prawn foreground of the first frame, separating the background part from the foreground by using an OTSU algorithm, sampling the background to obtain an average gray value, filling a foreground area according to a result after OTSU binaryzation, and constructing a background model. And foreground detection is carried out by using a background difference method in the tracking process, a Kalman filter is initialized for the id of each region, and the current position is corrected by using the Kalman filter in combination with a historical observation value, so that the tracking precision is improved. In the id assignment process of the foreground, the id of each shrimp is assigned using the condition defined by the boundary. The mask of the region is used for shielding the non-tracking part, and interference noise is reduced.
Drawings
FIG. 1 is a flow chart of a method according to an embodiment of the present invention.
Detailed Description
The invention is described in detail below by way of example with reference to the accompanying drawings.
The invention provides a boundary constraint Kalman filtering based long-time multi-target prawn tracking method, which can adapt to a long-time shooting experiment and simultaneously observe the motion behaviors of a plurality of prawns, and a flow chart is shown in figure 1. Firstly, a water tank used for an experiment is divided into a plurality of areas by using a baffle, the serial number of each area is different and is used as an ID (identity) tracking area, and the motion condition of one prawn is observed in each area. The data acquisition can use various electronic devices with shooting functions, and the method accepts data input in the form of videos and pictures. During data processing, selecting a foreground from a frame in a first frame image, separating the background and the foreground by using an OTSU-based algorithm, calculating an average gray value of a background area, filling the foreground area with the average gray value, and establishing background modeling. And during subsequent tracking, detecting all target positions by a background difference method. After the central position of each target is calculated, the central position is distributed to each tracking domain according to the boundary defined by each area. And calculating the final position of the target by using a Kalman filtering algorithm and combining the detection value and the predicted value of the target.
The method specifically comprises the following steps:
step 1, firstly, a water tank used for an experiment is divided into a plurality of areas by baffles, the number of each area is different and is an ID (identity) used as a tracking area, and the motion condition of one prawn is observed in each area. Before the shrimp tracking is carried out, tracking areas are divided by a mouse in a man-machine interaction mode according to the partition of a water tank in a first frame image, and each area corresponds to one tracking number (1-N). And then selecting all prawn foreground regions by using an observation region of a mouse on the image.
And step 2, graying the image based on the step 1 and marking the image as O (x, y). Cutting out foreground area of each prawn with length LkWidth of MkThe image of (2). Binarizing the foreground area image containing partial background by using OTSU algorithm to obtain image mask of each shrimp, and using Ik(x, y) where k represents the kth shrimp, the background portion value is 0, and the foreground portion value is 255. Calculating the total pixel number of the background part of the mask area by using the mask of the prawns:
Figure BDA0002519042610000051
then the background mean gray value of the foreground region of the kth shrimp is:
Figure BDA0002519042610000052
then use mean from foreground maskkFilling in the original part Ik(x, y) 255 position. And after filling all the foreground areas, establishing a background model.
Step 3, establishing a Kalman filter for each observation area selected in the step 1, and initializing the Kalman filter, wherein the target state vector is xkThe target observation vector is zk. Wherein the system state transition equation
Figure BDA0002519042610000053
Δ t is the time difference between two adjacent frames; system observation matrix
Figure BDA0002519042610000054
Kalman filtering covariance matrix at initial time
Figure BDA0002519042610000055
System process noise matrix
Figure BDA0002519042610000056
System observation noise
Figure BDA0002519042610000057
And 4, detecting the position of the target by using the background model established in the step 2. The target detection method can be used for subtracting a background model from the current frame image to obtain a background difference image and then carrying out OTSU binaryzation. And (4) acquiring the detected foreground contours from the binarized image, and calculating the center of each contour, wherein the center of each contour is the observation position of the prawn.
And 5, checking a tracking area to which each observation value belongs according to the condition defined by the boundary, and then allocating a tracked ID to each observation vector.
Step 6, when the number of read frames is less than 6, using the observed value zkAs a result of the tracking of the target, and updating the kalman filter using the current observation. And when the number of the read frames is more than 6, using a Kalman filter predicted value as a tracking result of the target. The prediction formula is xk=Fxk-1Wherein x isk-1And F is a state vector of the target at the last moment and is a system state transition matrix.
And 7, updating the Kalman filter. According to the formula
Figure BDA0002519042610000061
And calculating Kalman filtering gain. Wherein
Figure BDA0002519042610000062
Pk-1Is the kalman filter covariance matrix at time k-1. According to a formula x'k=xk+K′(zk-Akxk) And correcting the target state vector. According to formula Pk=P′k-K′HkP′kUpdating Kalman Filter covariance matrix Pk. And taking the corrected state vector as a target state value of the current frame, and reading the next frame of image.
And 8, reading the next frame of image, and repeating the steps 4 to 8 until all the images are calculated.
Further, in step 3, after the OTSU algorithm binarization is performed on the background difference image in the target detection method, the mask of the tracking area in step 1 is used to perform a masking operation on the background difference image, so as to shield the non-tracking area part.
Further, in the step 5, the observation point with the largest outline area is used as the observation value of the position of the prawn in each tracking domain, so as to shield the non-maximum noise.
Further, in the step 5, the observation point with the largest outline area is used as the observation value of the position of the prawn in each tracking domain, so as to shield the non-maximum noise.
Further, in step 2, the OTSU algorithm is the ohio algorithm, and the steps include:
for image I (x, y), the threshold value of foreground and target segmentation is recorded as T, and the proportion of pixel points belonging to the foreground part in the whole image is recorded as w0Average gray scale of foreground is recorded as u0(ii) a The proportion of background pixel points in the whole picture is recorded as w1Average gray of background is recorded as u1The average gray scale of the whole image is recorded as u, the inter-class variance is recorded as g, the size of the image is assumed to be M × N, and the number of pixels in the image with the gray scale value smaller than T is N0The number of pixels with the pixel gray level larger than the threshold value T is N1Then:
(1) calculating w according to the formula0
Figure BDA0002519042610000063
(2) Calculating w according to the formula1
Figure BDA0002519042610000071
(3) Calculating u, u-w according to formula0*u0+w1*u1
(4) Calculating g, g ═ w according to the formula0(u0-u)2+w1(u1-u)2
(5) The above steps (1) to (5) are repeated to obtain a threshold value T for maximizing the inter-class variance g.
(6) The grayscale image I (x, y) is binarized based on the obtained threshold value T. The pixel value is 255 for a portion larger than T and 0 for a portion smaller than T.
The method can accurately track the behavior track of the multi-tailed prawns for a long time based on the water tank experimental environment. The invention firstly divides the water tank used in the experiment into a plurality of areas by using the partition boards, each area forms a target tracking area of the tail shrimp, and the serial number distribution is carried out on each area to be tracked. And then selecting a foreground from the frame, separating the background and the foreground by using an OTSU-based algorithm, calculating an average gray value of a background area, filling the foreground area with the average gray value, and establishing background modeling. All target positions are detected by a background difference method. After the central position of each target is calculated, the central position is distributed to each tracking domain according to the boundary defined by each area. And calculating the accurate position of the target by using a Kalman filtering algorithm and combining the detected value and the predicted value of the target. Compared with the prior art, the method is suitable for condition change of long-time experiments, and has high algorithm real-time performance and high accuracy.

Claims (10)

1. A long-time multi-target prawn tracking method based on boundary constraint Kalman filtering is characterized by comprising the following steps:
1) dividing a water tank used for an experiment into a plurality of areas by using partition plates, forming a target tracking area for a tail shrimp in each area, distributing the serial numbers of each area to be tracked, and then framing out the foreground;
2) separating the background and the foreground by using an OTSU algorithm, calculating an average gray value of a background area, filling the foreground area with the average gray value and establishing a background model;
3) establishing a Kalman filter for each observation area selected in the step 1), and initializing the Kalman filter;
4) detecting the position of the target by using a background difference method based on the background model established in the step 2);
5) checking a tracking area to which each observation value belongs according to the condition defined by the boundary, and then allocating a tracked ID to each observation vector;
6) predicting a target and determining a tracking result of the target;
7) updating a Kalman filter, calculating Kalman filtering gain, and taking the corrected state vector as a target state value of the current frame;
8) reading the next frame of image, and repeating the steps 4) to 8) until all the images are calculated.
2. The long-time multi-target prawn tracking method based on boundary constraint Kalman filtering as claimed in claim 1, characterized in that in step 1), the water tank used for the experiment is divided into a plurality of areas by partition boards, each area forms a target tracking area for a prawn, each area to be tracked is assigned with a number, and then the specific steps of selecting a foreground by a frame can be as follows: firstly, a water tank used for an experiment is divided into a plurality of areas by using a baffle, the motion condition of one prawn is observed in each area, before the prawn is tracked, a tracking area is divided by using a mouse according to the partition of the water tank in a first frame image in a man-machine interaction mode, each area corresponds to one tracking number (1-N), and then all prawn foreground areas are selected by using an observation area of the mouse on the image.
3. The method according to claim 1, wherein in step 2), the OTSU algorithm is the tsu algorithm, and the steps include: for image I (x, y), the threshold value of foreground and target segmentation is recorded as T, and the proportion of pixel points belonging to the foreground part in the whole image is recorded as w0Average gray scale of foreground is recorded as u0(ii) a The proportion of background pixel points in the whole picture is recorded as w1Average gray of background is recorded as u1The average gray scale of the whole image is recorded as u, the inter-class variance is recorded as g, the image size is assumed to be M × N, and the number of pixels in the image with the gray scale value smaller than T is N0The number of pixels with the pixel gray level larger than the threshold value T is N1Then:
(1) calculating w according to the formula0
Figure FDA0002519042600000021
(2) Calculating w according to the formula1
Figure FDA0002519042600000022
(3) Calculating u, u-w according to formula0*u0+w1*u1
(4) Calculating g, g ═ w according to the formula0(u0-u)2+w1(u1-u)2
(5) Repeating the steps (1) to (5) to obtain a threshold value T which maximizes the inter-class variance g;
(6) according to the obtained threshold value T, the gray-scale image is binarized, the part of the pixel value larger than T is 255, and the part smaller than T is 0.
4. The long-time multi-target prawn tracking method based on boundary constraint Kalman filtering as claimed in claim 1, characterized in that in step 2), the specific process of establishing the background model is as follows: graying the image based on the step 1), recording the image as O (x, y), and cutting out a foreground area of each prawn with the length of LkWidth of MkThe image of (a); binarizing the foreground area image containing partial background by using OTSU algorithm to obtain image mask of each shrimp, and using Ik(x, y) represents, wherein k represents the kth shrimp, the background part value is 0, the foreground part value is 255, and the total number of pixels in the background part of the mask area is calculated by using the mask of the shrimps:
Figure FDA0002519042600000023
then, the background mean gray value of the foreground region of the kth shrimp is:
Figure FDA0002519042600000024
then use mean from foreground maskkFilling in the original part IkAnd (x, y) ═ 255, and after all foreground regions are filled, a background model is obtained.
5. The long-time multi-target prawn tracking method based on boundary constraint Kalman filtering as claimed in claim 1, characterized in that in step 3), the Kalman filter is established, and the specific method for initializing the Kalman filter is as follows: establishing a Kalman filter with a target state vector of xkThe target observation vector is zkSystem state transition equation
Figure FDA0002519042600000031
Δ t is the time difference between two adjacent frames, and the system observes the matrix
Figure FDA0002519042600000032
Kalman filtering covariance matrix at initial time
Figure FDA0002519042600000033
System process noise matrix
Figure FDA0002519042600000034
System observation noise
Figure FDA0002519042600000035
6. The long-time multi-target prawn tracking method based on boundary constraint Kalman filtering as claimed in claim 1, characterized in that in step 4), the specific method for detecting the position of the target by using the background difference method is as follows: subtracting the background model from the current frame image to obtain a background difference image, then carrying out OTSU binaryzation, obtaining the detected foreground contour from the binaryzation image, and calculating the center of each contour, wherein the contour center is the observation position of the prawn.
7. The long-time multi-target prawn tracking method based on boundary constraint Kalman filtering as claimed in claim 1, characterized in that in step 4), after the position of the target is detected by using a background difference method, a background difference result image is binarized, and the mask of the tracking area in step 1) is used to mask the background difference image to shield the non-tracking area.
8. The long-time multi-target prawn tracking method based on boundary constraint Kalman filtering as claimed in claim 1, characterized in that in step 5), the observation point with the largest outline area in the tracking area to which each observation value belongs is used as the observation value of the position of the prawn, so as to shield the non-maximum noise.
9. The long-time multi-target prawn tracking method based on boundary constraint Kalman filtering as claimed in claim 1, characterized in that in step 6), the specific method for predicting the target and determining the tracking result of the target is as follows: when the number of frames read is less than 6, the observed value z is usedkAs a tracking result of the target, updating the Kalman filter by using the current observation value; when the number of the read frames is more than 6, using a Kalman filter predicted value as a target tracking result; the prediction formula is xk=Fxk-1Wherein x isk-1And F is a state vector of the target at the last moment and is a system state transition matrix.
10. The method for long-time multi-target prawn tracking based on boundary constraint Kalman filtering as claimed in claim 1, wherein in step 7), the specific method for updating the Kalman filter, calculating Kalman filtering gain, and taking the modified state vector as the target state value of the current frame is as follows:
updating the Kalman filter according to the formula
Figure FDA0002519042600000036
Calculating a Kalman filter gain, wherein
Figure FDA0002519042600000037
Pk-1Kalman Filter covariance matrix at time k-1, according to equation x'k=xk+K′(zk-Akxk) Correcting the target state vector according to formula Pk=P′k-K′HkP′kUpdating Kalman Filter covariance matrix PkAnd taking the corrected state vector as a target state value of the current frame.
CN202010485883.5A 2020-06-01 2020-06-01 Long-time multi-target prawn tracking method based on boundary constraint Kalman filtering Pending CN111724405A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010485883.5A CN111724405A (en) 2020-06-01 2020-06-01 Long-time multi-target prawn tracking method based on boundary constraint Kalman filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010485883.5A CN111724405A (en) 2020-06-01 2020-06-01 Long-time multi-target prawn tracking method based on boundary constraint Kalman filtering

Publications (1)

Publication Number Publication Date
CN111724405A true CN111724405A (en) 2020-09-29

Family

ID=72565715

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010485883.5A Pending CN111724405A (en) 2020-06-01 2020-06-01 Long-time multi-target prawn tracking method based on boundary constraint Kalman filtering

Country Status (1)

Country Link
CN (1) CN111724405A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102222214A (en) * 2011-05-09 2011-10-19 苏州易斯康信息科技有限公司 Fast object recognition algorithm
CN104966304A (en) * 2015-06-08 2015-10-07 深圳市赛为智能股份有限公司 Kalman filtering and nonparametric background model-based multi-target detection tracking method
CN106780542A (en) * 2016-12-29 2017-05-31 北京理工大学 A kind of machine fish tracking of the Camshift based on embedded Kalman filter
CN107122758A (en) * 2017-05-11 2017-09-01 南宁市正祥科技有限公司 A kind of vehicle cab recognition and traffic flow detecting method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102222214A (en) * 2011-05-09 2011-10-19 苏州易斯康信息科技有限公司 Fast object recognition algorithm
CN104966304A (en) * 2015-06-08 2015-10-07 深圳市赛为智能股份有限公司 Kalman filtering and nonparametric background model-based multi-target detection tracking method
CN106780542A (en) * 2016-12-29 2017-05-31 北京理工大学 A kind of machine fish tracking of the Camshift based on embedded Kalman filter
CN107122758A (en) * 2017-05-11 2017-09-01 南宁市正祥科技有限公司 A kind of vehicle cab recognition and traffic flow detecting method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
J. STANDER ET AL.: "Detection of moving cast shadows for object segmentation", 《IEEE TRANS. MULTIMED.》 *
卢官明 等: "自适应背景更新及运动目标检测算法", 《南京邮电大学学报》 *
杨威 等: "基于机器视觉的圈养豪猪检测与基本行为识别方法研究", 《福建农业学报》 *
罗光华: "视频监控中运动目标检测与跟踪方法研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Similar Documents

Publication Publication Date Title
CN107424171B (en) Block-based anti-occlusion target tracking method
CN109934224B (en) Small target detection method based on Markov random field and visual contrast mechanism
EP3176751B1 (en) Information processing device, information processing method, computer-readable recording medium, and inspection system
Tweed et al. Tracking Many Objects Using Subordinated Condensation.
CN114022759B (en) Airspace finite pixel target detection system and method integrating neural network space-time characteristics
CN111062974B (en) Method and system for extracting foreground target by removing ghost
CN109859250B (en) Aviation infrared video multi-target detection and tracking method and device
CN110827262B (en) Weak and small target detection method based on continuous limited frame infrared image
CN108876820B (en) Moving target tracking method under shielding condition based on mean shift
CN113379789B (en) Moving target tracking method in complex environment
CN110349188B (en) Multi-target tracking method, device and storage medium based on TSK fuzzy model
Kryjak et al. Real-time implementation of the ViBe foreground object segmentation algorithm
CN109902578B (en) Infrared target detection and tracking method
TWI729587B (en) Object localization system and method thereof
CN113537077A (en) Label multi-Bernoulli video multi-target tracking method based on feature pool optimization
CN109558877B (en) KCF-based offshore target tracking algorithm
KR101690050B1 (en) Intelligent video security system
CN115187884A (en) High-altitude parabolic identification method and device, electronic equipment and storage medium
CN111724405A (en) Long-time multi-target prawn tracking method based on boundary constraint Kalman filtering
CN107798690B (en) Method for vesicle motion tracking in living cells
CN108038872B (en) Dynamic and static target detection and real-time compressed sensing tracking research method
CN111161304B (en) Remote sensing video target track tracking method for rapid background estimation
CN110322474B (en) Image moving target real-time detection method based on unmanned aerial vehicle platform
CN113920391A (en) Target counting method based on generated scale self-adaptive true value graph
Dagar et al. Soft computing techniques for edge detection problem: a state-of-the-art review

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200929