CN110853076B - Target tracking method, device, equipment and storage medium - Google Patents

Target tracking method, device, equipment and storage medium Download PDF

Info

Publication number
CN110853076B
CN110853076B CN201911092066.7A CN201911092066A CN110853076B CN 110853076 B CN110853076 B CN 110853076B CN 201911092066 A CN201911092066 A CN 201911092066A CN 110853076 B CN110853076 B CN 110853076B
Authority
CN
China
Prior art keywords
frame
tracking
image
target
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911092066.7A
Other languages
Chinese (zh)
Other versions
CN110853076A (en
Inventor
王玉哲
谢云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Yifei Zhilian Technology Co ltd
Original Assignee
Chongqing Yifei Zhilian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Yifei Zhilian Technology Co ltd filed Critical Chongqing Yifei Zhilian Technology Co ltd
Priority to CN201911092066.7A priority Critical patent/CN110853076B/en
Publication of CN110853076A publication Critical patent/CN110853076A/en
Application granted granted Critical
Publication of CN110853076B publication Critical patent/CN110853076B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a target tracking method, a target tracking device, target tracking equipment and a storage medium, and relates to the technical field of visual tracking. The method comprises the following steps: acquiring a frame image to be detected; the frame image to be detected comprises a tracking frame, and an image area framed by the tracking frame is a tracking target; acquiring a target detection frame according to the frame image to be detected, wherein an image area framed by the target detection frame is a detection target; judging whether the image area framed by the target detection frame meets a preset tracking requirement or not according to the image area framed by the target detection frame, if so, then: and updating the tracking frame, and taking the image framed and selected by the target detection frame as the current tracking target. Compared with the prior art, the method avoids the problems that small targets are difficult to identify and easy to lose.

Description

Target tracking method, device, equipment and storage medium
Technical Field
The present application relates to the field of visual tracking technologies, and in particular, to a target tracking method, apparatus, device, and storage medium.
Background
Visual target tracking is an important research direction in computer vision, and has wide application, such as: video monitoring, man-machine interaction, unmanned driving and the like.
In the target tracking in the prior art, a target is detected according to a detection algorithm in an image sequence, the target is accurately positioned, and then the motion information of the target is continuously updated in the moving process of the target, so that the target is continuously tracked.
However, in the prior art, only the tracking algorithm is used for tracking the target, and if the light is not good, the target rotates and is shielded, the target is easily lost, so that an effective and accurate tracking effect cannot be achieved.
Disclosure of Invention
An object of the present application is to provide a target tracking method, apparatus, device and storage medium for overcoming the above-mentioned deficiencies in the prior art, so as to solve the problem that the target is lost easily in the tracking process in the prior art.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides a target tracking method, where the method includes:
acquiring a frame image to be detected; the frame image to be detected comprises a tracking frame, and an image area framed by the tracking frame is a tracking target;
acquiring a target detection frame according to the frame image to be detected, wherein an image area framed by the target detection frame is a detection target;
judging whether the image area framed by the target detection frame meets a preset tracking requirement or not according to the image area framed by the target detection frame, if so, then: and updating the tracking frame, and taking the image framed and selected by the target detection frame as the current tracking target.
By adopting the target tracking method provided by the application, the target detection frame can be obtained in the tracking process, whether the obtained target detection frame meets the preset tracking requirement or not is judged, if yes, the tracking frame is updated, the image framed and selected by the target detection frame is taken as the current tracking target, the tracking frame is updated by adopting the target detection frame due to the combination of the tracking algorithm and the detection algorithm in the tracking process, the image framed and selected by the target detection frame is taken as the current tracking target, and the tracking frame tracks the current tracking target, so that the dynamic tracking in the target tracking process is realized, the tracking effect of small target tracking is improved, and the condition that the small target is lost during tracking is reduced.
In a second aspect, another embodiment of the present application provides an apparatus for tracking a target, the apparatus including: the device comprises an acquisition module, a judgment module and an update module, wherein:
the acquisition module is used for acquiring a frame image to be detected; the frame image to be detected comprises a tracking frame, and an image area framed by the tracking frame is a tracking target; acquiring a target detection frame according to the frame image to be detected, wherein an image area framed by the target detection frame is a detection target;
the judging module is used for judging whether the image area framed and selected by the target detection frame meets a preset tracking requirement or not;
the update module is configured to, if satisfied: and updating the tracking frame, and taking the image framed and selected by the target detection frame as the current tracking target.
In a third aspect, another embodiment of the present application provides a target tracking device, including: a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, and when the target tracking device is running, the processor communicates with the storage medium via the bus, and the processor executes the machine-readable instructions to perform the steps of the method according to any one of the first aspect.
In a fourth aspect, another embodiment of the present application provides a storage medium, on which a computer program is stored, and the computer program is executed by a processor to perform the steps of the method according to any one of the above first aspects.
The beneficial effect of this application is: by adopting the target tracking method provided by the application, whether the obtained target detection frame meets the preset tracking requirement or not can be judged, if yes, the tracking frame is updated, the image framed and selected by the target detection frame is taken as the current tracking target, and the tracking frame is updated by adopting the target detection frame in the tracking process and the tracking target is continuously tracked according to the updated tracking frame, so that the tracking effect of small target tracking is improved, and the condition that the small target is lost in tracking is reduced.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of a target tracking method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a target tracking method according to another embodiment of the present application;
fig. 3 is a schematic flowchart of a target tracking method according to another embodiment of the present application;
fig. 4 is a schematic structural diagram of a target tracking device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a target tracking device according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of a target tracking device according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of a target tracking device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments.
The target tracking method, device, equipment, storage medium and the like provided by the following embodiments of the present application can be applied to target tracking based on the computer vision field, for example, a high-altitude visual angle monitoring scene using an aircraft, or a monitoring scene using a ground camera.
Fig. 1 is a flowchart illustrating a target tracking method according to an embodiment of the present application, where the target tracking method is executed by a computer device with image processing capability. The computer device may be, for example, a terminal or server installed with a target tracking application. As shown in fig. 1, the method may include:
s101: and acquiring a frame image to be detected.
The frame image to be detected comprises a tracking frame, and an image area framed and selected by the tracking frame is a tracking target.
In each scheme provided by the application, a tracking target can be tracked by adopting a preset tracking algorithm according to the tracking frame. The preset Tracking algorithm may be an Efficient Convolution Operation (ECO). The ECO algorithm is adopted for target tracking, the target tracking speed can be improved, the method can be applied to tracking of most objects, and the visual challenges of different objects are met. Of course, the tracking frame may also be obtained by using other tracking algorithms, which are not described herein again.
It should be noted that the tracking target may be a tracking target selected based on a user operation, may also be a preset tracking target, or may also be a tracking target determined in another manner, which is not limited in this application.
The tracking target may be a vehicle, a person, or other movable target for tracking.
S102: and acquiring a target detection frame according to the frame image to be detected, wherein the image area framed by the target detection frame is a detection target.
In the method, a preset detection algorithm can be adopted to detect the frame image to be detected so as to generate a target detection frame on the frame image to be detected. At least one target detection frame may exist in the frame image to be detected. The number of the target detection frames in the frame image to be detected can be related to the number of the detection targets, and if the detection target is one, one target detection frame can be arranged in the frame image to be detected; if the number of the detection targets is multiple, the frame image to be detected can have multiple target detection frames, and the image area framed by each target detection frame is a detection target.
S103: and judging whether the image area framed by the target detection frame meets the preset tracking requirement or not according to the image area framed by the target detection frame.
The image area framed by the target detection frame is the detection target, so the step S103 can be to determine whether the detection target meets the preset tracking requirement.
The preset tracking requirement may be obtained according to the tracking target. For example, the shape of the detection target in the image area framed by the target detection frame may be determined to be similar to a preset tracking target, and if the shape of the detection target is similar to the preset tracking target, the detection target may be determined to meet the tracking requirement; otherwise, if not, the detection target is determined not to meet the tracking requirement. Wherein the shape similarity may include: the shapes are the same, or the difference parameter of the shapes is smaller than or equal to the preset difference value.
It should be noted that the tracking requirement may also be some other tracking requirement, for example, a requirement determined based on other parameters of the tracking target, which is only an example and is not limited by the present application.
If yes, executing the following S104; otherwise, if not, returning to continue executing the S102-S103.
S104: and updating the tracking frame, and taking the image framed and selected by the target detection frame as the current tracking target.
If the detected target meets the preset tracking requirement, the tracking frame in the frame image to be detected can be updated to the position of the detection frame, so that the image framed by the updated tracking frame is the image framed by the target detection frame, and then the image framed by the target monitoring frame is determined as the current tracking target, thereby realizing the updating of the tracking frame.
Optionally, the update to the tracking frame as described above may be: and acquiring a target detection frame meeting the preset tracking requirement according to the frame image to be detected every preset number of frames, and updating the tracking frame according to the target detection frame. In an embodiment of the present application, the preset number of frames may be set to be 90 frames, that is, every 90 frames, the above S102-S104 are performed once, so as to update the tracking frame once according to the target detection frame meeting the preset tracking requirement. The smaller the frame number of the preset number of frames is, the shorter the interval time is, so that the higher the updating frequency of the tracking frame is, the better the target tracking effect is, but the more frequent the updating is, the larger the power consumption of the tracking equipment is; however, the larger the number of frames, the longer the interval time, which makes the update frequency of the tracking frame lower, the object loss easily occurs. Therefore, in the application, the preset number of frames is set to 90 frames, so that the power consumption in the tracking process is not too large under the conditions that the updating frequency of the tracking frame is ensured and the target is prevented from being lost.
It should be noted that, the specific frame number setting is not limited thereto, and the specific frame number setting may be adjusted according to the actual tracking requirement, which is not limited herein.
In the process of tracking the target by only adopting the tracking algorithm, the tracking frame is lost due to shielding and the like of the moving tracking target, so that the target disappears. If the tracked target is an automobile, the size of the automobile on the frame image to be detected changes relative to the normal driving condition in the turning process, but the size of the tracking frame on the frame image to be detected is usually fixed, and if the tracking frame is not changed, if the size of the automobile on the frame image to be detected is greatly different from the size of the tracking frame, the image area selected by the tracking frame is not only the tracked target but also comprises other information, so that the tracked target has no pertinence, and the target is lost; the image area selected by the tracking frame may be only a part of the tracking target, that is, the tracking frame does not hold the tracking target, and the target may be lost. Or after the target automobile as the tracking target is shielded, the target automobile does not actually exist in the detection frame image, and the image area framed by the tracking frame is not the target automobile actually, that is, the tracking frame is not adopted to track the target automobile in the shielding time. After the occlusion time is over, the position of the tracking frame is not changed relative to the position before occlusion, and if the occlusion time is long, the tracking target may be lost.
The target tracking method provided by the embodiment of the application can be used for detecting the target of the detection frame image in the process of tracking the tracking target by adopting the tracking frame to obtain the target detection frame of the detection frame image, so that the combination of the tracking algorithm and the detection algorithm is realized, if the image area framed by the target detection frame meets the preset tracking requirement, the tracking frame is updated, the image framed by the target detection frame is taken as the current tracking target, the dynamic tracking in the target tracking process is realized, and the target loss caused by the loss of the tracking frame can be effectively avoided.
The application provides a tracking algorithm and a detection algorithm which are combined to improve the tracking effect in the tracking process. The tracking algorithm can be an ECO tracking algorithm, the detection algorithm can be a Tiny YOLOv3 detection algorithm, and the ECO tracking algorithm and the Tiny YOLOv3 detection algorithm are combined in the target tracking process, so that the target tracking effect can be effectively improved, and particularly the small target tracking effect can be improved.
In some possible implementation examples, the tracking algorithm and the detection algorithm may be integrated on the embedded platform TX2, which may enable real-time (30 fps, i.e. 30 frames per second) tracking of small targets in real time. The small target may be a target with a pixel value of 40 × 20 or a target with a pixel value of 50 × 30. Of course, the small target is only the target with the minimum pixel size that can be tracked in real time by the target tracking method, and besides real-time tracking of the small target, real-time tracking of targets with other pixel sizes can also be realized, which is not limited in the present application.
By adopting the target tracking method provided by the application, the target detection frame can be obtained in the tracking process, whether the obtained target detection frame meets the preset tracking requirement or not is judged, if yes, the tracking frame is updated, the image framed and selected by the target detection frame is used as the current tracking target, the tracking algorithm and the detection algorithm are combined in the tracking process, the target detection frame is adopted to update the tracking frame, the image framed and selected by the target detection frame is used as the current tracking target, and the tracking frame tracks the current tracking target, so that the dynamic tracking in the target tracking process is realized, the tracking effect of small target tracking is improved, and the condition that the small target is lost during tracking is reduced.
Alternatively, S102 as shown in the above method may include: if a plurality of detection frames are acquired according to the frame image to be detected, then: and selecting the detection frame closest to the tracking frame from the plurality of detection frames as a target detection frame.
The detection frame and the tracking frame can be in any shape. Taking a rectangle as an example, that is, both the detection frame and the tracking frame are rectangles, then, the distance between the detection frame and the tracking frame can be determined according to the center distance between the detection frame and the tracking frame.
If a preset detection algorithm is adopted, the frame image to be detected is detected to obtain a plurality of detection frames, and one detection frame can be determined from the plurality of detection frames to serve as a target detection frame. In the method, in the plurality of detection frames, because the similarity between the image area framed by the detection frame closest to the tracking frame and the tracked target is highest, the detection frame closest to the tracking frame is selected as the target monitoring frame, and the selected target detection frame can be ensured to be more accurate.
And in the method, after the target detection frame is selected from the plurality of detection frames, whether the image area framed by the target detection frame meets the preset tracking requirement is judged, and if the preset tracking requirement is met, the tracking frame is updated, so that the updated tracking frame is more accurate, the target loss is avoided, the calculation amount in the tracking process can be reduced, and the power consumption is reduced.
In some other possible implementation manners, the target detection frame may not be selected from the multiple detection frames, and for each detection frame, the above-mentioned S103 is performed to determine whether each detection frame meets the preset tracking requirement, and then the tracking frame is updated under the condition that the image area framed by the detection frame meets the preset tracking requirement, so as to avoid the loss of the tracking target.
Optionally, the updating of the tracking frame as shown above may include: updates to the tracking frame size and/or location.
For example, in an embodiment of the present application, the update process for the trace box is as follows: updating the position of the tracking frame on the frame image to be detected according to the position of the target detection frame on the frame image to be detected, so that the updated position of the tracking frame is the same as the position of the target detection frame; according to the size of the target detection frame, the size of the tracking frame on the frame image to be detected is updated, so that the updated size of the tracking frame is the same as the size of the target detection frame, but the specific updating of the tracking frame is not limited to the example provided in this embodiment, and this application does not limit this.
Optionally, before S101 acquires the detection frame image in the method shown above, the method further includes: detecting a trigger operation, the trigger operation comprising: an adjustment operation and/or a recovery operation.
In one example, a tracking frame in a current frame image may be detected to determine a confidence of an image region framed by the tracking frame, and then a trigger operation may be detected according to the confidence. In other examples, the trigger operation may be detected according to a preset time interval, or the trigger operation may be detected according to a preset interval frame number.
If the confidence detection triggering operation is adopted, and the confidence of the image area framed and selected by the tracking frame is a preset adjustment confidence, the adjustment operation can be determined to be detected, and the processing operation corresponding to the adjustment operation is executed, wherein the preset adjustment confidence is a preset maximum confidence. Otherwise, if the confidence of the image region framed by the tracking frame is the preset retrieval confidence, determining that the retrieval operation is detected, and executing the processing operation corresponding to the retrieval operation, wherein if the confidence is not the preset maximum confidence, the confidence is the retrieval confidence.
The confidence of the image region framed by the tracking frame can be used for representing the tracking state of the tracking frame on the tracking target. When the confidence coefficient is the preset maximum confidence coefficient, the tracking state of the tracking frame on the tracking target is better, and the image area framed and selected by the tracking frame is the tracking target. Conversely, the lower the confidence coefficient is, the tracking state of the tracking frame on the tracking target is poor, and even the target may be lost.
In one implementation, if the detected trigger operation is an adjustment operation, acquiring a current frame image as a frame image to be detected, and determining a target detection frame according to the current frame image.
Once the adjustment operation is detected based on the current frame image, it may be determined that the current frame image is a detected frame image, the operations of determining the target detection frame in S102, determining whether the preset tracking requirement is met in S103, and the like are executed, and if so, the operation of continuing to execute S104 for updating the tracking frame is executed.
When the adjustment operation is detected, in a specific implementation, an (AI) detection frame may be used for assisting the tracking in the process of tracking the target, that is, the detection frame in the current frame image is an AI detection frame, and a detection frame meeting a preset requirement may be selected from the detection frames in the current frame image as the tracking frame. The preset requirement may be, for example, the closest distance to the current tracking frame. In this embodiment, template matching may not be performed on the detection frame.
In another specific implementation, in the target tracking process, a detection frame meeting a preset requirement may be selected from the detection frames of the current frame image as the tracking frame, where the preset requirement is the closest distance to the current tracking frame. In this embodiment, template matching may be further performed on the detection frame to determine whether the response value of the image area framed by the detection frame meets a preset tracking requirement.
Illustratively, during the two update intervals of the tracking frame every preset number of frames, the tracking algorithm is adopted for tracking according to the tracking frame. Specifically, a tracking algorithm may be adopted, template matching is performed on a tracking frame in a current frame image based on the tracking template, a response value (peak _ value) of the tracking frame in the current frame image is determined, a confidence level of the tracking frame in the current frame image is determined according to the response value of the tracking frame and a corresponding relation between a preset response interval and the confidence level, if the confidence level of the tracking frame is a preset maximum confidence level, it is determined that the detected triggering operation is an adjustment operation, it is determined that the current frame image is a frame image to be detected, a target detection frame is determined according to the current frame image, and a tracking target is continuously tracked by adopting the tracking algorithm according to the tracking frame in the current frame image.
The response value of the image area framed by the tracking frame may be obtained by processing the image framed by the tracking frame by using a tracking algorithm, and may be used to determine the confidence of the tracking frame in the current frame image. The confidence coefficient of the tracking frame is the confidence coefficient corresponding to the response interval in which the response value is located, and if the confidence coefficient is larger than or equal to a preset threshold value, the tracking requirement is determined to be met; and if the tracking speed is smaller than the preset threshold value, determining that the tracking demand is not met. The different response time segments correspond to different confidence levels, and the confidence levels are used for indicating the tracking condition of the current tracking frame on the tracking target, in an embodiment of the present application, the confidence levels include: 255. 128 and 0, wherein 255 is a preset maximum confidence coefficient, namely an adjustment confidence coefficient, which can indicate that the tracking state of the tracking frame in the current frame image is good and the tracking can be continued; 128 is a preset retrieving confidence coefficient, which can indicate that the tracking effect of the tracking frame in the current frame image is not good and the tracking frame needs to be adjusted; and 0 is a preset loss confidence coefficient, which can indicate that the tracking state of the tracking frame in the current frame image is poor, the tracking target is lost, and the tracking target needs to be reselected or a retrieving operation is performed to restart the tracking. The matching relationship between the response block section and the confidence coefficient is set according to the user requirement, as long as the corresponding relationship between each block section and the confidence coefficient can be clearly divided, and the application is not limited herein.
In another implementation manner, if the triggering operation is a retrieving operation, an image with a first preset frame number is obtained, and the frame image to be detected and the target detection frame are determined according to the image with the first preset frame number.
If the detected triggering operation is a retrieving operation, determining a frame image to be detected and a target detection frame according to an image with a first preset frame number, after determining that the frame image is the target detection frame, continuing to execute the operations of determining the target detection frame in the S102, judging whether a preset tracking requirement is met or not in the S103, and the like, and if so, continuing to execute the S104 for updating the tracking frame, thereby realizing retrieving of the tracking frame and further realizing retrieving of the tracking target.
In some possible implementation manners, in the target tracking method as shown, the determining whether the detection frame meets a preset tracking requirement may include: judging whether the response value of the image area framed by the detection frame is greater than or equal to a preset threshold value or not, and if so, determining that the tracking requirement is met; and if the tracking speed is smaller than the preset threshold value, determining that the tracking demand is not met.
In a specific implementation process, the image area framed by the detection frame can be matched according to a preset detection template to obtain a response value of the image area framed by the detection frame, and then whether the detection frame meets a preset tracking requirement or not is determined based on the response value of the image area framed by the detection frame.
In other possible implementation manners, in the target tracking method as shown, the determining whether the tracking frame meets the preset tracking requirement may include: according to the judgment whether the response value of the image area framed by the tracking frame is greater than or equal to a preset threshold value or not, if so, determining that the tracking requirement is met; and if the tracking speed is smaller than the preset threshold value, determining that the tracking demand is not met.
In a specific implementation process, the image area framed by the tracking frame can be matched according to a preset tracking template to obtain a response value of the image area framed by the tracking frame, and then whether the tracking frame meets a preset tracking requirement or not is determined based on the response value of the image area framed by the tracking frame.
It should be noted that, for the same target, the detection template used by the detection frame and the tracking template used by the tracking frame may be the same template, and both the detection template and the tracking template are determined by determining a response value of the currently framed image area with respect to the preset target, and then determining whether the preset tracking requirement is met according to a relationship between the response value and a preset threshold. Specifically, if the response value is greater than or equal to the preset threshold, the preset tracking requirement is met, and if the response value is less than the preset threshold, the preset tracking requirement is not met.
In the actual target tracking process, the template may also be updated at preset time intervals, for example, 10 frames of images are fused every 10 intervals to obtain an updated template.
Optionally, when the detected trigger operation is a retrieving operation, the first preset frame number may be adjusted according to different application scenarios, in an embodiment of the present application, the first preset frame number may be 1500 frames, but the setting of the specific first preset frame number is flexibly adjusted by a user according to different scenarios, and the present application is not limited herein.
In the scheme provided by the application, the triggering operation detected by the method is an adjusting operation or a retrieving operation, the corresponding processing operation is executed, the updating of the target detection frame under different triggering operations is realized, the updating of the tracking frame is ensured, the updating of the tracking target is ensured, the target is prevented from being lost, and the tracking accuracy is improved.
In some other possible implementation manners, if the triggering operation is a retrieving operation, the determining, according to the image with the first preset frame number, the frame image to be detected and the target detection frame may include:
and determining a frame image to be detected from the image with the first preset frame number.
In the method, a preset detection algorithm can be adopted to detect all the images with the first preset frame number so as to determine whether each image with the first preset frame number has a detection frame, and the image with the detection frame is determined as the frame image to be detected.
Optionally, the preset detection algorithm may be started after the tracking target is determined and tracking is started, that is, the preset detection algorithm is started in the whole tracking process; the tracking system can be started only within a preset time period or when a tracking target meets preset requirements without starting when the tracking is started. For example: when the size of the tracked target is smaller than the preset threshold value, the preset detection algorithm is started, so that the tracking effect of the small target can be ensured, the specific detection algorithm can be designed according to the user requirement when being started, and the method and the device are not limited at all. And under the condition of determining the frame images to be detected, determining the number of the frame images to be detected. If the frame image to be detected is one, namely the frame image to be detected is unique, the target detection frame can be determined according to the frame image to be detected.
If there are a plurality of frame images to be detected, that is, the frame images to be detected are not unique, in one embodiment, the detection frame with the shortest distance may be selected from the detection frames of the plurality of frame images to be detected as the target detection frame according to the distance between the tracking frame and the detection frame in each frame image to be detected.
The detection frame with the shortest distance may be the detection frame with the shortest distance between the tracking frame and the detection frame in each frame image to be detected. The detection frame with the shortest distance is used as the target detection frame, the image area framed by the target detection frame can be ensured to be closer to the detection target, the target detection frame is more accurate, the tracking frame is updated under the condition that the image area framed by the target detection frame meets the preset tracking requirement, and the tracking target is continuously tracked according to the updated tracking frame.
If there are a plurality of frame images to be detected, in another embodiment, the detection frame of the frame image to be detected closest to the current time may also be selected as the target detection frame, and the specific manner of selecting the target detection frame is designed according to the user's needs, which is not limited herein.
According to the scheme provided by the embodiment of the application, the frame images to be detected can be determined from the images with the first preset frame number during the detected retrieving operation, the target detection frame can be determined from one or more frame images to be detected, the accuracy of the target detection frame in the retrieving process can be improved, the retrieving effects of the tracking frame and the tracking target are improved, and the target tracking effect is effectively guaranteed.
Optionally, if the trigger operation detected in the method is an adjustment operation, the method may further include: judging whether the detection algorithm is started, if not, judging whether the image area framed by the tracking frame meets the preset tracking requirement, and if so, continuing tracking; if not, executing the retrieving operation.
If the triggering operation is detected to be the adjusting operation and the detection algorithm is started, the detection algorithm is adopted to determine the target detection frame in the frame image to be detected, and whether the preset tracking requirement is met or not is judged, and if the preset tracking requirement is met, the tracking frame is continuously updated in the above mode. Otherwise, if the triggering operation is detected to be the adjusting operation but the detection algorithm is not started, judging whether the image area framed by the tracking frame meets the preset tracking requirement or not, and if so, continuing to track; if not, executing the retrieving operation.
Optionally, if the trigger operation detected in the method is a recovery operation, the method may further include: judging whether the detection algorithm is started, if not, acquiring an image with a second preset frame number, and determining an image area matched with the tracking template according to the image with the second preset frame number; if the image area matched with the tracking template meets the tracking requirement, the following steps are carried out: and taking the image area matched with the tracking template as a current tracking target. The tracking template may be the tracking template used for determining the confidence of the image region framed by the tracking frame.
And if the detected triggering operation is a retrieving operation and the detection algorithm is not started, performing image matching according to the images of the second preset frame number and a preset tracking template so as to update the tracking target. In the specific implementation process, in the frame images with the second preset frame number, each frame image is matched with the tracking template, whether the image area successfully matched meets the tracking requirement is judged, and if the image area successfully matched with the tracking template exists in a certain frame image and meets the preset tracking requirement, the image area is used as the current tracking target. And if each image in the images with the second preset frame number does not have an image area successfully matched with the tracking template, or the successfully matched image area does not meet the preset tracking requirement, determining that the tracking target is lost.
The second preset frame number may be a training frame number of a preset tracking template, and the training frame number is a number of frame images used in a process of training the tracking template. In an embodiment of the present application, the frame number of the tracking template is set to 4 frames, and compared to the conventional technology, the preset tracking algorithm in this embodiment is adjusted based on the existing tracking algorithm, and the frame number of the tracking template in the original algorithm is changed from 10 frames to 4 frames: namely, a tracking template is trained through 4 frames of images, so that the data volume of the template is reduced, the speed of the algorithm is increased, and the tracking effect is improved.
The tracking frame after being updated by template matching has better tracking effect because the confidence coefficient is the maximum confidence coefficient, and the tracking effect of the tracking frame is further ensured by updating the tracking frame by template matching.
The embodiment of the application also ensures the tracking effect of the target when the detection algorithm is not started by providing the execution operation of not starting the detection algorithm in the adjusting operation and the retrieving operation process.
Fig. 2 is a target tracking method provided in another embodiment of the present application, and as shown in fig. 2, the method may further include: and determining the operation of the image area framed by the tracking frame according to the wave gate instruction.
The operation specifically includes S105: and acquiring a wave gate adjusting instruction, and adjusting the position and the size of a wave gate on the frame image to be detected according to the wave gate adjusting instruction.
Namely, according to an input wave gate adjusting instruction, adjusting a wave gate on a frame image to be detected; wherein, the wave gate adjusting instruction is used for adjusting the size and/or the position of the wave gate; and determining an object in the position of the wave gate on the first frame image as a tracking target.
Optionally, the adjustment of the wave gate may be performed by a user zooming on the touch screen with a finger, or may be performed by the user adjusting the display with a mouse, or may be performed by the user adjusting the wave gate with a drawing board connected to the host; the specific adjustment mode is designed according to the needs of the user, and the application is not limited herein.
Because the size and the position of the wave gate can be adjusted through the user's demand, thereby can reduce too much background area and promote the tracking effect, thereby promote user's visual effect simultaneously better, avoided the wave gate size unadjustable, too big or undersize for the tracking target, cause the not accurate problem of tracking target selection, realized the accurate selection to the tracking target, promoted user experience.
Optionally, if the tracking target needs to be switched in the tracking process, the user directly controls and adjusts the wave gate through the external device, and selects a new tracking target through the wave gate frame again, so that a new tracking frame is generated to track the new tracking target, and the tracking target enters a state of tracking the target. The external device can be any intelligent device with a picture processing function, such as a tablet, a notebook, a smart phone, a palm computer and the like.
S106: and generating a corresponding tracking frame according to the adjusted wave gate.
Optionally, in an embodiment of the present application, after the corresponding tracking frame is generated according to the adjusted wave gate, the preset detection algorithm is immediately started, and the tracking frame is updated in real time according to the method through the combination of the preset detection algorithm and the tracking algorithm.
Optionally, the shape of the wave gate, the detection frame and the tracking frame provided by the present application may be rectangular, or may be other shapes; the shape of each frame body can be the same or different; the colors can be the same or different; only the frame bodies need to be distinguished, and a user can visually distinguish the frame bodies; in an embodiment of the present application, the wave gate is a white rectangular frame, the detection frame is a red rectangular frame, and the tracking frame is a green rectangular frame surrounded by line segments, but the specific shape and color are set according to the user's needs, and the present application is not limited herein.
Fig. 3 is a schematic flow chart of a target tracking method according to an embodiment of the present application, which illustrates a complete schematic flow chart of the target tracking method according to the present application, and as shown in fig. 3, in an embodiment of the present application, the complete flow chart of the method is as follows:
s201: and adjusting the wave gate.
The adjustment of the wave gate is performed according to the received wave gate command, and the specific adjustment manner may be the manner given in fig. 2 above.
After the wave gate is adjusted according to the wave gate adjustment instruction, a corresponding tracking frame is generated, and S202 is executed: and tracking the target.
Wherein, the tracking of the target adopts the tracking algorithm in the above embodiment to track the tracked target, and in the tracking process, S203: and judging whether a preset detection algorithm is started or not.
If the preset detection algorithm is not started, an adjustment operation is triggered, and S204a is executed: and judging whether the image area framed by the tracking frame meets the preset tracking requirement or not. If yes, returning to S202, continuing to track the tracking target according to the current tracking frame, if not, triggering a retrieving operation, and executing S205: and (6) recovering.
If the preset detection algorithm is on, then S204b is executed: judging whether the image area framed by the target detection frame meets the preset tracking requirement, if so, executing S206: updating the tracking frame on the second frame image; if not, executing S205: and (6) recovering.
The method comprises the steps of judging whether an image area framed by a target detection frame meets a preset tracking requirement or not, judging whether the confidence coefficient of the image area framed by the target detection frame is a preset maximum confidence coefficient or not, if so, meeting the preset tracking requirement, and if not, meeting the preset tracking requirement.
After the retrieving operation is triggered, S207 is executed: and judging whether a preset detection algorithm is started or not.
If the predetermined detection algorithm is in the on state, then S208 is executed: judging whether a target detection frame exists in the image with the first preset frame number or not; if yes, executing S208a: updating the tracking frame according to the target detection frame; if there is no target detection box, then execute S208b: and updating the tracking frame according to the image with the second preset frame number and the tracking template.
If the preset detection algorithm is not turned on, then S208b is executed: and updating the tracking frame according to the image with the second preset frame number and the tracking template.
After S208b, the updated trace box is executed S209: and judging whether the retrieving condition is met.
If yes, returning to S202, and continuing to track the tracking target according to the updated tracking frame; if not, indicating that the current tracking target is lost, executing S210: and determining that the tracking target is lost.
By adopting the target tracking method provided by the application, in the tracking process, every preset number of frames, the response value of the detection frame closest to the central point of the tracking frame in the preset area is confirmed according to the detection algorithm, the tracking frame is updated according to the response value of the closest detection frame and the closest detection frame, and the tracking target is continuously tracked according to the updated tracking frame. Due to the fact that the preset detection algorithm and the preset tracking algorithm are used in a matched mode in the tracking process, the tracking frame is updated periodically through the detection algorithm, the tracking effect of small target tracking is improved, and the situation that the small target is lost in tracking is reduced.
Fig. 4 is a schematic structural diagram of a target tracking apparatus according to an embodiment of the present application, and as shown in fig. 4, the apparatus includes: an obtaining module 201, a judging module 202 and an updating module 203, wherein:
an obtaining module 201, configured to obtain a frame image to be detected; the frame image to be detected comprises a tracking frame, and an image area framed and selected by the tracking frame is a tracking target; and acquiring a target detection frame according to the frame image to be detected, wherein the image area framed and selected by the target detection frame is a detection target.
The determining module 202 is configured to determine whether the image area framed by the target detection frame meets a preset tracking requirement.
An update module 203, configured to, if satisfied: and updating the tracking frame, and taking the image framed and selected by the target detection frame as the current tracking target.
Optionally, the obtaining module 201 is further configured to, if a plurality of detection frames are obtained according to the frame image to be detected: and selecting the detection frame closest to the tracking frame from the plurality of detection frames as a target detection frame.
Optionally, the target tracking apparatus includes a detection module, which may be located before the acquisition module 201, for detecting the trigger operation.
Optionally, the triggering operation includes: an adjusting operation and/or a retrieving operation, wherein if the triggering operation is an adjusting operation, the obtaining module 201 is further configured to: and acquiring the current frame image as a frame image to be detected, and determining a target detection frame according to the current frame image.
If the triggering operation is a retrieving operation, the obtaining module 201 is further configured to: acquiring an image with a first preset frame number, and determining a frame image to be detected and a target detection frame according to the image with the first preset frame number.
Fig. 5 is a schematic structural diagram of a target tracking apparatus according to another embodiment of the present application, and as shown in fig. 5, the apparatus further includes: the determining module 204 is configured to determine a frame image to be detected from the image with the first preset frame number.
If the frame image to be detected is unique, the determining module 204 is further configured to determine the target detection frame according to the frame image to be detected.
If there are a plurality of frame images to be detected, the determining module 204 is further configured to: selecting a detection frame of a frame image to be detected with the nearest time to the current frame image as a target detection frame; or selecting the detection frame with the shortest distance from the detection frames of the frame images to be detected as the target detection frame according to the distance between the tracking frame and the detection frame of each frame image to be detected.
Optionally, the determining module 202 is further configured to determine whether the image area framed by the tracking frame meets a preset tracking requirement if the detection algorithm is not started, and continue tracking if the image area framed by the tracking frame meets the preset tracking requirement; if not, the recovery operation is executed.
Optionally, the obtaining module 201 is further configured to obtain an image with a second preset frame number if the detection algorithm is not started, and determine an image area matched with the tracking template according to the image with the second preset frame number; if the image area matched with the tracking template meets the tracking requirement, the following steps are carried out: and taking the image area matched with the tracking template as a current tracking target.
Fig. 6 is a schematic structural diagram of a target tracking apparatus according to another embodiment of the present application, and as shown in fig. 6, the apparatus further includes: a generation module 205, wherein:
the obtaining module 201 is further configured to obtain a gate adjustment instruction, and adjust the position and size of a gate on the frame image to be detected according to the gate adjustment instruction.
A generating module 205, configured to generate the corresponding tracking frame according to the adjusted wave gate.
Optionally, the determining module 202 is further configured to determine whether a response value of the image area framed by the detection frame or the tracking frame is greater than or equal to a preset threshold, and if the response value is greater than or equal to the preset threshold, determine that the preset tracking requirement is met; and if the tracking speed is smaller than the preset threshold value, determining that the tracking demand is not met.
Optionally, if the triggering operation detected by the detection module is an adjustment operation, the obtaining module 201 is further configured to, if multiple detection frames are obtained according to the current frame image, then: selecting a detection frame closest to the tracking frame from the plurality of detection frames as the target detection frame;
the determining module 202 is further configured to determine that the tracking requirement is met according to the current state as the adjusting operation.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
The above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. As another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 7 is a schematic structural diagram of a target tracking device according to another embodiment of the present application, where an obtaining device of the target tracking device may be integrated in a terminal device or a chip of the terminal device, and the terminal may be a computing device with an image processing function.
The target tracking device may include: a processor 501, a storage medium 502, and a bus 503.
The processor 501 is used for storing programs, and the processor 501 calls the programs stored in the storage medium 502 to execute the above method embodiments. The specific implementation and technical effects are similar, and are not described herein again.
Optionally, the present application also provides a program product, such as a storage medium, having a computer program stored thereon, comprising a program which, when executed by a processor, performs the above-described method embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (in english: processor) to execute some steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (12)

1. A method of target tracking, the method comprising:
acquiring a frame image to be detected; the frame image to be detected comprises a tracking frame, and an image area framed by the tracking frame is a tracking target;
acquiring a target detection frame according to the frame image to be detected, wherein an image area framed by the target detection frame is a detection target;
judging whether the image area framed by the target detection frame meets a preset tracking requirement or not according to the image area framed by the target detection frame, if so, then: updating the tracking frame, and taking the image framed by the target detection frame as a current tracking target; wherein the updating of the tracking frame comprises: updates to the size and/or location of the tracking frame;
before the frame image to be detected is obtained, the method further includes: detecting a trigger operation; the triggering operation comprises: an adjustment operation and/or a recovery operation;
if the triggering operation is the adjusting operation, determining whether to start a detection algorithm;
if the starting is carried out, the following steps: acquiring a current frame image as the frame image to be detected, and determining the target detection frame according to the current frame image;
if the triggering operation is the retrieving operation, determining whether to start the detection algorithm;
if the target detection frame is started, acquiring an image with a first preset frame number, and determining the image of the frame to be detected and the target detection frame according to the image with the first preset frame number;
acquiring a gate adjusting instruction, and adjusting the position and the size of a gate on the frame image to be detected according to the gate adjusting instruction;
generating the corresponding tracking frame according to the adjusted wave gate;
if the triggering operation is an adjustment operation, the method further includes:
if the detection algorithm is not started, judging whether the image area framed by the tracking frame meets the preset tracking requirement or not, if so, continuing tracking, and starting the detection algorithm; if not, executing the retrieving operation;
if the triggering operation is a recovery operation, the method further comprises:
if the detection algorithm is not started, acquiring an image with a second preset frame number, and determining an image area matched with the tracking template according to the image with the second preset frame number; if the image area matched with the tracking template meets the tracking requirement, the following steps are carried out: and taking the image area matched with the tracking template as a current tracking target, and starting the detection algorithm.
2. The method according to claim 1, wherein the operation of obtaining a target detection frame according to the frame image to be detected includes:
if a plurality of detection frames are acquired according to the frame image to be detected, then: and selecting the detection frame closest to the tracking frame from the plurality of detection frames as the target detection frame.
3. The method as claimed in claim 1, wherein the operation of determining the target detection frame according to the first preset number of frames of images comprises:
determining the frame image to be detected from the image with the first preset frame number;
if the frame image to be detected is unique, determining the target detection frame according to the frame image to be detected;
if the frame images to be detected are multiple, then:
selecting a detection frame of a frame image to be detected closest to the current frame image time as the target detection frame; or the like, or a combination thereof,
and selecting the detection frame with the shortest distance from the detection frames of the frame images to be detected as the target detection frame according to the distance between the tracking frame and the detection frame of each frame image to be detected.
4. The method according to any one of claims 1-3, wherein said determining whether it meets a preset tracking requirement comprises:
judging whether the response value of the image area framed by the detection frame or the tracking frame is greater than or equal to a preset threshold value or not, and if so, determining that the tracking requirement is met; and if the tracking demand is smaller than the preset threshold value, determining that the tracking demand is not met.
5. The method of claim 1, wherein if the triggering operation is an adjusting operation, the acquiring a current frame image is the frame image to be detected, and the determining the target detection frame according to the current frame image comprises:
if a plurality of detection frames are acquired according to the current frame image, then: selecting a detection frame closest to the tracking frame from the plurality of detection frames as the target detection frame;
the operation of judging whether the target detection frame meets a preset tracking requirement comprises the following steps:
and judging that the tracking requirement is met according to the current state as the adjustment operation.
6. An object tracking apparatus, characterized in that the apparatus comprises: the device comprises an acquisition module, a judgment module and an updating module, wherein:
the acquisition module is used for acquiring a frame image to be detected; the frame image to be detected comprises a tracking frame, and an image area framed by the tracking frame is a tracking target; acquiring a target detection frame according to the frame image to be detected, wherein an image area framed by the target detection frame is a detection target;
the judging module is used for judging whether the image area framed and selected by the target detection frame meets a preset tracking requirement or not;
the update module is configured to, if yes: updating the tracking frame, and taking the image framed and selected by the target detection frame as a current tracking target; wherein the updating of the tracking frame comprises: updates to the size and/or location of the tracking frame;
the target tracking apparatus further includes: the detection module can be positioned in front of the acquisition module and is used for detecting the trigger operation;
the triggering operation comprises the following steps: an adjustment operation and/or a recovery operation;
if the triggering operation is the adjusting operation, an obtaining module is further configured to obtain a current frame image as the frame image to be detected, and determine the target detection frame according to the current frame image;
if the triggering operation is the retrieving operation, the obtaining module is further configured to obtain an image with a first preset frame number, and determine the frame image to be detected and the target detection frame according to the image with the first preset frame number;
the acquisition module is further used for acquiring a gate adjustment instruction and adjusting the position and the size of a gate on the frame image to be detected according to the gate adjustment instruction;
the device also comprises a generating module, wherein the generating module is used for generating the corresponding tracking frame according to the adjusted wave gate;
if the trigger operation detected by the detection module is an adjustment operation, then:
the judging module is also used for judging whether the image area framed by the tracking frame meets the preset tracking requirement or not if the detection algorithm is not started, and continuing tracking and starting the detection algorithm if the image area framed by the tracking frame meets the preset tracking requirement; if not, executing the retrieving operation;
if the triggering finance detected by the detection module is a recovery operation, then:
the acquisition module is further used for acquiring an image with a second preset frame number if the detection algorithm is not started, and determining an image area matched with the tracking template according to the image with the second preset frame number; if the image area matched with the tracking template meets the tracking requirement, the following steps are carried out: and taking the image area matched with the tracking template as a current tracking target, and starting the detection algorithm.
7. The object tracking device of claim 6,
the obtaining module is further configured to, if a plurality of detection frames are obtained according to the frame image to be detected, then: and selecting the detection frame closest to the tracking frame from the plurality of detection frames as a target detection frame.
8. The object tracking device of claim 6,
the device also includes: the determining module is used for determining a frame image to be detected from the images with the first preset frame number;
if the frame image to be detected is unique, the determining module is also used for determining a target detection frame according to the frame image to be detected;
if the number of the frame images to be detected is multiple, the determining module is further used for selecting the detection frame of the frame image to be detected, which is closest to the current frame image time, as the target detection frame; or selecting the detection frame with the shortest distance from the detection frames of the frame images to be detected as the target detection frame according to the distance between the tracking frame and the detection frame of each frame image to be detected.
9. The object tracking device of any one of claims 6-8,
the judging module is further used for judging whether the response value of the image area framed by the detection frame or the tracking frame is greater than or equal to a preset threshold value or not, and if the response value is greater than or equal to the preset threshold value, the preset tracking requirement is determined to be met; and if the tracking speed is smaller than the preset threshold value, determining that the tracking demand is not met.
10. The object tracking device of claim 6,
if the trigger operation detected by the detection module is an adjustment operation, then,
the obtaining module is further configured to, if a plurality of detection frames are obtained according to the current frame image, then: selecting a detection frame closest to the tracking frame from the plurality of detection frames as the target detection frame;
the judging module is further configured to perform the adjusting operation according to the current state, and determine that the tracking requirement is met according to a judgment result.
11. An object tracking device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the target tracking device is operating, the processor executing the machine-readable instructions to perform the steps of the method of any one of claims 1-5.
12. A storage medium having stored thereon a computer program for performing the steps of the method according to any of the claims 1-5 when executed by a processor.
CN201911092066.7A 2019-11-08 2019-11-08 Target tracking method, device, equipment and storage medium Active CN110853076B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911092066.7A CN110853076B (en) 2019-11-08 2019-11-08 Target tracking method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911092066.7A CN110853076B (en) 2019-11-08 2019-11-08 Target tracking method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110853076A CN110853076A (en) 2020-02-28
CN110853076B true CN110853076B (en) 2023-03-31

Family

ID=69600991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911092066.7A Active CN110853076B (en) 2019-11-08 2019-11-08 Target tracking method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110853076B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402295B (en) * 2020-03-11 2023-08-08 桂林理工大学 Moving object identification method based on object detection and tracking
CN111444916A (en) * 2020-03-26 2020-07-24 中科海微(北京)科技有限公司 License plate positioning and identifying method and system under unconstrained condition
CN111429478B (en) * 2020-04-13 2022-08-26 展讯通信(上海)有限公司 Target tracking method and related equipment
CN111479062B (en) * 2020-04-15 2021-09-28 上海摩象网络科技有限公司 Target object tracking frame display method and device and handheld camera
CN111563913B (en) * 2020-04-15 2021-12-10 上海摩象网络科技有限公司 Searching method and device based on tracking target and handheld camera thereof
CN111680551B (en) * 2020-04-28 2024-06-11 平安国际智慧城市科技股份有限公司 Method, device, computer equipment and storage medium for monitoring livestock quantity
CN111652902B (en) * 2020-06-02 2023-03-28 浙江大华技术股份有限公司 Target tracking detection method, electronic equipment and device
CN111798482A (en) * 2020-06-16 2020-10-20 浙江大华技术股份有限公司 Target tracking method and device
CN111862154B (en) * 2020-07-13 2024-03-01 中移(杭州)信息技术有限公司 Robot vision tracking method and device, robot and storage medium
CN111994377B (en) * 2020-07-21 2022-04-08 浙江大华技术股份有限公司 Method and device for detecting packaging box process and computer equipment
CN112069879B (en) * 2020-07-22 2024-06-07 深圳市优必选科技股份有限公司 Target person following method, computer-readable storage medium and robot
CN111862624B (en) * 2020-07-29 2022-05-03 浙江大华技术股份有限公司 Vehicle matching method and device, storage medium and electronic device
CN112070035A (en) * 2020-09-11 2020-12-11 联通物联网有限责任公司 Target tracking method and device based on video stream and storage medium
CN112132136A (en) * 2020-09-11 2020-12-25 华为技术有限公司 Target tracking method and device
CN112163471A (en) * 2020-09-14 2021-01-01 深圳市诺龙技术股份有限公司 Congestion detection method and device
CN112132071A (en) * 2020-09-27 2020-12-25 上海眼控科技股份有限公司 Processing method, device and equipment for identifying traffic jam and storage medium
CN112509003B (en) * 2020-12-01 2023-05-12 华中光电技术研究所(中国船舶重工集团公司第七一七研究所) Method and system for solving target tracking frame drift
CN113808159A (en) * 2021-01-04 2021-12-17 北京沃东天骏信息技术有限公司 Target tracking method and device
CN113053104A (en) * 2021-02-24 2021-06-29 上海眼控科技股份有限公司 Target state determination method and device, computer equipment and storage medium
CN113223057A (en) * 2021-06-04 2021-08-06 北京奇艺世纪科技有限公司 Face tracking method and device, electronic equipment and storage medium
CN113255588B (en) * 2021-06-24 2021-10-01 杭州鸿泉物联网技术股份有限公司 Garbage cleaning method and device for garbage sweeper, electronic equipment and storage medium
CN114140494A (en) * 2021-06-30 2022-03-04 杭州图灵视频科技有限公司 Single-target tracking system and method in complex scene, electronic device and storage medium
CN114119674B (en) * 2022-01-28 2022-04-26 深圳佑驾创新科技有限公司 Static target tracking method and device and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007099762A1 (en) * 2006-03-01 2007-09-07 Nikon Corporation Object-seeking computer program product, object-seeking device, and camera
CN108269269A (en) * 2016-12-30 2018-07-10 纳恩博(北京)科技有限公司 Method for tracking target and device
CN107256561A (en) * 2017-04-28 2017-10-17 纳恩博(北京)科技有限公司 Method for tracking target and device
CN109754409A (en) * 2017-11-06 2019-05-14 北京航天长峰科技工业集团有限公司 A kind of monitor video pedestrian target matched jamming System and method for
CN108830219B (en) * 2018-06-15 2022-03-18 北京小米移动软件有限公司 Target tracking method and device based on man-machine interaction and storage medium
CN109492537B (en) * 2018-10-17 2023-03-14 桂林飞宇科技股份有限公司 Object identification method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
赵俊青."智能视频分析中目标跟踪算法的改进与实现".《中国优秀硕士学位论文全文数据库(信息科技辑)》.2016,(第3期),第I138-6361页. *
闫若怡等."基于并行跟踪检测框架与深度学习的目标跟踪算法".《计算机应用》.2019,第39卷(第2卷),第343-347页. *

Also Published As

Publication number Publication date
CN110853076A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
CN110853076B (en) Target tracking method, device, equipment and storage medium
EP3163498B1 (en) Alarming method and device
CN104853668B (en) For eyes and the leading position scan based on tiling image for watching search attentively
WO2020015468A1 (en) Image transmission method and apparatus, terminal device, and storage medium
CN107491755B (en) Method and device for gesture recognition
US9639759B2 (en) Video processing apparatus and video processing method
CN109167893B (en) Shot image processing method and device, storage medium and mobile terminal
JP2008113071A (en) Automatic tracking device
CN105556539A (en) Detection devices and methods for detecting regions of interest
CN110572636B (en) Camera contamination detection method and device, storage medium and electronic equipment
CN109711332B (en) Regression algorithm-based face tracking method and application
CN103810696A (en) Method for detecting image of target object and device thereof
CN110647818A (en) Identification method and device for shielding target object
CN112135041A (en) Method and device for processing special effects of human face and storage medium
CN111212222A (en) Image processing method, image processing apparatus, electronic apparatus, and storage medium
US20160267356A1 (en) Image processing apparatus and image processing method of performing image segmentation
CN104063041A (en) Information processing method and electronic equipment
CN112367486A (en) Video processing method and device
JPWO2018179119A1 (en) Video analysis device, video analysis method, and program
CN109040604B (en) Shot image processing method and device, storage medium and mobile terminal
US9183448B2 (en) Approaching-object detector, approaching object detecting method, and recording medium storing its program
CN111507999A (en) FDSST algorithm-based target tracking method and device
CN112565605B (en) Image display method and device and electronic equipment
CN104754248A (en) Method and device for acquiring target snapshot
CN114840086A (en) Control method, electronic device and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant