CN112055158A - Target tracking method, monitoring device, storage medium and system - Google Patents

Target tracking method, monitoring device, storage medium and system Download PDF

Info

Publication number
CN112055158A
CN112055158A CN202011111982.3A CN202011111982A CN112055158A CN 112055158 A CN112055158 A CN 112055158A CN 202011111982 A CN202011111982 A CN 202011111982A CN 112055158 A CN112055158 A CN 112055158A
Authority
CN
China
Prior art keywords
tracking
target
image frame
tracking target
current image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011111982.3A
Other languages
Chinese (zh)
Other versions
CN112055158B (en
Inventor
肖志峰
吴杰
冯军
李志杰
施柯柯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Keda Technology Co Ltd
Original Assignee
Suzhou Keda Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Keda Technology Co Ltd filed Critical Suzhou Keda Technology Co Ltd
Priority to CN202011111982.3A priority Critical patent/CN112055158B/en
Publication of CN112055158A publication Critical patent/CN112055158A/en
Application granted granted Critical
Publication of CN112055158B publication Critical patent/CN112055158B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to the technical field of security protection, in particular to a target tracking method, monitoring equipment, a storage medium and a system, wherein the method comprises the steps of obtaining a current image frame and identifying a target of the current image frame; when a tracking target exists in a current image frame, determining a motion parameter of a local holder based on the position of the tracking target in the current image frame, wherein the motion parameter comprises at least one of a motion direction, a motion speed or zooming; and controlling the motion of the local holder according to the motion parameters of the local holder so as to track the tracked target. The target tracking is carried out in the local monitoring equipment, so that the target identification and the track tracking can be realized in a certain area by utilizing the independent holder monitoring equipment, the target tracking by adopting linkage among a plurality of monitoring equipment is avoided, the necessary coordinate calibration process of linkage of a plurality of equipment can be further avoided, and the real-time performance and the accuracy of tracking can be ensured by directly controlling the local holder.

Description

Target tracking method, monitoring device, storage medium and system
Technical Field
The invention relates to the technical field of security protection, in particular to a target tracking method, monitoring equipment, a storage medium and a system.
Background
The gun-ball linkage, the thunder-ball linkage or the panoramic close-up tracking which are commonly used in the monitoring field at present are all equipment such as radars or fixed cameras and the like which monitor a certain area and use a pan-tilt camera to carry out close-up amplification. The method for linking multiple devices is usually to establish a coordinate system between the multiple devices, for example, to establish a coordinate system with a monitoring platform as a coordinate origin, and then to calibrate the coordinates of each device in the coordinate system. Specifically, in the monitoring linkage process, each device sends the acquired image to the monitoring platform, the monitoring platform determines the monitoring device which is currently tracking the target, and sends a corresponding control instruction to the monitoring device so as to control the motion of the holder to perform close-up amplification on the target.
However, in the above technical solution, a coordinate system needs to be established among a plurality of devices, and corresponding coordinate calibration needs to be performed. If the monitoring equipment in a certain area is increased or decreased, the coordinate calibration of each equipment needs to be carried out again, and the cradle head is controlled by the monitoring platform, so that the tracking real-time performance is low, and the target tracking accuracy is low.
Disclosure of Invention
In view of this, embodiments of the present invention provide a target tracking method, a monitoring device, a storage medium, and a system, so as to solve the problem that the real-time performance and accuracy of target tracking are low.
According to a first aspect, an embodiment of the present invention provides a target tracking method, including:
acquiring a current image frame, and performing target identification on the current image frame;
when a tracking target exists in the current image frame, determining a motion parameter of a local tripod head based on the position of the tracking target in the current image frame, wherein the motion parameter comprises at least one of a motion direction, a motion speed or zooming;
and controlling the motion of the local holder according to the motion parameters of the local holder so as to track the tracking target.
The target tracking method provided by the embodiment of the invention identifies and tracks the tracked target in the local monitoring equipment, thereby realizing the target identification and track tracking of a certain area by using the single holder monitoring equipment, avoiding tracking the target by adopting linkage among a plurality of monitoring equipment, further avoiding the coordinate calibration process necessary for linkage of a plurality of equipment, and ensuring the real-time performance and the accuracy of tracking by directly controlling the local holder.
With reference to the first aspect, in a first implementation manner of the first aspect, the determining a motion parameter of a local pan/tilt head based on a position of the tracking target in the current image frame includes:
acquiring the central position of a current image frame;
and determining the movement direction and the movement speed based on the position relation between the tracking target and the central position.
According to the target tracking method provided by the embodiment of the invention, the movement direction and the movement speed are determined by utilizing the position relation between the tracking target and the center position of the current image frame, so that the tracking target can be positioned at the center position of the image frame in the subsequent tracking process, and the tracking accuracy and reliability are improved.
With reference to the first implementation manner of the first aspect, in a second implementation manner of the first aspect, the determining the moving direction and the moving speed based on the position relationship between the tracking target and the center position includes:
determining the movement direction by using the angle of the tracking target relative to the central position;
acquiring the position deviation of the tracking target in the current image frame and the previous image frame;
and determining the movement speed by using the angle of the tracking target relative to the central position and the position deviation.
According to the target tracking method provided by the embodiment of the invention, as the speed is higher as the tracking target is farther from the center position of the picture, and the speed is higher as the coordinate change of the tracking target between adjacent frames is larger, the tracking target and the frame are combined when the movement speed is determined, and the accuracy of the determined movement speed can be ensured.
With reference to the first aspect, or the first implementation manner of the first aspect, or the second implementation manner of the first aspect, in a third implementation manner of the first aspect, the determining a motion parameter of the local pan/tilt head based on the position of the tracking target in the current image frame includes:
acquiring the proportion of the tracking target in the current image frame;
determining the zoom of the local pan/tilt head based on the ratio.
The target tracking method provided by the embodiment of the invention has the advantages that in order to ensure that the tracking target cannot be directly thrown out of the picture due to too fast movement of the pan-tilt, the occupation ratio of the tracking target in the picture is required to be small, and in order to ensure the identification precision and the video definition of the tracking target, the tracking target is required to be as large as possible but is easy to lose; based on the method, the balance between the two conditions can be ensured by determining the zoom of the local holder according to the proportion of the tracking target in the current image frame.
With reference to the third embodiment of the first aspect, in a fourth embodiment of the first aspect, the method further includes:
judging whether the zooming meets a preset condition or not;
when the zoom does not meet the preset condition, reporting the parameters of the local pan-tilt to a monitoring platform so that the monitoring platform calls other monitoring equipment to perform linkage tracking on the tracking target, wherein the parameters of the pan-tilt comprise the angle and the zoom of the pan-tilt.
According to the target tracking method provided by the embodiment of the invention, when the zoom does not meet the preset condition, the tracking target exceeds the monitoring capability of the local monitoring equipment, the local monitoring equipment does not have enough performance to track the target, the motion parameters of the local pan-tilt are reported to the monitoring platform to call other monitoring equipment to perform linkage tracking on the tracking target, and therefore, the aim of fully deploying the defense of a large area by integrating multiple monitoring equipment can be realized.
With reference to the first aspect, in a fifth implementation manner of the first aspect, the controlling, according to the motion parameter of the local pan/tilt, the motion of the local pan/tilt to track the tracking target includes:
controlling the motion of the local holder by using the motion parameters;
and releasing the current image frame and acquiring the next image frame so as to track the tracking target.
According to the target tracking method provided by the embodiment of the invention, the storage space of local monitoring equipment can be saved by releasing the current image frame, and the tracking efficiency is improved.
With reference to the fifth implementation manner of the first aspect, in the sixth implementation manner of the first aspect, when the tracking manner currently turned on locally is alarm linkage tracking, the controlling, according to the motion parameter of the local pan/tilt, the motion of the local pan/tilt to track the tracking target further includes:
performing alarm processing on the tracking target;
judging whether the tracking target meets the snapshot condition;
and when the tracked target meets the snapshot condition, snapshot is carried out on the tracked target, and the snapshot result is uploaded to a monitoring platform.
According to the target tracking method provided by the embodiment of the invention, when the tracking mode started at present is alarm linkage tracking, alarm processing and subsequent snapshot are carried out on the tracked target, so that intelligent linkage tracking can be realized by using single monitoring equipment, and the tracking result can meet the requirements of users.
With reference to the first aspect, in a seventh implementation manner of the first aspect, the method further includes:
when a tracking target does not exist in the current image frame, determining whether the tracking target is lost based on the tracking target in the previous image frame;
when the tracking target is lost, reporting the parameters of the local cradle head to a monitoring platform so that the monitoring platform calls other monitoring equipment to perform linkage tracking on the tracking target, wherein the parameters of the cradle head comprise the angle and the zooming of the cradle head.
According to the target tracking method provided by the embodiment of the invention, when the tracking target does not exist in the current image frame, the parameters of the local pan-tilt are reported to the monitoring platform so as to call other monitoring equipment to perform linkage tracking on the tracking target, so that the error detection of a single monitoring equipment at a certain angle can be avoided, and the anti-interference performance of target tracking is improved.
With reference to the first aspect, in an eighth implementation manner of the first aspect, the performing target recognition on the current image frame to determine a tracking target in the image to be processed includes:
inputting the current image frame into at least one category detection model, and determining a detection result corresponding to each category detection model;
and determining a target class detection model based on the detection result corresponding to each class detection model so as to perform target identification on the subsequent image frame.
According to the target tracking method provided by the embodiment of the invention, at least one class detection model can operate before identification, the target class detection model is determined according to the detection result, and only the target class detection model is started in the subsequent tracking process, so that the target identification speed can be ensured. The tracking function is very sensitive to the identification time of a single frame image, and the effect can be effectively improved by only running a single model in the tracking process under the condition of certain local processing performance.
According to a second aspect, an embodiment of the present invention further provides a target tracking apparatus, including:
the acquisition module is used for acquiring a current image frame and carrying out target identification on the current image frame;
a determining module, configured to determine, when a tracking target exists in the current image frame, a motion parameter of a local pan/tilt head based on a position of the identified tracking target in the current image frame, where the motion parameter includes at least one of a motion direction, a speed, or a zoom;
and the control module is used for controlling the motion of the local holder according to the motion parameters of the local holder so as to track the tracking target.
The target tracking device provided by the embodiment of the invention identifies and tracks the tracked target in the local monitoring equipment, thereby realizing the target identification and track tracking of a certain area by using the single holder monitoring equipment, avoiding tracking the target by adopting linkage among a plurality of monitoring equipment, further avoiding the coordinate calibration process necessary for linkage of a plurality of equipment, and ensuring the real-time performance and the accuracy of tracking by directly controlling the local holder.
According to a third aspect, an embodiment of the present invention provides a monitoring device, including: a memory and a processor, the memory and the processor being communicatively connected to each other, the memory storing therein computer instructions, and the processor executing the computer instructions to perform the target tracking method according to the first aspect or any one of the embodiments of the first aspect.
According to a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium storing computer instructions for causing a computer to execute the target tracking method described in the first aspect or any one of the implementation manners of the first aspect.
According to a fifth aspect, an embodiment of the present invention further provides a target tracking system, including:
at least two monitoring devices according to the third aspect of the invention;
and the monitoring platform is connected with the monitoring equipment and used for calling other monitoring equipment to perform linkage tracking on the tracking target based on the holder parameters of the current monitoring equipment for tracking the tracking target.
According to the target tracking system provided by the embodiment of the invention, the monitoring platform calls other monitoring equipment to perform linkage tracking on the tracked target by using the pan-tilt parameters of the current monitoring equipment, so that the false detection of a single monitoring equipment at a certain angle can be avoided, and the anti-interference performance of the target tracking system is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of a target tracking system according to an embodiment of the present invention;
FIG. 2 is a flow diagram of a target tracking method according to an embodiment of the invention;
FIG. 3 is a flow diagram of a target tracking method according to an embodiment of the invention;
FIG. 4 is a flow diagram of a target tracking method according to an embodiment of the invention;
FIG. 5 is a logical functional diagram with a particular set of low-level features according to an embodiment of the present invention;
FIG. 6 is a functional usage page diagram of target tracking according to an embodiment of the present invention;
FIG. 7 is a functional usage page diagram of target tracking according to an embodiment of the present invention;
FIG. 8 is a block diagram of a target tracking device according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a hardware structure of a monitoring device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An embodiment of the present invention provides a monitoring system, as shown in fig. 1, where the monitoring system includes a monitoring platform and at least two monitoring devices, and the monitoring platform is connected to each of the monitoring devices.
The monitoring device comprises an image sensor, a processor and a holder. The image sensor is used for collecting images, the processor is used for carrying out target identification on the collected images and determining the motion parameters of the cloud deck based on the identification result so as to control the motion of the cloud deck, and the cloud deck drives the image sensor to move, so that the targets are identified and track-tracked.
The monitoring platform is used for realizing linkage tracking of each monitoring device, for example, the monitoring device 1 identifies a tracking target, and the processor of the monitoring device 1 determines the motion parameters of the holder of the monitoring device 1 based on the position of the identified tracking target in the image; and after the motion parameters of the holder are determined, controlling the motion of the holder to track the target. When the tracking target exceeds the monitoring range of the monitoring equipment 1 or the monitoring equipment 1 does not recognize the tracking target, the monitoring equipment 1 uploads the parameters of the current holder to the monitoring platform. After receiving the parameters of the pan-tilt of the monitoring device 1, the monitoring platform may call other monitoring devices to continue tracking the tracking target by using the position relationship of each monitoring device.
As an optional implementation manner of this embodiment, if multiple monitoring devices are deployed in the environment, the multiple monitoring devices may be coordinated and managed by a monitoring platform (supporting GB28181 or sip to connect to the monitoring platform, or using an onvif protocol to connect to NVR), so as to form a networking. The following procedure is sometimes followed:
(1) in the tracking process, if the current tracking target enters the defense area of other pan-tilt IPCs, parameters (namely PTZ) and target characteristics of the pan-tilt of the current monitoring equipment of the platform are reported, and other pan-tilt cameras are scheduled for auxiliary identification, so that anti-interference and target loss and recovery under partial scenes can be realized; (2) and if the target exceeds the current camera monitoring range, scheduling other equipment capable of monitoring the target according to the PTZ and the target characteristics in the same way to finish relay tracking.
In the target tracking system described in this embodiment, single-sphere tracking is implemented by using a single monitoring device, wherein each component used for single-sphere tracking can be fixed inside a single mold and used as a standard pan-tilt IPC monitoring device for single use, and a single device can be used as a tracking bayonet according to the mode of the text, so that the target tracking system is convenient to use, removes a common coordinate calibration process necessary for multi-device linkage, and is easier to install and deploy. If a plurality of devices are deployed in a certain area, the tracking can be optimized and enhanced by the coordination of the platform.
In accordance with an embodiment of the present invention, there is provided an object tracking method embodiment, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions and that, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order different than here.
In this embodiment, a target tracking method is provided, which may be used in the monitoring device, and fig. 2 is a flowchart of the target tracking method according to the embodiment of the present invention, as shown in fig. 2, where the flowchart includes the following steps:
and S11, acquiring the current image frame and carrying out target recognition on the current image frame.
The current image frame may be an original image frame acquired by the monitoring device in real time, or an image frame obtained by preprocessing the original image frame. The preprocessing can be resolution, format conversion, and cropping of an image region of interest, etc. on the original image frame. The specific pretreatment process is not limited at all, and may be set according to actual conditions.
After acquiring the current image frame, the monitoring device performs target identification on the current image frame. For example, the detection model may be used to perform target recognition on the current image frame, and may also perform image processing on the current image frame to recognize a tracking target in the current image frame.
For example, if the object currently to be tracked is a person, the current image frame may be input to a pedestrian detection model; if the object currently to be tracked is a vehicle, the current image frame may be input to a vehicle detection model, and so on.
In this embodiment, the method for identifying the target is not limited at all, and only the monitoring device is required to identify the target.
And S12, judging whether the tracking target exists in the current image frame.
The monitoring device may perform target recognition on the current image frame, and the recognition result may be that a tracking target exists in the current image frame or that no tracking target exists in the current image frame.
When a tracking target exists in the current image frame, performing S13; otherwise, other operations are performed. Wherein, the other operations can be releasing the current image frame and waiting for the next image frame; or reporting the monitoring platform to enable the monitoring platform to call other monitoring equipment to perform linkage tracking on the tracked target.
And S13, determining the motion parameters of the local pan-tilt head based on the position of the tracking target in the current image frame.
Wherein the motion parameter comprises at least one of a motion direction, a motion speed or a zoom.
When a tracking target is present in the current image frame, the monitoring device may determine the position of the tracking target in the current image frame. The position of the tracking target in the current image frame is related to the relative position of the tracking target and the monitoring device, and if the tracking target is to be tracked, the relative position between the monitoring device and the tracking target needs to be adjusted by using the position of the tracking target in the current image frame. And the adjustment of the relative position between the monitoring equipment and the tracking target is realized through the movement of the holder. Therefore, it is necessary to determine the motion parameters of the local head.
In order to ensure that the monitoring equipment can track the tracking target, correspondingly, the position of the tracking target in the image acquired by the monitoring equipment can be located at the central position of the image. Therefore, the monitoring device can determine the motion parameters of the local pan/tilt head by using the position of the tracking target in the current image frame.
This step will be described in detail below.
And S14, controlling the movement of the local holder according to the movement parameters of the local holder so as to track the tracking target.
After the monitoring equipment determines the motion parameters of the local cloud deck, the motion of the local cloud deck is controlled, so that an image sensor of the monitoring equipment carries out image acquisition towards a tracking target, clear images can be acquired, and the accuracy of image identification is guaranteed.
In some optional embodiments of this embodiment, the monitoring device may further perform processing such as capturing a snapshot and alerting on the tracking target. The specific setting can be performed according to the actual situation, and is not limited herein.
The target tracking method provided by the embodiment identifies and tracks the tracked target in the local monitoring equipment, so that the target identification and track tracking of a certain area can be realized by using the single holder monitoring equipment, the target tracking by adopting linkage among a plurality of monitoring equipment is avoided, the necessary coordinate calibration process of linkage of a plurality of equipment can be further avoided, and the real-time performance and the accuracy of tracking can be ensured by directly controlling the local holder.
In this embodiment, a target tracking method is provided, which may be used in the monitoring device, and fig. 3 is a flowchart of the target tracking method according to the embodiment of the present invention, as shown in fig. 3, where the flowchart includes the following steps:
and S21, acquiring the current image frame and carrying out target recognition on the current image frame.
Please refer to S11 in fig. 2 for details, which are not described herein.
And S22, judging whether the tracking target exists in the current image frame.
When a tracking target exists in the current image frame, performing S23; otherwise, other operations are performed. Wherein, the other operations can be releasing the current image frame and waiting for the next image frame; or reporting the monitoring platform to enable the monitoring platform to call other monitoring equipment to perform linkage tracking on the tracked target.
Please refer to S12 in fig. 2 for details, which are not described herein.
And S23, determining the motion parameters of the local pan-tilt head based on the position of the tracking target in the current image frame.
Wherein the motion parameter comprises at least one of a motion direction, a motion speed or a zoom.
Specifically, the step S23 includes the following steps:
s231, a center position of the current image frame is acquired.
After acquiring the current image frame, the monitoring device may determine the center position of the current image frame by using the size of the current image frame. Wherein the center position can be represented in the form of center point coordinates.
And S232, determining the movement direction and the movement speed based on the position relation between the tracking target and the central position.
After the monitoring apparatus identifies the tracking target in S21, the coordinate information of the tracking target may be determined, for example, the tracking target may be marked with the detection frame, and the position information of the tracking target may be indicated by using the coordinate information of the detection frame.
After determining the position information of the tracking target and the center position of the current image frame, the monitoring device can determine the moving direction and the moving speed based on the two positions. Wherein, the motion direction can be divided into a horizontal direction and a vertical direction; the determination of the movement speed also needs to be made in connection with the position of the tracking object in the previous image frame.
In an optional implementation manner of this embodiment, the step S232 includes the following steps:
(1) and determining the movement direction by utilizing the angle of the tracking target relative to the central position.
After determining the position information of the tracking target in the current image frame and the center position of the current image frame, the monitoring device may determine the angle of the tracking target relative to the center position.
After the angle of the tracking target relative to the central position is determined, the horizontal angle and the vertical angle can be determined, so that the movement direction of the monitoring equipment can be obtained.
(2) And acquiring the position deviation of the tracking target in the current image frame and the previous image frame.
The monitoring device extracts the position information of the tracking target in the previous image frame, and for the following description, the position information in the previous image frame is referred to as first position information, and the position information in the current image frame is referred to as second position information.
And comparing the position information between the first position information and the second position information by the monitoring equipment, so that the position deviation between the first position information and the second position information can be determined. The position deviation is the motion displacement of the tracking target.
It should be noted that the previous image frame is an image of a previous frame adjacent to the current image frame, and the next image frame described below is an image of a next frame adjacent to the current image frame.
(3) And determining the movement speed by utilizing the angle and the position deviation of the tracking target relative to the central position.
The speed is higher as the tracking target is farther away from the center position of the picture, and the speed is higher as the coordinate change of the tracking target between adjacent frames is larger, so that the tracking target and the frame are combined when the movement speed is determined, and the accuracy of the determined movement speed can be ensured.
Specifically, corresponding weights, for example, a first weight and a second weight, respectively, may be set for the angle and the positional deviation of the tracking target with respect to the center position. And determining the movement speed by utilizing the angle of the tracking target relative to the central position, the position deviation and the corresponding weight thereof.
And S233, acquiring the proportion of the tracking target in the current image frame.
In order to ensure that the tracking target is not thrown out of the picture directly because the pan-tilt moves too fast, the occupation ratio of the tracking target in the picture is required to be small, but in order to ensure the identification precision and the video definition of the tracking target, the tracking target is required to be as large as possible but is easy to lose; based on the method, the balance between the two conditions can be ensured by determining the zoom of the local holder according to the proportion of the tracking target in the current image frame.
The ratio of the tracking target in the current image frame may be the ratio of the long edge of the detection frame corresponding to the tracking target to the long edge of the current image frame; or the area ratio of the tracking target to the current image frame, etc.
And S234, determining the zooming of the local holder based on the proportion.
After the monitoring device obtains the proportion of the tracking target in the current image frame, the scaling of the local holder can be determined by using the proportion.
For example, 1/4 to 1/5 of the target long side occupying the screen length may be selected, and the magnification may be decreased to a design value by calculating the increase magnification from the current target magnification.
Alternatively, the zoom value based on the current magnification is 0.25/(long-side detection frame corresponding to the tracking target/long-side corresponding resolution of the current image frame).
And S24, judging whether the zooming meets the preset condition.
The preset condition may be a preset threshold, and according to the current magnification of the step zoom motor of the image sensor and the pixel size of the tracking target in the current image frame, if the target size is lower than a certain value (generally, the minimum value for ensuring the algorithm identification accuracy is generally determined in the process of training the model, and a fixed value is written in the application) under high magnification, the image sensor does not have enough performance to track the target, and the target is considered to be out of the target range.
When the zoom does not satisfy the preset condition, executing S25; otherwise, S26 is executed.
And S25, reporting the parameters of the local holder to the monitoring platform, so that the monitoring platform calls other monitoring equipment to perform linkage tracking on the tracked target.
When the zoom does not meet the preset condition, the tracking target exceeds the monitoring range of the monitoring equipment at the moment, the monitoring equipment reports the parameters of the local pan-tilt to the monitoring platform, and the monitoring platform calls other monitoring equipment to perform linkage tracking on the tracking target by using the parameters.
When the zoom does not meet the preset condition, the tracking target exceeds the monitoring capability of the local monitoring equipment, the local monitoring equipment does not have enough performance to track the target, the motion parameters of the local holder are reported to the monitoring platform to call other monitoring equipment to perform linkage tracking on the tracking target, and therefore the situation that multiple monitoring equipment are integrated to perform all defense deployment on a large area can be achieved.
And S26, controlling the movement of the local holder according to the movement parameters of the local holder so as to track the tracking target.
When the multiplying power meets the preset condition, the current tracking target is still in the target range of the monitoring equipment, and the tracking target can be continuously tracked by the monitoring equipment.
For the rest, please refer to S14 in the embodiment shown in fig. 2, which is not described herein again.
According to the target tracking method provided by the embodiment, the movement direction and the movement speed are determined by utilizing the position relationship between the tracking target and the center position of the current image frame, so that the tracking target can be positioned at the center position of the image frame in the subsequent tracking process, and the tracking accuracy and reliability are improved.
In this embodiment, a target tracking method is provided, which may be used in the monitoring device, and fig. 4 is a flowchart of the target tracking method according to the embodiment of the present invention, as shown in fig. 4, where the flowchart includes the following steps:
and S31, acquiring the current image frame and carrying out target recognition on the current image frame.
Specifically, the step S31 includes the following steps:
and S311, acquiring a current image frame.
An Image Signal Processing module (ISP) and a media control module in the monitoring device process an original Image of the Image sensor lens. For example, the type of device used (the design is based on a Hi3519A computing chip and a matching development board), an original raw image file of an image sensor is acquired by using a haisi SDK, image parameters such as brightness, contrast, saturation, sharpness and shutter gain are combined by an ISP, and the image parameters are subjected to noise reduction, anti-shake and other processing and then stored as frame data by media control. It should be noted that, here, the frame data refers to the picture data of each frame, and not to the same concept in the network protocol data link layer, the continuously played frame data is a video, and the number of frame data per second is a frame rate.
Before an original image is acquired, the lens of each monitoring device needs to be initialized, for example, the lens is zoomed and zoomed, after the motor is powered on, the motor is automatically stretched from the lowest magnification to the highest magnification, in the process, the optical coupling resistor is found, and the motor coordinate is initialized by taking the optical coupling resistor as the center.
After initialization is complete, the PTZ position, i.e., Pan/Tilt/Zoom, initially tracked by the monitoring device is recorded as the horizontal angle, the vertical angle, and the Zoom, respectively. The focusing sensitivity of the camera stepping motor is increased and is set to be continuous automatic focusing, so that clear images can be obtained in the tracking process, and the tracking is relatively smooth and coherent. After the above steps are completed, the current image frame can be acquired.
S312, inputting the current image frame into at least one category detection model, and determining a detection result corresponding to each category detection model.
The class detection model is a model corresponding to each detection class, and may be, for example, a pedestrian detection model, a vehicle detection model, or the like.
After the monitoring equipment acquires the current image frame, the current image frame is input into at least one category detection model, so that detection results corresponding to the category detection models are obtained. For example, the current image frame is input into a pedestrian detection model and a vehicle detection model to determine which model is selected for recognition.
And S313, determining a target type detection model based on the detection result corresponding to each type detection model so as to perform target identification on the subsequent image frame.
Before identification, the pedestrian detection model and the vehicle detection model both operate, people or vehicles are determined according to the result threshold value, only the selected model is started in the subsequent tracking process, and the identification speed is ensured. The tracking function is very sensitive to the identification time of a single frame image, and under the condition of certain processor performance, the effect can be effectively improved by only running a single model in the tracking process.
In some optional embodiments of this embodiment, to ensure that the recovery or relay is the same tracking target, a feature integration algorithm is used in this embodiment to perform interference/loss coordination and relay tracking. Specifically, in the identification process, the whole image is regarded as main background coordinates, the coordinates of each low-level feature are recorded, and a local feature map generated by combining adjacent features through relational coding is combined to form a temporary object. The low-level features are bound on the high-level information, joint search information is carried out to ensure comprehensive understanding of bottom-layer image information, and a recognition network is formed through subsequent learning training.
Then, a framework is established, and a calculation model is set up. The image is divided into square grids with the same size, and the texture primitive characteristics of each pixel are calculated through a texture layout filter. A "feature bag" is created, the mesh, i.e. the local element, is represented by a visible single pass. The process of enlarging the size of the grid is to obtain the regional characteristics, and finally, the regional characteristics are normalized into vectors. And (4) classifying and aggregating the vectors, and distributing a clustering center index to each image, wherein the index is the global feature of the image.
The multi-scale features are associated by using a relational coding scheme, on one hand, a longer coding scheme is expected to be used for managing image information, on the other hand, the longer coding scheme improves the calculation cost, and after balance, an association function is introduced to enable o to be a bottom layer local feature and stFor the state at time t, l is the target, and there are:
fi(o,si)=(xi(o,t))(si=l) (1)
wherein when s ═ l denotes the current state stAnd pair ofLike l is associated, xi(o, t) is a logical function that determines whether there is a particular set of low-level features. Wherein the function-related process is shown in fig. 5.
There is also an interaction function representing the transition between objects, defined as follows:
gi(st-1,st)=(st-1=l')(st=l) (2)
the interaction function represents the transition of the object name from the previous state to the current state, which does indicate the transition scheme of the high-level information. Similar to the correlation function, the information transfer scheme also employs a two-dimensional structure to encode horizontal and vertical relationship information. For the inference process, the object classes are inferred from low-level features, so we use a specific sequence of features x (o, t) as the inference relation coding scheme.
The probability distribution of the relation function, which is intended to contain the largest amount of information, i.e. the desired distribution p is as general as possible, some desired constraints are applied to the characteristic functions:
E(fi(o,si))=αi (3)
E(gj(st-1,st))=βi (4)
combining (1) and (2), assuming px is an element of a finite length vector P, the target can be formalized into a convex optimization problem under linear constraint, equations are listed and converted into Lagrange equations, and the Lagrange equations are solved
Figure BDA0002728878610000141
In the deep learning process, a posterior discriminant model is adopted, weight parameters of a relation function are estimated through K training image samples, in order to avoid overfitting, probability is punished by Gaussian weight, overfitting is avoided, the gradient of log-likelihood is input into an L-BFGS algorithm for iteration, and a recognition network is constructed by utilizing a weighting relation function. Assuming that the | L | class of the object has been learned, the | L | class can be determined by pairing gj(st-1,st) Satisfies st-1=l' and stThe weighted inter-functions of the ensemble learning samples of L are summed to construct the | L | × | L | information transfer matrix.
And S32, judging whether the tracking target exists in the current image frame.
When a tracking target exists in the current image frame, performing S33; otherwise, S35 is executed. Wherein, the other operations can be releasing the current image frame and waiting for the next image frame; or reporting the monitoring platform to enable the monitoring platform to call other monitoring equipment to perform linkage tracking on the tracked target.
And S33, determining the motion parameters of the local pan-tilt head based on the position of the tracking target in the current image frame.
Wherein the motion parameter comprises at least one of a motion direction, a motion speed or a zoom.
Please refer to S23 in fig. 3 for details, which are not described herein.
And S34, controlling the movement of the local holder according to the movement parameters of the local holder so as to track the tracking target.
It should be noted that, before the tracking starts, the tracking mode of the monitoring device being turned on needs to be determined. The tracking mode can be automatic tracking or alarm linkage tracking. The automatic tracking and the alarm linkage tracking are different in that after the alarm linkage tracking identifies the tracking target, the tracking target needs to be subjected to alarm processing, and when the tracking target meets the snapshot condition, the tracking target is subjected to snapshot and the like.
For example, fig. 6 shows a page schematic diagram for the function of target tracking, in which a user may configure a tracking manner and other parameters, and may specifically configure the tracking manner and other parameters according to an actual situation, which is not limited herein.
Specifically, the step S34 includes the following steps:
and S341, controlling the motion of the local holder by using the motion parameters.
After determining the motion parameters of the pan/tilt head in S33, the monitoring device controls the motion of the local pan/tilt head.
And S342, releasing the current image frame and acquiring the next image frame so as to track the tracking target.
The monitoring equipment releases the current image frame from the storage space and waits for acquiring the next image frame so as to track the tracking target. After the above processes are processed, the loop detection can be started to form a complete tracking process.
As an optional implementation manner of this embodiment, when the tracking manner of the monitoring device is alarm linkage tracking, the step S34 further includes the following steps:
(1) and carrying out alarm processing on the tracking target.
In the alarm linkage tracking mode, after the monitoring device identifies the tracking target, alarm processing is performed, such as alarm, subtitle addition, video linkage, platform reporting, and the like. And corresponding treatment can be specifically carried out according to requirements.
(2) And judging whether the tracking target meets the snapshot condition.
The snapshot condition can be that the ZOOM of the ZOOM is detected for multiple times and then the snapshot is carried out, so that the effect of the snapshot image is improved, and unnecessary snapshot is avoided.
When the tracking target meets the snapshot condition, executing (3); otherwise, S342 is executed.
For example, fig. 7 shows that the function of target tracking uses a page diagram in which the user can set a linkage manner, a defense time, and the like. The corresponding setting can be specifically carried out according to the actual situation.
(3) And snapping the tracked target, and uploading the snapping result to the monitoring platform.
And the monitoring equipment takes a snapshot of the tracked target and uploads the snapshot result to the monitoring platform.
S35, it is determined whether the tracking target is lost based on the tracking target in the previous image frame.
And if the tracking target does not exist in the current image frame, the monitoring equipment extracts the identification result of the last image frame. If the tracking target exists in the previous image frame and the tracking target does not exist in the current image frame, the tracking target can be determined to be lost; if the tracking steel template does not exist in the previous image frame and the tracking target does not exist in the current image frame, it indicates that the monitoring device is probably not in the tracking process. When the tracking target is lost, S36; otherwise, the current image frame is released.
And S36, reporting the parameters of the local holder to the monitoring platform, so that the monitoring platform calls other monitoring equipment to perform linkage tracking on the tracked target.
And the parameters of the current holder comprise the angle and the zooming of the holder.
After the monitoring equipment determines that the tracking target is lost, reporting the parameters of the local pan-tilt to the monitoring platform, and calling other monitoring equipment by the monitoring platform by using the parameters of the pan-tilt of the monitoring equipment to perform linkage tracking on the tracking target.
According to the target tracking method provided by the embodiment, when the tracking mode started at present is alarm linkage tracking, alarm processing and subsequent snapshot are performed on the tracked target, so that intelligent linkage tracking can be realized by using a single monitoring device, and the tracking result can meet the user requirement; when no tracking target exists in the current image frame, the parameters of the local pan-tilt are reported to the monitoring platform so as to call other monitoring equipment to perform linkage tracking on the tracking target, thereby avoiding the false detection of a single monitoring equipment at a certain angle and improving the anti-interference performance of target tracking.
In this embodiment, a target tracking apparatus is further provided, and the apparatus is used to implement the foregoing embodiments and preferred embodiments, and the description of which has been already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
The present embodiment provides a target tracking apparatus, as shown in fig. 8, including:
an obtaining module 41, configured to obtain a current image frame and perform target identification on the current image frame;
a determining module 42, configured to determine, when a tracking target exists in the current image frame, a motion parameter of the local pan/tilt head based on a position of the identified tracking target in the current image frame, where the motion parameter includes at least one of a motion direction, a speed, or a zoom ratio;
and the control module 43 is configured to control the motion of the local pan/tilt according to the motion parameter of the local pan/tilt, so as to track the tracking target.
The target tracking device provided by the embodiment identifies and tracks the target in local monitoring equipment, so that the target identification and track tracking can be realized in a certain area by utilizing the independent holder monitoring equipment, the target tracking by adopting linkage among a plurality of monitoring equipment is avoided, the necessary coordinate calibration process of linkage of a plurality of equipment can be avoided, and the real-time performance and the accuracy of tracking can be ensured by directly controlling the local holder.
The target tracking apparatus in this embodiment is presented in the form of a functional unit, where the unit refers to an ASIC circuit, a processor and memory executing one or more software or fixed programs, and/or other devices that may provide the above-described functionality.
Further functional descriptions of the modules are the same as those of the corresponding embodiments, and are not repeated herein.
An embodiment of the present invention further provides a monitoring device, which has the target tracking apparatus shown in fig. 8.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a monitoring device according to an alternative embodiment of the present invention, and as shown in fig. 9, the monitoring device may include: at least one processor 51, such as a CPU (Central Processing Unit), at least one communication interface 53, memory 54, at least one communication bus 52. Wherein a communication bus 52 is used to enable the connection communication between these components. The communication interface 53 may include a Display (Display) and a Keyboard (Keyboard), and the optional communication interface 53 may also include a standard wired interface and a standard wireless interface. The Memory 54 may be a high-speed RAM Memory (volatile Random Access Memory) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The memory 54 may alternatively be at least one memory device located remotely from the processor 51. Wherein the processor 51 may be in connection with the apparatus described in fig. 8, the memory 54 stores an application program, and the processor 51 calls the program code stored in the memory 54 for performing any of the above-mentioned method steps.
The communication bus 52 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The communication bus 52 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
The memory 54 may include a volatile memory (RAM), such as a random-access memory (RAM); the memory may also include a non-volatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviated: HDD) or a solid-state drive (english: SSD); the memory 54 may also comprise a combination of the above types of memories.
The processor 51 may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of a CPU and an NP.
The processor 51 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
Optionally, the memory 54 is also used to store program instructions. The processor 51 may call program instructions to implement the target tracking method as shown in the embodiments of fig. 2 to 4 of the present application.
An embodiment of the present invention further provides a non-transitory computer storage medium, where a computer-executable instruction is stored in the computer storage medium, and the computer-executable instruction may execute the target tracking method in any of the above method embodiments. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (12)

1. A target tracking method, comprising:
acquiring a current image frame, and performing target identification on the current image frame;
when a tracking target exists in the current image frame, determining a motion parameter of a local tripod head based on the position of the tracking target in the current image frame, wherein the motion parameter comprises at least one of a motion direction, a motion speed or zooming;
and controlling the motion of the local holder according to the motion parameters of the local holder so as to track the tracking target.
2. The method of claim 1, wherein determining the motion parameter of the local pan/tilt head based on the position of the tracking target in the current image frame comprises:
acquiring the central position of a current image frame;
and determining the movement direction and the movement speed based on the position relation between the tracking target and the central position.
3. The method according to claim 2, wherein the determining the moving direction and the moving speed based on the positional relationship between the tracking target and the center position comprises:
determining the movement direction by using the angle of the tracking target relative to the central position;
acquiring the position deviation of the tracking target in the current image frame and the previous image frame;
and determining the movement speed by using the angle of the tracking target relative to the central position and the position deviation.
4. The method according to any one of claims 1-3, wherein said determining a motion parameter of a local pan/tilt head based on the position of said tracking target in said current image frame comprises:
acquiring the proportion of the tracking target in the current image frame;
determining the zoom of the local pan/tilt head based on the ratio.
5. The method of claim 4, further comprising:
judging whether the zooming meets a preset condition or not;
when the zoom does not meet the preset condition, reporting the parameters of the local pan-tilt to a monitoring platform so that the monitoring platform calls other monitoring equipment to perform linkage tracking on the tracking target, wherein the parameters of the pan-tilt comprise the angle and the zoom of the pan-tilt.
6. The method according to claim 1, wherein the controlling the motion of the local pan/tilt head according to the motion parameter of the local pan/tilt head to track the tracking target comprises:
controlling the motion of the local holder by using the motion parameters;
and releasing the current image frame and acquiring the next image frame so as to track the tracking target.
7. The method according to claim 6, wherein when the tracking mode currently turned on locally is alarm linkage tracking, the method controls the motion of the local pan/tilt head according to the motion parameters of the local pan/tilt head to track the tracking target, and further comprising:
performing alarm processing on the tracking target;
judging whether the tracking target meets the snapshot condition;
and when the tracked target meets the snapshot condition, snapshot is carried out on the tracked target, and the snapshot result is uploaded to a monitoring platform.
8. The method of claim 1, further comprising:
when a tracking target does not exist in the current image frame, determining whether the tracking target is lost based on the tracking target in the previous image frame;
when the tracking target is lost, reporting the parameters of the local cradle head to a monitoring platform so that the monitoring platform calls other monitoring equipment to perform linkage tracking on the tracking target, wherein the parameters of the cradle head comprise the angle and the zooming of the cradle head.
9. The method of claim 1, wherein the performing target recognition on the current image frame to determine a tracking target in the image to be processed comprises:
inputting the current image frame into at least one category detection model, and determining a detection result corresponding to each category detection model;
and determining a target class detection model based on the detection result corresponding to each class detection model so as to perform target identification on the subsequent image frame.
10. A monitoring device, comprising:
a memory and a processor communicatively coupled to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the object tracking method of any one of claims 1-9.
11. A computer-readable storage medium storing computer instructions for causing a computer to perform the object tracking method of any one of claims 1-9.
12. An object tracking system, comprising:
at least two monitoring devices as claimed in claim 10;
and the monitoring platform is connected with the monitoring equipment and used for calling other monitoring equipment to perform linkage tracking on the tracking target based on the holder parameters of the current monitoring equipment for tracking the tracking target.
CN202011111982.3A 2020-10-16 2020-10-16 Target tracking method, monitoring device, storage medium and system Active CN112055158B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011111982.3A CN112055158B (en) 2020-10-16 2020-10-16 Target tracking method, monitoring device, storage medium and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011111982.3A CN112055158B (en) 2020-10-16 2020-10-16 Target tracking method, monitoring device, storage medium and system

Publications (2)

Publication Number Publication Date
CN112055158A true CN112055158A (en) 2020-12-08
CN112055158B CN112055158B (en) 2022-02-22

Family

ID=73605296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011111982.3A Active CN112055158B (en) 2020-10-16 2020-10-16 Target tracking method, monitoring device, storage medium and system

Country Status (1)

Country Link
CN (1) CN112055158B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112472138A (en) * 2020-12-18 2021-03-12 深圳市德力凯医疗设备股份有限公司 Ultrasonic tracking method, system, storage medium and ultrasonic equipment
CN112651994A (en) * 2020-12-18 2021-04-13 零八一电子集团有限公司 Ground multi-target tracking method
CN112884809A (en) * 2021-02-26 2021-06-01 北京市商汤科技开发有限公司 Target tracking method and device, electronic equipment and storage medium
CN113114939A (en) * 2021-04-12 2021-07-13 南京博蓝奇智能科技有限公司 Target tracking method and system and electronic equipment
CN113296546A (en) * 2021-04-22 2021-08-24 杭州晟冠科技有限公司 Compensation method for positioning error of ship linkage tracking
CN113591651A (en) * 2021-07-22 2021-11-02 浙江大华技术股份有限公司 Image capturing method, image display method and device and storage medium
WO2022141197A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Method and device for controlling cradle head, movable platform and storage medium
CN114726967A (en) * 2020-12-21 2022-07-08 深圳市迪威泰实业有限公司 Binocular face recognition intelligent AI camera
CN116744102A (en) * 2023-06-19 2023-09-12 北京拙河科技有限公司 Ball machine tracking method and device based on feedback adjustment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102143324A (en) * 2011-04-07 2011-08-03 天津市亚安科技电子有限公司 Method for automatically and smoothly tracking target by cradle head
KR20120050546A (en) * 2010-11-11 2012-05-21 주식회사 비츠로시스 System for controlling remotely device using pan tilt zoom camera
CN103607569A (en) * 2013-11-22 2014-02-26 广东威创视讯科技股份有限公司 Method and system for tracking monitored target in process of video monitoring
CN104159016A (en) * 2013-05-13 2014-11-19 浙江大华技术股份有限公司 Cradle head control system, method and device
US20150350606A1 (en) * 2014-05-29 2015-12-03 Abdullah I. Khanfor Automatic object tracking camera
CN205827430U (en) * 2016-04-19 2016-12-21 深圳正谱云教育技术有限公司 Camera to automatically track system based on single-lens image Dynamic Recognition
CN206195969U (en) * 2016-12-06 2017-05-24 天津银箭科技有限公司 Many cameras link intelligence in real time and trail integration system
CN111093050A (en) * 2018-10-19 2020-05-01 浙江宇视科技有限公司 Target monitoring method and device
CN111314609A (en) * 2020-02-24 2020-06-19 浙江大华技术股份有限公司 Method and device for controlling pan-tilt tracking camera shooting

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120050546A (en) * 2010-11-11 2012-05-21 주식회사 비츠로시스 System for controlling remotely device using pan tilt zoom camera
CN102143324A (en) * 2011-04-07 2011-08-03 天津市亚安科技电子有限公司 Method for automatically and smoothly tracking target by cradle head
CN104159016A (en) * 2013-05-13 2014-11-19 浙江大华技术股份有限公司 Cradle head control system, method and device
CN103607569A (en) * 2013-11-22 2014-02-26 广东威创视讯科技股份有限公司 Method and system for tracking monitored target in process of video monitoring
US20150350606A1 (en) * 2014-05-29 2015-12-03 Abdullah I. Khanfor Automatic object tracking camera
CN205827430U (en) * 2016-04-19 2016-12-21 深圳正谱云教育技术有限公司 Camera to automatically track system based on single-lens image Dynamic Recognition
CN206195969U (en) * 2016-12-06 2017-05-24 天津银箭科技有限公司 Many cameras link intelligence in real time and trail integration system
CN111093050A (en) * 2018-10-19 2020-05-01 浙江宇视科技有限公司 Target monitoring method and device
CN111314609A (en) * 2020-02-24 2020-06-19 浙江大华技术股份有限公司 Method and device for controlling pan-tilt tracking camera shooting

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651994A (en) * 2020-12-18 2021-04-13 零八一电子集团有限公司 Ground multi-target tracking method
CN112472138B (en) * 2020-12-18 2023-01-31 深圳市德力凯医疗设备股份有限公司 Ultrasonic tracking method, system, storage medium and ultrasonic equipment
CN112472138A (en) * 2020-12-18 2021-03-12 深圳市德力凯医疗设备股份有限公司 Ultrasonic tracking method, system, storage medium and ultrasonic equipment
CN114726967A (en) * 2020-12-21 2022-07-08 深圳市迪威泰实业有限公司 Binocular face recognition intelligent AI camera
CN114982217A (en) * 2020-12-30 2022-08-30 深圳市大疆创新科技有限公司 Control method and device of holder, movable platform and storage medium
WO2022141197A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Method and device for controlling cradle head, movable platform and storage medium
CN112884809A (en) * 2021-02-26 2021-06-01 北京市商汤科技开发有限公司 Target tracking method and device, electronic equipment and storage medium
CN113114939B (en) * 2021-04-12 2022-07-12 南京博蓝奇智能科技有限公司 Target tracking method and system and electronic equipment
CN113114939A (en) * 2021-04-12 2021-07-13 南京博蓝奇智能科技有限公司 Target tracking method and system and electronic equipment
CN113296546A (en) * 2021-04-22 2021-08-24 杭州晟冠科技有限公司 Compensation method for positioning error of ship linkage tracking
CN113591651A (en) * 2021-07-22 2021-11-02 浙江大华技术股份有限公司 Image capturing method, image display method and device and storage medium
CN116744102A (en) * 2023-06-19 2023-09-12 北京拙河科技有限公司 Ball machine tracking method and device based on feedback adjustment
CN116744102B (en) * 2023-06-19 2024-03-12 北京拙河科技有限公司 Ball machine tracking method and device based on feedback adjustment

Also Published As

Publication number Publication date
CN112055158B (en) 2022-02-22

Similar Documents

Publication Publication Date Title
CN112055158B (en) Target tracking method, monitoring device, storage medium and system
US11847826B2 (en) System and method for providing dominant scene classification by semantic segmentation
US11809998B2 (en) Maintaining fixed sizes for target objects in frames
EP2549738B1 (en) Method and camera for determining an image adjustment parameter
CN107408303A (en) System and method for Object tracking
KR101530255B1 (en) Cctv system having auto tracking function of moving target
KR20010085779A (en) Visual device
JP2007074143A (en) Imaging device and imaging system
CN112602319B (en) Focusing device, method and related equipment
KR20140095333A (en) Method and apparratus of tracing object on image
Fawzi et al. Embedded real-time video surveillance system based on multi-sensor and visual tracking
CN114140745A (en) Method, system, device and medium for detecting personnel attributes of construction site
Wang et al. Object counting in video surveillance using multi-scale density map regression
CN115103120A (en) Shooting scene detection method and device, electronic equipment and storage medium
JP2020021368A (en) Image analysis system, image analysis method and image analysis program
US20230334774A1 (en) Site model updating method and system
US20230419505A1 (en) Automatic exposure metering for regions of interest that tracks moving subjects using artificial intelligence
US11790483B2 (en) Method, apparatus, and device for identifying human body and computer readable storage medium
JP6875646B2 (en) Image processing device and image processing program
CN117280708A (en) Shutter value adjustment of monitoring camera using AI-based object recognition
JP2005347942A (en) Monitor system, control program of monitor system and monitoring device
CN113691731B (en) Processing method and device and electronic equipment
CN114245023B (en) Focusing processing method and device, camera device and storage medium
US12033082B2 (en) Maintaining fixed sizes for target objects in frames
JP2024089489A (en) Information processing device, control method for information processing device, program, and information processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant