CN118172548A - Target detection tracking identification method and system based on multi-data fusion - Google Patents

Target detection tracking identification method and system based on multi-data fusion Download PDF

Info

Publication number
CN118172548A
CN118172548A CN202410606505.6A CN202410606505A CN118172548A CN 118172548 A CN118172548 A CN 118172548A CN 202410606505 A CN202410606505 A CN 202410606505A CN 118172548 A CN118172548 A CN 118172548A
Authority
CN
China
Prior art keywords
target
infrared
radar
processing data
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410606505.6A
Other languages
Chinese (zh)
Other versions
CN118172548B (en
Inventor
段卓镭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Institute of Technology
Original Assignee
Nanchang Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Institute of Technology filed Critical Nanchang Institute of Technology
Priority to CN202410606505.6A priority Critical patent/CN118172548B/en
Publication of CN118172548A publication Critical patent/CN118172548A/en
Application granted granted Critical
Publication of CN118172548B publication Critical patent/CN118172548B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Burglar Alarm Systems (AREA)

Abstract

The embodiment of the invention relates to the technical field of target detection, and particularly discloses a target detection tracking identification method and system based on multi-data fusion. According to the embodiment of the invention, the infrared processing data and the radar processing data are generated by carrying out real-time infrared monitoring and radar detection; judging whether a suspicious object to be detected exists or not; when a suspicious object to be detected exists, tracking shooting is carried out, and shooting processing data are obtained; judging whether a target to be checked is confirmed or not; when the target to be detected is confirmed, the infrared processing data, the radar processing data and the shooting processing data are integrated, whether the action abnormality exists or not is judged, and when the action abnormality exists, abnormal action alarm is carried out. The method can perform real-time infrared monitoring, radar detection and target detection, perform tracking shooting and confirmation recognition when a suspicious target to be detected is provided, and perform comprehensive action abnormality judgment and processing when the suspicious target to be detected is provided, so that the target detection precision is effectively improved, and the faults of target detection, tracking and recognition are avoided.

Description

Target detection tracking identification method and system based on multi-data fusion
Technical Field
The invention belongs to the technical field of target detection, and particularly relates to a target detection tracking identification method and system based on multi-data fusion.
Background
Target detection is an important technology in the field of computer vision, and is related technology of detecting the position and the size of a target object in an image in a given image or video, classifying or identifying the target object and the like. The basic principle of the object detection algorithm is to identify and locate an object by extracting features from image or video data and applying classifiers and frame regressors to the extracted features.
The target detection has wide application in the fields of artificial intelligence, automatic driving, security monitoring, image retrieval and the like. In the prior art, the mode of target detection is single, and the detection, tracking and identification of the target are usually carried out only through shooting and analysis. However, in a complex target detection environment, such a single target detection manner is easily affected by the environment, so that the target detection accuracy is not high, and the target detection tracking identification is easy to be in error.
Disclosure of Invention
The embodiment of the invention aims to provide a target detection tracking identification method and system based on multi-data fusion, and aims to solve the problems in the background technology.
In order to achieve the above object, the embodiment of the present invention provides the following technical solutions:
the target detection tracking identification method based on the multi-data fusion specifically comprises the following steps:
Performing real-time infrared monitoring and radar detection to obtain infrared monitoring data and radar detection data, and performing data preprocessing to generate infrared processing data and radar processing data;
performing target detection on the infrared processing data and the radar processing data, and judging whether a suspicious target to be detected exists or not;
When a suspicious target to be detected exists, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data, and carrying out tracking shooting and preprocessing to acquire shooting processing data;
According to the shooting processing data, confirming and identifying the suspicious object to be detected, and judging whether the suspicious object to be detected exists or not;
When the target to be detected is confirmed, the infrared processing data, the radar processing data and the shooting processing data are integrated, whether the action abnormality exists or not is judged, and when the action abnormality exists, abnormal action alarm is carried out.
As a further limitation of the technical scheme of the embodiment of the present invention, the performing real-time infrared monitoring and radar detection to obtain infrared monitoring data and radar detection data, and performing data preprocessing to generate infrared processing data and radar processing data specifically includes the following steps:
Performing infrared monitoring to acquire infrared monitoring data in real time;
performing data preprocessing of temperature correction, background subtraction and threshold background separation on the infrared monitoring data to generate infrared processing data;
Radar detection is carried out, and radar detection data are obtained in real time;
and carrying out coordinate conversion, ground clutter removal and data preprocessing of radar point cloud clustering on the radar detection data to generate radar processing data.
As a further limitation of the technical solution of the embodiment of the present invention, the target detection for the infrared processing data and the radar processing data, and the judging whether there is a suspicious target to be detected specifically includes the following steps:
performing target detection on the infrared processing data according to a preset infrared detection model to generate an infrared detection result;
Performing target detection on the radar processing data according to a preset radar detection model to generate a radar detection result;
comprehensively analyzing the infrared detection result and the radar detection result to determine the detection confidence;
and comparing the detection confidence with a preset standard confidence, and judging whether a suspicious object to be detected exists.
As a further limitation of the technical solution of the embodiment of the present invention, when a suspicious target to be detected is present, determining a suspicious target azimuth according to the infrared processing data and the radar processing data, and performing tracking shooting and preprocessing, and acquiring shooting processing data specifically includes the following steps:
When a suspicious target to be detected exists, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data;
determining a target shooting angle according to the suspicious target azimuth and a preset shooting space position;
tracking shooting is carried out according to the target shooting angle, and tracking shooting data are obtained;
and carrying out data preprocessing of gray conversion, histogram equalization and image noise reduction on the tracking shooting data to generate shooting processing data.
As a further limitation of the technical solution of the embodiment of the present invention, the identifying the suspicious object to be inspected according to the shooting processing data, and determining whether the suspicious object to be inspected has the confirmation object to be inspected specifically includes the following steps:
performing feature recognition on the shooting processing data to acquire a plurality of shooting recognition features;
Importing a plurality of standard identification features;
And comparing the shooting identification features with the standard identification features, and judging whether the target to be detected is confirmed.
As a further limitation of the technical solution of the embodiment of the present invention, when the target to be detected is confirmed, the infrared processing data, the radar processing data and the shooting processing data are integrated, whether the action abnormality exists is determined, and when the action abnormality exists, the abnormal action alarm is performed, which specifically includes the following steps:
Creating an action prediction task when the target to be detected is confirmed;
according to the action prediction task, the infrared processing data, the radar processing data and the shooting processing data are synthesized, and the follow-up action direction is predicted;
Judging whether the follow-up action direction has abnormal action or not according to the follow-up action direction;
when there is an abnormal action, an abnormal action alarm signal is generated and an abnormal action alarm is performed.
The system comprises an infrared monitoring radar detection unit, a target detection judging unit, a tracking shooting processing unit, a target confirmation identifying unit and an action abnormality judging unit, wherein:
the infrared monitoring radar detection unit is used for carrying out real-time infrared monitoring and radar detection, acquiring infrared monitoring data and radar detection data, and carrying out data preprocessing to generate infrared processing data and radar processing data;
The target detection judging unit is used for carrying out target detection on the infrared processing data and the radar processing data and judging whether a suspicious target to be detected exists or not;
The tracking shooting processing unit is used for determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data when the suspicious target to be detected exists, tracking shooting and preprocessing are carried out, and shooting processing data are obtained;
The target confirmation and identification unit is used for confirming and identifying the suspicious target to be detected according to the shooting processing data and judging whether the suspicious target to be detected is available;
And the action abnormality judging unit is used for integrating the infrared processing data, the radar processing data and the shooting processing data when the target to be detected is confirmed, judging whether the action abnormality exists or not, and alarming abnormal action when the action abnormality exists.
As a further limitation of the technical solution of the embodiment of the present invention, the infrared monitoring radar detection unit specifically includes:
The infrared monitoring module is used for carrying out infrared monitoring and acquiring infrared monitoring data in real time;
the infrared preprocessing module is used for preprocessing the infrared monitoring data for temperature correction, background subtraction and threshold background separation to generate infrared processing data;
The radar detection module is used for carrying out radar detection and acquiring radar detection data in real time;
And the radar preprocessing module is used for carrying out coordinate conversion, ground clutter removal and data preprocessing of radar point cloud clustering on the radar detection data to generate radar processing data.
As a further limitation of the technical solution of the embodiment of the present invention, the target detection and judgment unit specifically includes:
The infrared target detection module is used for carrying out target detection on the infrared processing data according to a preset infrared detection model to generate an infrared detection result;
The radar target detection module is used for carrying out target detection on the radar processing data according to a preset radar detection model to generate a radar detection result;
the detection confidence determining module is used for comprehensively analyzing the infrared detection result and the radar detection result and determining the detection confidence;
And the confidence coefficient comparison module is used for comparing the detection confidence coefficient with a preset standard confidence coefficient and judging whether a suspicious object to be detected exists.
As a further limitation of the technical solution of the embodiment of the present invention, the tracking shooting processing unit specifically includes:
The target azimuth determining module is used for determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data when the suspicious target to be detected is provided;
the shooting angle determining module is used for determining a target shooting angle according to the suspicious target azimuth and a preset shooting space position;
The tracking shooting module is used for tracking shooting according to the target shooting angle to acquire tracking shooting data;
and the shooting preprocessing module is used for carrying out data preprocessing of gray conversion, histogram equalization and image noise reduction on the tracking shooting data to generate shooting processing data.
Compared with the prior art, the invention has the beneficial effects that:
According to the embodiment of the invention, the infrared processing data and the radar processing data are generated by carrying out real-time infrared monitoring and radar detection; judging whether a suspicious object to be detected exists or not; when a suspicious object to be detected exists, tracking shooting is carried out, and shooting processing data are obtained; judging whether a target to be checked is confirmed or not; when the target to be detected is confirmed, the infrared processing data, the radar processing data and the shooting processing data are integrated, whether the action abnormality exists or not is judged, and when the action abnormality exists, abnormal action alarm is carried out. The method can perform real-time infrared monitoring, radar detection and target detection, perform tracking shooting and confirmation recognition when a suspicious target to be detected is provided, and perform comprehensive action abnormality judgment and processing when the suspicious target to be detected is provided, so that the target detection precision is effectively improved, and the faults of target detection, tracking and recognition are avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present invention.
Fig. 1 shows a flowchart of a method provided by an embodiment of the present invention.
Fig. 2 shows a flow chart of infrared monitoring and radar detection in a method provided by an embodiment of the invention.
Fig. 3 shows a flowchart of determining whether a suspicious object to be inspected exists in the method according to the embodiment of the present invention.
Fig. 4 shows a flowchart of tracking shooting and preprocessing in the method provided by the embodiment of the invention.
Fig. 5 shows a flowchart of determining whether there is a confirmation target to be inspected in the method provided by the embodiment of the invention.
FIG. 6 shows a flow chart of an abnormal action alarm in the method provided by the embodiment of the invention.
Fig. 7 shows an application architecture diagram of a system provided by an embodiment of the present invention.
Fig. 8 shows a block diagram of a system infrared monitoring radar detection unit according to an embodiment of the present invention.
Fig. 9 is a block diagram showing a structure of an object detection judgment unit in the system according to the embodiment of the present invention.
Fig. 10 shows a block diagram of a tracking shooting processing unit in the system according to the embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
It is understood that object detection has wide application in the fields of artificial intelligence, autopilot, security monitoring, image retrieval, etc. In the prior art, the target detection mode is single, and the detection, tracking and identification of the target are usually carried out only through shooting and analysis, but in a complex target detection environment, the single target detection mode is easily influenced by the environment, so that the target detection precision is low, and the target detection, tracking and identification are easy to miss.
In order to solve the problems, the embodiment of the invention acquires infrared monitoring data and radar detection data by carrying out real-time infrared monitoring and radar detection, and carries out data preprocessing to generate infrared processing data and radar processing data; performing target detection on the infrared processing data and the radar processing data, and judging whether a suspicious target to be detected exists or not; when a suspicious target to be detected is provided, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data, and carrying out tracking shooting and preprocessing to obtain shooting processing data; according to the shooting processing data, confirming and identifying the suspicious object to be detected, and judging whether the suspicious object to be detected exists or not; when the target to be detected is confirmed, the infrared processing data, the radar processing data and the shooting processing data are integrated, whether the action abnormality exists or not is judged, and when the action abnormality exists, abnormal action alarm is carried out. The method can perform real-time infrared monitoring, radar detection and target detection, perform tracking shooting and confirmation recognition when a suspicious target to be detected is provided, and perform comprehensive action abnormality judgment and processing when the suspicious target to be detected is provided, so that the target detection precision is effectively improved, and the faults of target detection, tracking and recognition are avoided.
Fig. 1 shows a flowchart of a method provided by an embodiment of the present invention.
Specifically, the target detection tracking identification method based on multi-data fusion specifically comprises the following steps:
Step S101, real-time infrared monitoring and radar detection are carried out, infrared monitoring data and radar detection data are obtained, data preprocessing is carried out, and infrared processing data and radar processing data are generated.
In the embodiment of the invention, the infrared monitoring data is obtained by carrying out real-time infrared monitoring on the area needing target detection tracking identification, then the infrared monitoring data is subjected to data preprocessing of temperature correction, background subtraction and threshold background separation, the accuracy of temperature measurement is ensured, the foreground target thermal signal is separated and the target area is further subdivided to generate infrared processing data, meanwhile, the area needing target detection tracking identification is subjected to real-time radar detection to obtain radar detection data, then the radar detection data is subjected to data preprocessing of coordinate conversion, ground clutter removal and radar point cloud clustering, the radar original coordinate system is converted into a uniform geographic or image coordinate system, the clutter cloud generated by ground reflection is removed, the potential target is primarily separated, and the radar processing data is generated.
Specifically, fig. 2 shows a flowchart of infrared monitoring and radar detection in the method provided by the embodiment of the invention.
In the preferred embodiment of the present invention, the performing real-time infrared monitoring and radar detection to obtain infrared monitoring data and radar detection data, and performing data preprocessing to generate infrared processing data and radar processing data specifically includes the following steps:
Step S1011, performing infrared monitoring to acquire infrared monitoring data in real time;
Step S1012, carrying out data preprocessing of temperature correction, background subtraction and threshold background separation on the infrared monitoring data to generate infrared processing data;
Due to the fact that the infrared monitoring equipment is different in characteristics, environment temperature changes or emissivity of a target object, temperature values in infrared monitoring data can deviate, and therefore temperature correction needs to be conducted on the original infrared monitoring data. Wherein the step of performing temperature correction is performed by the following formula:
Wherein, Representing corrected temperature value,/>Representing temperature values in raw infrared monitoring data,/>Representing the calibration coefficient,/>Representing the real-time temperature of the environment in which the infrared detection device is located,/>Representing the reference temperature/>Representing the emissivity of the target object.
It should be noted that, for the calibration coefficientsThe intensity for adjusting the temperature correction reflects the degree of calibration of the infrared sensor under different environmental conditions. Real-time temperature/>, for the environment in which the infrared detection device is locatedThis value is typically measured by an ambient temperature sensor on the device for taking into account the effect of ambient temperature on the accuracy of the infrared measurement. For reference temperature/>This value is the ambient temperature under certain conditions (e.g., equipment calibration) and is used as a reference point to correct the temperature value measured in real time. Emissivity for target object/>Emissivity is the efficiency with which the surface of an object radiates energy, and depends on the material and surface state of the object.
For background subtraction, a fixed thermal background may interfere with target detection when infrared monitoring is performed. To remove such background noise, a method of background modeling and subtracting the background from the original data is employed.
For threshold background separation, in order to further highlight the target object and suppress the background, a threshold method may be employed for background and target separation. Setting a threshold value for separating the background and the target, and comparing the temperature value of each pixel point with the threshold value for separating the background and the target to determine whether the pixel point belongs to the background or the target, wherein the finally generated infrared processing data only comprises the information of the target object, and the background information is effectively restrained.
Wherein the calculation formula of the threshold value for separating the background and the target is expressed as:
Wherein, Representing a threshold for separating background and objects,/>Representing the average temperature value of the local area/(Representing the threshold coefficient,/>Representing the standard deviation of the local area temperature.
It should be noted here that, for the threshold value for separating the background and the targetWhen the temperature value of a certain pixel point is higher (or lower, depending on the specific application scenario) than the threshold/>When the pixel is considered to be part of the target object; otherwise, it is considered as background. Average temperature value for local area/>The average of the temperature values of all pixels in a specific area is used for calculating the average temperature of a local area, so that the background temperature level of the area can be obtained, and the subsequent separation of the background and the target is facilitated. Standard deviation for local area temperature/>The standard deviation is a statistic for measuring the degree of dispersion of the data distribution and is used for reflecting the fluctuation condition of the temperature value in the local area. By introducing standard deviation, the threshold value can be set more flexibly and adaptively, so that the method is better suitable for the infrared image processing requirements under different scenes.
Step S1013, radar detection is carried out, and radar detection data are obtained in real time;
And step S1014, performing coordinate conversion, ground clutter removal and data preprocessing of radar point cloud clustering on the radar detection data to generate radar processing data.
In step S1014, when the radar detection data is subjected to coordinate conversion, since the radar detection data is generally acquired in a polar coordinate system (distance and azimuth) of the radar itself. It is therefore necessary to convert these data into a cartesian coordinate system for the convenience of subsequent processing and analysis.
For ground clutter removal, ground clutter is echoes resulting from radar beams shining on the ground, typically representing low speed or stationary targets. In this embodiment, in order to remove clutter, the following filter calculation formula is adopted:
Wherein, Representing the filtering factor,/>Representing detected target speed,/>Representing detected target acceleration,/>Threshold representing speed,/>A threshold value representing acceleration. As a supplementary explanation, for the filter factor/>For determining whether to reserve the point. For Lei Dadian cloud clustering operations, clustering is to classify points belonging to the same target into one class. In this embodiment, a clustering algorithm based on distance and speed is used to calculate each point/>Calculate the dot/>And all other points/>Similarity of (2)The corresponding calculation formula is expressed as: /(I)
Wherein,Representing a dot/>And dot/>Similarity of/>Representing a dot/>And dot/>Spatial distance of/>Representing a dot/>And dot/>Velocity difference of/>Threshold representing spatial similarity,/>A threshold value representing a speed similarity.
Then, according to the similarity matrixThe point clouds are clustered using a clustering algorithm (e.g., DBSCAN or a variant thereof) to classify similar points into one class, thereby distinguishing different targets.
Further, the target detection tracking identification method based on multi-data fusion further comprises the following steps:
and step S102, performing target detection on the infrared processing data and the radar processing data, and judging whether a suspicious target to be detected exists.
According to the embodiment of the invention, according to a preset infrared detection model, the infrared temperature characteristics in infrared processing data are extracted, target detection is carried out, an infrared detection result is generated, according to a preset radar detection model, the geometric characteristics of point clouds in radar processing data are extracted, target detection is carried out, a radar detection result is generated, confidence analysis is carried out on the infrared detection result based on algorithms of edge detection, template matching and deep learning (such as U-Net, FCN and the like), confidence analysis is carried out on the radar detection result by utilizing a 3D target detection model such as PointPillars, SECOND and the like, comprehensive detection confidence is generated, and whether a suspicious target to be detected exists or not is judged by comparing the detection confidence with the preset standard confidence.
Specifically, fig. 3 shows a flowchart of determining whether a suspicious object to be inspected exists in the method provided by the embodiment of the present invention.
In the preferred embodiment of the present invention, the target detection for the infrared processing data and the radar processing data, and the judging whether there is a suspicious target to be detected specifically includes the following steps:
step S1021, performing target detection on the infrared processing data according to a preset infrared detection model to generate an infrared detection result;
Step S1022, performing target detection on the radar processing data according to a preset radar detection model to generate a radar detection result;
step S1023, comprehensively analyzing the infrared detection result and the radar detection result to determine the detection confidence;
In step S1023, the method for determining the detection confidence level specifically includes the following steps:
step S1023a, acquiring an infrared detection result and a radar detection result, wherein the infrared detection result comprises the confidence coefficient of the infrared detection result, and the radar detection result comprises the confidence coefficient of the radar detection result;
Step S1023b, calculating to obtain the detection confidence coefficient according to the confidence coefficient of the infrared detection result and the confidence coefficient of the radar detection result.
The calculation formula of the detection confidence coefficient is expressed as follows:
Wherein, Representing the confidence of the detection,/>Representing the confidence of the infrared detection result,/>Representing confidence of radar detection result,/>Representing the weight of infrared detection results in detection confidence calculation,/>Representing the weight of radar detection results in detection confidence calculation,/>Representing adjustment factors,/>The smaller value of the confidence coefficient of the infrared detection result and the confidence coefficient of the radar detection result is represented.
It should be noted here that for the detection confidenceThe method is a confidence value integrating the infrared detection result and the radar detection result and is used for judging whether a suspicious object to be detected exists or not, and the higher the confidence value is, the greater the possibility of detecting the object is. Confidence/>, for infrared detection resultsThe value output by the infrared detection model is a value which represents the confidence that the infrared sensor detects the target, the range of the value is 0, 1, and the larger the value is, the more confident the infrared sensor detects the target. Confidence for radar detection results/>The value output by the radar detection model is a value which indicates the confidence that the radar sensor detects the target, the value range is also 0 and 1, and the larger the value is, the more confident the radar sensor detects the target. For the adjustment factor/>Used for enhancing the fusion confidence coefficient when the confidence coefficient of the infrared and radar detection results is higher, and adjusting the factor/>Reflecting the contribution of consistency between the two detection results to the fusion confidence. When the confidence of the detection results of the infrared and the radar are higher and similar, the adjustment factor/>The final detection confidence will be improved.
Step S1024, comparing the detection confidence with a preset standard confidence, and judging whether a suspicious object to be detected exists.
Further, the target detection tracking identification method based on multi-data fusion further comprises the following steps:
Step S103, when a suspicious target to be detected exists, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data, and carrying out tracking shooting and preprocessing to acquire shooting processing data.
In the embodiment of the invention, when the detection confidence is greater than the preset standard confidence, the suspicious object to be detected is judged to exist, at the moment, the suspicious object azimuth is determined by carrying out object azimuth identification on the infrared processing data and the radar processing data, then the object shooting angle is determined according to the suspicious object azimuth and the preset shooting space position, tracking shooting control is carried out according to the object shooting angle, tracking shooting data is obtained, and then data preprocessing such as gray level conversion, histogram equalization and image noise reduction is carried out on the tracking shooting data, so that the calculation complexity is reduced, the information loss caused by uneven illumination is improved, the detection performance is improved, and the shooting processing data is generated.
Specifically, fig. 4 shows a flowchart of tracking shooting and preprocessing in the method provided by the embodiment of the invention.
In the preferred embodiment of the present invention, when a suspicious target to be detected is present, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data, and performing tracking shooting and preprocessing to obtain shooting processing data, wherein the method specifically includes the following steps:
step S1031, when a suspicious target to be detected exists, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data;
the step S1031 specifically includes the following sub-steps:
Step S1031a, obtaining pixel offset of a suspicious target highest temperature point in the infrared image relative to the center of the image and the focal length of the infrared image based on the infrared processing data;
Step S1031b, obtaining a distance between a suspicious target detected by the radar and an initial azimuth angle of the suspicious target detected by the radar based on the radar processing data;
Step S1031c, calculating and determining the target azimuth according to the pixel offset of the highest temperature point of the suspicious target in the infrared image relative to the center of the image, the focal length of the infrared image, the distance between the suspicious target detected by the radar and the initial azimuth of the suspicious target detected by the radar.
The calculation formula of the target azimuth is expressed as follows:
Wherein, Representing the distance between the suspicious object detected by the radar and the radar,/>Representing the initial azimuth of a suspicious object detected by radar,/>Representing the azimuth offset adjusted by the infrared data,/>Representing the azimuth angle of the suspicious target finally determined after infrared data adjustment;
Wherein, Representing the first calibration coefficient,/>Representing the second calibration factor,/>Representing the pixel offset in the horizontal direction of the highest temperature point of a suspicious object in an infrared image relative to the center of the image,/>Representing the pixel offset in the vertical direction of the highest temperature point of a suspicious object in an infrared image relative to the center of the image,/>Representing the focal length of the infrared image.
It should be noted that, for the azimuth offsetThis value represents the amount of adjustment made to the initial azimuth angle detected by the radar based on the infrared sensor data for improving the accuracy of the target azimuth. For the calibration coefficient/>And/>The two coefficients are offsets for converting pixel offsets in the infrared image into azimuth angles, reflecting the conversion relationship between the infrared image coordinate system and radar azimuth angles, and are usually determined by calibration through experiments or simulations. For pixel offset/>And/>These two values represent pixel offsets in the horizontal and vertical directions of the highest temperature point (or feature point) of the suspicious object in the infrared image relative to the center of the image, and the calculation of the offset can determine the position of the object in the image and adjust the azimuth angle detected by the radar accordingly. Focal length/>, for infrared imagesAnd the distance from the image plane to the focus is expressed and is used for converting the pixel offset into the angle offset, so that fusion of the infrared data and the radar data is realized.
Step S1032, determining a target shooting angle according to the suspicious target azimuth and a preset shooting space position;
step S1033, tracking shooting is carried out according to the target shooting angle, and tracking shooting data are obtained;
step S1034, performing data preprocessing including gray conversion, histogram equalization and image noise reduction on the tracking shooting data, and generating shooting processing data.
Further, the target detection tracking identification method based on multi-data fusion further comprises the following steps:
Step S104, according to the shooting processing data, confirming and identifying the suspicious object to be detected, and judging whether the suspicious object to be detected is available.
In the embodiment of the invention, a plurality of shooting identification features are acquired by carrying out feature identification on shooting processing data, a plurality of standard identification features are imported, the plurality of shooting identification features are compared with the plurality of standard identification features, whether the object to be checked is confirmed is judged, and when the matching number of the plurality of shooting identification features and the plurality of standard identification features is larger than the preset standard number, the object to be checked is judged to be confirmed; and when the number of matches between the plurality of shooting identification features and the plurality of standard identification features is not greater than the preset standard number, judging that the object to be detected is not confirmed.
Specifically, fig. 5 shows a flowchart of determining whether there is a confirmation target to be inspected in the method provided in the embodiment of the present invention.
In a preferred embodiment of the present invention, the identifying the suspicious object to be inspected according to the shooting processing data, and determining whether the suspicious object to be inspected has the suspicious object to be inspected specifically includes the following steps:
Step S1041, performing feature recognition on the shooting processing data to obtain a plurality of shooting recognition features;
step S1042, importing a plurality of standard identification features;
step S1043, comparing the plurality of shot recognition features with the plurality of standard recognition features, and determining whether there is a confirmation target to be inspected.
The step S1043 specifically includes the following sub-steps:
step S1043a, obtaining a feature value of the shooting identification feature and a feature value of the standard identification feature;
Step S1043b, calculating to obtain a matching degree according to the characteristic value of the shooting identification characteristic and the characteristic value of the standard identification characteristic;
Wherein, the calculation formula of the matching degree is expressed as:
Wherein, Representing the degree of matching,/>Representing the total number of features of the shot recognition feature compared with the standard recognition feature,Represents the/>Characteristic value of individual shooting identification characteristic,/>Represents the/>Characteristic value of standard identification feature corresponding to individual shooting identification feature,/>Represents the/>Larger value of individual shooting identification feature and corresponding standard identification feature,/>Representing a ride-through operation.
It should be noted here that for the ride-through operationRepresents a continuous multiplication operation on the expression in brackets, from/>Riding/>. That is, the feature/>, is identified for each shotA matching degree factor/>, is calculatedThese factors are then all multiplied to give the final degree of matching/>
Step S1043c, when the matching degree is determined to be greater than the matching degree threshold, determining that the shooting identification feature matches the corresponding standard identification feature, so as to confirm the object to be detected.
Further, the target detection tracking identification method based on multi-data fusion further comprises the following steps:
Step S105, when the target to be detected is confirmed, the infrared processing data, the radar processing data and the shooting processing data are integrated, whether the action abnormality exists or not is judged, and when the action abnormality exists, abnormal action alarm is carried out.
In the embodiment of the invention, when the target to be detected is confirmed, an action prediction task is created, at the moment, according to the action prediction task, comprehensive action track analysis is carried out on the infrared processing data, the radar processing data and the shooting processing data, the follow-up action direction is predicted, whether the target to be detected is abnormal or not is judged according to the follow-up action direction, and when the target to be detected is abnormal, an abnormal alarm signal is generated, and abnormal action alarm is carried out.
Specifically, fig. 6 shows a flowchart of an abnormal action alarm in the method provided by the embodiment of the invention.
In a preferred embodiment of the present invention, when the target to be detected is confirmed, the method includes the steps of, in particular, integrating the infrared processing data, the radar processing data, and the photographing processing data, determining whether there is an abnormal behavior, and when there is an abnormal behavior, performing an abnormal behavior alarm:
step S1051, when the target to be detected is confirmed, creating an action prediction task;
in step S1051, the action prediction is set to be generated in succession according to the degree of urgency. In the present embodiment, the calculation formula of the priority of the action prediction task is expressed as:
Wherein, Representing the priority of the generated action prediction task,/>Representing the adjustment coefficient,/>Weight coefficients respectively representing infrared processing data, radar processing data and photographing processing data,/>Respectively representing target observation values extracted from infrared processing data, radar processing data and shooting processing data,/>Representing the influencing factors representing the context information.
It will be appreciated that the number of components,Higher values of (2) indicate more urgent tasks. For/>These coefficients may be adjusted according to the reliability and importance of the different data sources. For/>These observations may be feature vectors, motion parameters, etc. of the target to describe the state of the target. For/>Influence factors representing context information, including environmental conditions, historical data, other sensor information, etc., have an important impact on the creation of predictive tasks.
After the priority of the action prediction task is calculated, the action prediction task is generated in the order of the priority of the action prediction task from large to small.
Step S1052, according to the action prediction task, synthesizing the infrared processing data, the radar processing data and the shooting processing data, and predicting the subsequent action direction;
In this step, the calculation formula of the follow-up direction vector is expressed as:
Wherein, Representing a follow-up action direction vector,/>Representing the weighting coefficient,/>Representing feature vectors extracted from infrared-processed data,/>Representing a target velocity vector obtained from radar processing data,/>Representing the visual feature vector extracted from the photographing process data.
It should be noted that, for the feature vector extracted from the infrared processing dataThe vector contains key features extracted from the infrared data about the target, such as heat radiation pattern, temperature distribution or temperature gradient, etc., which information aids in analyzing the active state and energy output of the target. For target velocity vector/>, obtained from radar processing dataThe vector provides the speed and direction information of the target, and generally comprises the radial speed and azimuth angle of the target, which is an important basis for predicting the moving track of the target. For visual feature vector/>, extracted from shooting process dataThe vector contains features of the object extracted from the visual image, such as shape, size, movement pattern, etc., which are important for identifying the type and behavioral intention of the object and help to predict the direction of action of the object more accurately.
Step S1053, determining whether there is an abnormal action according to the follow-up action direction;
In step S1054, when there is an abnormal behavior, an abnormal behavior alarm signal is generated, and an abnormal behavior alarm is performed.
Further, fig. 7 shows an application architecture diagram of the system provided by the embodiment of the present invention.
In another preferred embodiment of the present invention, the target detection tracking recognition system based on multiple data fusion includes:
the infrared monitoring radar detection unit 101 is configured to perform real-time infrared monitoring and radar detection, acquire infrared monitoring data and radar detection data, perform data preprocessing, and generate infrared processing data and radar processing data.
In the embodiment of the invention, the infrared monitoring radar detection unit 101 acquires infrared monitoring data by carrying out real-time infrared monitoring on an area needing target detection tracking identification, carries out data preprocessing of temperature correction, background subtraction and threshold background separation on the infrared monitoring data, ensures temperature measurement accuracy, separates out a foreground target thermal signal and further subdivides a target area to generate infrared processing data, simultaneously carries out real-time radar detection on the area needing target detection tracking identification to acquire radar detection data, carries out data preprocessing of coordinate conversion, ground clutter removal and radar point cloud clustering on the radar detection data, converts a radar original coordinate system into a uniform geographic or image coordinate system, eliminates clutter cloud generated by ground reflection, and initially separates out potential targets to generate radar processing data.
Specifically, fig. 8 shows a block diagram of a structure of an infrared monitoring radar detection unit 101 in the system according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the infrared monitoring radar detection unit 101 specifically includes:
the infrared monitoring module 1011 is used for carrying out infrared monitoring and acquiring infrared monitoring data in real time;
An infrared preprocessing module 1012, configured to perform data preprocessing of temperature correction, background subtraction and threshold background separation on the infrared monitoring data, and generate infrared processing data;
the radar detection module 1013 is configured to perform radar detection and acquire radar detection data in real time;
The radar preprocessing module 1014 is configured to perform coordinate transformation, ground clutter removal and data preprocessing of radar point cloud clustering on the radar detection data, and generate radar processing data.
Further, the target detection tracking identification system based on multi-data fusion further comprises:
and the target detection judging unit 102 is used for carrying out target detection on the infrared processing data and the radar processing data and judging whether a suspicious target to be detected exists.
In the embodiment of the present invention, the target detection determining unit 102 extracts the infrared temperature characteristic in the infrared processing data according to the preset infrared detection model, performs target detection, generates an infrared detection result, extracts the geometric characteristic of the point cloud in the radar processing data according to the preset radar detection model, performs target detection, generates a radar detection result, performs confidence analysis on the infrared detection result based on the algorithm of edge detection, template matching, deep learning (such as U-Net, FCN, etc.), performs confidence analysis on the radar detection result by using the 3D target detection model such as PointPillars, SECOND, etc., generates a comprehensive detection confidence, and determines whether a suspicious target to be detected exists by comparing the detection confidence with the preset standard confidence.
Specifically, fig. 9 shows a block diagram of the structure of the target detection determining unit 102 in the system according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the target detection determining unit 102 specifically includes:
the infrared target detection module 1021 is configured to perform target detection on the infrared processing data according to a preset infrared detection model, and generate an infrared detection result;
the radar target detection module 1022 is configured to perform target detection on the radar processing data according to a preset radar detection model, and generate a radar detection result;
the detection confidence determining module 1023 is used for comprehensively analyzing the infrared detection result and the radar detection result to determine the detection confidence;
the confidence coefficient comparing module 1024 is configured to compare the detection confidence coefficient with a preset standard confidence coefficient, and determine whether a suspicious target to be detected exists.
Further, the target detection tracking identification system based on multi-data fusion further comprises:
and the tracking shooting processing unit 103 is used for determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data when the suspicious target to be detected exists, and carrying out tracking shooting and preprocessing to acquire shooting processing data.
In the embodiment of the invention, when the detection confidence is greater than the preset standard confidence, it is determined that a suspicious target to be detected exists, at this time, the tracking shooting processing unit 103 identifies the target azimuth by carrying out target azimuth recognition on the infrared processing data and the radar processing data, determines the target shooting angle according to the suspicious target azimuth and the preset shooting space position, carries out tracking shooting control according to the target shooting angle, acquires tracking shooting data, carries out data preprocessing of gray level conversion, histogram equalization and image noise reduction on the tracking shooting data, reduces calculation complexity, improves information loss caused by uneven illumination, improves detection performance, and generates shooting processing data.
Specifically, fig. 10 shows a block diagram of the structure of the tracking shooting processing unit 103 in the system provided by the embodiment of the present invention.
In a preferred embodiment of the present invention, the tracking shooting processing unit 103 specifically includes:
the target position determining module 1031 is configured to determine, when a suspicious target to be detected is present, a suspicious target position according to the infrared processing data and the radar processing data;
The shooting angle determining module 1032 is configured to determine a target shooting angle according to the suspicious target azimuth and a preset shooting space position;
the tracking shooting module 1033 is configured to perform tracking shooting according to the target shooting angle, and obtain tracking shooting data;
The shooting preprocessing module 1034 is configured to perform data preprocessing including gray level conversion, histogram equalization and image noise reduction on the tracking shooting data, and generate shooting processing data.
Further, the target detection tracking identification system based on multi-data fusion further comprises:
and the target confirmation and identification unit 104 is configured to confirm and identify the suspicious target to be detected according to the shooting processing data, and determine whether the suspicious target to be detected has the confirmation target to be detected.
In the embodiment of the present invention, the target confirmation identifying unit 104 obtains a plurality of shooting identifying features by performing feature identification on the shooting processing data, then introduces a plurality of standard identifying features, compares the plurality of shooting identifying features with the plurality of standard identifying features, determines whether the target to be checked is present, and determines that the target to be checked is present when the number of matches between the plurality of shooting identifying features and the plurality of standard identifying features is greater than a preset standard number; and when the number of matches between the plurality of shooting identification features and the plurality of standard identification features is not greater than the preset standard number, judging that the object to be detected is not confirmed.
And an abnormal behavior judging unit 105 for integrating the infrared processing data, the radar processing data and the photographing processing data when the object to be detected is confirmed, judging whether the object to be detected is abnormal in behavior, and giving an abnormal behavior alarm when the object to be detected is abnormal in behavior.
In the embodiment of the present invention, when the target to be detected is confirmed, the action abnormality determination unit 105 creates an action prediction task, and at this time, according to the action prediction task, performs comprehensive action trajectory analysis on the infrared processing data, the radar processing data, and the photographing processing data, predicts the subsequent action direction, determines whether there is an action abnormality according to the subsequent action direction, and generates an abnormality alarm signal when there is an action abnormality, and performs an abnormality action alarm.
It should be understood that, although the steps in the flowcharts of the embodiments of the present invention are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in various embodiments may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
Those skilled in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by a computer program for instructing relevant hardware, where the program may be stored in a non-volatile computer readable storage medium, and where the program, when executed, may include processes in the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous link (SYNCHLINK) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the invention and are described in detail herein without thereby limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (10)

1. The target detection tracking identification method based on the multi-data fusion is characterized by comprising the following steps of:
Performing real-time infrared monitoring and radar detection to obtain infrared monitoring data and radar detection data, and performing data preprocessing to generate infrared processing data and radar processing data;
performing target detection on the infrared processing data and the radar processing data, and judging whether a suspicious target to be detected exists or not;
When a suspicious target to be detected exists, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data, and carrying out tracking shooting and preprocessing to acquire shooting processing data;
According to the shooting processing data, confirming and identifying the suspicious object to be detected, and judging whether the suspicious object to be detected exists or not;
when the target to be detected is confirmed, the infrared processing data, the radar processing data and the shooting processing data are integrated, whether the action abnormality exists or not is judged, and when the action abnormality exists, abnormal action alarm is carried out;
The method for generating the infrared processing data and the radar processing data specifically comprises the following steps of:
Performing infrared monitoring to acquire infrared monitoring data in real time;
performing data preprocessing of temperature correction, background subtraction and threshold background separation on the infrared monitoring data to generate infrared processing data;
Radar detection is carried out, and radar detection data are obtained in real time;
Performing coordinate conversion, ground clutter removal and data preprocessing of radar point cloud clustering on the radar detection data to generate radar processing data;
The target detection is carried out on the infrared processing data and the radar processing data, and the judging whether the suspicious target to be detected exists specifically comprises the following steps:
performing target detection on the infrared processing data according to a preset infrared detection model to generate an infrared detection result;
Performing target detection on the radar processing data according to a preset radar detection model to generate a radar detection result;
comprehensively analyzing the infrared detection result and the radar detection result to determine the detection confidence;
comparing the detection confidence with a preset standard confidence, and judging whether a suspicious object to be detected exists or not;
The method for comprehensively analyzing the infrared detection result and the radar detection result and determining the detection confidence comprises the following steps:
acquiring an infrared detection result and a radar detection result, wherein the infrared detection result comprises the confidence coefficient of the infrared detection result, and the radar detection result comprises the confidence coefficient of the radar detection result;
calculating the confidence coefficient of the detection result according to the confidence coefficient of the infrared detection result and the confidence coefficient of the radar detection result to obtain a detection confidence coefficient;
the calculation formula of the detection confidence coefficient is expressed as follows:
Wherein, Representing the confidence of the detection,/>Representing the confidence of the infrared detection result,/>Representing confidence of radar detection result,/>Representing the weight of infrared detection results in detection confidence calculation,/>Representing the weight of radar detection results in detection confidence calculation,/>Representing adjustment factors,/>The smaller value of the confidence coefficient of the infrared detection result and the confidence coefficient of the radar detection result is represented.
2. The method for detecting, tracking and identifying a target based on multi-data fusion according to claim 1, wherein in the step of performing temperature correction on the infrared monitoring data, the step of performing temperature correction is performed by the following formula:
Wherein, Representing corrected temperature value,/>Representing temperature values in raw infrared monitoring data,/>Representing the calibration coefficient,/>Representing the real-time temperature of the environment in which the infrared detection device is located,/>Representing the reference temperature/>Representing the emissivity of the target object;
In the step of separating the background of the threshold value from the infrared monitoring data, the background is separated from the target by setting a threshold value for separating the background from the target;
wherein the calculation formula of the threshold value for separating the background and the target is expressed as:
Wherein, Representing a threshold for separating background and objects,/>Representing the average temperature value of the local area/(Representing the threshold coefficient,/>Representing the standard deviation of the local area temperature.
3. The method for detecting, tracking and identifying a target based on multiple data fusion according to claim 2, wherein in the step of performing ground clutter removal on the radar detection data, the following filter calculation formula is adopted:
Wherein, Representing the filtering factor,/>Representing detected target speed,/>Representing detected target acceleration,/>Threshold representing speed,/>A threshold value representing acceleration;
In the operation of carrying out radar point cloud clustering on the radar detection data, a clustering algorithm based on distance and speed is adopted to calculate each point Calculate the dot/>And all other points/>Similarity/>The corresponding calculation formula is expressed as:
Wherein, Representing a dot/>And dot/>Similarity of/>Representing a dot/>And dot/>Spatial distance of/>Representing a dot/>And dot/>Velocity difference of/>Threshold representing spatial similarity,/>A threshold value representing a speed similarity.
4. The method for detecting, tracking and identifying a target based on multi-data fusion according to claim 3, wherein when the target to be detected is suspicious, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data, and performing tracking shooting and preprocessing, and acquiring shooting processing data specifically comprises the following steps:
When a suspicious target to be detected exists, determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data;
determining a target shooting angle according to the suspicious target azimuth and a preset shooting space position;
tracking shooting is carried out according to the target shooting angle, and tracking shooting data are obtained;
and carrying out data preprocessing of gray conversion, histogram equalization and image noise reduction on the tracking shooting data to generate shooting processing data.
5. The method for detecting, tracking and identifying a target based on multi-data fusion according to claim 4, wherein the method for determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data when the suspicious target is detected comprises the following steps:
acquiring pixel offset of a suspicious target highest temperature point in the infrared image relative to the center of the image and the focal length of the infrared image based on the infrared processing data;
acquiring a distance between a suspicious target detected by the radar and an initial azimuth angle of the suspicious target detected by the radar based on the radar processing data;
Calculating and determining to obtain a target azimuth according to pixel offset of a highest temperature point of a suspicious target in an infrared image relative to the center of the image, the focal length of the infrared image, the distance between the suspicious target detected by a radar and the initial azimuth of the suspicious target detected by the radar;
the calculation formula of the target azimuth is expressed as follows:
Wherein, Representing the distance between the suspicious object detected by the radar and the radar,/>Representing the initial azimuth of a suspicious object detected by radar,/>Representing the azimuth offset adjusted by the infrared data,/>Representing the azimuth angle of the suspicious target finally determined after infrared data adjustment;
Wherein, Representing the first calibration coefficient,/>Representing the second calibration factor,/>Representing the pixel offset in the horizontal direction of the highest temperature point of a suspicious object in an infrared image relative to the center of the image,/>Representing the pixel offset in the vertical direction of the highest temperature point of a suspicious object in an infrared image relative to the center of the image,/>Representing the focal length of the infrared image.
6. The method for detecting, tracking and identifying a target based on multiple data fusion according to claim 5, wherein the step of identifying the suspicious target according to the photographing processing data, and determining whether the suspicious target has the identification target specifically comprises the following steps:
performing feature recognition on the shooting processing data to acquire a plurality of shooting recognition features;
Importing a plurality of standard identification features;
And comparing the shooting identification features with the standard identification features, and judging whether the target to be detected is confirmed.
7. The multi-data fusion-based object detection tracking recognition method according to claim 6, wherein the method of comparing the plurality of shot recognition features with the plurality of standard recognition features to determine whether there is a recognition object to be detected comprises the steps of:
acquiring the characteristic value of the shooting identification characteristic and the characteristic value of the standard identification characteristic;
Calculating to obtain a matching degree according to the characteristic value of the shooting identification characteristic and the characteristic value of the standard identification characteristic;
Wherein, the calculation formula of the matching degree is expressed as:
Wherein, Representing the degree of matching,/>Representing the total number of features of the shot recognition feature compared with the standard recognition feature,/>Represents the/>Characteristic value of individual shooting identification characteristic,/>Represents the/>Characteristic values of standard identification features corresponding to the individual photographing identification features,Represents the/>Larger value of individual shooting identification feature and corresponding standard identification feature,/>Representing a ride-through operation;
and when the matching degree is judged to be larger than the matching degree threshold, determining that the shooting identification feature is matched with the corresponding standard identification feature so as to confirm the object to be detected.
8. The method for detecting, tracking and identifying a target based on multiple data fusion according to claim 7, wherein the step of integrating the infrared processing data, the radar processing data and the photographing processing data to determine whether there is an abnormality in the behavior when there is a confirmation of the target to be detected, and performing an abnormal behavior alarm when there is an abnormality in the behavior specifically comprises the steps of:
Creating an action prediction task when the target to be detected is confirmed;
according to the action prediction task, the infrared processing data, the radar processing data and the shooting processing data are synthesized, and the follow-up action direction is predicted;
Judging whether the follow-up action direction has abnormal action or not according to the follow-up action direction;
when there is an abnormal action, an abnormal action alarm signal is generated and an abnormal action alarm is performed.
9. The method for detecting, tracking and identifying a target based on multiple data fusion according to claim 8, wherein in the step of creating an action prediction task with confirmation of a target to be detected, a calculation formula of a priority of the action prediction task is expressed as:
Wherein, Representing the priority of the generated action prediction task,/>Representing the adjustment coefficient,/>Weight coefficients respectively representing infrared processing data, radar processing data and photographing processing data,/>Respectively representing target observation values extracted from infrared processing data, radar processing data and shooting processing data,/>Representing an impact factor representing the context information;
in the step of predicting the follow-up direction by integrating the infrared processing data, the radar processing data, and the photographing processing data in accordance with the creation of the action prediction task, the calculation formula of the follow-up direction vector is expressed as:
Wherein, Representing a follow-up action direction vector,/>Representing the weighting coefficient,/>Representing feature vectors extracted from infrared-processed data,/>Representing a target velocity vector obtained from radar processing data,/>Representing the visual feature vector extracted from the photographing process data.
10. The target detection tracking recognition system based on multi-data fusion is characterized in that the target detection tracking recognition method based on multi-data fusion as claimed in any one of the claims 1 to 9 is applied, and the system comprises an infrared monitoring radar detection unit, a target detection judgment unit, a tracking shooting processing unit, a target confirmation recognition unit and an action abnormality judgment unit, wherein:
the infrared monitoring radar detection unit is used for carrying out real-time infrared monitoring and radar detection, acquiring infrared monitoring data and radar detection data, and carrying out data preprocessing to generate infrared processing data and radar processing data;
The target detection judging unit is used for carrying out target detection on the infrared processing data and the radar processing data and judging whether a suspicious target to be detected exists or not;
The tracking shooting processing unit is used for determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data when the suspicious target to be detected exists, tracking shooting and preprocessing are carried out, and shooting processing data are obtained;
The target confirmation and identification unit is used for confirming and identifying the suspicious target to be detected according to the shooting processing data and judging whether the suspicious target to be detected is available;
The action abnormality judging unit is used for integrating the infrared processing data, the radar processing data and the shooting processing data when the target to be detected is confirmed, judging whether the action abnormality exists or not, and alarming abnormal actions when the action abnormality exists;
The infrared monitoring radar detection unit specifically comprises:
The infrared monitoring module is used for carrying out infrared monitoring and acquiring infrared monitoring data in real time;
the infrared preprocessing module is used for preprocessing the infrared monitoring data for temperature correction, background subtraction and threshold background separation to generate infrared processing data;
The radar detection module is used for carrying out radar detection and acquiring radar detection data in real time;
The radar preprocessing module is used for carrying out coordinate conversion, ground clutter removal and data preprocessing of radar point cloud clustering on the radar detection data to generate radar processing data; the target detection judging unit specifically includes:
The infrared target detection module is used for carrying out target detection on the infrared processing data according to a preset infrared detection model to generate an infrared detection result;
The radar target detection module is used for carrying out target detection on the radar processing data according to a preset radar detection model to generate a radar detection result;
the detection confidence determining module is used for comprehensively analyzing the infrared detection result and the radar detection result and determining the detection confidence;
the confidence coefficient comparison module is used for comparing the detection confidence coefficient with a preset standard confidence coefficient and judging whether a suspicious object to be detected exists or not;
The tracking shooting processing unit specifically comprises:
The target azimuth determining module is used for determining the azimuth of the suspicious target according to the infrared processing data and the radar processing data when the suspicious target to be detected is provided;
the shooting angle determining module is used for determining a target shooting angle according to the suspicious target azimuth and a preset shooting space position;
The tracking shooting module is used for tracking shooting according to the target shooting angle to acquire tracking shooting data;
and the shooting preprocessing module is used for carrying out data preprocessing of gray conversion, histogram equalization and image noise reduction on the tracking shooting data to generate shooting processing data.
CN202410606505.6A 2024-05-16 2024-05-16 Target detection tracking identification method and system based on multi-data fusion Active CN118172548B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410606505.6A CN118172548B (en) 2024-05-16 2024-05-16 Target detection tracking identification method and system based on multi-data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410606505.6A CN118172548B (en) 2024-05-16 2024-05-16 Target detection tracking identification method and system based on multi-data fusion

Publications (2)

Publication Number Publication Date
CN118172548A true CN118172548A (en) 2024-06-11
CN118172548B CN118172548B (en) 2024-07-19

Family

ID=91355167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410606505.6A Active CN118172548B (en) 2024-05-16 2024-05-16 Target detection tracking identification method and system based on multi-data fusion

Country Status (1)

Country Link
CN (1) CN118172548B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006029754A (en) * 2004-07-22 2006-02-02 Toshiba Corp Flying body tracking method and device therefor
EP2000817A2 (en) * 2007-05-07 2008-12-10 Harris Corporation Systems and methods for improved target tracking for tactical imaging
CN105842683A (en) * 2016-05-27 2016-08-10 南京博驰光电科技有限公司 Unmanned aerial vehicle integrated defense system and method
CN106204629A (en) * 2016-08-17 2016-12-07 西安电子科技大学 Space based radar and infrared data merge moving target detection method in-orbit
CN106569511A (en) * 2016-11-01 2017-04-19 北京无线电测量研究所 Unmanned aerial vehicle trapping interception system based on electromagnetic interference and trapping interception method thereof
CN107846258A (en) * 2017-09-07 2018-03-27 新疆美特智能安全工程股份有限公司 A kind of unmanned plane system of defense
JP2018091713A (en) * 2016-12-02 2018-06-14 三菱電機株式会社 Tracking device and multi-sensor system
CN108447075A (en) * 2018-02-08 2018-08-24 烟台欣飞智能***有限公司 A kind of unmanned plane monitoring system and its monitoring method
CN108761403A (en) * 2018-05-31 2018-11-06 智飞智能装备科技东台有限公司 A kind of anti-unmanned plane system of defense based on radar
CN110375585A (en) * 2019-06-24 2019-10-25 湖北工业大学 A kind of flying object invasion reply system and method based on double capstan heads
WO2020108647A1 (en) * 2018-11-30 2020-06-04 杭州海康威视数字技术股份有限公司 Target detection method, apparatus and system based on linkage between vehicle-mounted camera and vehicle-mounted radar
US20210241026A1 (en) * 2020-02-04 2021-08-05 Nio Usa, Inc. Single frame 4d detection using deep fusion of camera image, imaging radar and lidar point cloud
CN113219454A (en) * 2021-04-22 2021-08-06 九州云(北京)科技发展有限公司 System and method for preventing vehicles and personnel from invading runway based on millimeter wave radar
CN114019502A (en) * 2021-11-08 2022-02-08 北京环境特性研究所 Integrated target reconnaissance system

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006029754A (en) * 2004-07-22 2006-02-02 Toshiba Corp Flying body tracking method and device therefor
EP2000817A2 (en) * 2007-05-07 2008-12-10 Harris Corporation Systems and methods for improved target tracking for tactical imaging
CN105842683A (en) * 2016-05-27 2016-08-10 南京博驰光电科技有限公司 Unmanned aerial vehicle integrated defense system and method
CN106204629A (en) * 2016-08-17 2016-12-07 西安电子科技大学 Space based radar and infrared data merge moving target detection method in-orbit
CN106569511A (en) * 2016-11-01 2017-04-19 北京无线电测量研究所 Unmanned aerial vehicle trapping interception system based on electromagnetic interference and trapping interception method thereof
JP2018091713A (en) * 2016-12-02 2018-06-14 三菱電機株式会社 Tracking device and multi-sensor system
CN107846258A (en) * 2017-09-07 2018-03-27 新疆美特智能安全工程股份有限公司 A kind of unmanned plane system of defense
CN108447075A (en) * 2018-02-08 2018-08-24 烟台欣飞智能***有限公司 A kind of unmanned plane monitoring system and its monitoring method
CN108761403A (en) * 2018-05-31 2018-11-06 智飞智能装备科技东台有限公司 A kind of anti-unmanned plane system of defense based on radar
WO2020108647A1 (en) * 2018-11-30 2020-06-04 杭州海康威视数字技术股份有限公司 Target detection method, apparatus and system based on linkage between vehicle-mounted camera and vehicle-mounted radar
CN110375585A (en) * 2019-06-24 2019-10-25 湖北工业大学 A kind of flying object invasion reply system and method based on double capstan heads
US20210241026A1 (en) * 2020-02-04 2021-08-05 Nio Usa, Inc. Single frame 4d detection using deep fusion of camera image, imaging radar and lidar point cloud
CN113219454A (en) * 2021-04-22 2021-08-06 九州云(北京)科技发展有限公司 System and method for preventing vehicles and personnel from invading runway based on millimeter wave radar
CN114019502A (en) * 2021-11-08 2022-02-08 北京环境特性研究所 Integrated target reconnaissance system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张卫东;刘笑成;韩鹏;: "水上无人***研究进展及其面临的挑战", 自动化学报, no. 05, 31 May 2020 (2020-05-31) *

Also Published As

Publication number Publication date
CN118172548B (en) 2024-07-19

Similar Documents

Publication Publication Date Title
CN111950329B (en) Target detection and model training method, device, computer equipment and storage medium
US10115209B2 (en) Image target tracking method and system thereof
CN109035299B (en) Target tracking method and device, computer equipment and storage medium
KR101995107B1 (en) Method and system for artificial intelligence based video surveillance using deep learning
US10388022B2 (en) Image target tracking method and system thereof
KR101764845B1 (en) A video surveillance apparatus for removing overlap and tracking multiple moving objects and method thereof
US9008440B2 (en) Component recognizing apparatus and component recognizing method
CN112149491A (en) Method for determining a trust value of a detected object
CN111160365A (en) Unmanned aerial vehicle target tracking method based on combination of detector and tracker
CN112989910A (en) Power target detection method and device, computer equipment and storage medium
CN116681730A (en) Target tracking method, device, computer equipment and storage medium
CN110567324A (en) multi-target group threat degree prediction device and method based on DS evidence theory
JP2023010697A (en) Contrastive predictive coding for anomaly detection and segmentation
CN117218380A (en) Dynamic target detection tracking method for unmanned ship remote sensing image
Bae et al. Image tracking algorithm using template matching and PSNF-m
CN114863201A (en) Training method and device of three-dimensional detection model, computer equipment and storage medium
CN111105436B (en) Target tracking method, computer device and storage medium
CN118172548B (en) Target detection tracking identification method and system based on multi-data fusion
Ojdanić et al. Parallel architecture for low latency UAV detection and tracking using robotic telescopes
CN111681266A (en) Ship tracking method, system, equipment and storage medium
CN109117850B (en) Method for identifying corresponding infrared target image by utilizing visible light target image
CN108985216B (en) Pedestrian head detection method based on multivariate logistic regression feature fusion
CN115063714A (en) Bird collision accident prevention target detection method based on improved YOLOv5s network
Ullah et al. Deep learning based wheat ears count in robot images for wheat phenotyping
Peruničić et al. Vision-based Vehicle Speed Estimation Using the YOLO Detector and RNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant