CN116883686A - Rocket recovery sub-level air recognition and tracking method, device and storage medium - Google Patents

Rocket recovery sub-level air recognition and tracking method, device and storage medium Download PDF

Info

Publication number
CN116883686A
CN116883686A CN202310920723.2A CN202310920723A CN116883686A CN 116883686 A CN116883686 A CN 116883686A CN 202310920723 A CN202310920723 A CN 202310920723A CN 116883686 A CN116883686 A CN 116883686A
Authority
CN
China
Prior art keywords
recovery sub
level
rocket recovery
rocket
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310920723.2A
Other languages
Chinese (zh)
Other versions
CN116883686B (en
Inventor
王健
布向伟
彭昊旻
姚颂
魏凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongfang Space Technology Shandong Co Ltd
Original Assignee
Dongfang Space Technology Shandong Co Ltd
Orienspace Hainan Technology Co Ltd
Orienspace Technology Beijing Co Ltd
Orienspace Xian Aerospace Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongfang Space Technology Shandong Co Ltd, Orienspace Hainan Technology Co Ltd, Orienspace Technology Beijing Co Ltd, Orienspace Xian Aerospace Technology Co Ltd filed Critical Dongfang Space Technology Shandong Co Ltd
Priority to CN202310920723.2A priority Critical patent/CN116883686B/en
Publication of CN116883686A publication Critical patent/CN116883686A/en
Application granted granted Critical
Publication of CN116883686B publication Critical patent/CN116883686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/86Arrangements for image or video recognition or understanding using pattern recognition or machine learning using syntactic or structural representations of the image or video pattern, e.g. symbolic string recognition; using graph matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Computational Mathematics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Algebra (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a rocket recovery sub-level air recognition and tracking method, rocket recovery sub-level air recognition and tracking equipment and a storage medium, which belong to the technical field of target recognition and tracking. And secondly, tracking the motion trail of the target in real time by utilizing an SORT algorithm. The rocket recovery sub-stage identification method and device can effectively and accurately identify and classify the rocket recovery sub-stage, improve the accuracy and reliability of target identification, and effectively track the rocket recovery sub-stage.

Description

Rocket recovery sub-level air recognition and tracking method, device and storage medium
Technical Field
The application relates to the technical field of target identification and tracking, in particular to a rocket recovery sub-level air identification and tracking method, rocket recovery sub-level air identification and tracking equipment and a storage medium.
Background
In rocket sub-level recovery air recognition and tracking technical field, the prior art scheme mainly relates to target recognition and tracking. Conventional target recognition methods include feature extraction and classification algorithms such as Support Vector Machines (SVMs), random Forest (Random Forest), and Convolutional Neural Networks (CNNs). These methods utilize a training dataset to build a model and classify the target according to the extracted features. In rocket sub-level recovery tasks, targets may be shielded by flames, smoke, high-speed motion and other objects due to environmental complexity and rocket motion specificity, and accuracy and reliability of rocket recovery sub-level identification in the prior art cannot meet requirements.
Furthermore, common methods for object tracking techniques include Kalman filters (Kalman filters) and Particle filters (Particle filters), as well as deep learning based algorithms such as Siamese networks and MDNet. These methods estimate the position and motion trajectory of the target through state updates between the model and the observed data. However, on the premise that the environmental complexity is high and the rocket recovery sub-level motion has specificity, the independent use of the methods may be influenced by factors such as target shielding, appearance change, data noise and the like, so that tracking is unstable or fails.
In view of the foregoing, it is necessary to provide a new solution to the above-mentioned problems.
Disclosure of Invention
In order to solve the technical problems, the application provides a rocket recovery sub-stage air recognition and tracking method, equipment and a storage medium, which can effectively and accurately recognize and classify the rocket recovery sub-stage, improve the accuracy and reliability of target recognition and effectively track the rocket recovery sub-stage.
A rocket recovery sub-level air recognition and tracking method comprises the following steps:
acquiring data related to rocket recovery sub-level air identification and tracking, and preprocessing the acquired data;
fusing the preprocessed information to extract multi-mode characteristics of rocket recovery sub-stages;
carrying out rocket recovery sub-level tracking and data association by combining a KNN algorithm, comparing and matching target characteristics of a current frame with target characteristics of a previous frame, establishing a rocket recovery sub-level track, and updating a rocket recovery sub-level state;
based on target track and historical motion information of the rocket recovery sub-stage, predicting, locking and tracking rocket recovery sub-stage motion by utilizing a SORT algorithm;
the position, speed and attitude information of the rocket recovery sub-stage are updated in real time, and accurate feedback and decision basis are provided for rocket recovery sub-stage air identification and tracking.
Preferably, the data related to rocket recovery sub-level air identification and tracking comprises: image data, video data, distance data, and speed data.
Preferably, preprocessing the data includes denoising, filtering, and data correction.
Preferably, the multi-modal features in the multi-modal features of the extracted rocket recovery sub-stage include shape, color, texture, and velocity and acceleration of the rocket recovery sub-stage.
Preferably, in the fusing of the preprocessed information, the formula for performing information fusion calculation is as follows:
in the method, in the process of the application,is a fusion feature vector; />Image feature weight when features are fused; />Video feature weights at feature fusion; />Distance feature weight when features are fused; />The speed characteristic weight is used for characteristic fusion;a vector representation of image features; />A vector representation that is a feature of the video; />A vector representation that is a distance feature; />Is a vector representation of the velocity features.
Preferably, the performing rocket recovery sub-level tracking and data association by combining with the KNN algorithm, comparing and matching the target feature of the current frame with the target feature of the previous frame, establishing a trajectory of the rocket recovery sub-level, and updating the rocket recovery sub-level state includes:
rocket recovery sub-level detection and recognition are carried out based on the comprehensive feature vectors;
target positioning is carried out based on comprehensive feature vectors, and rocket recovery sub-level positions are obtained
Performing target speed estimation based on the speed feature vector to obtain a rocket recovery sub-level speed estimation value;
performing target tracking based on the rocket recovery sub-level position and the velocity estimation value, and determining the rocket recovery sub-level state;
and carrying out rocket recovery sub-level state comprehensive evaluation based on the fusion feature vector and the target detection result.
Preferably, the performing rocket recovery sub-level state comprehensive evaluation based on the fusion feature vector and the target detection result includes:
weighting calculation is carried out on the fusion feature vector;
normalizing the feature vector;
weighting calculation is carried out on the target detection result:
and (5) comprehensively evaluating and calculating to determine the final recognition and positioning results of the arrow recovery sub-level after fusion.
Preferably, the comprehensive evaluation calculation formula is:
in the method, in the process of the application,for final fused target recognition and localization results, < > and so on>For accuracy, ->For confidence level->For the degree of risk->For accuracy weight, ++>Is confidence ofDegree weight (weight of->Is the risk degree weight;
if the accuracy is not less than the accuracy threshold, the accuracy evaluation result is 1, otherwise, the accuracy evaluation result is 0;
if the confidence coefficient is not smaller than the confidence coefficient threshold value, the confidence coefficient assessment result is 1, otherwise, the confidence coefficient assessment result is 0;
if the risk level is not greater than the risk level threshold, the risk level evaluation result is 1, otherwise, the risk level evaluation result is 0.
According to another aspect of the present application, there is also provided a computing device including: a processor, a memory storing a computer program which, when executed by the processor, performs the rocket recovery sub-level air identification and tracking method of any one of claims 1 to 8.
According to another aspect of the application there is also provided a computer readable storage medium having stored thereon computer instructions which, when run on a computer, cause the computer to perform the rocket recovery sub-level air identification and tracking method according to any one of claims 1 to 8.
Compared with the prior art, the application has at least the following beneficial effects:
1. the rocket recovery sub-stage identification method and device can effectively accurately identify and classify rocket recovery sub-stages, and improve the accuracy and reliability of target identification.
2. The application can improve the stability and accuracy of rocket recovery sub-level tracking and ensure target tracking in the rocket sub-level recovery process in a high-speed dynamic environment.
3. The application can also improve the robustness of the rocket sub-level recovery air recognition and tracking system under the complex scene and target shielding condition.
Drawings
Some specific embodiments of the application will be described in detail hereinafter by way of example and not by way of limitation with reference to the accompanying drawings. The same reference numbers will be used throughout the drawings to refer to the same or like parts or portions. It will be appreciated by those skilled in the art that the drawings are not necessarily drawn to scale. In the accompanying drawings:
FIG. 1 is a schematic diagram of the overall flow of the rocket recovery sub-level air recognition and tracking method of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
As shown in fig. 1, a rocket recovery sub-level air recognition and tracking method comprises the following steps:
and S1, acquiring data related to rocket recovery sub-level air recognition and tracking, and preprocessing the acquired data.
Specifically, data of devices such as a sensor, a camera, a radar and the like are collected, and image data, video data, distance data and speed data are preprocessed. The preprocessing comprises denoising, filtering and data correction, so that the data quality and accuracy are improved.
And S2, fusing the preprocessed information, and extracting multi-mode characteristics of the rocket recovery sub-level.
And the information from different devices such as a sensor, a camera and a radar after preprocessing is fused by utilizing a multi-mode data fusion technology. The extraction of the multi-modal features of the rocket recovery sub-stage includes the shape, color, texture, and velocity and acceleration of the rocket recovery sub-stage.
By comprehensively utilizing multi-mode data of images, videos, distances and speeds, the accuracy and the robustness of target detection and identification can be improved, and rocket sub-level targets can be accurately positioned and classified.
And S3, carrying out rocket recovery sub-level tracking and data association by combining a KNN algorithm, comparing and matching the target characteristics of the current frame with the target characteristics of the previous frame, establishing a rocket recovery sub-level track, and updating the rocket recovery sub-level state.
Specifically, the method comprises the following steps:
step S31, rocket recovery sub-level detection and recognition are carried out based on the comprehensive feature vectors;
s32, performing target positioning based on the comprehensive feature vector to obtain rocket recovery sub-level positions
S33, estimating a target speed based on the speed feature vector to obtain a rocket recovery sub-level speed estimated value;
step S34, performing target tracking based on the rocket recovery sub-level position and the velocity estimation value, and determining the rocket recovery sub-level state;
and S35, comprehensively evaluating the state of the rocket recovery sub-stage based on the fusion feature vector and the target detection result.
Specifically, step S35 includes:
step S351, weighting calculation is carried out on the fusion feature vector;
step S352, normalizing the feature vector;
step S353, performing weighted calculation on the target detection result:
and step 354, comprehensively evaluating and calculating to determine the final recognition and positioning result of the recovered sub-level of the arrow after fusion.
And S4, based on target track and historical motion information of the rocket recovery sub-stage, predicting, locking and tracking rocket recovery sub-stage motion by utilizing an SORT algorithm.
And S5, updating the position, speed and attitude information of the rocket recovery sub-stage in real time, and providing accurate feedback and decision basis for the aerial identification and tracking of the rocket recovery sub-stage.
As one embodiment of the application, the rocket recovery sub-stage is taken as a calculation target, and the implementation process of the rocket recovery sub-stage air recognition and tracking method is as follows:
first, the following parameters are defined:
image data (I): representing image data acquired by the camera.
Video data (V): representing continuous video data acquired by the camera.
Distance data (D): distance data representing the distance between the target and the rocket sub-level acquired by a sensor such as radar.
Speed data (S): data representing the relative velocity of the target and rocket sublevel acquired by a sensor such as radar.
Time (t): and (5) representing the time point of data acquisition under the rocket sublevel recovery time coordinate system.
Target position (P): representing the position of the target in the rocket-level coordinate system.
Target speed [ ]): representing the velocity of the target in the rocket-level coordinate system.
Image characteristics [ ]): representing feature vectors extracted from the image data.
Video feature [ ]): representing feature vectors extracted from video data.
Distance characteristic [ ]): representing feature vectors extracted from the distance data.
Speed characteristics [ ]): representing the feature vector extracted from the velocity data.
Appearance characteristics of%): representing the topographical features of the object, such as aspect ratio, geometry, etc.
Color characterization): representing the color histogram of the object.
Texture characteristics [ ]): representing texture characteristics of the object, such as texture frequency, contrast, etc.
Rocket sub-grade tail flame envelope characteristics): representing the shape and envelope characteristics of the rocket sub-level tail flame.
Target state (X): the state vector representing the object includes information such as position, velocity, etc.
Weight (W): and the weights are used for representing the characteristics of different data sources and are used for fusing information of different data.
Color threshold [ ]): representing threshold parameters for the color feature.
Texture threshold [ ]): representing threshold parameters for texture features.
Envelope threshold [ ]): a threshold parameter representing a characteristic for a rocket-sub-level tail envelope.
Fusion results (R): and representing the target recognition and positioning result after final fusion.
Data preprocessing:
a. preprocessing image data: = PreprocessImage(I);
b. video data preprocessing: = PreprocessVideo(V);
c. distance data preprocessing:= PreprocessDistance(D);
d. preprocessing speed data: = PreprocessSpeed(S)。
wherein PreprocessImage (I) is a function of preprocessing the image data I.
PreprocessVideo (V) is a function of preprocessing the video data V.
PreprocessDistance (D) is a function of preprocessing the distance data D.
PreprocessSpeed (S) a function of preprocessing the speed data S.
And (3) information fusion:
the formula for information fusion calculation is as follows:
in the method, in the process of the application,is a fusion feature vector; />Image feature weight when features are fused; />Video feature weights at feature fusion; />Distance feature weight when features are fused; />The speed characteristic weight is used for characteristic fusion;a vector representation of image features; />A vector representation that is a feature of the video; />A vector representation that is a distance feature; />Is a vector representation of the velocity features.
The target detection and identification formula is as follows:
DetectionResult = ObjectDetection(F);
wherein ObjectDetection (F) is a function for performing object detection based on the feature vector F, and DetectionResult is a returned detection result.
Target positioning and tracking includes:
a. target positioning: p= ObjectLocalization (F);
b. target speed estimation: = EstimateVelocity(/>);
c. Target tracking: x=objecttracking (P,);
the ObjectLocalization (F) is a function for performing target positioning based on the feature vector F, and the target position P can be returned by calculation.
EstimateVelocity() Is based on speed characteristics->Function for estimating target speed, by calculating the speed estimate value that can be returned +.>
ObjectTracking(P, ) For estimating value +.>The function of target tracking is performed, and the target state X can be returned by calculation.
The comprehensive evaluation process is as follows:
R = ObjectEvaluation(F, DetectionResult);
the objection evaluation (F, detectionResult) is a function for performing comprehensive evaluation based on the fusion feature vector F and the target detection result DetectionResult. The function comprehensively considers the weight and the threshold value of each feature, and calculates the evaluation result R of the target according to the set evaluation index. The evaluation result R can help the system to track the target in the rocket sublevel recovery process more accurately.
Specifically, the comprehensive evaluation includes the following procedures:
defining an evaluation index:
accuracy: a, A is as follows;
confidence level: c, performing operation;
degree of risk: d, a step of performing the process;
characteristic weight:,/> ,/> , ..., />
evaluation index threshold:,/> , />
the weighted feature vector calculation formula is:
the normalized eigenvector calculation formula is:
= Normalize(/>) ;
the target detection result is weighted.
The weighted target detection result is as follows:
the comprehensive evaluation calculation formula is as follows:
in the method, in the process of the application,for final fused target recognition and localization results, < > and so on>For accuracy, ->For confidence level->For the degree of risk->For accuracy weight, ++>For confidence weight, ++>Is a risk degree weight.
If, during the calculationThen->Otherwise->
If it isThen->Otherwise->
If it isThen->Otherwise->
Wherein, the liquid crystal display device comprises a liquid crystal display device,for the accuracy threshold, ++>For confidence threshold, ++>Is a risk level threshold.
In addition, the method also comprises the step of updating the position, speed and attitude information of the rocket recovery sub-stage in real time:
in the method, in the process of the application,for updated target recognition and localization results, < >>For accuracy calculation result +.>For confidence calculation result, ++>The results are calculated for the risk level.
Through the calculation, the comprehensive evaluation result R is obtained and used for tracking and locking rocket sublevel recovery pictures in real time. The dynamic tracking of rocket sublevel can be realized by continuously updating the position and state of the target, and the accuracy, the confidence and the risk degree of the target are evaluated according to the comprehensive evaluation result. This helps to improve the dynamic lock look and feel of the rocket sublevel recovery process.
In the rocket-stage recovery air recognition and tracking method, a KNN algorithm is used for recognizing a rocket-stage target. By using the training data set, the KNN algorithm can calculate the similarity with a known sample according to rocket sub-level data acquired through a wireless sensing network in the rocket sub-level landing area calculated in advance, so as to determine the category of the target. The rocket sub-level can be rapidly and accurately distinguished from other targets, and a starting point is provided for subsequent target tracking.
In the rocket sub-level recovery air recognition and tracking method, the SORT algorithm is used for carrying out real-time tracking on the recognized rocket sub-level targets. By combining a Kalman filtering technology, the SORT algorithm can predict the motion trail and position of the target, and meanwhile, the currently observed target is associated with the existing tracking result by an association filtering technology, so that the tracking accuracy and stability are ensured. The SORT algorithm can reliably track rocket sublevel targets in real time in a complex environment, and provides key position and visual motion information for recovery operation.
By combining the KNN algorithm and the SORT algorithm, the method can play an advantage in the aspects of target identification and real-time tracking. The KNN algorithm provides accurate target recognition capability, and the SORT algorithm ensures real-time tracking of targets, so that accurate recognition and stable tracking of rocket sub-stages are realized. The combined algorithm can improve the performance and efficiency of the system and provide reliable support for rocket sublevel recovery.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. The rocket recovery sub-level air recognition and tracking method is characterized by comprising the following steps of:
acquiring data related to rocket recovery sub-level air identification and tracking, and preprocessing the acquired data;
fusing the preprocessed information to extract multi-mode characteristics of rocket recovery sub-stages;
carrying out rocket recovery sub-level tracking and data association by combining a KNN algorithm, comparing and matching target characteristics of a current frame with target characteristics of a previous frame, establishing a rocket recovery sub-level track, and updating a rocket recovery sub-level state;
based on target track and historical motion information of the rocket recovery sub-stage, predicting, locking and tracking rocket recovery sub-stage motion by utilizing a SORT algorithm;
the position, speed and attitude information of the rocket recovery sub-stage are updated in real time, and accurate feedback and decision basis are provided for rocket recovery sub-stage air identification and tracking.
2. A rocket recovery sub-level aerial identification and tracking method as recited in claim 1, wherein said rocket recovery sub-level aerial identification and tracking related data comprises: image data, video data, distance data, and speed data.
3. A rocket recovery sub-level air identification and tracking method as recited in claim 2, wherein preprocessing said data comprises denoising, filtering, and data correction.
4. A rocket recovery sub-stage airborne identification and tracking method as recited in claim 1, wherein said extracting of multi-modal characteristics of rocket recovery sub-stages comprises rocket recovery sub-stage shape, color, texture, and velocity and acceleration.
5. The rocket recovery sub-level air recognition and tracking method according to claim 4, wherein in the process of fusing the preprocessed information, the formula for performing information fusion calculation is as follows:
in the method, in the process of the application,is a fusion feature vector; />Image feature weight when features are fused; />Video feature weights at feature fusion; />Distance feature weight when features are fused; />The speed characteristic weight is used for characteristic fusion; />A vector representation of image features; />A vector representation that is a feature of the video; />A vector representation that is a distance feature; />Is a vector representation of the velocity features.
6. The rocket recovery sub-level air recognition and tracking method according to claim 5, wherein the steps of combining the KNN algorithm to perform rocket recovery sub-level tracking and data association, comparing and matching the target features of the current frame with the target features of the previous frame, establishing the trajectory of the rocket recovery sub-level, and updating the rocket recovery sub-level state comprise:
rocket recovery sub-level detection and recognition are carried out based on the comprehensive feature vectors;
target positioning is carried out based on comprehensive feature vectors, and rocket recovery sub-level positions are obtained
Performing target speed estimation based on the speed feature vector to obtain a rocket recovery sub-level speed estimation value;
performing target tracking based on the rocket recovery sub-level position and the velocity estimation value, and determining the rocket recovery sub-level state;
and carrying out rocket recovery sub-level state comprehensive evaluation based on the fusion feature vector and the target detection result.
7. The rocket recovery sub-level air recognition and tracking method according to claim 6, wherein said rocket recovery sub-level state comprehensive evaluation based on the fusion feature vector and the target detection result comprises:
weighting calculation is carried out on the fusion feature vector;
normalizing the feature vector;
weighting calculation is carried out on the target detection result:
and (5) comprehensively evaluating and calculating to determine the final recognition and positioning results of the arrow recovery sub-level after fusion.
8. The rocket recovery sub-level aerial identification and tracking method of claim 7, wherein said comprehensive evaluation calculation formula is:
in the method, in the process of the application,for final fused target recognition and localization results, < > and so on>For accuracy, ->For confidence level->For the degree of risk->For accuracy weight, ++>For confidence weight, ++>Is the risk degree weight;
if the accuracy is not less than the accuracy threshold, the accuracy evaluation result is 1, otherwise, the accuracy evaluation result is 0;
if the confidence coefficient is not smaller than the confidence coefficient threshold value, the confidence coefficient assessment result is 1, otherwise, the confidence coefficient assessment result is 0;
if the risk level is not greater than the risk level threshold, the risk level evaluation result is 1, otherwise, the risk level evaluation result is 0.
9. A computing device, comprising: a processor, a memory storing a computer program which, when executed by the processor, performs the rocket recovery sub-level air identification and tracking method of any one of claims 1 to 8.
10. A computer readable storage medium having stored thereon computer instructions which, when run on a computer, cause the computer to perform a rocket recovery sub-level air identification and tracking method according to any one of claims 1 to 8.
CN202310920723.2A 2023-07-26 2023-07-26 Rocket recovery sub-level air recognition and tracking method, device and storage medium Active CN116883686B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310920723.2A CN116883686B (en) 2023-07-26 2023-07-26 Rocket recovery sub-level air recognition and tracking method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310920723.2A CN116883686B (en) 2023-07-26 2023-07-26 Rocket recovery sub-level air recognition and tracking method, device and storage medium

Publications (2)

Publication Number Publication Date
CN116883686A true CN116883686A (en) 2023-10-13
CN116883686B CN116883686B (en) 2024-03-12

Family

ID=88261996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310920723.2A Active CN116883686B (en) 2023-07-26 2023-07-26 Rocket recovery sub-level air recognition and tracking method, device and storage medium

Country Status (1)

Country Link
CN (1) CN116883686B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100237197A1 (en) * 2005-07-15 2010-09-23 Rosenfield Gary C Rocket ejection delay apparatus and/or method
WO2019227352A1 (en) * 2018-05-30 2019-12-05 深圳市大疆创新科技有限公司 Flight control method and aircraft
CN114387304A (en) * 2021-12-31 2022-04-22 北京旷视科技有限公司 Target tracking method, computer program product, storage medium, and electronic device
CN114397913A (en) * 2021-12-15 2022-04-26 中国人民解放军军事科学院国防科技创新研究院 Rocket wreckage searching and positioning system and method
CN114435631A (en) * 2022-02-17 2022-05-06 广州大学 Autonomous control system of spacecraft
CN114612506A (en) * 2022-02-19 2022-06-10 西北工业大学 Simple, efficient and anti-interference high-altitude parabolic track identification and positioning method
CN114757974A (en) * 2022-04-24 2022-07-15 汕头大学 Trajectory tracking method and system for multi-rotor unmanned aerial vehicle
CN115731268A (en) * 2022-11-17 2023-03-03 东南大学 Unmanned aerial vehicle multi-target tracking method based on visual/millimeter wave radar information fusion

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100237197A1 (en) * 2005-07-15 2010-09-23 Rosenfield Gary C Rocket ejection delay apparatus and/or method
WO2019227352A1 (en) * 2018-05-30 2019-12-05 深圳市大疆创新科技有限公司 Flight control method and aircraft
CN114397913A (en) * 2021-12-15 2022-04-26 中国人民解放军军事科学院国防科技创新研究院 Rocket wreckage searching and positioning system and method
CN114387304A (en) * 2021-12-31 2022-04-22 北京旷视科技有限公司 Target tracking method, computer program product, storage medium, and electronic device
CN114435631A (en) * 2022-02-17 2022-05-06 广州大学 Autonomous control system of spacecraft
CN114612506A (en) * 2022-02-19 2022-06-10 西北工业大学 Simple, efficient and anti-interference high-altitude parabolic track identification and positioning method
CN114757974A (en) * 2022-04-24 2022-07-15 汕头大学 Trajectory tracking method and system for multi-rotor unmanned aerial vehicle
CN115731268A (en) * 2022-11-17 2023-03-03 东南大学 Unmanned aerial vehicle multi-target tracking method based on visual/millimeter wave radar information fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
崔乃刚;吴荣;韦常柱;徐大富;: "火箭垂直返回双幂次固定时间收敛滑模控制方法", 哈尔滨工业大学学报, no. 04 *

Also Published As

Publication number Publication date
CN116883686B (en) 2024-03-12

Similar Documents

Publication Publication Date Title
CN105405154B (en) Target object tracking based on color-structure feature
CN105023278B (en) A kind of motion target tracking method and system based on optical flow method
CN115240130A (en) Pedestrian multi-target tracking method and device and computer readable storage medium
CN105160649A (en) Multi-target tracking method and system based on kernel function unsupervised clustering
CN110555868A (en) method for detecting small moving target under complex ground background
CN111666860A (en) Vehicle track tracking method integrating license plate information and vehicle characteristics
CN113379789A (en) Moving target tracking method in complex environment
Ge et al. Multi-target tracking based on Kalman filtering and optical flow histogram
Hossain et al. Fast-D: When non-smoothing color feature meets moving object detection in real-time
Yuan et al. High Speed Safe Autonomous Landing Marker Tracking of Fixed Wing Drone Based on Deep Learning
CN114549549A (en) Dynamic target modeling tracking method based on instance segmentation in dynamic environment
CN113313733A (en) Hierarchical unmanned aerial vehicle target tracking method based on shared convolution
CN116883686B (en) Rocket recovery sub-level air recognition and tracking method, device and storage medium
CN117011341A (en) Vehicle track detection method and system based on target tracking
CN116665097A (en) Self-adaptive target tracking method combining context awareness
CN116862832A (en) Three-dimensional live-action model-based operator positioning method
CN115855018A (en) Improved synchronous positioning and mapping method based on point-line comprehensive characteristics
Zhou et al. Road detection based on edge feature with GAC model in aerial image
Zhu et al. Moving vehicle detection and tracking algorithm in traffic video
Du CAMShift-Based Moving Object Tracking System
CN105654514A (en) Image target tracking method
Kainz et al. Estimating the Height of a Person from a Video Sequence
Sujatha et al. An innovative moving object detection and tracking system by using modified region growing algorithm
Sun et al. Research on wear recognition of electric worker’s helmet based on neural network
CN109544601A (en) A kind of object detecting and tracking method based on on-line study

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20231027

Address after: No. 1, Haixiang Middle Road, Fengcheng Street, Haiyang City, Yantai City, Shandong Province, 265100

Applicant after: Dongfang space technology (Shandong) Co.,Ltd.

Address before: No. 1, Haixiang Middle Road, Fengcheng Street, Haiyang City, Yantai City, Shandong Province, 265100

Applicant before: Dongfang space technology (Shandong) Co.,Ltd.

Applicant before: Oriental space technology (Beijing) Co.,Ltd.

Applicant before: Oriental space (Xi'an) Aerospace Technology Co.,Ltd.

Applicant before: Orient Space (Hainan) Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant