CN110781949B - Asynchronous serial multi-sensor-based flight path data fusion method and storage medium - Google Patents

Asynchronous serial multi-sensor-based flight path data fusion method and storage medium Download PDF

Info

Publication number
CN110781949B
CN110781949B CN201911011310.2A CN201911011310A CN110781949B CN 110781949 B CN110781949 B CN 110781949B CN 201911011310 A CN201911011310 A CN 201911011310A CN 110781949 B CN110781949 B CN 110781949B
Authority
CN
China
Prior art keywords
data
sensor
sensors
track
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911011310.2A
Other languages
Chinese (zh)
Other versions
CN110781949A (en
Inventor
李海鹏
陈文强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Hante Cloud Intelligent Technology Co ltd
Original Assignee
Fujian Hante Cloud Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Hante Cloud Intelligent Technology Co ltd filed Critical Fujian Hante Cloud Intelligent Technology Co ltd
Priority to CN201911011310.2A priority Critical patent/CN110781949B/en
Publication of CN110781949A publication Critical patent/CN110781949A/en
Application granted granted Critical
Publication of CN110781949B publication Critical patent/CN110781949B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a track data fusion method based on asynchronous serial multi-sensors and a storage medium, wherein the method comprises the following steps: acquiring target data through a plurality of sensors, wherein the target data comprises basic data and auxiliary data, and the plurality of sensors comprise main sensors and auxiliary sensors; selecting a sensor of the same type from the main sensors as a reference sensor; pre-fusing the target data collected by other sensors in sequence according to a serial sequence by taking the reference data collected by the reference sensor as a starting point to obtain pre-fused data; matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data; and performing post-fusion on the obtained flight path updating data and target data acquired by other sensors to obtain final flight path data. The problem that the existing fusion algorithm is basically synchronous fusion is solved, and meanwhile, an asynchronous serial fusion algorithm framework is modularized, so that the effect of strong universality is achieved.

Description

Asynchronous serial multi-sensor-based flight path data fusion method and storage medium
Technical Field
The invention relates to the technical field of flight path data fusion, in particular to a flight path data fusion method based on asynchronous serial multi-sensors and a storage medium.
Background
Today, ADAS (Advanced Driving assistance System) has become a major hot spot in the development of technology, and both host plants and suppliers devote a great deal of effort to them. And in terms of vehicles, the 'finished vehicle' with the ADAS installed is greatly improved in terms of both the operation stability and the safety, and even some regions have included the ADAS part of functions in the 'mandatory installation' regulation. The ADAS technology can be roughly divided into three major modules, namely "perception", "decision" and "control": the 'perception system' is mainly responsible for acquiring external environment information data, transmitting the external environment information data to the 'decision-making system' for analysis and processing, then forming a control signal to be input into the 'control system', and the controlled information is captured by the 'perception system' again to finally form a 'perception-decision-control-perception' closed-loop system. The accuracy and comprehensiveness of data output by the sensing system fundamentally influence decision making and control, and further determine the quality of products.
At present, regarding the design of the "perception system" aspect, developers generally tend to adopt the design scheme of heterogeneous multi-sensor data fusion for the following reasons:
a. the single sensor has larger limitation, the detection range cannot meet the requirement, and the output data type is single;
b. the scheme of combining various cheap sensors (with low precision) replaces a high-price sensor (with high precision), so that the cost is effectively reduced;
c. the heterogeneous multi-sensor combination mode has strong robustness and can adapt to various different working environments;
d. multisensors are more easily compliant with ISO26262 functional safety standards (ASIL D rating).
For the above reasons, the "multi-sensor data fusion algorithm" is very important. The work of the fusion algorithm mainly comprises the steps of integrating and associating data from the same target; existing tracks are managed and relevant historical data is recorded. However, each sensor has disadvantages, which bring about not small "development difficulty" to the fusion algorithm at the back end, including:
millimeter wave radar: the position and speed information is accurate, but the problems of a large number of false alarm points, false tracks and repeated targets exist;
a visual sensor: target type information can be detected, but the jump in the depth direction is large;
laser radar: the speed information cannot be detected.
In terms of the current state of development, there are several mainstream approaches to fusion algorithms, including: nearest Neighbor matching (NN), Joint Probabilistic Data Association (JPDA), and Multiple Hypothesis Tracking (MHT). The basic mechanism is as follows: predicting the target position (Previous Location) at the Previous moment to the Current moment, and matching and associating the Predicted Location (Predicted Location) with the target position (Current Location) at the Current moment through a correlation rule, so as to update the flight path, realize target tracking and distinguish an effective point from an ineffective point. However, the above algorithms all belong to a data association part in the fusion algorithm, and the overall framework of the fusion algorithm is designed in a few ways and mostly in a "synchronous fusion" manner, that is, only sensor data at the current moment is used, and data association is performed after time and space correction is performed, so that data update is realized. And the fusion algorithm steps and interfaces are relatively disordered, and the universality for the heterogeneous sensor is poor.
Disclosure of Invention
Therefore, a flight path data fusion method and a storage medium based on asynchronous serial multi-sensors are needed to be provided, and the problems that most of existing flight path data fusion algorithms are synchronous fusion, algorithm steps and interfaces are messy, and the generality of heterogeneous sensors is poor are solved.
In order to achieve the above object, the inventor provides a flight path data fusion method based on asynchronous serial multi-sensor, comprising the following steps:
acquiring target data through a multi-sensor, wherein the target data comprises basic data and auxiliary data, the multi-sensor comprises a main sensor and an auxiliary sensor, the main sensor is used for acquiring the basic data and the auxiliary data, and the auxiliary sensor is used for acquiring the auxiliary data;
selecting a sensor of the same type from the main sensors as a reference sensor, wherein target data acquired by the reference sensor is reference data of a detection target;
pre-fusing the target data collected by other sensors in sequence according to a serial sequence by taking the reference data collected by the reference sensor as a starting point to obtain pre-fused data;
matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data;
and performing post-fusion on the obtained flight path updating data and target data acquired by other sensors to obtain final flight path data.
Further optimization, the step of matching and fusing the obtained pre-fusion data and the historical track data to obtain track update data specifically comprises the following steps:
judging whether the target data collected by the reference sensor at the current moment is within the time stamp threshold value,
if not, removing historical track information, and generating new track data according to target data acquired by the current reference sensor;
if yes, historical track data are obtained according to the historical track, and target data acquired by the reference sensor and the historical track data are fused to obtain new track data.
Further optimization, the method also comprises the following steps:
and if the damage of the reference sensor is detected, switching other main sensors to be used as new reference sensors until all the main sensors are damaged.
Further optimization, the method also comprises the following steps:
and for target data acquired by multiple sensors, combining the data acquired by the same type of sensors to form a data layer.
Further optimization, the method also comprises the following steps:
and respectively preprocessing the target data acquired by the main sensor, the target data acquired by the auxiliary sensor and the historical track data.
The inventor also provides another technical scheme that: a storage medium storing a computer program which, when executed by a processor, performs the steps of:
acquiring target data through a multi-sensor, wherein the target data comprises basic data and auxiliary data, the multi-sensor comprises a main sensor and an auxiliary sensor, the main sensor is used for acquiring the basic data and the auxiliary data, and the auxiliary sensor is used for acquiring the auxiliary data;
selecting a sensor of the same type from the main sensors as a reference sensor, wherein target data acquired by the reference sensor is reference data of a detection target;
pre-fusing the target data collected by other sensors in sequence according to a serial sequence by taking the reference data collected by the reference sensor as a starting point to obtain pre-fused data;
matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data;
and performing post-fusion on the obtained flight path updating data and target data acquired by other sensors to obtain final flight path data.
Further optimization, the computer program is run by the processor, and when the step of matching and fusing the obtained pre-fusion data and the historical track data to obtain the track updating data is executed, the following steps are specifically executed:
judging whether the target data collected by the reference sensor at the current moment is within the time stamp threshold value,
if not, removing historical track information, and generating new track data according to target data acquired by the current reference sensor;
if yes, historical track data are obtained according to the historical track, and target data acquired by the reference sensor and the historical track data are fused to obtain new track data.
Further preferably, the computer program is executed by a processor, and further performs the steps of:
and if the damage of the reference sensor is detected, switching other main sensors to be used as new reference sensors until all the main sensors are damaged.
Further preferably, the computer program is executed by a processor, and further performs the steps of:
and for target data acquired by multiple sensors, combining the data acquired by the same type of sensors to form a data layer.
Further preferably, the computer program is executed by a processor, and further performs the steps of:
and respectively preprocessing the target data acquired by the main sensor, the target data acquired by the auxiliary sensor and the historical track data.
Different from the prior art, in the technical scheme, the sensors are divided into the main sensor and the auxiliary sensor according to the characteristics of different sensors, one type of sensor is selected as a reference sensor from the main sensor, target data acquired by the reference sensor is used as reference data of a target to be detected at the current moment, and the reference sensor is sequentially fused with target data acquired by other sensors and historical track data according to a serial sequence, wherein the fusion content comprises iterative updating of the basic data and addition of other additional information to further obtain output track data, the target data acquired by the sensors are fused based on an asynchronous serial fusion algorithm frame according to the characteristics of different sensors, so that the problem that the existing fusion algorithm is basically synchronous fusion is solved, and the asynchronous serial fusion algorithm frame is modularized, thereby achieving the effect of strong universality.
Drawings
FIG. 1 is a schematic flowchart of a method for asynchronous serial multi-sensor based track data fusion according to an embodiment;
FIG. 2 is a schematic diagram of one embodiment of a sensor class;
FIG. 3 is a flow diagram illustrating a timeline flow for track prediction according to an embodiment;
fig. 4 is a flowchart illustrating a process of preprocessing a data layer according to an embodiment.
Detailed Description
To explain technical contents, structural features, and objects and effects of the technical solutions in detail, the following detailed description is given with reference to the accompanying drawings in conjunction with the embodiments.
Referring to fig. 1, the present embodiment provides a track data fusion method based on asynchronous serial multiple sensors, including the following steps:
step S110: acquiring target data through a multi-sensor, wherein the target data comprises basic data and auxiliary data, the multi-sensor comprises a main sensor and an auxiliary sensor, the main sensor is used for acquiring the basic data and the auxiliary data, and the auxiliary sensor is used for acquiring the auxiliary data;
referring to the sensor classification shown in fig. 2, the sensors are classified by the description objects based on the sensor output data, and the main classification principle is as follows: sensors that can describe the properties (position, speed, acceleration, azimuth, target type, etc.) of a potential moving object (vehicle, cyclist, pedestrian, etc.) are classified into a first category, such as millimeter wave Radar, Camera, Lidar, etc.; sensors that can describe other traffic information as well as the own vehicle information (lane lines, traffic signs, own vehicle speed, own vehicle yaw rate, etc.) are classified into a second category, such as high precision MAPs HD Map, RTK, Camera, inertial navigation unit IMU, high precision MAPs HD Map, real time differential positioning system RTD-GPS, etc. Meanwhile, the second type of sensor has the main function of auxiliary calculation of the output data of the first type of sensor, so that the target expression information is richer. The first sensor is a main sensor, and the second sensor is an auxiliary sensor. In fig. 2, the sensors in the lower half belong to a "first Type of Sensor (Type i Sensor)" and are formed by an "capital english letter + arabic numeral subscript", where the capital english letter indicates the Type of the Sensor (the same letter indicates the same Type of Sensor), and the numeral indicates the number of the same Type of Sensor. The first half of the sensors belong to the "Type II Sensor", and different Greek characters represent different types of sensors. In addition, each type of sensor in the type is only required to be loaded, so that the problem of the number of the sensors in the same type is solved.
Based on the classification rules described above, some sensors may be assigned to both the first and second classes of sensors based on the type of data they output. Such as a visual sensor, whose output data includes detected object attribute information, lane line information, and traffic identification information, and thus will be divided into two different types of sensor categories at the same time. However, in terms of software, the above classification conflict does not affect the fusion algorithm framework of the present patent, because the fusion algorithm framework input is not in units of sensors, but in units of "data types", such as speed, position, angle, type, and the like.
The present patent divides the sensor output Data into "Fundamental Data" and "auxiliary Data" according to the physical information described by the sensor output Data. The "basic data" describes individual information (such as position, speed, type) of the target attribute, and there is a repetitive phenomenon in a large probability that a plurality of sensors can output the basic data. The auxiliary data mainly describes common information, and the auxiliary basic data is calculated (such as traffic information and vehicle information), so that the target can carry more information contents, and the auxiliary information does not have a repeated phenomenon, namely one type of auxiliary data can be output by only one sensor. Meanwhile, the first type of sensor (primary sensor) can output "basic data" and "auxiliary data", while the second type of sensor (auxiliary sensor) can output only "auxiliary data". For example, the millimeter wave radar (first type sensor) of the vehicle detects the target vehicle ahead and outputs the position and speed information (basic data) of the target vehicle, and the lane number (newly generated information content) of the target vehicle can be determined by combining the lane line information (auxiliary data) output by the camera (second type sensor).
Step S120: selecting a sensor of the same type from the main sensors as a reference sensor, wherein target data acquired by the reference sensor is reference data of a detection target; the method is characterized in that accuracy and stability of basic data (position, speed, angle, acceleration and the like) in data output by a main sensor are taken as main reference bases, a unique type is selected from a plurality of main sensors to be used as a 'reference sensor', meanwhile, in a track management link, only data from the reference sensor is allowed to generate a new track, and other main sensors are not authorized to generate the track, and in addition, if the reference sensor is damaged and cannot normally send data, the next type of main sensor is automatically searched to be used as the reference sensor until all the main sensors are completely damaged.
Step S130: pre-fusing the target data collected by other sensors in sequence according to a serial sequence by taking the reference data collected by the reference sensor as a starting point to obtain pre-fused data;
step S140: matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data;
step S150: and performing post-fusion on the obtained flight path updating data and target data acquired by other sensors to obtain final flight path data.
The data fusion comprises track data fusion and auxiliary data fusion and can be divided into three parts, namely pre-fusion, main fusion and post-fusion, wherein the pre-fusion and the post-fusion belong to the auxiliary data fusion, and the main fusion belongs to the track data fusion; the flight path data fusion refers to the mutual fusion between the current detection target data (the target data comprises basic data and auxiliary data) and the historical flight path (the flight path data comprises the basic data and the auxiliary data). Wherein, the track basic data is mainly updated by using the target basic data. For example, when a target vehicle appears in front of a vehicle, the target vehicle is detected by a main sensor, the main sensor outputs basic data such as position, speed and the like, the basic data is matched with track data, the track basic data is updated by using the current target basic data after the data are confirmed to come from the same target, the updating mode comprises weight addition and the like, and the specific mode can be selected according to actual needed polarity. In addition, in order to ensure the comprehensiveness of the output data information, target auxiliary information should be added to the track output result. Meanwhile, after the track data is updated, the relevant work of track management, namely generating a new track, destroying the lost track and the like, needs to be finished.
The auxiliary data fusion refers to data fusion among different types of sensors, and is generally located after track data fusion, but partial auxiliary data fusion exists before main fusion, for example, when historical data fusion is matched, a target dynamic and static state needs to be calculated, so that the target dynamic and static state needs to be calculated in advance by using data such as the speed of a vehicle and the yaw rate, and the fused output data can carry more information through auxiliary data calculation. Because the problem of data repetition does not exist in the auxiliary data fusion, links such as matching, basic data updating and the like are not needed. For example, in the case of calculating the lane where the target is located, "lane line" is the shared information that is not different from any target, so that a "matching" loop is not required, and the "lane where the target is located" is the auxiliary information, and because of the discreteness of the information, the state can be directly updated, and the problem of "weight addition" and the like does not exist.
By dividing the sensors into main sensors and auxiliary sensors according to the characteristics of the different sensors, meanwhile, one type of sensor is selected as a reference sensor from the main sensor, target data acquired by the reference sensor is used as reference data of a detection target at the current moment, and the target data and historical track data acquired by other sensors are sequentially fused according to a serial sequence, wherein the fusion content comprises iterative update of basic data and addition of other additional information so as to obtain output track data, the problem that the existing fusion algorithm is basically synchronous fusion is solved by fusing target data acquired by the sensors based on an asynchronous serial fusion algorithm framework according to the characteristics of different sensors, meanwhile, the 'asynchronous serial' fusion algorithm framework is modularized, so that the effect of strong universality is achieved.
In this embodiment, in the running process, a situation of partial software or hardware damage is highly likely to occur, so that some sensors cannot normally output related data. Therefore, in order to enhance the robustness of the fusion algorithm, the fusion algorithm framework mutually strips the main data fusion and the auxiliary data fusion: when the auxiliary sensor is damaged, under the condition that the auxiliary data cannot be normally received, the detection target information (main data such as position, speed, angle and the like are normal) can be still output, but partial information (information calculated by the auxiliary data) is lost; when the main sensors are completely damaged, the system cannot work, but if only part of the main sensors are completely damaged (the main sensors which can normally work still exist), the fusion system switches data and still keeps normal work.
Referring to the time axis flow of the track prediction shown in fig. 3, in the present embodiment, the method adopts an asynchronous serial multi-sensor fusion architecture, which does not force data from different sources to be synchronized any more, and further replaces: the method specifically comprises the following steps of predicting the existing historical track to the current moment, defining a timestamp threshold, updating the track when data collected by a sensor in the timestamp threshold are fused, and obtaining track updating data by matching and fusing obtained pre-fusion data and historical track data:
judging whether the target data collected by the reference sensor at the current moment is within the time stamp threshold value,
if not, removing historical track information, and generating new track data according to target data acquired by the current reference sensor;
if yes, historical track data are obtained according to the historical track, and target data acquired by the reference sensor and the historical track data are fused to obtain new track data.
By recording the processing time of the main program of each frame as a reference time, wherein the reference time can be used for calculating a timestamp threshold of each data on one hand, and on the other hand, the difference between the reference times of every two adjacent frames is used as a prediction step length of track prediction, the specific formula is as follows:step=tcurrent-tlast. After the prediction step length is obtained, the historical track data output from the previous frame is predicted to the current time through a prediction Model, wherein the prediction Model comprises a Constant Velocity Model (Constant Velocity Model), a Constant Acceleration Model (Constant Acceleration Model), a Constant Turning Rate Model (Constant Turning Rate Model), an interactive Multiple Model (interactive Multiple Model) and the like. And taking the predicted track time stamp as reference time, and taking the time of the front frame and the time of the rear frame as a time stamp threshold, wherein the data period is defined by the sensor or the sensor. Whether a timestamp of target data acquired by a reference sensor is located within a timestamp threshold value is checked, if the check result is true, historical track data are obtained according to a historical track, the target data acquired by the reference sensor and the historical track data are fused to obtain new track data, if the timestamp of the reference sensor data is located outside the threshold value, historical track information is immediately cleared, a new track is generated by using current reference sensor data, and the timestamp threshold value is updated by taking the current reference sensor data timestamp as a reference. Moreover, the several timestamp screens mentioned in this embodiment are not necessarily designed in a uniform manner, and may be individually selected to have appropriate thresholds.
In this embodiment, there may be a case where the reference sensor is damaged, so to solve this problem, the method further includes the following steps:
and if the damage of the reference sensor is detected, switching other main sensors to be used as new reference sensors until all the main sensors are damaged.
And comparing basic data of various sensors in the early stage, and selecting the sensor with high accuracy and stability as a reference sensor. And in the operation process of the fusion system, whether the reference sensor works normally can be confirmed by checking the output data, if the reference sensor is found to be damaged, other main sensors are switched to be used as the reference sensors, and the basic data of the main sensors are used as the detection target reference data.
Referring to the process of preprocessing the data layer shown in fig. 4, in this embodiment, in order to meet the requirement of the sensing range, the method further includes the following steps:
and for target data acquired by multiple sensors, combining the data acquired by the same type of sensors to form a data layer.
Because the sensing range of a single sensor is small, a plurality of similar sensors are usually adopted to cover the required sensing range, and before data collected by the sensors are fused, data from the sensors belonging to the same type are merged to form a data layer, wherein in the embodiment, historical track data is positioned as a first data layer, 2-N layers are data layers collected by main sensors, and data collected by auxiliary sensors are respectively one layer by taking each auxiliary sensor as a unit.
In this embodiment, the method further includes converting the data collected by the sensor, and first converting the data collected by the sensor into a rectangular coordinate system or a polar coordinate system, and simultaneously converting the data collected by the reference sensor into a common coordinate system for subsequent matching.
In this embodiment, the sensing area of the sensor may be further divided into an area of interest and an invalid area based on the index requirement, wherein the data in the area of interest is retained while the data in the invalid area is discarded, and meanwhile, the area of interest operation is redundant, that is, the area of interest determination exists no matter under polar coordinates, under rectangular coordinates, before/after combination of similar sensors, but the area of interest determination is not essential.
In this embodiment, there may be multiple data points from the same target in the data collected by the main sensor, and clustering is performed to avoid repeated generation of the same flight path. For example, millimeter wave radar feeds back multiple identical target points, or a large number of repeated point clouds from the same target in lidar point cloud data. The clustering processing modes include K-mean, density clustering, grid clustering and the like.
In this embodiment, the method further includes the following steps:
and respectively preprocessing the target data acquired by the main sensor, the target data acquired by the auxiliary sensor and the historical track data.
In the embodiment, the number of the targets is more, so that the number of the targets needs to be checked, and each target is ensured to be correspondingly operated in circulation. The method comprises the following steps: firstly, judging an interested region of reference data acquired by a reference sensor, if the interested region is in the basic data acquired by the reference sensor, converting the coordinates of the basic data acquired by the reference sensor into a common coordinate system, judging the interested region again, after judging the interested region again, performing homologous combination on the converted reference data, then judging the interested region, clustering the combined reference data when the interested region is judged, performing timestamp threshold judgment on the clustered reference data, if the clustered reference data is in the timestamp threshold, performing CRC (cyclic redundancy check) check, if the clustered reference data passes the check, outputting the reference data, if the clustered reference data does not exist in the timestamp threshold, clearing historical tracks, generating new tracks according to the reference data, updating the timestamp threshold, and then outputting the new tracks after performing the CRC check.
The preprocessing of the target data acquired by the main sensor is almost the same as the preprocessing of the reference data acquired by the reference sensor, and the data acquired by the main sensor is subjected to sensor data conversion, similar sensor data combination, region-of-interest judgment, same target data clustering, timestamp threshold value selection and target number redundancy check. The method comprises the following specific steps: firstly, judging the region of interest of target data acquired by a main sensor, if the target data is in the region of interest, performing coordinate conversion on the target data acquired by the main sensor, then judging the region of interest again, performing homologous combination on the converted target data after judging the target data is in the region of interest, then judging the region of interest again, clustering the combined target data when judging the target data is in the region of interest, performing timestamp threshold judgment on the clustered reference data, if the target data is in the timestamp threshold, performing CRC (cyclic redundancy check) check, if the target data passes the check, outputting the target data acquired by the main sensor, and if the target data is not in the timestamp threshold, directly discarding the data acquired by the main sensor which is out of the timestamp threshold.
And as for the preprocessing of the auxiliary data collected by the auxiliary sensor, the main processing mode is time stamp screening, namely, the auxiliary data collected by the auxiliary sensor is judged whether to be within a time stamp threshold value, if so, the auxiliary data is selected, and if not, the auxiliary data outside the time stamp threshold value is abandoned.
And the preprocessing of historical track data is mainly to carry out rationality judgment on the historical track after track prediction, if the existing historical track is reasonable, the existing historical track is used, and if the existing historical track is unreasonable, unreasonable historical track data is abandoned, wherein the rationality judgment comprises the following steps: the predicted number of tracks is equal to the number of tracks output by the previous frame and the track state is not required to be transformed.
In the embodiment, the pre-fusion belongs to an auxiliary fusion type, and the fusion content mainly comprises two parts, namely reference sensor data-basic data fusion and reference sensor data-auxiliary data fusion. The two-part fusion algorithm has no mandatory requirement on the sequence, and the main function of the two-part fusion algorithm is to add more physical information to the reference data. Wherein, the reference sensor data-basic data fusion mainly refers to the fusion between the basic data in the reference sensor data and the basic data in the other main part main sensors. The physical information contained in the above two parts of data is described for each detection target. Therefore, the part needs to complete the target matching link and can perform fusion operation only after confirming that the information comes from the same target. In the process of target matching, a calculation mode of matching degree (credibility of description information from the same target) is also involved, and finally, weighted fusion is carried out, and if basic data is a continuous variable, a method of weight addition is adopted for fusion; if the basic quantity is a discrete variable, the state of the variable can be directly updated. And the calculation modes of the matching degree are various, wherein the calculation modes comprise Dot Product calculation (Dot-Product), nearest neighbor (Euclidean distance) calculation, JPDA/MHT, residual error (Mahalanobis distance) calculation and the like, and the specific mode can be selected according to actual needs. Meanwhile, weighted fusion is not a hard requirement, and the weight coefficients can be generated in various ways according to the requirements of robustness/accuracy. In the asynchronous serial fusion framework in this embodiment, the main sensor with the highest accuracy and the best stability of the output data is selected as the reference sensor, and the other main sensors are only responsible for adding the discrete information of the main sensors to the data of the reference sensor. Thus, the associated variable weight in the reference sensor data is set to 1, while the continuous variable weight in the other master sensor output data is set to 0. Reference sensor data-auxiliary data fusion mainly refers to fusion between reference data collected by a reference sensor and auxiliary data output by other sensors, wherein the reference data can only come from the reference sensor, but the auxiliary data can come from a main sensor and an auxiliary sensor at the same time. The description content of the auxiliary data mainly aims at unified information of non-target individuals, such as traffic information, vehicle information and the like, so that a target matching link is not needed in the reference data-auxiliary data fusion, more target additional physical information is directly calculated according to the auxiliary data and is attached to the main data stream.
For the main data fusion, the method belongs to track fusion, and mainly operates on the reference data and the predicted track data, which is described in detail as follows:
(1) target data state definition
Invalid target: when the target data is located outside the ROI (region of interest), or the carried timestamp does not meet the reference timestamp threshold;
selecting a target: the matching degree of the target data and a certain track is higher than a threshold value;
matching the target: successfully matching the target data to a certain track and updating the track data;
effective target: the remaining data points that have not undergone any operation are valid targets.
(2) Track State definition
Aerial track: a reset track cache space;
effective track: successfully matching the updated track by the target data at the current moment;
losing the track: the track which is not matched and updated by the target data under the current frame;
invalid flight path: and the continuous multiframes are not matched with the updated flight path by the target data, and are the flight paths with the number of matched frames more than the number of successfully matched frames.
(3) Related parameter definition
The visible times of the flight path: the number of times the track detected target data is updated;
track loss times: the number of times that the track fails to match the detection target;
matching degree threshold value: when the matching degree of the reference data and the flight path is higher than a certain value, the matching degree is recorded at the corresponding position of the matching matrix, otherwise, the resetting is carried out;
threshold value of consecutive loss times: when the number of continuous frame loss times of the lost track is larger than the threshold value, the lost track is converted into an invalid track, and then the lost track is reset to the aerial track.
And the main data fusion process comprises track data updating and track management. For track data updating, firstly, the matching operation of the current target information and the track information is carried out, and the matching degree which is greater than the threshold value is recorded. The target data state is marked for the first time, namely, the selected target is marked; and then, the data are arranged according to the matching degree from high to low, the pairing with high matching degree based on the flight path sequence is preferentially updated, and the successfully updated pairing does not participate in the subsequent screening operation. However, there may be problems of "matching multiple targets to the same track" and "matching multiple tracks to the same target", and the specific solution may be weight ratio, directly selecting optimal matching peers, or defined by a developer according to related requirements, and will not be described herein again. And finally, updating the target data state for the second time, namely marking the target state successfully matched. Meanwhile, the first updating of the track state is needed to be finished, namely, the effective track is marked. For the track management, the visible times and lost times of the track are updated according to the track state (whether the current target data is successfully matched or not), and the specific updating mode is as follows: adding one to the visible times of the track successfully matched with the frame, and resetting the invisible times to zero; and adding one to the invisible times of the track of the failed frame matching, wherein the visible times are unchanged. The logic is designed by the developer and does not make a mandatory requirement. And then updating the track state for the second time, marking the unsuccessfully matched track of the frame as a lost track, and reserving a track prediction value. And then selecting invalid flight paths to reset and zero according to the comparison between the flight path loss times and the continuous loss time threshold value. And finally, filling the current target data with the effective state into the air track to be used as a new track to generate, and carrying out third time target state updating and third time track state updating, namely converting the effective target state into a matched target and converting the air track state into an effective track state.
For post-fusion, the method belongs to an auxiliary fusion type, and input data are updated 'new flight path data' and other main sensor data and auxiliary sensor data. The fusion content mainly comprises two parts of new flight path data-basic data fusion and new flight path data-auxiliary data fusion. The two-part fusion algorithm has no mandatory requirement on the sequence, and the main function of the algorithm is to add more physical information to new track data. After the module is positioned behind the main fusion module in the asynchronous serial fusion framework, the basic process is the same as that of pre-fusion, and only 'reference data' is replaced by 'new flight path data'.
In this embodiment, a pre-fusion link and a post-fusion link involve multiple kinds of sensor fusion, and no matter the main sensor or the auxiliary sensor has no specific fusion sequence, which can be selected according to actual needs, but the fusion data must include reference sensor data.
In another embodiment, a storage medium stores a computer program which, when executed by a processor, performs the steps of:
acquiring target data through a multi-sensor, wherein the target data comprises basic data and auxiliary data, the multi-sensor comprises a main sensor and an auxiliary sensor, the main sensor is used for acquiring the basic data and the auxiliary data, and the auxiliary sensor is used for acquiring the auxiliary data;
referring to the sensor classification shown in fig. 2, the sensors are classified by the description objects based on the sensor output data, and the main classification principle is as follows: sensors that can describe the properties (position, speed, acceleration, azimuth, target type, etc.) of a potential moving object (vehicle, cyclist, pedestrian, etc.) are classified into a first category, such as millimeter wave Radar, Camera, Lidar, etc.; sensors that can describe other traffic information as well as the own vehicle information (lane lines, traffic signs, own vehicle speed, own vehicle yaw rate, etc.) are classified into a second category, such as high precision MAPs HD Map, RTK, Camera, inertial navigation unit IMU, high precision MAPs HD Map, real time differential positioning system RTD-GPS, etc. Meanwhile, the second type of sensor has the main function of auxiliary calculation of the output data of the first type of sensor, so that the target expression information is richer. The first sensor is a main sensor, and the second sensor is an auxiliary sensor. In fig. 2, the sensors in the lower half belong to a "first Type of Sensor (Type i Sensor)" and are formed by an "capital english letter + arabic numeral subscript", where the capital english letter indicates the Type of the Sensor (the same letter indicates the same Type of Sensor), and the numeral indicates the number of the same Type of Sensor. The first half of the sensors belong to the "Type II Sensor", and different Greek characters represent different types of sensors. In addition, each type of sensor in the type is only required to be loaded, so that the problem of the number of the sensors in the same type is solved.
Based on the classification rules described above, some sensors may be assigned to both the first and second classes of sensors based on the type of data they output. Such as a visual sensor, whose output data includes detected object attribute information, lane line information, and traffic identification information, and thus will be divided into two different types of sensor categories at the same time. However, in terms of software, the above classification conflict does not affect the fusion algorithm framework of the present patent, because the fusion algorithm framework input is not in units of sensors, but in units of "data types", such as speed, position, angle, type, and the like.
The present patent divides the sensor output Data into "Fundamental Data" and "auxiliary Data" according to the physical information described by the sensor output Data. The "basic data" describes individual information (such as position, speed, type) of the target attribute, and there is a repetitive phenomenon in a large probability that a plurality of sensors can output the basic data. The auxiliary data mainly describes common information, and the auxiliary basic data is calculated (such as traffic information and vehicle information), so that the target can carry more information contents, and the auxiliary information does not have a repeated phenomenon, namely one type of auxiliary data can be output by only one sensor. Meanwhile, the first type of sensor (primary sensor) can output "basic data" and "auxiliary data", while the second type of sensor (auxiliary sensor) can output only "auxiliary data". For example, the millimeter wave radar (first type sensor) of the vehicle detects the target vehicle ahead and outputs the position and speed information (basic data) of the target vehicle, and the lane number (newly generated information content) of the target vehicle can be determined by combining the lane line information (auxiliary data) output by the camera (second type sensor).
Selecting a sensor of the same type from the main sensors as a reference sensor, wherein target data acquired by the reference sensor is reference data of a detection target; the method is characterized in that accuracy and stability of basic data (position, speed, angle, acceleration and the like) in data output by a main sensor are taken as main reference bases, a unique type is selected from a plurality of main sensors to be used as a 'reference sensor', meanwhile, in a track management link, only data from the reference sensor is allowed to generate a new track, and other main sensors are not authorized to generate the track, and in addition, if the reference sensor is damaged and cannot normally send data, the next type of main sensor is automatically searched to be used as the reference sensor until all the main sensors are completely damaged.
Pre-fusing the target data collected by other sensors in sequence according to a serial sequence by taking the reference data collected by the reference sensor as a starting point to obtain pre-fused data;
matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data;
and performing post-fusion on the obtained flight path updating data and target data acquired by other sensors to obtain final flight path data.
The data fusion comprises track data fusion and auxiliary data fusion and can be divided into three parts, namely pre-fusion, main fusion and post-fusion, wherein the pre-fusion and the post-fusion belong to the auxiliary data fusion, and the main fusion belongs to the track data fusion; the flight path data fusion refers to the mutual fusion between the current detection target data (the target data comprises basic data and auxiliary data) and the historical flight path (the flight path data comprises the basic data and the auxiliary data). Wherein, the track basic data is mainly updated by using the target basic data. For example, when a target vehicle appears in front of a vehicle, the target vehicle is detected by a main sensor, the main sensor outputs basic data such as position, speed and the like, the basic data is matched with track data, the track basic data is updated by using the current target basic data after the data are confirmed to come from the same target, the updating mode comprises weight addition and the like, and the specific mode can be selected according to actual needed polarity. In addition, in order to ensure the comprehensiveness of the output data information, target auxiliary information should be added to the track output result. Meanwhile, after the track data is updated, the relevant work of track management, namely generating a new track, destroying the lost track and the like, needs to be finished.
The auxiliary data fusion refers to data fusion among different types of sensors, and is generally located after track data fusion, but partial auxiliary data fusion exists before main fusion, for example, when historical data fusion is matched, a target dynamic and static state needs to be calculated, so that the target dynamic and static state needs to be calculated in advance by using data such as the speed of a vehicle and the yaw rate, and the fused output data can carry more information through auxiliary data calculation. Because the problem of data repetition does not exist in the auxiliary data fusion, links such as matching, basic data updating and the like are not needed. For example, in the case of calculating the lane where the target is located, "lane line" is the shared information that is not different from any target, so that a "matching" loop is not required, and the "lane where the target is located" is the auxiliary information, and because of the discreteness of the information, the state can be directly updated, and the problem of "weight addition" and the like does not exist.
By dividing the sensors into main sensors and auxiliary sensors according to the characteristics of the different sensors, meanwhile, one type of sensor is selected as a reference sensor from the main sensor, target data acquired by the reference sensor is used as reference data of a detection target at the current moment, and the target data and historical track data acquired by other sensors are sequentially fused according to a serial sequence, wherein the fusion content comprises iterative update of basic data and addition of other additional information so as to obtain output track data, the problem that the existing fusion algorithm is basically synchronous fusion is solved by fusing target data acquired by the sensors based on an asynchronous serial fusion algorithm framework according to the characteristics of different sensors, meanwhile, the 'asynchronous serial' fusion algorithm framework is modularized, so that the effect of strong universality is achieved.
In this embodiment, in the running process, a situation of partial software or hardware damage is highly likely to occur, so that some sensors cannot normally output related data. Therefore, in order to enhance the robustness of the fusion algorithm, the fusion algorithm framework mutually strips the main data fusion and the auxiliary data fusion: when the auxiliary sensor is damaged, under the condition that the auxiliary data cannot be normally received, the detection target information (main data such as position, speed, angle and the like are normal) can be still output, but partial information (information calculated by the auxiliary data) is lost; when the main sensors are completely damaged, the system cannot work, but if only part of the main sensors are completely damaged (the main sensors which can normally work still exist), the fusion system switches data and still keeps normal work.
Referring to the time axis flow of the track prediction shown in fig. 3, in the present embodiment, the method adopts an asynchronous serial multi-sensor fusion architecture, which does not force data from different sources to be synchronized any more, and further replaces: the method comprises the following steps of predicting the existing historical track to the current moment, defining a timestamp threshold, updating the track by data collected by a sensor within the timestamp threshold, namely fusing, running a computer program by a processor, and executing the following steps when the step of matching and fusing the obtained pre-fusion data with the historical track data to obtain track updating data is executed:
judging whether the target data collected by the reference sensor at the current moment is within the time stamp threshold value,
if not, removing historical track information, and generating new track data according to target data acquired by the current reference sensor;
if yes, historical track data are obtained according to the historical track, and target data acquired by the reference sensor and the historical track data are fused to obtain new track data.
By recording the processing time of the main program of each frame as a reference time, wherein the reference time can be used for calculating a timestamp threshold of each data on one hand, and on the other hand, the difference between the reference times of every two adjacent frames is used as a prediction step length of track prediction, the specific formula is as follows:step=tcurrent-tlast. After the prediction step length is obtained, the historical track data output from the previous frame is predicted to the current time through a prediction Model, wherein the prediction Model comprises a Constant Velocity Model (Constant Velocity Model), a Constant Acceleration Model (Constant Acceleration Model), a Constant Turning Rate Model (Constant Turning Rate Model), an interactive Multiple Model (interactive Multiple Model) and the like. And taking the predicted track time stamp as reference time, and taking the time of the front frame and the time of the rear frame as a time stamp threshold, wherein the data period is defined by the sensor or the sensor. Checking whether the timestamp of the target data collected by the reference sensor is within a timestamp threshold, and if so, checkingIf the verification result is true, obtaining historical track data according to the historical track, fusing target data acquired by the reference sensor with the historical track data to obtain new track data, if the timestamp of the reference sensor data is out of the threshold value, immediately clearing the historical track information, simultaneously generating a new track by using the current reference sensor data, and updating the 'timestamp threshold value' by taking the current reference sensor data timestamp as the reference. Moreover, the several timestamp screens mentioned in this embodiment are not necessarily designed in a uniform manner, and may be individually selected to have appropriate thresholds.
In this embodiment, there may be a case where the reference sensor is damaged, so to solve this problem, the following steps are further performed:
and if the damage of the reference sensor is detected, switching other main sensors to be used as new reference sensors until all the main sensors are damaged.
And comparing basic data of various sensors in the early stage, and selecting the sensor with high accuracy and stability as a reference sensor. And in the operation process of the fusion system, whether the reference sensor works normally can be confirmed by checking the output data, if the reference sensor is found to be damaged, other main sensors are switched to be used as the reference sensors, and the basic data of the main sensors are used as the detection target reference data.
Referring to the process of preprocessing the data layer shown in fig. 4, in this embodiment, in order to meet the requirement of the sensing range, the computer program is executed by the processor, and further performs the following steps:
and for target data acquired by multiple sensors, combining the data acquired by the same type of sensors to form a data layer.
Because the sensing range of a single sensor is small, a plurality of similar sensors are usually adopted to cover the required sensing range, and before data collected by the sensors are fused, data from the sensors belonging to the same type are merged to form a data layer, wherein in the embodiment, historical track data is positioned as a first data layer, 2-N layers are data layers collected by main sensors, and data collected by auxiliary sensors are respectively one layer by taking each auxiliary sensor as a unit.
In this embodiment, the method further includes converting the data collected by the sensor, and first converting the data collected by the sensor into a rectangular coordinate system or a polar coordinate system, and simultaneously converting the data collected by the reference sensor into a common coordinate system for subsequent matching.
In this embodiment, the sensing area of the sensor may be further divided into an area of interest and an invalid area based on the index requirement, wherein the data in the area of interest is retained while the data in the invalid area is discarded, and meanwhile, the area of interest operation is redundant, that is, the area of interest determination exists no matter under polar coordinates, under rectangular coordinates, before/after combination of similar sensors, but the area of interest determination is not essential.
In this embodiment, there may be multiple data points from the same target in the data collected by the main sensor, and clustering is performed to avoid repeated generation of the same flight path. For example, millimeter wave radar feeds back multiple identical target points, or a large number of repeated point clouds from the same target in lidar point cloud data. The clustering processing modes include K-mean, density clustering, grid clustering and the like.
In this embodiment, the computer program is executed by a processor, and further performs the steps of:
and respectively preprocessing the target data acquired by the main sensor, the target data acquired by the auxiliary sensor and the historical track data.
In the embodiment, the number of the targets is more, so that the number of the targets needs to be checked, and each target is ensured to be correspondingly operated in circulation. The method comprises the following steps: firstly, judging an interested region of reference data acquired by a reference sensor, if the interested region is in the basic data acquired by the reference sensor, converting the coordinates of the basic data acquired by the reference sensor into a common coordinate system, judging the interested region again, after judging the interested region again, performing homologous combination on the converted reference data, then judging the interested region, clustering the combined reference data when the interested region is judged, performing timestamp threshold judgment on the clustered reference data, if the clustered reference data is in the timestamp threshold, performing CRC (cyclic redundancy check) check, if the clustered reference data passes the check, outputting the reference data, if the clustered reference data does not exist in the timestamp threshold, clearing historical tracks, generating new tracks according to the reference data, updating the timestamp threshold, and then outputting the new tracks after performing the CRC check.
The preprocessing of the target data acquired by the main sensor is almost the same as the preprocessing of the reference data acquired by the reference sensor, and the data acquired by the main sensor is subjected to sensor data conversion, similar sensor data combination, region-of-interest judgment, same target data clustering, timestamp threshold value selection and target number redundancy check. The method comprises the following specific steps: firstly, judging the region of interest of target data acquired by a main sensor, if the target data is in the region of interest, performing coordinate conversion on the target data acquired by the main sensor, then judging the region of interest again, performing homologous combination on the converted target data after judging the target data is in the region of interest, then judging the region of interest again, clustering the combined target data when judging the target data is in the region of interest, performing timestamp threshold judgment on the clustered reference data, if the target data is in the timestamp threshold, performing CRC (cyclic redundancy check) check, if the target data passes the check, outputting the target data acquired by the main sensor, and if the target data is not in the timestamp threshold, directly discarding the data acquired by the main sensor which is out of the timestamp threshold.
And as for the preprocessing of the auxiliary data collected by the auxiliary sensor, the main processing mode is time stamp screening, namely, the auxiliary data collected by the auxiliary sensor is judged whether to be within a time stamp threshold value, if so, the auxiliary data is selected, and if not, the auxiliary data outside the time stamp threshold value is abandoned.
And the preprocessing of historical track data is mainly to carry out rationality judgment on the historical track after track prediction, if the existing historical track is reasonable, the existing historical track is used, and if the existing historical track is unreasonable, unreasonable historical track data is abandoned, wherein the rationality judgment comprises the following steps: the predicted number of tracks is equal to the number of tracks output by the previous frame and the track state is not required to be transformed.
In the embodiment, the pre-fusion belongs to an auxiliary fusion type, and the fusion content mainly comprises two parts, namely reference sensor data-basic data fusion and reference sensor data-auxiliary data fusion. The two-part fusion algorithm has no mandatory requirement on the sequence, and the main function of the two-part fusion algorithm is to add more physical information to the reference data. Wherein, the reference sensor data-basic data fusion mainly refers to the fusion between the basic data in the reference sensor data and the basic data in the other main part main sensors. The physical information contained in the above two parts of data is described for each detection target. Therefore, the part needs to complete the target matching link and can perform fusion operation only after confirming that the information comes from the same target. In the process of target matching, a calculation mode of matching degree (credibility of description information from the same target) is also involved, and finally, weighted fusion is carried out, and if basic data is a continuous variable, a method of weight addition is adopted for fusion; if the basic quantity is a discrete variable, the state of the variable can be directly updated. And the calculation modes of the matching degree are various, wherein the calculation modes comprise Dot Product calculation (Dot-Product), nearest neighbor (Euclidean distance) calculation, JPDA/MHT, residual error (Mahalanobis distance) calculation and the like, and the specific mode can be selected according to actual needs. Meanwhile, weighted fusion is not a hard requirement, and the weight coefficients can be generated in various ways according to the requirements of robustness/accuracy. In the asynchronous serial fusion framework in this embodiment, the main sensor with the highest accuracy and the best stability of the output data is selected as the reference sensor, and the other main sensors are only responsible for adding the discrete information of the main sensors to the data of the reference sensor. Thus, the associated variable weight in the reference sensor data is set to 1, while the continuous variable weight in the other master sensor output data is set to 0. Reference sensor data-auxiliary data fusion mainly refers to fusion between reference data collected by a reference sensor and auxiliary data output by other sensors, wherein the reference data can only come from the reference sensor, but the auxiliary data can come from a main sensor and an auxiliary sensor at the same time. The description content of the auxiliary data mainly aims at unified information of non-target individuals, such as traffic information, vehicle information and the like, so that a target matching link is not needed in the reference data-auxiliary data fusion, more target additional physical information is directly calculated according to the auxiliary data and is attached to the main data stream.
For the main data fusion, the method belongs to track fusion, and mainly operates on the reference data and the predicted track data, which is described in detail as follows:
(4) target data state definition
Invalid target: when the target data is located outside the ROI (region of interest), or the carried timestamp does not meet the reference timestamp threshold;
selecting a target: the matching degree of the target data and a certain track is higher than a threshold value;
matching the target: successfully matching the target data to a certain track and updating the track data;
effective target: the remaining data points that have not undergone any operation are valid targets.
(5) Track State definition
Aerial track: a reset track cache space;
effective track: successfully matching the updated track by the target data at the current moment;
losing the track: the track which is not matched and updated by the target data under the current frame;
invalid flight path: and the continuous multiframes are not matched with the updated flight path by the target data, and are the flight paths with the number of matched frames more than the number of successfully matched frames.
(6) Related parameter definition
The visible times of the flight path: the number of times the track detected target data is updated;
track loss times: the number of times that the track fails to match the detection target;
matching degree threshold value: when the matching degree of the reference data and the flight path is higher than a certain value, the matching degree is recorded at the corresponding position of the matching matrix, otherwise, the resetting is carried out;
threshold value of consecutive loss times: when the number of continuous frame loss times of the lost track is larger than the threshold value, the lost track is converted into an invalid track, and then the lost track is reset to the aerial track.
And the main data fusion process comprises track data updating and track management. For track data updating, firstly, the matching operation of the current target information and the track information is carried out, and the matching degree which is greater than the threshold value is recorded. The target data state is marked for the first time, namely, the selected target is marked; and then, the data are arranged according to the matching degree from high to low, the pairing with high matching degree based on the flight path sequence is preferentially updated, and the successfully updated pairing does not participate in the subsequent screening operation. However, there may be problems of "matching multiple targets to the same track" and "matching multiple tracks to the same target", and the specific solution may be weight ratio, directly selecting optimal matching peers, or defined by a developer according to related requirements, and will not be described herein again. And finally, updating the target data state for the second time, namely marking the target state successfully matched. Meanwhile, the first updating of the track state is needed to be finished, namely, the effective track is marked. For the track management, the visible times and lost times of the track are updated according to the track state (whether the current target data is successfully matched or not), and the specific updating mode is as follows: adding one to the visible times of the track successfully matched with the frame, and resetting the invisible times to zero; and adding one to the invisible times of the track of the failed frame matching, wherein the visible times are unchanged. The logic is designed by the developer and does not make a mandatory requirement. And then updating the track state for the second time, marking the unsuccessfully matched track of the frame as a lost track, and reserving a track prediction value. And then selecting invalid flight paths to reset and zero according to the comparison between the flight path loss times and the continuous loss time threshold value. And finally, filling the current target data with the effective state into the air track to be used as a new track to generate, and carrying out third time target state updating and third time track state updating, namely converting the effective target state into a matched target and converting the air track state into an effective track state.
For post-fusion, the method belongs to an auxiliary fusion type, and input data are updated 'new flight path data' and other main sensor data and auxiliary sensor data. The fusion content mainly comprises two parts of new flight path data-basic data fusion and new flight path data-auxiliary data fusion. The two-part fusion algorithm has no mandatory requirement on the sequence, and the main function of the algorithm is to add more physical information to new track data. After the module is positioned behind the main fusion module in the asynchronous serial fusion framework, the basic process is the same as that of pre-fusion, and only 'reference data' is replaced by 'new flight path data'.
In this embodiment, a pre-fusion link and a post-fusion link involve multiple kinds of sensor fusion, and no matter the main sensor or the auxiliary sensor has no specific fusion sequence, which can be selected according to actual needs, but the fusion data must include reference sensor data.
It should be noted that, although the above embodiments have been described herein, the invention is not limited thereto. Therefore, based on the innovative concepts of the present invention, the technical solutions of the present invention can be directly or indirectly applied to other related technical fields by making changes and modifications to the embodiments described herein, or by using equivalent structures or equivalent processes performed in the content of the present specification and the attached drawings, which are included in the scope of the present invention.

Claims (8)

1. A flight path data fusion method based on asynchronous serial multi-sensors is characterized by comprising the following steps:
acquiring target data through a plurality of sensors, wherein the target data comprise basic data and auxiliary data, the plurality of sensors comprise a main sensor and an auxiliary sensor, the main sensor is used for acquiring the basic data and the auxiliary data, the auxiliary sensor is used for acquiring the auxiliary data, the basic data comprise position information, speed information, angle information and acceleration information, and the auxiliary data comprise traffic information and self-vehicle information;
selecting a sensor of the same type from the main sensors as a reference sensor, wherein target data acquired by the reference sensor is reference data of a detection target;
pre-fusing the target data collected by other sensors in sequence according to a serial sequence by taking the reference data collected by the reference sensor as a starting point to obtain pre-fused data;
matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data;
fusing the obtained flight path updating data with target data acquired by other sensors to obtain final flight path data;
the step of matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data specifically comprises the following steps:
predicting historical track data output by a previous frame to a current time through a prediction model, taking a predicted track timestamp as reference time, taking N frames of time before and after the reference time as a timestamp threshold, and judging whether target data acquired by a reference sensor at the current time is within the timestamp threshold;
if not, removing historical track information, and generating new track data according to target data acquired by the current reference sensor;
if yes, historical track data are obtained according to the historical track, and target data acquired by the reference sensor and the historical track data are fused to obtain new track data.
2. The asynchronous serial multi-sensor based track data fusion method according to claim 1, characterized by further comprising the following steps:
and if the damage of the reference sensor is detected, switching other main sensors to be used as new reference sensors until all the main sensors are damaged.
3. The asynchronous serial multi-sensor based track data fusion method according to claim 1, characterized by further comprising the following steps:
and for target data acquired by multiple sensors, combining the data acquired by the same type of sensors to form a data layer.
4. The asynchronous serial multi-sensor based track data fusion method according to claim 1, characterized by further comprising the following steps:
and respectively preprocessing the target data acquired by the main sensor, the target data acquired by the auxiliary sensor and the historical track data.
5. A storage medium storing a computer program, the computer program when executed by a processor performing the steps of:
acquiring target data through a plurality of sensors, wherein the target data comprise basic data and auxiliary data, the plurality of sensors comprise a main sensor and an auxiliary sensor, the main sensor is used for acquiring the basic data and the auxiliary data, the auxiliary sensor is used for acquiring the auxiliary data, the basic data comprise position information, speed information, angle information and acceleration information, and the auxiliary data comprise traffic information and self-vehicle information;
selecting a sensor of the same type from the main sensors as a reference sensor, wherein target data acquired by the reference sensor is reference data of a detection target;
pre-fusing the target data collected by other sensors in sequence according to a serial sequence by taking the reference data collected by the reference sensor as a starting point to obtain pre-fused data;
matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data;
fusing the obtained flight path updating data with target data acquired by other sensors to obtain final flight path data;
the computer program is run by a processor, and when the step of matching and fusing the obtained pre-fusion data and the historical track data to obtain track updating data is executed, the following steps are specifically executed:
predicting historical track data output by a previous frame to a current time through a prediction model, taking a predicted track timestamp as reference time, taking N frames of time before and after the reference time as a timestamp threshold, and judging whether target data acquired by a reference sensor at the current time is within the timestamp threshold;
if not, removing historical track information, and generating new track data according to target data acquired by the current reference sensor;
if yes, historical track data are obtained according to the historical track, and target data acquired by the reference sensor and the historical track data are fused to obtain new track data.
6. The storage medium of claim 5, wherein the computer program, when executed by the processor, further performs the steps of:
and if the damage of the reference sensor is detected, switching other main sensors to be used as new reference sensors until all the main sensors are damaged.
7. The storage medium of claim 5, wherein the computer program, when executed by the processor, further performs the steps of:
and for target data acquired by multiple sensors, combining the data acquired by the same type of sensors to form a data layer.
8. The storage medium of claim 5, wherein the computer program, when executed by the processor, further performs the steps of:
and respectively preprocessing the target data acquired by the main sensor, the target data acquired by the auxiliary sensor and the historical track data.
CN201911011310.2A 2019-10-23 2019-10-23 Asynchronous serial multi-sensor-based flight path data fusion method and storage medium Active CN110781949B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911011310.2A CN110781949B (en) 2019-10-23 2019-10-23 Asynchronous serial multi-sensor-based flight path data fusion method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911011310.2A CN110781949B (en) 2019-10-23 2019-10-23 Asynchronous serial multi-sensor-based flight path data fusion method and storage medium

Publications (2)

Publication Number Publication Date
CN110781949A CN110781949A (en) 2020-02-11
CN110781949B true CN110781949B (en) 2020-12-25

Family

ID=69386516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911011310.2A Active CN110781949B (en) 2019-10-23 2019-10-23 Asynchronous serial multi-sensor-based flight path data fusion method and storage medium

Country Status (1)

Country Link
CN (1) CN110781949B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113556366B (en) * 2020-04-07 2023-09-26 舜宇光学(浙江)研究院有限公司 Multi-sensor data synchronization method and system and electronic equipment thereof
CN111766557A (en) * 2020-05-31 2020-10-13 宁夏隆基宁光仪表股份有限公司 Method for analyzing influence on detection precision of electric energy meter based on K-Means algorithm
CN111912295A (en) * 2020-06-22 2020-11-10 中国人民解放军63850部队 Trajectory drop point prediction system
CN112541527A (en) * 2020-11-26 2021-03-23 深兰科技(上海)有限公司 Multi-sensor synchronization method and device, electronic equipment and storage medium
CN112924960B (en) * 2021-01-29 2023-07-18 重庆长安汽车股份有限公司 Target size real-time detection method, system, vehicle and storage medium
CN112946624B (en) * 2021-03-01 2023-06-27 西安交通大学 Multi-target tracking method based on track management method
CN113536057B (en) * 2021-07-29 2023-03-07 中国第一汽车股份有限公司 Flight path management method and device, computer equipment and storage medium
CN115545004A (en) * 2022-09-27 2022-12-30 北京有竹居网络技术有限公司 Navigation method and device and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102494687A (en) * 2011-10-19 2012-06-13 清华大学 High-precision posture/track integrated measurement device
CN102833882A (en) * 2011-06-15 2012-12-19 中国科学院声学研究所 Multi-target data fusion method and system based on hydroacoustic sensor network

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9906914B2 (en) * 2016-05-11 2018-02-27 Mapsted Corp. Scalable indoor navigation and positioning systems and methods
CN106017475B (en) * 2016-07-04 2019-03-08 四川九洲防控科技有限责任公司 A kind of track update method and device
CN106840242B (en) * 2017-01-23 2020-02-04 驭势科技(北京)有限公司 Sensor self-checking system and multi-sensor fusion system of intelligent driving automobile
CN107063259B (en) * 2017-03-08 2020-06-09 四川九洲电器集团有限责任公司 Track association method and electronic equipment
CN109858526B (en) * 2019-01-08 2023-08-18 沈阳理工大学 Multi-target track fusion method based on sensor in target tracking

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102833882A (en) * 2011-06-15 2012-12-19 中国科学院声学研究所 Multi-target data fusion method and system based on hydroacoustic sensor network
CN102494687A (en) * 2011-10-19 2012-06-13 清华大学 High-precision posture/track integrated measurement device

Also Published As

Publication number Publication date
CN110781949A (en) 2020-02-11

Similar Documents

Publication Publication Date Title
CN110781949B (en) Asynchronous serial multi-sensor-based flight path data fusion method and storage medium
Xia et al. An automated driving systems data acquisition and analytics platform
US10035508B2 (en) Device for signalling objects to a navigation module of a vehicle equipped with this device
Shin et al. Roarnet: A robust 3d object detection based on region approximation refinement
JP7140922B2 (en) Multi-sensor data fusion method and apparatus
CN109212521B (en) Target tracking method based on fusion of forward-looking camera and millimeter wave radar
CN109086788B (en) Apparatus, method and system for multi-mode fusion processing of data in multiple different formats sensed from heterogeneous devices
JP2021504796A (en) Sensor data segmentation
Aeberhard et al. High-level sensor data fusion architecture for vehicle surround environment perception
CN110009029B (en) Feature matching method based on point cloud segmentation
US11430224B2 (en) Systems and methods for camera-LiDAR fused object detection with segment filtering
Schueler et al. 360 degree multi sensor fusion for static and dynamic obstacles
García et al. Context aided pedestrian detection for danger estimation based on laser scanner and computer vision
US20220128700A1 (en) Systems and methods for camera-lidar fused object detection with point pruning
US11472444B2 (en) Method and system for dynamically updating an environmental representation of an autonomous agent
Wang et al. Deepaccident: A motion and accident prediction benchmark for v2x autonomous driving
CN116830164A (en) LiDAR decorrelated object detection system and method
CN113269811A (en) Data fusion method and device and electronic equipment
CN114842445A (en) Target detection method, device, equipment and medium based on multi-path fusion
CN115311512A (en) Data labeling method, device, equipment and storage medium
US20220343637A1 (en) Traffic flow machine-learning modeling system and method applied to vehicles
CN112036422A (en) Flight path management method and system based on multi-sensor information fusion and computer readable medium
US11142212B2 (en) Safety-aware comparator for redundant subsystems in autonomous vehicles
CN111612818A (en) Novel binocular vision multi-target tracking method and system
Xu et al. An efficient multi‐sensor fusion and tracking protocol in a vehicle‐road collaborative system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant