CN116340876A - Spatial target situation awareness method for local multisource data fusion - Google Patents
Spatial target situation awareness method for local multisource data fusion Download PDFInfo
- Publication number
- CN116340876A CN116340876A CN202310016534.2A CN202310016534A CN116340876A CN 116340876 A CN116340876 A CN 116340876A CN 202310016534 A CN202310016534 A CN 202310016534A CN 116340876 A CN116340876 A CN 116340876A
- Authority
- CN
- China
- Prior art keywords
- track
- registration
- local
- target
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 33
- 238000000034 method Methods 0.000 title claims abstract description 27
- 238000012545 processing Methods 0.000 claims abstract description 28
- 238000007781 pre-processing Methods 0.000 claims abstract description 7
- 238000012544 monitoring process Methods 0.000 claims abstract description 6
- 238000013507 mapping Methods 0.000 claims description 9
- 230000001133 acceleration Effects 0.000 claims description 6
- 238000004458 analytical method Methods 0.000 claims description 6
- 238000005259 measurement Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000011156 evaluation Methods 0.000 claims description 3
- 239000002245 particle Substances 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 4
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention belongs to the field of situation awareness, and particularly discloses a spatial target situation awareness method for local multisource data fusion, which comprises the following steps of: monitoring the exploration space to acquire multi-source data of targets in the exploration space; preprocessing, primary processing, secondary processing and tertiary processing are sequentially carried out on the multi-source data, and data registration, attribute fusion and situation estimation are respectively realized; judging the risk degree of the space target according to the processing result; the utility of the sensor data is fully exerted, the information and the services of different systems of the space monitoring network are shared among space groups, and finally, the combat commander is assisted to quickly master the space situation, so that quick decision is realized.
Description
Technical Field
The invention relates to the field of situation awareness, in particular to a spatial target situation awareness method for local multisource data fusion.
Background
With the rapid development of multi-sensor technology, various multi-sensor information systems facing application background are also emerging in large numbers. In the multi-sensor system, the expression forms of information are various, the capacity of information is huge, the relationship between information is complex, and the requirement for the processing speed is high. No sensor can provide comprehensive and accurate information at any time, and it has been difficult to improve system performance by simply increasing the accuracy and capacity of the sensor itself. Therefore, there is a need for a method to integrate multiple sensor information that does not have to be very accurate, and through coordination and performance complementation between them, get a comprehensive and correct knowledge of the environment or object characteristics, so as to improve the robustness of the whole system. Multi-sensor information fusion is thus the case. It is actually a fusion processing technology of multiple information, and by analyzing and intelligently synthesizing information from different sensors, the best consistent estimation of the measured object and the property of the measured object is obtained, so that more accurate and complete estimation and decision are generated compared with a single information source.
In the integration method of sensor information, at present, a method of introducing a situation awareness module is mostly adopted, namely, the whole situation is divided into situations which are convenient to understand according to actual requirements, and situation understanding results of an auxiliary system are reported to a control center, so that the burden of fighters is relieved, the situation awareness module in the auxiliary control system is an reasoning process in terms of a mathematical process, and the main flow technology at present comprises: a situation awareness network is constructed by adopting a Bayesian inference method, the network architecture is clear in level, but the problem that the prior probability is difficult to determine is faced in actual use. An algorithm based on a fuzzy cognitive map weakens the influence of sensor errors, but lacks the inclusion of sample randomness, and a reasonable processing method also exists for the variability of expert experience in an expert system.
Although the methods obtained by the researches have good task completion effect, in a practical complex battlefield environment, the situation awareness of the sensor information is simply and directly carried out by an auxiliary command system, and the problems of ensuring the accuracy of the situation awareness, reasonably carrying out information screening and sequencing to ensure that command intervention and decision making are carried out quickly and effectively are urgently needed to be solved at present.
Disclosure of Invention
The invention aims to provide a spatial target situation awareness method for local multisource data fusion, which aims to solve the problems in the background technology.
In order to achieve the above purpose, the present invention provides the following technical solutions: a spatial target situation awareness method for local multisource data fusion comprises the following steps:
s1: monitoring the exploration space to acquire multi-source data of targets in the exploration space;
s2: preprocessing, primary processing, secondary processing and tertiary processing are sequentially carried out on the multi-source data, and data registration, attribute fusion and situation estimation are respectively realized;
s3: and judging the risk degree of the space target according to the processing result.
Preferably, the multi-source data includes target information and scene information; wherein the target information includes: position, target three-dimensional point cloud, speed, acceleration, heading angle, azimuth angle and distance; the scene information comprises a scene three-dimensional point cloud, a depth image and a color image.
Preferably, the first-stage process includes time registration, spatial registration, measurement interconnection, tracking and track fusion, wherein the specific steps of time registration are as follows: 1) When the observation data of the plurality of sensor data are input to the registration processing unit, simple preprocessing analysis is performed on the observation data before registration processing to obtain required information such as the number of sensors, sampling period, sampling interval between the sensors, and the like; 2) Selecting a proper registration algorithm and registration frequency according to the existing information, and adjusting parameters according to the registered feedback information to ensure the real-time registration; 3) Processing and fusing the data to be registered according to the selected registration algorithm and registration frequency; 4) And (3) measuring real-time equivalent registration evaluation indexes of registration accuracy, feeding back analysis results to the step (2) for parameter calibration and updating in real time.
Preferably, the specific steps of the spatial registration are as follows: 1) Describing the position of the sensor on the elliptical earth by using the latitude, longitude and altitude of the earth relative to the reference earth, and converting the position of the sensor into a mapping position and a mapping function of a geographic coordinate system; 2) Performing first-order taylor expansion on the mapping function to approximately obtain a linear model between measurement and system deviation: 3) And carrying out least square estimation on the linear model, and obtaining an accurate estimation value of the system deviation through calculation.
Preferably, the track fusion is to perform registration fusion of a local track and a whole track, and specifically includes: because the time frequency of the data reported by each radar in the data acquisition equipment is different, under the condition that the target cannot be regarded as particles, the position transformation is considered, the gesture transformation can also occur, and the local track of the target at different time frames and the whole track of the continuous time period are not necessarily the same, so that the registration of the local track and the whole track is required, the pose of the target is judged, and the formula is as follows:
M P =F*M S +T(F*M Sv +T*M Sa /2)
wherein T is the track prediction time, i.e. the difference between the time of the local track and the time of the global track, M p Representing the position of the object at each time frame, M Sv Representing the speed, M, of each local track in the fused track Sa Expressing the acceleration of each local track in the fusion track; if the whole track is not detected at the current moment, establishing a fusion track according to the local track; otherwise, matching and fusing the local track and the whole track to form a fused track; the track is then predicted for the next time frame, including the position coordinates and velocity of each target.
Preferably, the specific steps of the third stage treatment include: 1) Generating an element set of the current situation; 2) Generating a hypothesis set of situations: 3) Forming a current situation by a least uncertainty situation assumption; 4) Estimating the support degree of the current situation on realizing the situation target; 5) The possible situation for the next period to occur is predicted.
Compared with the prior art, the invention has the beneficial effects that: the utility of the sensor data is fully exerted, the information and the services of different systems of the space monitoring network are shared among space groups, and finally, the combat commander is assisted to quickly master the space situation, so that quick decision is realized.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly and completely described below in conjunction with the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides a space target situation awareness method for local multisource data fusion, which comprises the following steps:
s1: monitoring the exploration space to acquire multi-source data of targets in the exploration space;
s2: preprocessing, primary processing, secondary processing and tertiary processing are sequentially carried out on the multi-source data, and data registration, attribute fusion and situation estimation are respectively realized;
s3: and judging the risk degree of the space target according to the processing result.
The multi-source data comprises target information and scene information; wherein the target information includes: position, target three-dimensional point cloud, speed, acceleration, heading angle, azimuth angle and distance; the scene information comprises a scene three-dimensional point cloud, a depth image and a color image;
the first-stage processing comprises the steps of time registration, spatial registration, measurement interconnection, tracking and track fusion, wherein the specific steps of the time registration are as follows: 1) When the observation data of the plurality of sensor data are input to the registration processing unit, simple preprocessing analysis is performed on the observation data before registration processing to obtain required information such as the number of sensors, sampling period, sampling interval between the sensors, and the like; 2) Selecting a proper registration algorithm and registration frequency according to the existing information, wherein the registration algorithm comprises an interpolation extrapolation method and a least square method, and simultaneously, parameters can be adjusted according to the registered feedback information so as to ensure the real-time registration; 3) Processing and fusing the data to be registered according to the selected registration algorithm and registration frequency; 4) Measuring the real-time equivalent registration evaluation index of the registration accuracy, feeding back the analysis result to the step 2) for parameter calibration and updating in real time;
the specific steps of the spatial registration are as follows: 1) Describing the position of the sensor on the elliptical earth by using the latitude, longitude and altitude of the earth relative to the reference earth, and converting the position of the sensor into a mapping position and a mapping function of a geographic coordinate system; 2) Performing first-order taylor expansion on the mapping function to approximately obtain a linear model between measurement and system deviation: 3) Carrying out least square estimation on the linear model, and obtaining an accurate estimation value of the system deviation through calculation;
the track fusion is to perform registration fusion of a local track and a whole track, and specifically comprises the following steps: because the time frequency of the data reported by each radar in the data acquisition equipment is different, under the condition that the target cannot be regarded as particles, the position transformation is considered, the gesture transformation can also occur, and the local track of the target at different time frames and the whole track of the continuous time period are not necessarily the same, so that the registration of the local track and the whole track is required, the pose of the target is judged, and the formula is as follows:
M P =F*M S +T(F*M Sv +T*M Sa /2)
wherein T is the track prediction time, i.e. the difference between the time of the local track and the time of the global track, M p Representing the position of the object at each time frame, M Sv Representing the speed, M, of each local track in the fused track Sa Expressing the acceleration of each local track in the fusion track; if the whole track is not detected at the current moment, establishing a fusion track according to the local track; otherwise, matching and fusing the local track and the whole track to form a fused track; then predicting the track at the next moment frame, including the position coordinates and the speed of each target;
the specific steps of the third stage treatment include: 1) Generating an element set of the current situation; 2) Generating a hypothesis set of situations: 3) Forming a current situation by a least uncertainty situation assumption; 4) Estimating the support degree of the current situation on realizing the situation target; 5) The possible situation for the next period to occur is predicted.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (6)
1. The spatial target situation awareness method for local multisource data fusion is characterized by comprising the following steps of:
s1: monitoring the exploration space to acquire multi-source data of targets in the exploration space;
s2: preprocessing, primary processing, secondary processing and tertiary processing are sequentially carried out on the multi-source data, and data registration, attribute fusion and situation estimation are respectively realized;
s3: and judging the risk degree of the space target according to the processing result.
2. The spatial target situation awareness method for local multisource data fusion according to claim 1, wherein the spatial target situation awareness method is characterized by comprising the following steps: the multi-source data comprises target information and scene information; wherein the target information includes: position, target three-dimensional point cloud, speed, acceleration, heading angle, azimuth angle and distance; the scene information comprises a scene three-dimensional point cloud, a depth image and a color image.
3. The spatial target situation awareness method for local multisource data fusion according to claim 1, wherein the spatial target situation awareness method is characterized by comprising the following steps: the first-stage processing comprises time registration, space registration, measurement interconnection, tracking and track fusion, wherein the specific steps of the time registration are as follows: 1) When the observation data of the plurality of sensor data are input to the registration processing unit, simple preprocessing analysis is performed on the observation data before registration processing to obtain required information such as the number of sensors, sampling period, sampling interval between the sensors, and the like; 2) Selecting a proper registration algorithm and registration frequency according to the existing information, and adjusting parameters according to the registered feedback information to ensure the real-time registration; 3) Processing and fusing the data to be registered according to the selected registration algorithm and registration frequency; 4) And (3) measuring real-time equivalent registration evaluation indexes of registration accuracy, feeding back analysis results to the step (2) for parameter calibration and updating in real time.
4. A method for spatial target situation awareness in a local multisource data fusion according to claim 3, wherein: the specific steps of the spatial registration are as follows: 1) Describing the position of the sensor on the elliptical earth by using the latitude, longitude and altitude of the earth relative to the reference earth, and converting the position of the sensor into a mapping position and a mapping function of a geographic coordinate system; 2) Performing first-order taylor expansion on the mapping function to approximately obtain a linear model between measurement and system deviation: 3) And carrying out least square estimation on the linear model, and obtaining an accurate estimation value of the system deviation through calculation.
5. The spatial target situation awareness method for local multisource data fusion according to claim 1, wherein the spatial target situation awareness method is characterized by comprising the following steps: the track fusion is to perform registration fusion of a local track and a whole track, and specifically comprises the following steps: because the time frequency of the data reported by each radar in the data acquisition equipment is different, under the condition that the target cannot be regarded as particles, the position transformation is considered, the gesture transformation can also occur, and the local track of the target at different time frames and the whole track of the continuous time period are not necessarily the same, so that the registration of the local track and the whole track is required, the pose of the target is judged, and the formula is as follows:
M P =F*M S +T(F*M Sv +T*M Sa /2)
wherein T is the track prediction time, i.e. the difference between the time of the local track and the time of the global track, M p Representing the position of the object at each time frame, M Sv Representing the speed, M, of each local track in the fused track Sa Expressing the acceleration of each local track in the fusion track; if the whole track is not detected at the current moment, establishing a fusion track according to the local track; otherwise, matching and fusing the local track and the whole track to form a fused track; the track is then predicted for the next time frame, including the position coordinates and velocity of each target.
6. The spatial target situation awareness method for local multisource data fusion according to claim 1, wherein the spatial target situation awareness method is characterized by comprising the following steps: the specific steps of the third-stage treatment comprise: 1) Generating an element set of the current situation; 2) Generating a hypothesis set of situations: 3) Forming a current situation by a least uncertainty situation assumption; 4) Estimating the support degree of the current situation on realizing the situation target; 5) The possible situation for the next period to occur is predicted.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310016534.2A CN116340876A (en) | 2023-01-06 | 2023-01-06 | Spatial target situation awareness method for local multisource data fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310016534.2A CN116340876A (en) | 2023-01-06 | 2023-01-06 | Spatial target situation awareness method for local multisource data fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116340876A true CN116340876A (en) | 2023-06-27 |
Family
ID=86890541
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310016534.2A Pending CN116340876A (en) | 2023-01-06 | 2023-01-06 | Spatial target situation awareness method for local multisource data fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116340876A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116842127A (en) * | 2023-08-31 | 2023-10-03 | 中国人民解放军海军航空大学 | Self-adaptive auxiliary decision-making intelligent method and system based on multi-source dynamic data |
-
2023
- 2023-01-06 CN CN202310016534.2A patent/CN116340876A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116842127A (en) * | 2023-08-31 | 2023-10-03 | 中国人民解放军海军航空大学 | Self-adaptive auxiliary decision-making intelligent method and system based on multi-source dynamic data |
CN116842127B (en) * | 2023-08-31 | 2023-12-05 | 中国人民解放军海军航空大学 | Self-adaptive auxiliary decision-making intelligent method and system based on multi-source dynamic data |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110866887A (en) | Target situation fusion sensing method and system based on multiple sensors | |
KR20200006167A (en) | Vessel automatic tracking method and system based on deep learning network and average movement | |
CN110361744B (en) | RBMCDA underwater multi-target tracking method based on density clustering | |
CN111859054B (en) | Meteorological satellite data processing method and device | |
CN111027692A (en) | Target motion situation prediction method and device | |
CN115342814B (en) | Unmanned ship positioning method based on multi-sensor data fusion | |
CN113075648B (en) | Clustering and filtering method for unmanned cluster target positioning information | |
CN102853836A (en) | Feedback weight fusion method based on track quality | |
CN116340876A (en) | Spatial target situation awareness method for local multisource data fusion | |
CN115451948A (en) | Agricultural unmanned vehicle positioning odometer method and system based on multi-sensor fusion | |
CN111999735A (en) | Dynamic and static target separation method based on radial velocity and target tracking | |
CN113486960A (en) | Unmanned aerial vehicle tracking method and device based on long-time memory neural network, storage medium and computer equipment | |
CN114463932A (en) | Non-contact construction safety distance active dynamic recognition early warning system and method | |
CN115855079A (en) | Time asynchronous perception sensor fusion method | |
CN117724059A (en) | Multi-source sensor fusion track correction method based on Kalman filtering algorithm | |
CN115900712A (en) | Information source reliability evaluation combined positioning method | |
CN117687416B (en) | Path planning method and system for river network water safety detection device | |
Li et al. | Adaptive multiframe detection algorithm with range-Doppler-azimuth measurements | |
CN117828527A (en) | Multi-source data fusion and situation generation method and system | |
CN102830391B (en) | Accuracy index calculating method of infrared search and track system | |
Ebert et al. | Deep radar sensor models for accurate and robust object tracking | |
CN106570536B (en) | A kind of positioning using TDOA aims of systems high precision tracking filtering method | |
Lei et al. | Multi-platform and multi-sensor data fusion based on ds evidence theory | |
CN114488247A (en) | Method for analyzing mobility of equipment based on high-precision Beidou differential positioning | |
CN114384509A (en) | Safe driving decision generation method supported by intelligent driving vehicle data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |