CN111238467A - Bionic polarized light assisted unmanned combat aircraft autonomous navigation method - Google Patents
Bionic polarized light assisted unmanned combat aircraft autonomous navigation method Download PDFInfo
- Publication number
- CN111238467A CN111238467A CN202010082963.6A CN202010082963A CN111238467A CN 111238467 A CN111238467 A CN 111238467A CN 202010082963 A CN202010082963 A CN 202010082963A CN 111238467 A CN111238467 A CN 111238467A
- Authority
- CN
- China
- Prior art keywords
- navigation
- navigation information
- observation
- combat aircraft
- unmanned combat
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The invention discloses a bionic polarized light assisted unmanned combat aircraft autonomous navigation method, which comprises the steps of establishing a dynamics model of a quaternion-based BPNS/MIMU/VNS autonomous integrated navigation system according to an MIMU navigation subsystem; establishing a first observation model and a second observation model through a VNS navigation subsystem and a BPNS navigation subsystem respectively; acquiring navigation information of each navigation subsystem, and performing optimal estimation on the dynamic model, the first observation model and the second observation model by using an improved federal CKF filtering method to obtain final navigation information of the unmanned combat aircraft; the invention assists the traditional MIMU and VNS through BPNS, introduces an anti-difference factor in the filtering process, weakens the influence of abnormal observation, and effectively improves the sampling rate and the precision of asynchronous observation processing through asynchronous time registration. Therefore, the unmanned aerial vehicle autonomous navigation system has the advantages of being strong in autonomy, high in precision, good in reliability and the like, can make up for the defects of the existing autonomous navigation technology, and improves the autonomous accurate navigation capacity of the unmanned aerial vehicle under the complex battlefield environment.
Description
[ technical field ] A method for producing a semiconductor device
The invention belongs to the technical field of navigation, guidance and control, and particularly relates to a bionic polarized light assisted unmanned combat aircraft autonomous navigation method.
[ background of the invention ]
As an unmanned platform facing a complex combat environment and far away from a control base station, intelligent autonomous flight of an unmanned combat aircraft is one of the necessary core capabilities, and the important support of the core capability is an autonomous precise navigation technology. The autonomous accurate navigation technology is a fundamental guarantee that the unmanned combat aircraft completes established tasks and achieves the value of the unmanned combat aircraft, and is one of the main bottlenecks restricting the rapid development of the unmanned combat aircraft.
In order to meet the task requirements of continuous combat during long-endurance, high-maneuverability flight, anti-interference accurate flight in a complex battlefield environment and the like, the unmanned combat aircraft provides very high requirements for the design of a navigation system. Currently, the mainstream navigation means is mainly divided into autonomous navigation and non-autonomous navigation according to the autonomy of the navigation mode. The autonomous navigation includes: inertial navigation, astronomical navigation, visual navigation, and the like; the most important of non-autonomous navigation is satellite navigation. The autonomous navigation has the characteristic of strong anti-interference capability because the autonomous navigation does not carry out information interaction with the outside in the navigation stage, and has incomparable advantages in the military field.
However, the existing autonomous navigation methods have respective defects and application ranges, and cannot meet the requirement of accurate navigation of the unmanned combat aircraft. Such as inertial navigation errors, accumulate rapidly over time; astronomical navigation cannot obtain accurate starlight information in a low airspace where the unmanned combat aircraft flies; the visual navigation system is influenced by factors such as visual angle change and illumination intensity, and the problem of algorithm failure caused by unstable image tracking easily occurs.
[ summary of the invention ]
The invention aims to provide a bionic polarized light assisted autonomous navigation method for an unmanned combat aircraft, which is used for assisting a traditional navigation system through bionic polarized light to form a combined navigation system and solving the problems of low autonomy, low accuracy caused by time errors, poor system robustness when observation information is interfered and the like of the traditional navigation method, so that the autonomous accurate navigation capability of the unmanned combat aircraft in a complex battlefield environment is improved.
The invention adopts the following technical scheme: a bionic polarized light assisted unmanned combat aircraft autonomous navigation method comprises the following steps:
establishing a BPNS/MIMU/VNS autonomous integrated navigation system dynamic model based on quaternion according to the MIMU navigation subsystem;
establishing a first observation model and a second observation model through a BPNS navigation subsystem and a VNS navigation subsystem respectively;
and acquiring navigation information of each navigation subsystem, and performing state estimation on the dynamic model, the first observation model and the second observation model by using an improved federal CKF filtering method to obtain final navigation information of the unmanned combat aircraft.
Further, the state estimation of the dynamic model, the first observation model and the second observation model by using the improved federated CKF filtering method comprises the following steps:
carrying out state estimation on the dynamic model and the first observation model by using an improved CKF filtering method to obtain first navigation information of the unmanned combat aircraft;
carrying out state estimation on the dynamic model and the second observation model by using an improved CKF filtering method to obtain second navigation information of the unmanned combat aircraft;
and performing system state fusion on the first navigation information and the second navigation information by using a federal filtering method to obtain final navigation information of the unmanned combat aircraft.
Further, the quaternion-based dynamical model is:
wherein the content of the first and second substances,a non-linear function representing a kinetic model, x (t) being a system state vector,w (t) is kinetic noise;
attitude quaternion from the body coordinate system b to the navigation coordinate system n;Representing the velocity of the unmanned aerial vehicle in the navigational coordinate system n,andthe east speed, the north speed and the sky speed of the unmanned combat aircraft are respectively; p is a radical ofn=[λ,L,h]TThe longitude, the latitude and the altitude of the unmanned combat aircraft in a navigation coordinate system n are collected, wherein lambda is the longitude, L is the latitude, and h is the altitude; epsilonbAs gyro bias, ▽bIs the accelerometer bias;
is composed ofThe differential of (a) is determined,for the angular velocity of the unmanned combat aircraft in the navigational coordinate system n,representing a quaternion multiplication;
is v isnThe differential of (a) is determined,representing the attitude rotation matrix, fbThe specific force in the body coordinate system b measured for the accelerometer,is the angular velocity of rotation of the earth with respect to the inertial frame i,for the angular velocity, g, of the navigation coordinate system n relative to the terrestrial coordinate system enIs a gravity acceleration vector;
is pnM is a matrix of position coefficients,RMand RNRespectively the curvature radius of the earth meridian circle and the Mao unitary circle;
Further, the first observation model is:
z1(k)=h1(x(k))+v1(k),
wherein z is1(k)=[ψB,θB,γB]T,ψB,θB,γBCourse angle, pitch angle and roll angle, v, of the unmanned combat aircraft measured by the BPNS navigation subsystem respectively1(k) For the BPNS navigation subsystem's observed noise vector for attitude,x (k) is a discrete representation of x (t).
Further, the second observation model is:
z2(k)=h2(x(k))+v2(k),
wherein z is2(k)=[ψV,θV,γV,λV,LV,hV]T,ψV,θV,γVRespectively measuring course angle, pitch angle and roll angle, lambda of the unmanned combat aircraft obtained by the VNS navigation subsystemV,LV,hVLongitude, latitude and altitude, v, of unmanned combat aircraft measured by VNS navigation subsystem respectively2(k) For the observed noise vector for attitude and position in the VNS navigation subsystem,
further, the improved federal CKF filtering method comprises:
initializing a kinetic model and a first observation model/a second observation model;
acquiring third navigation information of the unmanned combat aircraft; the third navigation information is initial navigation information or final navigation information of the unmanned combat aircraft at the previous moment;
generating fourth navigation information of the unmanned combat aircraft by combining the third navigation information and the dynamic model; the fourth navigation information is a navigation information predicted value;
acquiring fifth navigation information; the fifth navigation information is attitude information output by the BPNS navigation subsystem after asynchronous observation time registration and attitude and position information output by the VNS navigation subsystem after asynchronous observation time registration;
and generating final navigation information of the unmanned combat aircraft by combining the first observation model/the second observation model according to the fourth navigation information and the fifth navigation information.
Further, obtaining the fifth navigation information further includes:
Judging whether the mahalanobis distance criterion of the innovation vector is less than or equal to a threshold value, namely
Wherein, βkIs used as a criterion of the Mahalanobis distance, in order to obtain the covariance matrix of the innovation vector,m is an observation vector dimension, which is a preset threshold value;
in response toConstructing an iterative relationship of the robust factor according to the Mahalanobis distance criterion, and iteratively solving the robust factor kappakCombining with robust factor, updating the covariance matrix of CKF innovation vector to
Wherein the content of the first and second substances,the volume points selected for the CKF filter and propagated through the nonlinear observation function,and R is a filter observation noise covariance matrix.
Executing the CKF algorithm step to obtain first/second navigation information;
and performing state fusion on the first navigation information and the second navigation information by using a federal filtering method to obtain final navigation information of the unmanned combat aircraft.
Further, solving for the robust factor includes:
constructing a nonlinear equation according to the Mahalanobis distance criterion:
wherein the content of the first and second substances,representing updated innovation vector covariance matrix after introduction of the robust factor;
solving the nonlinear equation by using a Newton iteration method to obtain an iterative relationship of the robust factor:
wherein i is the iteration number, and the initial value of the iteration is kk(0)=1。
Further, the acquiring of the fifth navigation information includes:
determining an observation processing section [ T ]Δ+sTΔ,TΔ+sTΔ+Ts]Wherein s is 1 or a positive integer, TΔIs an overlapping interval, TsIs a segment interval;
within each processing interval, an observation set { z ] of the t-th sensor is acquired respectivelyt1,zt2,...,ztr};
Based on a set of observations { zt1,zt2,...,ztrAnd generating an observation estimated value of the t-th navigation sensor in each processing interval by using a least square method, wherein the observation estimated value is used as fifth navigation information.
Further, the system state fusion comprises:
and respectively acquiring an error covariance matrix of the first navigation information and an error covariance matrix of the second navigation information according to the dynamic model and the first observation model/the second observation model. Establishing weights of the first navigation information and the second navigation information based on the error covariance matrix of the first navigation information and the error covariance matrix of the second navigation information;
wherein the content of the first and second substances,final navigation information for unmanned aerial vehicles, PgFor the corresponding error covariance matrix,P1is an error covariance matrix of the first navigation information,is the first navigation information, P2Is an error covariance matrix of the second navigation information,as second navigation information, PzThe navigation information error covariance matrix obtained after the time update is carried out on the main filter through a dynamic model,and the navigation information is obtained after the time of the main filter is updated through the dynamic model.
The invention has the beneficial effects that: the invention forms a combined navigation system by the bionic polarized light assisted traditional MIMU and VNS, thus improving the autonomy and reliability of the unmanned combat aircraft navigation system; by adding the robust factor in the filtering process, the filtering gain is adjusted, the influence of abnormal observation on state estimation is weakened, the robustness of the filter can be effectively improved, and the state estimation precision of the navigation system is improved; and by means of asynchronous observation time registration, overlapping subsection intervals and a time registration method, sampling rate and precision of asynchronous observation processing are effectively improved, and asynchronous time errors of the integrated navigation system are reduced, so that the method has the advantages of being strong in autonomy, high in precision, good in reliability and the like, can make up for the defects of the existing autonomous navigation technology, improves autonomous accurate navigation capability of the unmanned combat aircraft in a complex battlefield environment, and has wide application prospects.
[ description of the drawings ]
FIG. 1 is a flow chart of an autonomous navigation method in an embodiment of the present invention;
FIG. 2 is a flowchart of the robust CKF based on Mahalanobis distance in an embodiment of the present invention;
FIG. 3 is a flow chart of a state fusion process in an embodiment of the present invention;
fig. 4 is a schematic diagram of asynchronous observation time registration in the embodiment of the present invention.
[ detailed description ] embodiments
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
The BPNS is a new and prospective autonomous navigation mode, attitude information of the unmanned combat aircraft can be obtained by measuring the polarization characteristic of natural light, and compared with other mature navigation methods, the BPNS has the unique advantages: (a) the method has high autonomy and good concealment, and is not interfered by human factors; (b) the required time for navigation resolving is short, and the real-time performance is good; (c) navigation errors are not accumulated along with time, and the precision is high.
Therefore, the method can make up the defects of the existing autonomous navigation mode, a novel bionic technology is added for the navigation of the unmanned combat aircraft, and the types of autonomous navigation systems of the unmanned combat aircraft are expanded. However, since the BPNS mainly outputs the attitude angle of the carrier, the six-degree-of-freedom navigation parameters of the unmanned combat aircraft cannot be provided; and BPNS is susceptible to weather, and atmospheric polarization modes in rainy or windy and sandy weather become more complex, which in turn affects navigation accuracy.
Therefore, the single navigation system has respective advantages and disadvantages, and the single navigation system is difficult to meet the requirements of long endurance and high performance navigation. The integrated navigation technology based on the information fusion principle provides a new way for improving the information redundancy and the overall performance of the aircraft navigation system. However, the existing combined navigation technology still has defects and shortcomings, and cannot meet the requirements of unmanned combat aircrafts.
For example, satellite navigation (global positioning system (GPS), beidou) and inertial navigation are the mainstream combined navigation technologies adopted by most unmanned aerial vehicles, but in case of war, if satellite navigation fails, the combined navigation mode has a great risk. When the unmanned combat aircraft flies in a low hollow space, the inertia and astronomical combined navigation system is limited by visibility, so that the mode is rarely adopted; the inertia/VNS combined navigation has strong autonomy and anti-electromagnetic interference, but when the inertia/VNS combined navigation is influenced by factors such as light and the like, the navigation precision is sharply reduced.
In conclusion, the combined navigation constructed by the BPNS-assisted traditional navigation sensor is a prospective new concept navigation method, is an extension and extension of the existing autonomous navigation method, represents the development trend of the autonomous navigation technology of the current unmanned combat aircraft, and can provide a new way for realizing autonomous and accurate navigation of the unmanned combat aircraft.
The invention aims to provide a bionic polarized light assisted autonomous navigation method for an unmanned combat aircraft, which aims to solve the problems of low autonomy, low accuracy caused by time error, low system robustness when observation information is interfered and the like of the traditional navigation method, so that the autonomous accurate navigation capability of the unmanned combat aircraft in a complex battlefield environment is improved.
Specifically, the invention discloses a bionic polarized light assisted unmanned combat aircraft autonomous navigation method, which comprises the following steps:
a dynamics model of a quaternion-based BPNS/MIMU/VNS autonomous integrated navigation system is established according to a MIMU navigation subsystem (namely a micro inertial navigation system) (the BPNS is a bionic polarized light navigation system, and the VNS is a visual navigation system). And establishing a first observation model and a second observation model through the BPNS navigation subsystem and the VNS navigation subsystem respectively. And acquiring navigation information of each navigation subsystem, and performing optimal estimation on the dynamic model, the first observation model and the second observation model by using an improved federal CKF filtering method (namely an improved federal cubature Kalman filtering method) to obtain final (optimal) navigation information of the unmanned combat aircraft.
According to the method, the robust factor is added in the filtering process, so that the filtering gain can be adjusted, the influence of abnormal observation on state estimation is weakened, the robustness of a filter can be effectively improved, and the state estimation precision of the integrated navigation system is improved; and by means of asynchronous observation time registration, overlapping subsection intervals and a time registration method, sampling rate and precision of asynchronous observation processing are effectively improved, and asynchronous time errors of the integrated navigation system are reduced, so that the method has the advantages of being strong in autonomy, high in precision, good in reliability and the like, can make up for the defects of the existing autonomous navigation technology, improves autonomous accurate navigation capability of the unmanned combat aircraft in a complex battlefield environment, and has wide application prospects.
In the embodiment of the present invention, as shown in fig. 1, the performing optimal estimation on the dynamic model, the first observation model, and the second observation model by using the improved federal CKF filtering method includes:
and performing state estimation on the dynamic model and the first observation model by using an improved CKF filtering method (namely an improved cubature Kalman filtering method) to obtain first navigation information of the unmanned combat aircraft. And carrying out state estimation on the dynamic model and the second observation model by using an improved CKF filtering method to obtain second navigation information of the unmanned combat aircraft. And performing system state fusion on the first navigation information and the second navigation information by using a federal filtering method to obtain final (optimal) navigation information of the unmanned combat aircraft.
The embodiment designs a distributed data fusion structure with three-layer fusion characteristics aiming at a BPNS/MIMU/VNS combined navigation system. In the first layer, time registration is carried out on asynchronous observation output by BPNS, MIMU and VNS by adopting a time registration method based on segment overlapping; in the second layer, an improved CKF algorithm is adopted to obtain the local state estimation of the MIMU/BPNS and MIMU/VNS combined navigation subsystem in a parallel mode; and in the third layer, dynamic fusion is carried out on the local state estimation through federal filtering, and the global optimal state estimation of the integrated navigation system is obtained.
Because the MIMU has the advantages of comprehensive navigation parameters, strong maneuvering tracking capability, timely and continuous output, good anti-interference performance and the like, when the BPNS/MIMU/VNS full-autonomous integrated navigation system is designed, the MIMU is used as a basic system, and the navigation information output by the BPNS and the VNS is organically integrated with the navigation information of the MIMU respectively, so that the defects of each subsystem can be overcome, and the problem that the accurate navigation of the unmanned combat aircraft cannot be realized due to the failure of the integrated navigation subsystem caused by some objective conditions is solved. The BPNS/MIMU/VNS fully-autonomous combined navigation system has higher navigation precision, better reliability and stronger autonomy due to more redundant information.
In this embodiment, the navigation coordinate system adopts an east-north-sky geographic coordinate system, the MIMU outputs information such as the position, speed, and attitude of the unmanned aerial vehicle, the BPNS outputs attitude information, and the VNS outputs attitude and position information. Firstly, designing MIMU/BPNS and MIMU/VNS combined navigation subsystems based on MIMU, establishing quaternion-based combined navigation local filter nonlinear system models (dynamic models and observation models), and calculating to obtain two groups of local estimation of the state of the combined navigation system. And then, sending the two groups of local estimation into a main filter for global information fusion to obtain global optimal estimation of the system state. And finally, taking the global optimal estimation of the system state as the output of the integrated navigation system, namely the navigation state of the unmanned combat aircraft at the current moment. The designed BPNS/MIMU/VNS full-autonomous combined navigation can realize the advantage complementation and cooperative work among the BPNS, the MIMU and the VNS, and provides a new way for the autonomous accurate navigation of the unmanned combat aircraft.
As a specific embodiment, the kinetic model is:
wherein the content of the first and second substances,a non-linear function representing the kinetic model, x (t) being a system state vector,w (t) is kinetic noise;
the attitude quaternion from the body coordinate system b to the navigation coordinate system n;representing the velocity of the unmanned aerial vehicle in the navigational coordinate system n,andthe east speed, the north speed and the sky speed of the unmanned combat aircraft are respectively; p is a radical ofn=[λ,L,h]TThe longitude, the latitude and the altitude of the unmanned combat aircraft in a navigation coordinate system n are collected, wherein lambda is the longitude, L is the latitude, and h is the altitude; epsilonbAs gyro bias, ▽bIs the accelerometer bias;
is composed ofThe differential of (a) is determined,for the angular velocity of the unmanned combat aircraft in the navigational coordinate system n,it is shown that the quaternion multiplication, the angular rate of the unmanned combat aircraft under the gyro-measured body coordinate system b,is the angular velocity of rotation of the earth relative to the inertial frame i,is the angular rate of the navigation coordinate system n relative to the terrestrial coordinate system e;
is v isnThe differential of (a) is determined,representing the attitude rotation matrix, fbSpecific force, g, in body coordinate system b measured for accelerometernIs a gravity acceleration vector;
is pnM is a matrix of position coefficients,RMand RNRespectively the curvature radius of the earth meridian circle and the Mao unitary circle;
The first observation model of this embodiment is:
z1(k)=h1(x(k))+v1(k),
wherein z is1(k)=[ψB,θB,γB]T,ψB,θB,γBCourse angle, pitch angle and roll angle, v, of the unmanned combat aircraft measured by the BPNS navigation subsystem respectively1(k) For the observed noise vector for the pose in the BPNS navigation subsystem,x (k) is a discrete representation of x (t).
The second observation model of this embodiment is:
z2(k)=h2(x(k))+v2(k),
wherein z is2(k)=[ψV,θV,γV,λV,LV,hV]T,ψV,θV,γVRespectively measuring course angle, pitch angle and roll angle, lambda of the unmanned combat aircraft obtained by the VNS navigation subsystemV,LV,hVLongitude, latitude and altitude, v, of unmanned combat aircraft measured by VNS navigation subsystem respectively2(k) For the observed noise vector for attitude and position in the VNS navigation subsystem,
specifically, the improved federal CKF filtering method comprises the following steps:
initializing a kinetic model and a first observation model/a second observation model; acquiring third navigation information of the unmanned combat aircraft; the third navigation information is initial navigation information or navigation information of the unmanned combat aircraft at the previous moment; generating fourth navigation information of the unmanned combat aircraft by combining the third navigation information and the dynamic model; the fourth navigation information is a navigation information predicted value; acquiring fifth navigation information; the fifth navigation information is attitude and position information output by the VNS navigation subsystem after asynchronous observation time registration and attitude information output by the BPNS navigation subsystem after asynchronous observation time registration; and generating final navigation information of the unmanned combat aircraft by combining the first observation model/the second observation model according to the fourth navigation information and the fifth navigation information and utilizing a federal filtering technology.
In this embodiment, the system state fusion includes:
as shown in fig. 3, an error covariance matrix of the first navigation information and an error covariance matrix of the second navigation information are obtained according to the dynamic model and the first observation model/the second observation model, respectively. Establishing weights of the first navigation information and the second navigation information based on the error covariance matrix of the first navigation information and the error covariance matrix of the second navigation information;
wherein the content of the first and second substances,final (optimal) navigation information, P, for the unmanned combat aircraftgFor the corresponding error covariance matrix,P1is an error covariance matrix of the first navigation information,is the first navigation information, P2Is an error covariance matrix of the second navigation information,as second navigation information, PzThe navigation information error covariance matrix obtained after the time update (one-step prediction) is carried out on the main filter through a dynamic model,and (4) navigation information obtained after time updating (one-step prediction) is carried out on the main filter through a dynamic model. And after the final (optimal) navigation information and the error covariance matrix of the unmanned combat aircraft are obtained, the initial values of the sub-filters and the main filter at the next moment are respectively reset to be the initial values in a federal reset feedback modeWherein the content of the first and second substances,and distributing coefficients for the information, wherein N is the number of the sub-filters.
In order to overcome the defects of the existing time registration method, the embodiment proposes that the asynchronous observation data output by each navigation sensor is processed by adopting a segmented overlapping method.
The principle of segment overlap is shown in FIG. 4, where first, a segment interval T is determineds. To avoid the occurrence of sensorless observation in the segmentation interval, the segmentation interval should be not less than the maximum sampling period T of each sensor in the integrated navigation systemmaxI.e. Ts>Tmax。
Next, an overlap interval T is dividedΔ. In principle, the overlap interval should be as dense as possible to increase the sampling rate. However, if the overlap interval is too small, not only data update cannot be performed in the adjacent interval, but also the computing load of the system is increased. Therefore, the average value of the sampling periods of the sensors is selected as the overlapping interval in the calculation, but if the sampling rate is greatly different, the influence of the maximum value is eliminated. In summary, the overlap interval TΔCan be selected as
Wherein n is the number of sensors, TtFor the T-th sensor sampling period, TminThe minimum sampling period for each sensor.
Thirdly, taking the system observation starting value as the starting time T0Determining each observation processing section [ T ]Δ+sTΔ,TΔ+sTΔ+Ts]And s is 0,1,2, …. Within each processing interval, an observation set { z ] of the t-th sensor is acquired respectivelyt1,zt2,...,ztr}。
Finally, based on the set of observations { z }t1,zt2,...,ztrAnd solving the observation estimation of the t-th navigation sensor in each processing interval according to the least square principle. This results in a series of sensors at the same time and spaced apart by a distance TΔAs a filtered observation value of each sensor.
By using the overlapped segmentation interval, the provided time registration method can effectively improve the sampling rate and the precision of asynchronous observation processing.
The embodiment provides a segment overlapping-based time registration method aiming at the limitation of the existing time registration algorithm. The output data of each navigation sensor is subjected to subsection overlapping processing, asynchronous observation information is synchronized to a unified reference time scale, and a series of observation estimated values which are positioned at the same time and are at equal intervals are obtained. And the sampling rate and the observation precision of the time registration method are improved by using the overlapped segmentation interval. The method is the basis for carrying out optimal fusion on multi-source and asynchronous combined navigation information.
In the method of this embodiment, as shown in fig. 2, after obtaining the fifth navigation information, the method further includes:
mahalanobis distance is a discriminant criterion in statistics for detecting outliers of a multi-metadata sample. For one mean value, μ ═ μ (μ)1,μ2,…,μp)TThe multi-dimensional vector x with covariance matrix of Σ is (x)1,x2,…,xp)TThe Mahalanobis distance is defined as
And respectively judging and detecting model errors in the MIMU/BPNS and MIMU/VNS integrated navigation subsystems aiming at the designed BPNS/MIMU/VNSS integrated navigation system.
According to the fifth navigation information, calculating m-dimensional information vector of the sub-filter asFor Gaussian systems without model errors, innovation vectorsMultivariate Gaussian distribution obeying mean value of 0WhereinIs an innovation vector covariance matrix. Therefore, according to the mahalanobis distance definition,is provided with
As can be seen from the statistical knowledge, the,subject to a degree of freedom mAnd (4) distribution. According toTest theory, given a level of significance α (0 < α < 1), there is a critical valueMake it
Therefore, the following criteria can be established
Wherein the content of the first and second substances,is a preset check threshold which represents the χ corresponding to the significance check level of α2Checking the critical value by querying χ2And obtaining a distribution table.
The abnormal observation of the filter is identified by the Mahalanobis distance criterion, and when the abnormal observation of the filter is detected, the abnormal observation of the filter is identified by the Mahalanobis distance criterion
In response toAnd executing the CKF algorithm step to acquire the first/second navigation information.
In response toConstructing an iterative relationship of the robust factor according to the Mahalanobis distance criterion, and iteratively solving the robust factor kappakAnd updating the covariance matrix of the CKF innovation vector and executing the CKF algorithm step by combining the robust factor to obtain the first/second navigation information.
Solving the robust factor includes:
constructing a nonlinear equation according to the Mahalanobis distance criterion:
wherein the content of the first and second substances,representing the updated innovation vector covariance matrix after the introduction of the robust factor.
Solving the nonlinear equation to obtain an iteration relation of the robust factor:
wherein R is a filter observation noise covariance matrix, i is iteration times, and an iteration initial value is kappak(0) 1 and brings the iteration result of each step into the criterion β for calculating mahalanobis distancekWhen is coming into contact withThen, the iteration is finished, and the last iteration result is the determined robust factor; otherwise, the iterative computation is continued.
In addition, in the embodiment, the covariance matrix of the CKF innovation vector is updated to be
Claims (10)
1. A bionic polarized light assisted unmanned combat aircraft autonomous navigation method is characterized by comprising the following steps:
establishing a BPNS/MIMU/VNS autonomous integrated navigation system dynamic model based on quaternion according to the MIMU navigation subsystem;
establishing a first observation model and a second observation model through a BPNS navigation subsystem and a VNS navigation subsystem respectively;
and acquiring navigation information of each navigation subsystem, and performing state estimation on the dynamic model, the first observation model and the second observation model by using an improved federal CKF filtering method to obtain final navigation information of the unmanned combat aircraft.
2. The method of claim 1, wherein the performing state estimation on the dynamical model, the first observation model and the second observation model by using an improved federal CKF filtering method comprises:
carrying out state estimation on the dynamic model and the first observation model by using an improved CKF filtering method to obtain first navigation information of the unmanned combat aircraft;
carrying out state estimation on the dynamic model and the second observation model by using an improved CKF filtering method to obtain second navigation information of the unmanned combat aircraft;
and performing system state fusion on the first navigation information and the second navigation information by using a federal filtering method to obtain final navigation information of the unmanned combat aircraft.
3. The method for autonomous navigation of unmanned combat aircraft assisted by bionic polarized light according to claim 1 or 2, wherein the quaternion-based dynamical model is:
wherein the content of the first and second substances,a non-linear function representing the kinetic model, x (t) being a system state vector,w (t) is kinetic noise;
the attitude quaternion from the body coordinate system b to the navigation coordinate system n;representing the velocity of the unmanned aerial vehicle in the navigational coordinate system n,andthe east speed, the north speed and the sky speed of the unmanned combat aircraft are respectively; p is a radical ofn=[λ,L,h]TThe longitude, the latitude and the altitude of the unmanned combat aircraft in a navigation coordinate system n are collected, wherein lambda is the longitude, L is the latitude, and h is the altitude; epsilonbIn order to be the deviation of the gyroscope,is the accelerometer bias;
is composed ofThe differential of (a) is determined,for the angular velocity of the unmanned combat aircraft in the navigational coordinate system n,representing a quaternion multiplication;
is v isnThe differential of (a) is determined,representing the attitude rotation matrix, fbThe specific force in the body coordinate system b measured for the accelerometer,is the angular velocity of rotation of the earth relative to the inertial frame i,for the angular velocity, g, of the navigation coordinate system n relative to the terrestrial coordinate system enIs a gravity acceleration vector;
is pnThe differential of (a) is determined,m is a position coefficient matrix, and M is a position coefficient matrix,RMand RNRespectively the curvature radius of the earth meridian circle and the Mao unitary circle;
4. The method for autonomous navigation of unmanned combat aircraft assisted by bionic polarized light according to claim 3, wherein the first observation model is:
z1(k)=h1(x(k))+v1(k),
5. The method for autonomous navigation of unmanned combat aircraft assisted by bionic polarized light according to claim 4, wherein the second observation model is:
z2(k)=h2(x(k))+v2(k),
wherein z is2(k)=[ψV,θV,γV,λV,LV,hV]T,ψV,θV,γVRespectively measuring course angle, pitch angle and roll angle, lambda of the unmanned combat aircraft obtained by the VNS navigation subsystemV,LV,hVLongitude, latitude and altitude, v, of unmanned combat aircraft measured by VNS navigation subsystem respectively2(k) For the observed noise vector for attitude and position in the VNS navigation subsystem,
6. the method of claim 5, wherein the improved federal CKF filtering method comprises:
initializing the kinetic model and first/second observation models;
acquiring third navigation information of the unmanned combat aircraft; the third navigation information is initial navigation information or final navigation information of the unmanned combat aircraft at the previous moment;
generating fourth navigation information of the unmanned combat aircraft by combining the third navigation information and the dynamic model; the fourth navigation information is a navigation information predicted value;
acquiring fifth navigation information; the fifth navigation information is attitude information output by the BPNS navigation subsystem after asynchronous observation time registration and attitude and position information output by the VNS navigation subsystem after asynchronous observation time registration;
and generating final navigation information of the unmanned combat aircraft by combining the first observation model/the second observation model according to the fourth navigation information and the fifth navigation information.
7. The method as claimed in claim 6, wherein obtaining the fifth navigation information further comprises:
Judging whether the mahalanobis distance criterion of the innovation vector is less than or equal to a threshold value, namely
Wherein, βkIs used as a criterion of the Mahalanobis distance, in order to obtain the covariance matrix of the innovation vector,m is an observation vector dimension, which is a preset threshold value;
in response toConstructing an iterative relationship of the robust factor according to the Mahalanobis distance criterion, and iteratively solving the robust factor kappak(ii) a Combining with robust factor, CKF innovation vector synergyThe variance matrix is updated to
Wherein the content of the first and second substances,the volume points selected for the CKF filter and propagated through the nonlinear observation function,and R is a filter observation noise covariance matrix.
Executing the CKF algorithm step to obtain first/second navigation information;
and performing state fusion on the first navigation information and the second navigation information by using a federal filtering method to obtain final navigation information of the unmanned combat aircraft.
8. The method of claim 7, wherein solving for the robust factor comprises:
constructing a nonlinear equation according to the Mahalanobis distance criterion:
wherein the content of the first and second substances,representing updated innovation vector covariance matrix after introduction of the robust factor;
solving the nonlinear equation by using a Newton iteration method to obtain an iterative relationship of the robust factor:
wherein iFor the number of iterations, the initial value of the iteration is κk(0)=1。
9. The unmanned aerial vehicle autonomous navigation method of either claim 6 or claim 8, wherein the acquiring fifth navigation information comprises:
determining an observation processing section [ T ]Δ+sTΔ,TΔ+sTΔ+Ts]Wherein s is 1 or a positive integer, TΔIs an overlapping interval, TsIs a segment interval;
within each processing interval, an observation set { z ] of the t-th sensor is acquired respectivelyt1,zt2,...,ztr};
Based on a set of observations { zt1,zt2,...,ztrAnd generating an observation estimated value of the t-th navigation sensor in each processing interval by using a least square method, wherein the observation estimated value is used as fifth navigation information.
10. The method for autonomous navigation of unmanned combat aircraft assisted by bionic polarized light according to claim 2 or 9, wherein the system state fusion comprises:
and respectively acquiring an error covariance matrix of the first navigation information and an error covariance matrix of the second navigation information according to the dynamic model and the first observation model/the second observation model. Establishing weights of the first navigation information and the second navigation information based on the error covariance matrix of the first navigation information and the error covariance matrix of the second navigation information;
wherein the content of the first and second substances,final navigation information for the unmanned aerial vehicle, PgFor the corresponding error covarianceThe matrix is a matrix of a plurality of matrices,P1is an error covariance matrix of the first navigation information,is the first navigation information, P2Is an error covariance matrix of the second navigation information,as second navigation information, PzThe navigation information error covariance matrix obtained after the time update is carried out on the main filter through a dynamic model,and the navigation information is obtained after the time of the main filter is updated through the dynamic model.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010082963.6A CN111238467B (en) | 2020-02-07 | 2020-02-07 | Bionic polarized light assisted unmanned combat aircraft autonomous navigation method |
ZA2021/02684A ZA202102684B (en) | 2020-02-07 | 2021-04-22 | Autonomous navigation method for unmanned combat aerial vehicle assisted by bionic polarization |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010082963.6A CN111238467B (en) | 2020-02-07 | 2020-02-07 | Bionic polarized light assisted unmanned combat aircraft autonomous navigation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111238467A true CN111238467A (en) | 2020-06-05 |
CN111238467B CN111238467B (en) | 2021-09-03 |
Family
ID=70870540
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010082963.6A Active CN111238467B (en) | 2020-02-07 | 2020-02-07 | Bionic polarized light assisted unmanned combat aircraft autonomous navigation method |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111238467B (en) |
ZA (1) | ZA202102684B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111504312A (en) * | 2020-07-02 | 2020-08-07 | 中国人民解放军国防科技大学 | Unmanned aerial vehicle pose estimation method based on visual inertial polarized light fusion |
CN111947653A (en) * | 2020-08-13 | 2020-11-17 | 北京航空航天大学 | Dual-mode inertial/visual/astronomical navigation method for lunar surface inspection tour detector |
CN112146655A (en) * | 2020-08-31 | 2020-12-29 | 郑州轻工业大学 | Elastic model design method for BeiDou/SINS tight integrated navigation system |
CN113380073A (en) * | 2021-06-03 | 2021-09-10 | 杭州电子科技大学 | Asynchronous filtering estimation method of flow management system based on event trigger mechanism |
CN113819907A (en) * | 2021-11-22 | 2021-12-21 | 北京航空航天大学 | Inertia/polarization navigation method based on polarization and sun dual-vector switching |
CN114543799A (en) * | 2022-03-31 | 2022-05-27 | 湖南大学无锡智能控制研究院 | Robust federated Kalman filtering method, device and system |
CN114995518A (en) * | 2022-07-27 | 2022-09-02 | 西北工业大学 | Master-slave cooperative guidance method for failure of slave aircraft GPS target positioning |
CN115574816A (en) * | 2022-11-24 | 2023-01-06 | 东南大学 | Bionic vision multi-source information intelligent perception unmanned platform |
CN116105743A (en) * | 2023-04-17 | 2023-05-12 | 山东大学 | Information factor distribution method of federal filtering system and underwater navigation system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011104512A1 (en) * | 2011-06-17 | 2012-12-20 | Northrop Grumman Litef Gmbh | Light source module for tri-axes rotational measuring triad system used in navigation system, has light source that is integrated in substrate to emit polarized light on photoconductive fibers |
CN106168662A (en) * | 2016-07-26 | 2016-11-30 | 中国人民解放军海军航空工程学院 | The error registration method of passive sensor based on Maximum-likelihood estimation and device |
CN107144284A (en) * | 2017-04-18 | 2017-09-08 | 东南大学 | Inertial navigation combination navigation method is aided in based on the vehicle dynamic model that CKF is filtered |
CN108375381A (en) * | 2018-02-08 | 2018-08-07 | 北方工业大学 | Bionic polarization sensor multi-source error calibration method based on extended Kalman filtering |
CN108827322A (en) * | 2018-06-14 | 2018-11-16 | 上海卫星工程研究所 | A kind of more stellar associations are the same as DF and location observation system optimization design and appraisal procedure |
-
2020
- 2020-02-07 CN CN202010082963.6A patent/CN111238467B/en active Active
-
2021
- 2021-04-22 ZA ZA2021/02684A patent/ZA202102684B/en unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011104512A1 (en) * | 2011-06-17 | 2012-12-20 | Northrop Grumman Litef Gmbh | Light source module for tri-axes rotational measuring triad system used in navigation system, has light source that is integrated in substrate to emit polarized light on photoconductive fibers |
CN106168662A (en) * | 2016-07-26 | 2016-11-30 | 中国人民解放军海军航空工程学院 | The error registration method of passive sensor based on Maximum-likelihood estimation and device |
CN107144284A (en) * | 2017-04-18 | 2017-09-08 | 东南大学 | Inertial navigation combination navigation method is aided in based on the vehicle dynamic model that CKF is filtered |
CN108375381A (en) * | 2018-02-08 | 2018-08-07 | 北方工业大学 | Bionic polarization sensor multi-source error calibration method based on extended Kalman filtering |
CN108827322A (en) * | 2018-06-14 | 2018-11-16 | 上海卫星工程研究所 | A kind of more stellar associations are the same as DF and location observation system optimization design and appraisal procedure |
Non-Patent Citations (1)
Title |
---|
孔祥龙: "基于多视图几何的惯性/立体视觉组合导航方法研究", 《中国博士学位论文全文数据库 社会科学Ⅰ辑》 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111504312A (en) * | 2020-07-02 | 2020-08-07 | 中国人民解放军国防科技大学 | Unmanned aerial vehicle pose estimation method based on visual inertial polarized light fusion |
CN111947653A (en) * | 2020-08-13 | 2020-11-17 | 北京航空航天大学 | Dual-mode inertial/visual/astronomical navigation method for lunar surface inspection tour detector |
CN112146655A (en) * | 2020-08-31 | 2020-12-29 | 郑州轻工业大学 | Elastic model design method for BeiDou/SINS tight integrated navigation system |
CN113380073B (en) * | 2021-06-03 | 2022-05-13 | 杭州电子科技大学 | Asynchronous filtering estimation method of flow management system based on event trigger mechanism |
CN113380073A (en) * | 2021-06-03 | 2021-09-10 | 杭州电子科技大学 | Asynchronous filtering estimation method of flow management system based on event trigger mechanism |
CN113819907A (en) * | 2021-11-22 | 2021-12-21 | 北京航空航天大学 | Inertia/polarization navigation method based on polarization and sun dual-vector switching |
CN113819907B (en) * | 2021-11-22 | 2022-02-11 | 北京航空航天大学 | Inertia/polarization navigation method based on polarization and sun dual-vector switching |
CN114543799A (en) * | 2022-03-31 | 2022-05-27 | 湖南大学无锡智能控制研究院 | Robust federated Kalman filtering method, device and system |
CN114543799B (en) * | 2022-03-31 | 2023-10-27 | 湖南大学无锡智能控制研究院 | Robust federal Kalman filtering method, device and system |
CN114995518A (en) * | 2022-07-27 | 2022-09-02 | 西北工业大学 | Master-slave cooperative guidance method for failure of slave aircraft GPS target positioning |
CN114995518B (en) * | 2022-07-27 | 2022-11-18 | 西北工业大学 | Master-slave cooperative guidance method for failure of slave aircraft GPS target positioning |
CN115574816A (en) * | 2022-11-24 | 2023-01-06 | 东南大学 | Bionic vision multi-source information intelligent perception unmanned platform |
CN115574816B (en) * | 2022-11-24 | 2023-03-14 | 东南大学 | Bionic vision multi-source information intelligent perception unmanned platform |
WO2024109002A1 (en) * | 2022-11-24 | 2024-05-30 | 东南大学 | Bionic-vision multi-source-information unmanned intelligent sensing platform |
CN116105743A (en) * | 2023-04-17 | 2023-05-12 | 山东大学 | Information factor distribution method of federal filtering system and underwater navigation system |
Also Published As
Publication number | Publication date |
---|---|
ZA202102684B (en) | 2021-05-26 |
CN111238467B (en) | 2021-09-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111238467B (en) | Bionic polarized light assisted unmanned combat aircraft autonomous navigation method | |
CN108759833B (en) | Intelligent vehicle positioning method based on prior map | |
CN109781099B (en) | Navigation method and system of self-adaptive UKF algorithm | |
CN112083726B (en) | Park-oriented automatic driving double-filter fusion positioning system | |
Scherer et al. | River mapping from a flying robot: state estimation, river detection, and obstacle mapping | |
CN103913181B (en) | A kind of airborne distributed POS Transfer Alignments based on parameter identification | |
CN112639502A (en) | Robot pose estimation | |
CN112505737B (en) | GNSS/INS integrated navigation method | |
CN108387236B (en) | Polarized light SLAM method based on extended Kalman filtering | |
CN107727101B (en) | Three-dimensional attitude information rapid resolving method based on dual-polarized light vector | |
CN113252038B (en) | Course planning terrain auxiliary navigation method based on particle swarm optimization | |
CN113295162B (en) | Generalized factor graph fusion navigation method based on unmanned aerial vehicle state information | |
EP4220086A1 (en) | Combined navigation system initialization method and apparatus, medium, and electronic device | |
CN111156986B (en) | Spectrum red shift autonomous integrated navigation method based on robust adaptive UKF | |
CN115143954B (en) | Unmanned vehicle navigation method based on multi-source information fusion | |
Gao et al. | An integrated land vehicle navigation system based on context awareness | |
CN111207773B (en) | Attitude unconstrained optimization solving method for bionic polarized light navigation | |
US20240184309A1 (en) | Unmanned platform with bionic visual multi-source information and intelligent perception | |
CN113008229A (en) | Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor | |
Qiu et al. | Outlier-Robust Extended Kalman Filtering for Bioinspired Integrated Navigation System | |
CN117387604A (en) | Positioning and mapping method and system based on 4D millimeter wave radar and IMU fusion | |
CN116681733A (en) | Near-distance real-time pose tracking method for space non-cooperative target | |
CN115014321B (en) | Bionic polarization multisource fusion orientation method based on adaptive robust filtering | |
Atia et al. | A novel systems integration approach for multi-sensor integrated navigation systems | |
CN112304309B (en) | Method for calculating combined navigation information of hypersonic vehicles based on cardiac array |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |