CN102506868A - SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system - Google Patents

SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system Download PDF

Info

Publication number
CN102506868A
CN102506868A CN2011103718640A CN201110371864A CN102506868A CN 102506868 A CN102506868 A CN 102506868A CN 2011103718640 A CN2011103718640 A CN 2011103718640A CN 201110371864 A CN201110371864 A CN 201110371864A CN 102506868 A CN102506868 A CN 102506868A
Authority
CN
China
Prior art keywords
navigation
navigation system
sins
landform
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011103718640A
Other languages
Chinese (zh)
Other versions
CN102506868B (en
Inventor
程农
胡海东
李威
杨霄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201110371864.0A priority Critical patent/CN102506868B/en
Publication of CN102506868A publication Critical patent/CN102506868A/en
Application granted granted Critical
Publication of CN102506868B publication Critical patent/CN102506868B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Navigation (AREA)

Abstract

The invention discloses an SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and a system. The method includes the following steps: performing image matching for scene matching auxiliary navigation, and confirming the position of an aircraft through the affine transformation relationship between a digital map and a shoot image; performing terrain matching for the terrain reference navigation through adopting the terrain matching method, and confirming the position of the aircraft as per the elevation data; building an error model for the SINS, as well as observation models of the SMANS and the TRNS; and performing information fusion for the output of the SINS, the SMANS and the TRNS, so as to obtain the optimum estimation result, and further calibrate the SINS; The system includes an atmosphere inertia navigation system, a flight path generator module, an SINS/SMANS combined navigation system, an SINS/TRNS combined navigation system and a federated filtering module. The method and the system can effectively improve the accuracy of the navigation systems, and have high fault tolerance, independency and reliability.

Description

SINS/SMANS/TRNS Combinated navigation method and system based on federal filtering
Technical field
The present invention relates to the integrated navigation field of locating technology, relate in particular to a kind of SINS/SMANS/TRNS Combinated navigation method and system based on federal filtering.
Background technology
" navigation " is exactly that correct vectored flight device arrives the destination at the appointed time along predetermined course line.In order to accomplish this task, need know instantaneous geographic position, the headway of aircraft, the parameters such as attitude course of navigation at any time.These parameters, so-called navigational parameter.To manned aircraft, these navigational parameters can be by the navigator through observing instrument and calculating.But,, increasingly high to requirements of navigation along with the continuous increase of speed and voyage; In order to alleviate and replace navigator's work, various navigational system have just appearred, the various navigational parameters that need can be provided automatically.
Inertial navigation is a kind of air navigation aid of independence.It relies on the autonomous completion navigation task of airborne equipment and the external world that any light, electrical communication do not take place fully.Therefore, good concealment, work does not receive the restriction of meteorological condition.The advantage that this is unique makes inertial navigation system on guided missile, naval vessel, aircraft, spacecraft, obtain using widely, in airmanship, occupies outstanding status.Inertial navigation is position, speed and an attitude navigational parameter of confirming the place carrier with the measurement data that gyroscope and accelerometer provide.Through the combination of these two kinds of measurements, just can confirm that this carrier is in the translation motion of inertial coordinates system and calculate its position.
The most mechanical complexity of plateform system has been removed through be connected sensor (or fixing) by strap-down inertial (SINS) system on the housing of carrier.The potential benefit of this method is that cost reduces, size reduces, reliability improves.Small-sized, accurate strapdown inertial navigation system can install on the various aircraft, and the subject matter of being brought is that computational complexity significantly increases, and needs the high-revolving device of energy measurement.Yet the continuous progress of computer technology combines with the exploitation of suitable sensor, makes this reality that is designed to.
The shortcoming of strap-down inertial is that the position estimation precision that it provides can be drifted about in time.In long-time scope, the speed that navigation error increases is mainly determined by the dynamic perfromance of initial alignment precision, the employed inertial sensor defective of system and carrier movement locus.Though adopt more accurate sensor can improve precision, it is very expensive that the cost of inertia system can become, and the precision that improves also is limited.In recent years, in order to solve the error drift problem of SINS, a kind of method that is suitable for multiple application is the integrated navigation technology, and integrated navigation is to adopt one or more secondary navigation systems that inertial navigation is revised.Wherein, scene matching aided navigation system and terrain refer enced navigation system are exactly the navigational system of two kinds of high independences.
Scene matching aided navigation assisting navigation (SMANS) is to utilize real-time landform scene figure airborne or that the missile-borne imageing sensor is gathered in flight course to mate in real time with the benchmark landform scene figure for preparing in advance to calculate and the technology of acquisition precise location information.The scene matching aided navigation navigation belongs to autonomous positioning, can zero-miss guidance be provided for aircraft, and navigation accuracy and flying distance are irrelevant, and cost is relatively low.The landform picture of its below when the scene matching aided navigation assisting navigation adopts an imaging system to set up aircraft to fly forward, when needing position coordinates, a part of scan image is stored " scene " of formation aircraft below landform.Through this process, image is converted to " pixel " array, and each pixel all has the numerical value of a that part of brightness of image of expression.Scene to " catching " is handled, and those possibly provide the characteristic of navigation information to remove noise and enhancing, adopt related algorithm to seek again and are stored in discernible figure in the terrain surface specifications database in advance.After finding the characteristic and the characteristic matching in the database in the scene, carry out several calculating, just can calculate scene in the position that is hunted down moment according to the attitude and the terrain clearance of aircraft.
Landform is to receive extensive attention and the secondary navigation system used of success in recent years with reference to navigation (TRNS) technology, and it is technological with the irrelevant low level navigation of voyage also to be a kind of autonomous, hidden, round-the-clock, navigation and positioning accuracy.The most frequently used landform frame of reference adopts the aircraft of a radio altimeter, an airborne air pressure inertial navigation system and a storage to fly over the terrain profile figure in zone.Radio altimeter is measured the height on ground, in conjunction with the valuation of inertial navigation system to sea level elevation, and the ground contour below can the computer vision reappear theory flight path on carrier.Topographic map data with ground contour that obtains and storage compares to realize the compatibility coupling then, can confirm the position of carrier thus.
Summary of the invention
The technical matters that (one) will solve
The technical matters that the present invention will solve is: a kind of SINS/SMANS/TRNS Combinated navigation method and system based on federal filtering is provided, with effective raising accuracy of navigation systems, and has high fault tolerance, high independence and high reliability.
(2) technical scheme
For addressing the above problem, the invention provides a kind of SINS/SMANS/TRNS Combinated navigation method based on federal filtering, may further comprise the steps:
S1: the scene matching aided navigation assisting navigation adopts the angle point method to carry out images match, through the attitude and the height conversion of aircraft, confirms the position of aircraft;
S2: landform adopts the terrain match method to carry out terrain match with reference to navigation, through actual relative height and inertial navigation relative height, finally confirms the position of aircraft;
S3: set up the error model of strapdown inertial navigation system, and scene matching aided navigation assisting navigation and landform are with reference to the observation model of navigation;
S4: strap-down inertial, scene matching aided navigation assisting navigation and landform are carried out information fusion with reference to the output of navigation, draw the optimal estimation result, and strapdown inertial navigation system is proofreaied and correct.
Preferably, said step S1 specifically may further comprise the steps:
S11: input take photo by plane image and area image to be matched, respectively two images are carried out the extraction of unique point, image is carried out metric space is represented and carry out three-dimensional localization;
S12: make up the descriptor vector of point of interest through the direction character of confirming point of interest, adopt based on minimum distance and carry out Feature Points Matching than time in-plant matching process, and to the mistake match point to rejecting;
S13: based on mating successful point of interest to confirming the homography matrix of said take photo by plane image and area image to be matched; The position and the rotation relationship of two width of cloth images that provide according to said homography matrix; The position of figure in area map to be matched of confirming to take photo by plane, thus confirm the flight position of aircraft.
Preferably, the descriptor vector of said point of interest can be 64 dimensions or 128 dimension formations.
Preferably, said step S2 specifically may further comprise the steps:
S21: input landform altitude data, and set up the landform inearized model;
S22: it is poor to calculate landform altitude according to the aircraft horizontal position error, poor according to the absolute altitude difference and the said landform altitude difference calculating landform relative height of aircraft again;
S23: will calculate gained landform relative height difference and observation station and get the relative height difference and compare, and revise and finally confirm the position of aircraft.
Preferably, said step S4 specifically may further comprise the steps:
S41: state after the information fusion and variance battle array are carried out initialization;
S42: constitute inertial navigation/scene matching aided navigation assisting navigation subfilter and inertial navigation/landform with reference to the navigation subfilter, state, variance battle array and the state-noise battle array of each subfilter are carried out information distribution;
S43: inertial navigation/scene matching aided navigation assisting navigation subfilter and inertial navigation/landform are carried out separately time renewal and observation renewal respectively with reference to the navigation subfilter, obtain the estimated information of each subfilter;
S44: the estimated information of all subfilters is merged the global state optimal estimation information that becomes.
On the other hand, the present invention also provides a kind of integrated navigation system of realizing the combinations thereof air navigation aid, comprising:
The atmosphere inertial navigation system is used to obtain inertial navigation positional information and output;
The flight path generator module is used for simulated flight device flight path, obtains position, speed and the attitude information of aircraft;
The SINS/SMANS integrated navigation system comprises:
Imageing sensor vision area and positional parameter computing module are used for vision area and positional parameter according to the airborne imageing sensor of positional information calculation of said atmosphere inertial navigation system output;
The digital reference map database is used for obtaining suitable digital reference map according to the result of calculation of said imageing sensor vision area and positional parameter computing module;
The imageing sensor analog module, the attitude and the height conversion information of the aircraft that is used for obtaining according to said flight path generator module, simulation generates the realtime graphic that airborne imageing sensor is taken;
The images match module is used for said digital reference map that is obtained by said digital reference map database and the realtime graphic of being simulated by said imageing sensor analog module are carried out registration, calculates the real time position of aircraft;
SINS/SMANS Kalman subfilter module is used for carrying out information fusion according to the inertial navigation positional information of atmosphere inertial navigation system output and the aircraft real-time position information of images match module output;
The SINS/TRNS integrated navigation system comprises:
Laser ceilometer is used to receive the actual absolute altitude information of flight path generator module acquisition and the actual elevation information that the terrain match module obtains, and obtains surveying relative height and output;
The landform altitude database is used to provide terrain data;
The terrain match module is used for the inertial navigation positional information according to the output of atmosphere inertial navigation system, the actual position information of flight path generator module output and the terrain data that said landform altitude database provides and exports actual elevation information, output landform slope and inertial navigation elevation information;
SINS/TRNS Kalman subfilter module is used for carrying out information fusion according to the output signal of said laser ceilometer, terrain match module and atmosphere inertial navigation system;
Federal filtration module; Be used for the signal of SINS/SMANS Kalman subfilter module and the output of SINS/TRNS Kalman subfilter module is merged; The positional information result who is finally merged carries out error correction to said atmosphere inertial navigation system simultaneously.
(3) beneficial effect
The present invention forms integrated navigation system with inertial navigation and the scene matching aided navigation assisting navigation that is applicable to the smooth or undistinguishable region of large tracts of land and the landform that is applicable to that landform is more coarse or changes violent zone with reference to navigation; Effectively revise the drift error of inertial navigation with reference to navigation through scene matching aided navigation assisting navigation and landform; Effectively improve accuracy of navigation systems, and have high fault tolerance, high independence and high reliability.
Description of drawings
Fig. 1 is the flow chart of steps according to embodiment of the invention Combinated navigation method;
Fig. 2 is the process flow diagram according to embodiment of the invention Combinated navigation method step 1;
Fig. 3 is the process flow diagram according to embodiment of the invention Combinated navigation method step 2;
Fig. 4 is the process flow diagram according to embodiment of the invention Combinated navigation method step 4;
Fig. 5 is the rough schematic view of federal filtering fusion structure of the integrated navigation system of embodiment of the present invention method;
Fig. 6 is the structural representation of the integrated navigation system of embodiment of the present invention method.
Embodiment
Below in conjunction with accompanying drawing and embodiment the present invention is elaborated as follows.
Embodiment one:
As shown in Figure 1, present embodiment has been put down in writing a kind of SINS/SMANS/TRNS Combinated navigation method based on federal filtering, may further comprise the steps:
S1: the scene matching aided navigation assisting navigation adopts the angle point method to carry out images match, through the attitude and the height conversion of aircraft, confirms the position of aircraft;
As shown in Figure 2, said step S1 specifically may further comprise the steps:
S11: input take photo by plane image and area image to be matched, for example adopt the method for the approximate Hessian matrix of frame shape wave filter to carry out the extraction of unique point to two images respectively, image is carried out metric space is represented and carry out three-dimensional localization;
S12: make up the descriptor vector of point of interest through the direction character of confirming point of interest, adopt based on minimum distance and carry out Feature Points Matching than time in-plant matching process, and through the RANSAC algorithm to the mistake match point to rejecting;
S13: based on mating successful point of interest to confirming the homography matrix of said take photo by plane image and area image to be matched; The position and the rotation relationship of two width of cloth images that provide according to said homography matrix; The position of figure in area map to be matched of confirming to take photo by plane, thus confirm the flight position of aircraft.
S2: landform adopts SITAN (Sandia Inertial Terrain Aided Navigation with reference to navigation; Sang Diya inertia Terrain-aided Navigation system) the terrain match method is carried out terrain match; Through actual relative height and inertial navigation relative height, and finally confirm the position of aircraft;
As shown in Figure 3, said step S2 specifically may further comprise the steps:
S21: input landform altitude data, and set up the landform inearized model;
S22: calculate landform altitude difference Δ h according to aircraft horizontal position error (Δ x, Δ y) l, landform relative height difference Δ h then rCan be expressed as:
Δh r=Δh-Δh l
Wherein, Δ h is that the aircraft absolute altitude is poor;
S23: will calculate gained landform relative height difference and observation station and get the relative height difference and compare, and revise the position of aircraft, and finally confirm the position of aircraft.
S3: set up the error model of strapdown inertial navigation system, and scene matching aided navigation assisting navigation and landform are with reference to the observation model of navigation;
Wherein, the error model of strapdown inertial navigation system is:
X=FX+Gw
Figure BDA0000110658750000071
Wherein X is the system state vector, and F is 5 * 5 system matrixes, and G is a system noise input battle array, and w is the system noise vector,
Figure BDA0000110658750000072
δ λ, δ h, δ v E, δ v NBe respectively latitude error, longitude error, height error, east orientation velocity error and north orientation velocity error, its non-zero entry is:
F ( 1,3 ) = - v n ( R N + h ) 2 F ( 1,5 ) = 1 R M + h
Figure BDA0000110658750000075
Figure BDA0000110658750000076
Figure BDA0000110658750000077
Figure BDA0000110658750000081
Figure BDA0000110658750000082
Figure BDA0000110658750000084
Figure BDA0000110658750000085
Figure BDA0000110658750000086
Figure BDA0000110658750000087
F ( 5,5 ) = v U R M + h .
The observation model of scene matching aided navigation assisting navigation is:
Z SMANS=H SMANSX+V SMANS
H SMANS=[I 2×2?0 2×3]
Wherein, X is the error model of strapdown inertial navigation system, Z SMANSBe the observed quantity of scene matching aided navigation assisting navigation, V SMANSObservation noise for the scene matching aided navigation assisting navigation.
Landform with reference to the observation model of navigation is:
Z TRNS=H TRNSX+V TRNS
H TRNS=[-k x?-k y?1?0?0]
Wherein, Z TRNSBe the observed quantity of landform with reference to navigation, V TRNSFor landform with reference to the navigation observation noise, k x, k yBe respectively the landform slope of x direction and y direction.
S4: strap-down inertial, scene matching aided navigation assisting navigation and landform are carried out information fusion with reference to navigation output, draw the optimal estimation result, and strapdown inertial navigation system is proofreaied and correct.
The descriptor vector of said point of interest can be 64 dimensions or 128 dimensions constitute.
As shown in Figure 4, said step S4 specifically may further comprise the steps:
S41: the function of state after the information fusion is set With variance battle array function P f, and respectively the two is carried out initialization; Even
x ^ f ( 0 ) = x ( 0 ) , P f(0)=P(0)
S42: constitute SINS/SMANS subfilter and SINS/TRNS subfilter, state, variance battle array and the state-noise battle array of each subfilter are carried out information distribution: promptly to the state of i sub-filters
Figure BDA0000110658750000091
Variance battle array P iWith state-noise battle array Q iCarry out information distribution: order
x ^ i ( k ) = x ^ f ( k )
P i ( k ) = β i - 1 ( k ) P f ( k )
Q i ( k ) = β i - 1 ( k ) Q f ( k )
Wherein
Figure BDA0000110658750000095
P fAnd Q fBe respectively state, variance battle array and state-noise battle array after the fusion, β iBe the information distribution factor, k>=1, i=1,2.β iSatisfy:
β 12=1。
S43: inertial navigation/scene matching aided navigation assisting navigation subfilter and inertial navigation/landform are carried out separately time renewal and observation renewal respectively with reference to the navigation subfilter, obtain the estimated information of each subfilter:
(1) obtain after the time renewal:
x(k|k-1)=Φ(k|k-1)x(k-1)
P(k|k-1)=Φ(k|k-1)P(k-1)Φ T(k|k-1)+Q(k-1)
x(k-1)=x(k|k-1)
P(k-1)=P(k|k-1)
Wherein, x is the state vector of wave filter, and P is the variance battle array of wave filter, and Φ is the corresponding system state transfer matrix of F matrix;
(2) observation obtains after upgrading:
K(k)=P(k|k-1)H T(k)(H(k)P(k|k-1)H T(k)+R(k)) -1
x(k)=x(k|k-1)+K(k)(z(k)-H(k)x(k|k-1))
P(k)=(I-K(k)H(k))P(k|k-1)
Wherein, K is a Kalman filtering gain battle array, and H is an observing matrix.
S44: the estimated information of each subfilter is merged the global state optimal estimation information that becomes:
P i - 1 ( k ) = P m - 1 ( k ) + Σ i = 1 n P i - 1 ( k )
x ^ f ( k ) = P f ( k ) [ P m - 1 ( k ) x ^ m ( k ) + Σ i = 1 n P i - 1 ( k ) x ^ i ( k ) ] .
The federal filtering fusion structure of performing step 4 is as shown in Figure 5; The signal that SINS and SMANS system obtain is transfused to the SINS/SMANS subfilter; The signal that SINS and TRNS system obtain is transfused to the SINS/TRNS subfilter, after handling through renewal etc., merges the back and exports.
Embodiment two:
As shown in Figure 6, present embodiment has been put down in writing a kind of integrated navigation system of realizing the combinations thereof air navigation aid, comprising:
The atmosphere inertial navigation system is used to obtain inertial navigation positional information and output;
The flight path generator module is used for simulated flight device flight path, obtains position, speed and the attitude information of aircraft;
The SINS/SMANS integrated navigation system comprises:
Imageing sensor vision area and positional parameter computing module are used for vision area and positional parameter according to the airborne imageing sensor of positional information calculation of said atmosphere inertial navigation system output;
The digital reference map database is used for obtaining suitable digital reference map according to the result of calculation of said imageing sensor vision area and positional parameter computing module;
The imageing sensor analog module, the attitude and the height conversion information of the aircraft that is used for obtaining according to said flight path generator module, simulation generates the realtime graphic that airborne imageing sensor is taken;
The images match module is used for said digital reference map that is obtained by said digital reference map database and the realtime graphic of being simulated by said imageing sensor analog module are carried out registration, calculates the real time position of aircraft;
SINS/SMANS Kalman subfilter module is used for carrying out information fusion according to the inertial navigation positional information of atmosphere inertial navigation system output and the aircraft real-time position information of images match module output;
The SINS/TRNS integrated navigation system comprises:
Laser ceilometer is used to receive the actual absolute altitude information of flight path generator module acquisition and the actual elevation information that the terrain match module obtains, and obtains surveying relative height and output;
The landform altitude database is used to provide terrain data;
The terrain match module is used for the inertial navigation positional information according to the output of atmosphere inertial navigation system, the actual position information of flight path generator module output and the terrain data that said landform altitude database provides and exports actual elevation information, output landform slope and inertial navigation elevation information;
SINS/TRNS Kalman subfilter module is used for carrying out information fusion according to the output signal of said laser ceilometer, terrain match module and atmosphere inertial navigation system;
Federal filtration module; Be used for the signal of SINS/SMANS Kalman subfilter module and the output of SINS/TRNS Kalman subfilter module is merged; The positional information result who is finally merged carries out error correction to said atmosphere inertial navigation system simultaneously.
Above embodiment only is used to explain the present invention; And be not limitation of the present invention; The those of ordinary skill in relevant technologies field under the situation that does not break away from the spirit and scope of the present invention, can also be made various variations and modification; Therefore all technical schemes that are equal to also belong to category of the present invention, and scope of patent protection of the present invention should be defined by the claims.

Claims (6)

1. the SINS/SMANS/TRNS Combinated navigation method based on federal filtering is characterized in that, may further comprise the steps:
S1: the scene matching aided navigation assisting navigation adopts the angle point method to carry out images match, through the attitude and the height conversion of aircraft, confirms the position of aircraft;
S2: landform adopts the terrain match method to carry out terrain match with reference to navigation, through actual relative height and inertial navigation relative height, finally confirms the position of aircraft;
S3: set up the error model of strapdown inertial navigation system, and scene matching aided navigation assisting navigation and landform are with reference to the observation model of navigation;
S4: strap-down inertial, scene matching aided navigation assisting navigation and landform are carried out information fusion with reference to the output of navigation, draw the optimal estimation result, and strapdown inertial navigation system is proofreaied and correct.
2. Combinated navigation method as claimed in claim 1 is characterized in that, said step S1 specifically may further comprise the steps:
S11: input take photo by plane image and area image to be matched, respectively two images are carried out the extraction of unique point, image is carried out metric space is represented and carry out three-dimensional localization;
S12: make up the descriptor vector of point of interest through the direction character of confirming point of interest, adopt based on minimum distance and carry out Feature Points Matching than time in-plant matching process, and to the mistake match point to rejecting;
S13: based on mating successful point of interest to confirming the homography matrix of said take photo by plane image and area image to be matched; The position and the rotation relationship of two width of cloth images that provide according to said homography matrix; The position of figure in area map to be matched of confirming to take photo by plane, thus confirm the flight position of aircraft.
3. Combinated navigation method as claimed in claim 2 is characterized in that, the descriptor vector of said point of interest can be 64 dimensions or 128 dimensions constitute.
4. Combinated navigation method as claimed in claim 1 is characterized in that, said step S2 specifically may further comprise the steps:
S21: input landform altitude data, and set up the landform inearized model;
S22: it is poor to calculate landform altitude according to the aircraft horizontal position error, poor according to the absolute altitude difference and the said landform altitude difference calculating landform relative height of aircraft again;
S23: will calculate gained landform relative height difference and observation station and get the relative height difference and compare, and revise and finally confirm the position of aircraft.
5. Combinated navigation method as claimed in claim 1 is characterized in that, said step S4 specifically may further comprise the steps:
S41: state after the information fusion and variance battle array are carried out initialization;
S42: constitute inertial navigation/scene matching aided navigation assisting navigation subfilter and inertial navigation/landform with reference to the navigation subfilter, state, variance battle array and the state-noise battle array of each subfilter are carried out information distribution;
S43: inertial navigation/scene matching aided navigation assisting navigation subfilter and inertial navigation/landform are carried out separately time renewal and observation renewal respectively with reference to the navigation subfilter, obtain the estimated information of each subfilter;
S44: the estimated information of all subfilters is merged the global state optimal estimation information that becomes.
6. an integrated navigation system of realizing each described Combinated navigation method among the claim 1-5 is characterized in that, comprising:
The atmosphere inertial navigation system is used to obtain inertial navigation positional information and output;
The flight path generator module is used for simulated flight device flight path, obtains position, speed and the attitude information of aircraft;
The SINS/SMANS integrated navigation system comprises:
Imageing sensor vision area and positional parameter computing module are used for vision area and positional parameter according to the airborne imageing sensor of positional information calculation of said atmosphere inertial navigation system output;
The digital reference map database is used for obtaining suitable digital reference map according to the result of calculation of said imageing sensor vision area and positional parameter computing module;
The imageing sensor analog module, the attitude and the height conversion information of the aircraft that is used for obtaining according to said flight path generator module, simulation generates the realtime graphic that airborne imageing sensor is taken;
The images match module is used for said digital reference map that is obtained by said digital reference map database and the realtime graphic of being simulated by said imageing sensor analog module are carried out registration, calculates the real time position of aircraft;
SINS/SMANS Kalman subfilter module is used for carrying out information fusion according to the inertial navigation positional information of atmosphere inertial navigation system output and the aircraft real-time position information of images match module output;
The SINS/TRNS integrated navigation system comprises:
Laser ceilometer is used to receive the actual absolute altitude information of flight path generator module acquisition and the actual elevation information that the terrain match module obtains, and obtains surveying relative height and output;
The landform altitude database is used to provide terrain data;
The terrain match module is used for the inertial navigation positional information according to the output of atmosphere inertial navigation system, the actual position information of flight path generator module output and the terrain data that said landform altitude database provides and exports actual elevation information, output landform slope and inertial navigation elevation information;
SINS/TRNS Kalman subfilter module is used for carrying out information fusion according to the output signal of said laser ceilometer, terrain match module and atmosphere inertial navigation system;
Federal filtration module; Be used for the signal of SINS/SMANS Kalman subfilter module and the output of SINS/TRNS Kalman subfilter module is merged; The positional information result who is finally merged carries out error correction to said atmosphere inertial navigation system simultaneously.
CN201110371864.0A 2011-11-21 2011-11-21 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system Active CN102506868B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110371864.0A CN102506868B (en) 2011-11-21 2011-11-21 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110371864.0A CN102506868B (en) 2011-11-21 2011-11-21 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system

Publications (2)

Publication Number Publication Date
CN102506868A true CN102506868A (en) 2012-06-20
CN102506868B CN102506868B (en) 2014-03-12

Family

ID=46218975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110371864.0A Active CN102506868B (en) 2011-11-21 2011-11-21 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system

Country Status (1)

Country Link
CN (1) CN102506868B (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN103353310A (en) * 2013-06-01 2013-10-16 西北工业大学 Laser strapdown inertial navigation system
CN103591955A (en) * 2013-11-21 2014-02-19 西安中科光电精密工程有限公司 Combined navigation system
CN104704424A (en) * 2012-08-21 2015-06-10 视觉智能有限合伙公司 Infrastructure mapping system and method
CN105547300A (en) * 2015-12-30 2016-05-04 航天恒星科技有限公司 All-source navigation system and method used for AUV (Autonomous Underwater Vehicle)
US9797980B2 (en) 2002-09-20 2017-10-24 Visual Intelligence Lp Self-calibrated, remote imaging and data processing system
CN108230374A (en) * 2016-12-21 2018-06-29 波音公司 Enhance the method and apparatus of raw sensor image by geographic registration
CN108496130A (en) * 2017-05-31 2018-09-04 深圳市大疆创新科技有限公司 Flight control method, equipment, control terminal and its control method, unmanned plane
CN109029434A (en) * 2018-06-29 2018-12-18 电子科技大学 Based on the Sang Diya inertia terrain auxiliary navigation method pushed away under adaptive scale
CN109214254A (en) * 2017-07-07 2019-01-15 北京臻迪科技股份有限公司 A kind of method and device of determining robot displacement
CN110388939A (en) * 2018-04-23 2019-10-29 湖南海迅自动化技术有限公司 One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
CN111854728A (en) * 2020-05-20 2020-10-30 哈尔滨工程大学 Fault-tolerant filtering method based on generalized relative entropy
WO2021016867A1 (en) * 2019-07-30 2021-02-04 深圳市大疆创新科技有限公司 Terminal device and data processing method therefor, and unmanned aerial vehicle and control method therefor
CN112859137A (en) * 2020-12-31 2021-05-28 国营芜湖机械厂 Airborne SINS/BDS/GNSS/TAN combined navigation semi-physical simulation system
CN113074722A (en) * 2020-01-03 2021-07-06 上海航空电器有限公司 Method for improving, positioning and correcting terrain reference navigation precision based on vision assistance technology
CN113155126A (en) * 2021-01-04 2021-07-23 航天时代飞鸿技术有限公司 Multi-machine cooperative target high-precision positioning system and method based on visual navigation
CN113406566A (en) * 2021-06-04 2021-09-17 广东汇天航空航天科技有限公司 Aircraft positioning method and device
CN114111795A (en) * 2021-11-24 2022-03-01 航天神舟飞行器有限公司 Unmanned aerial vehicle self-navigation based on terrain matching
USRE49105E1 (en) 2002-09-20 2022-06-14 Vi Technologies, Llc Self-calibrated, remote imaging and data processing system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
US20100017046A1 (en) * 2008-03-16 2010-01-21 Carol Carlin Cheung Collaborative engagement for target identification and tracking
CN102506867A (en) * 2011-11-21 2012-06-20 清华大学 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
US20100017046A1 (en) * 2008-03-16 2010-01-21 Carol Carlin Cheung Collaborative engagement for target identification and tracking
CN102506867A (en) * 2011-11-21 2012-06-20 清华大学 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
《新世纪 新机遇 新挑战--知识创新和高新技术产业发展(下册)》 20011231 王翌等 惯性/卫星定位/地形匹配/景象匹配组合导航技术 , *
HAIDONG HU ET AL.: "A Novel Algorithm for SINS/CNS/GPS Integrated Navigation System", 《JOINT 48TH IEEE CONFERENCE ON DECISION AND CONTROL AND 28TH CHINESE CONTROL CONFERENCE》, 18 December 2009 (2009-12-18) *
杨恒等: "一种高效的图像局部特征匹配算法", 《西北工业大学学报》, vol. 28, no. 2, 30 April 2010 (2010-04-30) *
江春红等: "信息融合技术在INS/GPS/TAN/SMN四组合***中的应用", 《信息与控制》, vol. 30, no. 6, 31 December 2001 (2001-12-31) *
王翌等: "惯性/卫星定位/地形匹配/景象匹配组合导航技术", 《新世纪 新机遇 新挑战——知识创新和高新技术产业发展(下册)》, 31 December 2001 (2001-12-31) *
谢建春等: "一种新的复合地形辅助导航方法", 《计算机仿真》, vol. 26, no. 3, 31 March 2009 (2009-03-31) *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE49105E1 (en) 2002-09-20 2022-06-14 Vi Technologies, Llc Self-calibrated, remote imaging and data processing system
US9797980B2 (en) 2002-09-20 2017-10-24 Visual Intelligence Lp Self-calibrated, remote imaging and data processing system
CN104704424A (en) * 2012-08-21 2015-06-10 视觉智能有限合伙公司 Infrastructure mapping system and method
CN104704424B (en) * 2012-08-21 2018-02-09 视觉智能有限合伙公司 infrastructure mapping system and method
CN102829785B (en) * 2012-08-30 2014-12-31 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN103353310B (en) * 2013-06-01 2017-06-09 西北工业大学 A kind of laser near-net shaping
CN103353310A (en) * 2013-06-01 2013-10-16 西北工业大学 Laser strapdown inertial navigation system
CN103591955B (en) * 2013-11-21 2016-03-30 西安中科光电精密工程有限公司 Integrated navigation system
CN103591955A (en) * 2013-11-21 2014-02-19 西安中科光电精密工程有限公司 Combined navigation system
CN105547300A (en) * 2015-12-30 2016-05-04 航天恒星科技有限公司 All-source navigation system and method used for AUV (Autonomous Underwater Vehicle)
CN108230374A (en) * 2016-12-21 2018-06-29 波音公司 Enhance the method and apparatus of raw sensor image by geographic registration
CN108496130A (en) * 2017-05-31 2018-09-04 深圳市大疆创新科技有限公司 Flight control method, equipment, control terminal and its control method, unmanned plane
CN109214254B (en) * 2017-07-07 2020-08-14 北京臻迪科技股份有限公司 Method and device for determining displacement of robot
CN109214254A (en) * 2017-07-07 2019-01-15 北京臻迪科技股份有限公司 A kind of method and device of determining robot displacement
CN110388939A (en) * 2018-04-23 2019-10-29 湖南海迅自动化技术有限公司 One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
CN109029434A (en) * 2018-06-29 2018-12-18 电子科技大学 Based on the Sang Diya inertia terrain auxiliary navigation method pushed away under adaptive scale
WO2021016867A1 (en) * 2019-07-30 2021-02-04 深圳市大疆创新科技有限公司 Terminal device and data processing method therefor, and unmanned aerial vehicle and control method therefor
CN113074722A (en) * 2020-01-03 2021-07-06 上海航空电器有限公司 Method for improving, positioning and correcting terrain reference navigation precision based on vision assistance technology
CN113074722B (en) * 2020-01-03 2024-06-11 上海航空电器有限公司 Positioning correction method for improving terrain reference navigation precision based on vision auxiliary technology
CN111854728A (en) * 2020-05-20 2020-10-30 哈尔滨工程大学 Fault-tolerant filtering method based on generalized relative entropy
CN111854728B (en) * 2020-05-20 2022-12-13 哈尔滨工程大学 Fault-tolerant filtering method based on generalized relative entropy
CN112859137A (en) * 2020-12-31 2021-05-28 国营芜湖机械厂 Airborne SINS/BDS/GNSS/TAN combined navigation semi-physical simulation system
CN113155126A (en) * 2021-01-04 2021-07-23 航天时代飞鸿技术有限公司 Multi-machine cooperative target high-precision positioning system and method based on visual navigation
CN113155126B (en) * 2021-01-04 2023-10-20 航天时代飞鸿技术有限公司 Visual navigation-based multi-machine cooperative target high-precision positioning system and method
CN113406566A (en) * 2021-06-04 2021-09-17 广东汇天航空航天科技有限公司 Aircraft positioning method and device
CN113406566B (en) * 2021-06-04 2023-09-19 广东汇天航空航天科技有限公司 Method and device for positioning aircraft
CN114111795A (en) * 2021-11-24 2022-03-01 航天神舟飞行器有限公司 Unmanned aerial vehicle self-navigation based on terrain matching

Also Published As

Publication number Publication date
CN102506868B (en) 2014-03-12

Similar Documents

Publication Publication Date Title
CN102506868B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system
CN103697889B (en) A kind of unmanned plane independent navigation and localization method based on multi-model Distributed filtering
CN107727079B (en) Target positioning method of full-strapdown downward-looking camera of micro unmanned aerial vehicle
CN111102978B (en) Method and device for determining vehicle motion state and electronic equipment
Sim et al. Integrated position estimation using aerial image sequences
Conte et al. Vision-based unmanned aerial vehicle navigation using geo-referenced information
CN110487267B (en) Unmanned aerial vehicle navigation system and method based on VIO & UWB loose combination
CN101598556B (en) Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment
CN101858748B (en) Fault-tolerance autonomous navigation method of multi-sensor of high-altitude long-endurance unmanned plane
US7868821B2 (en) Method and apparatus to estimate vehicle position and recognized landmark positions using GPS and camera
JP4448187B2 (en) Image geometric correction method and apparatus
CN102506867B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
CN102829785B (en) Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN106017463A (en) Aircraft positioning method based on positioning and sensing device
KR20190051703A (en) Stereo drone and method and system for calculating earth volume in non-control points using the same
CN110160545B (en) Enhanced positioning system and method for laser radar and GPS
KR102239562B1 (en) Fusion system between airborne and terrestrial observation data
CN106352897B (en) It is a kind of based on the silicon MEMS gyro estimation error of monocular vision sensor and bearing calibration
Owens et al. Development of a signature-based terrain relative navigation system for precision landing
Tjahjadi et al. Single frame resection of compact digital cameras for UAV imagery
CN110388939A (en) One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
Toth Sensor integration in airborne mapping
CN113253325A (en) Inertial satellite sequential tight combination lie group filtering method
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
CN110068325A (en) A kind of lever arm error compensating method of vehicle-mounted INS/ visual combination navigation system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant