CN109269511A - The Curve Matching vision navigation method that circumstances not known lower planet lands - Google Patents

The Curve Matching vision navigation method that circumstances not known lower planet lands Download PDF

Info

Publication number
CN109269511A
CN109269511A CN201811310255.2A CN201811310255A CN109269511A CN 109269511 A CN109269511 A CN 109269511A CN 201811310255 A CN201811310255 A CN 201811310255A CN 109269511 A CN109269511 A CN 109269511A
Authority
CN
China
Prior art keywords
lander
coordinate system
circumstances
formula
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811310255.2A
Other languages
Chinese (zh)
Other versions
CN109269511B (en
Inventor
崔平远
高锡珍
朱圣英
刘阳
徐瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201811310255.2A priority Critical patent/CN109269511B/en
Publication of CN109269511A publication Critical patent/CN109269511A/en
Application granted granted Critical
Publication of CN109269511B publication Critical patent/CN109269511B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The Curve Matching vision navigation method that circumstances not known lower planet disclosed by the invention lands, belongs to field of deep space exploration.Implementation method of the present invention is as follows: combining inertia measurement information to establish lander kinematics model first, then the measurement model based on interframe Curve Matching is established using acquisition sequence image during lander decline, real-time estimation is carried out finally by absolute movement state of the Kalman filtering algorithm to lander, and then realize the Curve Matching vision guided navigation that circumstances not known lower planet lands, improve navigation system precision, guarantee the stability of navigation system, guarantees that lander precise and safety is landed.The present invention do not need priori cartographic information can the absolute movement state to lander estimate, improve navigation system stability.The present invention is applicable not only in planetary landing task, is also applied for small feature loss landing task.

Description

The Curve Matching vision navigation method that circumstances not known lower planet lands
Technical field
The present invention relates to the Curve Matching vision navigation methods that circumstances not known lower planet lands, and belong to deep space exploration technology neck Domain.
Background technique
Landing detection and sampling return are the main direction of development of the following deep space exploration.Following small feature loss and mars exploration Task requires that detector has the ability landed in the higher region pinpoint of scientific value.And target celestial body is apart from the earth Farther out, communication time-delay is serious, and this requires detectors to have the ability of independent navigation.Meanwhile the priori letter of target celestial body environment Ceasing the uncertainty such as deficiency, environmental perturbation, more stringent requirements are proposed to autonomous navigation system.
The air navigation aid based on Inertial Measurement Unit IMU boat position recursion, but this method are mainly used in landing mission at present Initial deviation can not be modified, and there are random drifts and error for Inertial Measurement Unit, with the accumulated error meeting of time It gradually spreads, it is difficult to meet the requirement of high-precision navigation.For deficiency existing for above-mentioned air navigation aid, it is based on celestial surface feature The autonomous vision navigation method of image information is increasingly becoming the emphasis of scholars' research.Based on celestial surface characteristic image information Autonomous vision navigation method be broadly divided into two classes: the first kind is air navigation aid known to celestial surface road sign feature locations;The Two classes are the unknown air navigation aids of celestial surface road sign feature locations.But in the absence of the priori cartographic information, i.e. celestial surface When road sign feature locations can not obtain, first kind method will be no longer applicable in.It leads based on celestial surface road sign feature locations are unknown Boat method is divided into the air navigation aid based on Feature Points Matching and leading based on Curve Matching according to the different of road sign characteristics of image again Boat method.But characteristic point visual line measurement information is vulnerable to influence of noise.In consideration of it, it is necessary to move for lander under circumstances not known This problem of state estimation designs a kind of fast and effeciently lander vision navigation method, guarantees that lander precise and safety is landed.
Summary of the invention
In order to solve the problems, such as interspace landing independent navigation under circumstances not known, the object of the present invention is to provide a kind of unknown rings The Curve Matching vision navigation method that border lower planet lands, in conjunction with inertia measurement use of information Kalman filtering algorithm to lander Absolute movement state carry out real-time estimation, realize circumstances not known under interspace landing independent navigation, guarantee lander precise and safety It lands.
The purpose of the present invention is what is be achieved through the following technical solutions.
The Curve Matching vision navigation method that circumstances not known lower planet disclosed by the invention lands combines inertia measurement first Information establishes lander kinematics model, is then established using acquisition sequence image during lander decline and is based on interframe curve Matched measurement model carries out real-time estimation finally by absolute movement state of the Kalman filtering algorithm to lander, in turn It realizes the Curve Matching vision guided navigation that circumstances not known lower planet lands, improves navigation system precision, guarantee the stabilization of navigation system Property.
The Curve Matching vision navigation method that circumstances not known lower planet disclosed by the invention lands, includes the following steps:
Step 1: establishing lander kinematics model.
In order to describe the skyborne position of lander and posture and its phase between the visual signature of target celestial body surface To geometrical relationship, and the equation of motion of the lander in coherent reference system is defined, is firstly introduced into following relative coordinate system: landing point Coordinate system { L }, navigate camera body coordinate system { C } and lander body coordinate system { B }.The Position and orientation parameters of lander are all It is described in landing point coordinate system.Lander body coordinate system with navigation camera coordinates system is overlapped, i.e., optical navigation camera and The installation matrix of land device is unit battle array.Do not consider that planetary rotation influences, establishes lander landing kinematics using IMU metrical information Equation are as follows:
Wherein Inertia information acceleration aimuAnd angular velocity omegaimuMeasurement model is
Wherein,LR andLV respectively indicates position and speed of the lander under landing point coordinate system;For attitude quaternion,It is transition matrix of the landing point coordinate system to lander body coordinate system, is abbreviated as C (q);LG is gravitational acceleration, ng For gravitational acceleration disturbance;baAnd bωRespectively indicate accelerometer and gyroscope zero bias;naAnd nωIt is accelerometer and gyro respectively Instrument measures noise;nwaAnd nRespectively accelerometer and gyroscope deviation noise;LA expression acts on lander in addition to gravitation Resultant force generate acceleration;Bω indicates that lander is fast with respect to the angle of rotation of landing point coordinate system under lander body coordinate system Degree;For any angular velocity omega=[ωx ωy ωz]T, Ω () is defined as
Step 2: the measurement model based on interframe Curve Matching is established, for realizing the update of lander motion state.
Touchdown area almost plane, crater are expressed as at landing point coordinate system { L }
WhereinFor any point on aerolite pit edge under landing point coordinate system.
Any point using national forest park in Xiaokeng, in landing planeLX=[Lx Ly Lz]TPicture in the i-th width decline image Point u=[u v]TFor
Wherein σ is non-zero constant,F is camera focus, Ri=C (qi)。
Since touchdown area is approximately plane, thenLZ=0, formula (4) are written as
Wherein
WhereinIndicate lander location components under landing point coordinate system.
Crater is expressed as in the i-th width decline image
Then by formula (3), formula (5) and formula (8), crater is obtained as curve EiFor
Therefore decline the measurement of j-th of crater in image in the i-th widthIt is expressed as
WhereinFor crater boundary curve parameter,It makes an uproar for measurement Sound, vech () indicate the vectorization form of symmetrical matrix, and vec () indicates the vectorization form of Arbitrary Matrix, and matrix Η is Transition matrix between vech () and vec (),
Since crater absolute location information Q is unknown, formula (10) cannot be directly used to state estimation.
Crater Q is observed in continuous two width decline image, and lander is in t1And t2The crater picture of moment observation is bent Line is respectively
It is obtained by formula (13) and formula (14)
Crater is in t2Moment measurement modelFor
WhereinTo measure noise,
Step 3: the lander kinematics model and step 2 established in conjunction with step 1 establish based on unknown curvilinear characteristic Measurement model carries out real-time estimation using absolute movement state of the Kalman filtering algorithm to lander, realizes under circumstances not known Interspace landing independent navigation guarantees that lander precise and safety is landed.
For the nonlinear problem based on unknown curvilinear characteristic measurement model that settlement steps to deal 2 is established, karr described in step 3 The graceful preferred Unscented kalman filtering algorithm of filtering algorithm.
When step 3 selects Unscented kalman filtering algorithm, step 3 concrete methods of realizing is as follows:
Step 3.1: based in step 1 using inertia measurement information establish lander kinematics model obtain lander state Equation is
WhereinThe differential form of expression state,LrcWithBe illustrated respectively in previous imaging moment lander position and Attitude quaternion, w indicate state-noise, Qk=E [wwT] it is state-noise covariance matrix.
Step 3.2: based on the measurement model based on unknown curvilinear characteristic established in step 2, obtaining measurement model isFor
zk=h (vech (Ek))+vk (19)
Wherein k=1,2,3 ...,vkTo survey Measure noise, Rk=E { vk(vk)TIt is measurement noise covariance matrix.
Step 3.3: carrying out real-time estimation using absolute movement state of the Unscented kalman filtering algorithm to lander, realize Interspace landing independent navigation under circumstances not known guarantees that lander precise and safety is landed.
Step 3.3.1: lander motion state is initialized
Wherein x0WithRespectively indicate lander original state and its mean value, P0Indicate lander original state covariance square Battle array.
Step 3.3.2: lander motion state sigma sampled point is calculated
Step 3.3.3: lander motion state time propagation equation is established using formula (18).
Step 3.3.4: lander motion state measurement updaue equation is established using formula (19).
N indicates state dimension in above-mentioned formula,
Wherein 10-4≤ α≤1, κ=3-n, β=2
Step 3.3.5: the real-time absolute movement state of lander is obtained using formula (22) and (23)WithRealize Interspace landing independent navigation under circumstances not known guarantees that lander precise and safety is landed.
The utility model has the advantages that
1, the Curve Matching vision navigation method that circumstances not known lower planet disclosed by the invention lands provides and a kind of utilizes frame The matched lander vision navigation method of half interval contour, in conjunction with inertia measurement use of information Unscented kalman filtering algorithm to lander Absolute movement state carry out real-time estimation, can guarantee the real-time of navigation system.
2, the Curve Matching vision navigation method that circumstances not known lower planet disclosed by the invention lands, utilizes interframe curve With as measurement model, therefore, there is no need to priori cartographic information can the absolute movement state to lander estimate, improve Navigation system stability.
3, since planet and small feature loss surface have curvilinear characteristic, what circumstances not known lower planet disclosed by the invention landed Curve Matching vision navigation method is applicable not only in planetary landing task, is also applied for small feature loss landing task.
Detailed description of the invention
Fig. 1 is the Curve Matching vision navigation method flow chart that circumstances not known lower planet lands;
Fig. 2 is that lander position estimation error and its 3 σ filter performance criterias are poor;
Fig. 3 is that lander speed estimation error and its 3 σ filter performance criterias are poor;
Fig. 4 is that lander attitude estimation error and its 3 σ filter performance criterias are poor.
Specific embodiment
Objects and advantages in order to better illustrate the present invention, with reference to the accompanying drawing with example to the contents of the present invention do into One step explanation.
As shown in Figure 1, the Curve Matching vision navigation method that circumstances not known lower planet disclosed in this example lands, specific to walk It is rapid as follows:
Step 1: establishing lander kinematics model.
In order to describe the skyborne position of lander and posture and its phase between the visual signature of target celestial body surface To geometrical relationship, and the equation of motion of the lander in coherent reference system is defined, is firstly introduced into following relative coordinate system: landing point Coordinate system { L }, navigate camera body coordinate system { C } and lander body coordinate system { B }.The Position and orientation parameters of lander are all It is described in landing point coordinate system.Lander body coordinate system with navigation camera coordinates system is overlapped, i.e., optical navigation camera and The installation matrix of land device is unit battle array.Do not consider that planetary rotation influences, establishes lander landing kinematics using IMU metrical information Equation are as follows:
Wherein Inertia information acceleration aimuAnd angular velocity omegaimuMeasurement model is
Wherein,LR andLV respectively indicates position and speed of the lander under landing point coordinate system;For attitude quaternion,It is transition matrix of the landing point coordinate system to lander body coordinate system, is abbreviated as C (q);LG is gravitational acceleration, ng For gravitational acceleration disturbance;baAnd bωRespectively indicate accelerometer and gyroscope zero bias;naAnd nωIt is accelerometer and gyro respectively Instrument measures noise;nwaAnd nRespectively accelerometer and gyroscope deviation noise;LA expression acts on lander in addition to gravitation Resultant force generate acceleration;Bω indicates that lander is fast with respect to the angle of rotation of landing point coordinate system under lander body coordinate system Degree;For any angular velocity omega=[ωx ωy ωz]T, Ω () is defined as
Step 2: the measurement model based on interframe Curve Matching is established, for realizing the update of lander motion state.
Touchdown area almost plane, crater are expressed as at landing point coordinate system { L }
Wherein1]TFor any point on aerolite pit edge under landing point coordinate system.
Any point using national forest park in Xiaokeng, in landing planeLX=[Lx Ly Lz]TPicture in the i-th width decline image Point u=[u v]TFor
Wherein σ is non-zero constant,F is camera focus, Ri=C (qi)。
Since touchdown area is approximately plane, thenLZ=0, formula (28) are written as
Wherein
WhereinIndicate lander location components under landing point coordinate system.
Crater is expressed as in the i-th width decline image
Then by formula (27), formula (29) and formula (32), crater is obtained as curve EiFor
Therefore decline the measurement of a crater in image in the i-th widthIt is expressed as
WhereinFor crater boundary curve parameter,It makes an uproar for measurement Sound, vech () indicate the vectorization form of symmetrical matrix, and vec () indicates the vectorization form of Arbitrary Matrix, and matrix Η is Transition matrix between vech () and vec (),
Since crater absolute location information Q is unknown, formula (34) cannot be directly used to state estimation.
Crater Q is observed in continuous two width decline image, and lander is in t1And t2The crater picture of moment observation is bent Line is respectively
It is obtained by formula (37) and formula (38)
Crater is in t2Moment measurement modelFor
WhereinTo measure noise,
Step 3: the lander kinematics model and step 2 established in conjunction with step 1 establish based on unknown curvilinear characteristic Measurement model carries out real-time estimation using absolute movement state of the Kalman filtering algorithm to lander, realizes under circumstances not known Interspace landing independent navigation guarantees that lander precise and safety is landed.
For the nonlinear problem for the measurement model based on unknown curvilinear characteristic that settlement steps to deal 2 is established, card described in step 3 The preferred Unscented kalman filtering algorithm of Kalman Filtering algorithm.
When step 3 selects Unscented kalman filtering algorithm, step 3 concrete methods of realizing is as follows:
Step 3.1: based in step 1 using inertia measurement information establish lander kinematics model obtain lander state Equation is
WhereinThe differential form of expression state,LrcWithIt is illustrated respectively in previous imaging moment lander
Position and attitude quaternion, w indicate state-noise, Qk=E [wwT] it is state-noise covariance matrix.
Step 3.2: based on the measurement model based on unknown curvilinear characteristic established in step 2, obtaining measurement model isFor
zk=h (vech (Ek))+vk (43)
Wherein k=1,2,3 ...,vkTo survey Measure noise, Rk=E { vk(vk)TIt is measurement noise covariance matrix.
Step 3.3: carrying out real-time estimation using absolute movement state of the Unscented kalman filtering algorithm to lander, realize Interspace landing independent navigation under circumstances not known guarantees that lander precise and safety is landed.
Step 3.3.1: lander motion state is initialized
Wherein x0WithRespectively indicate lander original state and its mean value, P0Indicate lander original state covariance square Battle array.
Step 3.3.2: lander motion state sigma sampled point is calculated
Step 3.3.3: lander motion state time propagation equation is established using formula (42).
Step 3.3.4: lander motion state measurement updaue equation is established using formula (43).
N expression state dimension in above-mentioned formula, n=23,
Wherein 10-4≤ α≤1, κ=3-n, β=2
Step 3.3.5: the real-time absolute movement state of lander is obtained using formula (46) and (47)WithIt is i.e. real Existing circumstances not known lower planet landing independent navigation guarantees that lander precise and safety is landed.
Mathematical simulation simulating, verifying has been carried out using a curve using Mars landing detection as background under Matlab environment. If emulation terminates when lander reaches above landing point at 100m, landing times 120s.45 ° of viewing field of camera angle of navigation, focal length 14.6mm measures 1 pixel of noise.IMU uses LN-200, sample frequency 50HZ.Lander original state is as shown in table 1, position All directions initial error is 500m, and speed all directions initial error is 1m/s, and posture all directions initial error is 1 °.Process noise Covariance Q is
Q=diag ([2.4 × 10-13I 2.4×10-13I 2.5×10-7I 1.2×10-7I 1.2×10-8I])
1 simulation parameter of table
Above-described specific descriptions have carried out further specifically the purpose of invention, technical scheme and beneficial effects It is bright, it should be understood that the above is only a specific embodiment of the present invention, the protection model being not intended to limit the present invention It encloses, all within the spirits and principles of the present invention, any modification, equivalent substitution, improvement and etc. done should be included in the present invention Protection scope within.

Claims (6)

1. the Curve Matching vision navigation method that circumstances not known lower planet lands, characterized by the following steps:
Step 1: establishing lander kinematics model;
Step 2: the measurement model based on interframe Curve Matching is established, for realizing the update of lander motion state;
Step 3: the measurement based on unknown curvilinear characteristic that the lander kinematics model and step 2 established in conjunction with step 1 are established Model carries out real-time estimation using absolute movement state of the Kalman filtering algorithm to lander, realizes interspace under circumstances not known Landing independent navigation guarantees that lander precise and safety is landed.
2. the Curve Matching vision navigation method that circumstances not known lower planet as described in claim 1 lands, it is characterised in that: step Rapid 1 concrete methods of realizing is,
In order to describe the skyborne position of lander and posture and it is relatively several between the visual signature of target celestial body surface What relationship, and the equation of motion of the lander in coherent reference system is defined, it is firstly introduced into following relative coordinate system: landing point coordinate It is { L } navigate camera body coordinate system { C } and lander body coordinate system { B };The Position and orientation parameters of lander all It is described in the point coordinate system of land;Lander body coordinate system is overlapped with navigation camera coordinates system, i.e. optical navigation camera and lander Installation matrix be unit battle array;Do not consider that planetary rotation influences, establishes lander landing kinematical equation using IMU metrical information Are as follows:
Wherein Inertia information acceleration aimuAnd angular velocity omegaimuMeasurement model is
Wherein,LR andLV respectively indicates position and speed of the lander under landing point coordinate system;For attitude quaternion,It is transition matrix of the landing point coordinate system to lander body coordinate system, is abbreviated as C (q);LG is gravitational acceleration, ng For gravitational acceleration disturbance;baAnd bωRespectively indicate accelerometer and gyroscope zero bias;naAnd nωIt is accelerometer and gyro respectively Instrument measures noise;nwaAnd nRespectively accelerometer and gyroscope deviation noise;LA expression acts on lander in addition to gravitation Resultant force generate acceleration;Bω indicates that lander is fast with respect to the angle of rotation of landing point coordinate system under lander body coordinate system Degree;For any angular velocity omega=[ωx ωy ωz]T, Ω () is defined as
3. the Curve Matching vision navigation method that circumstances not known lower planet as claimed in claim 2 lands, it is characterised in that: step Rapid 2 concrete methods of realizing is,
Touchdown area almost plane, crater are expressed as at landing point coordinate system { L }
WhereinFor any point on aerolite pit edge under landing point coordinate system;
Any point using national forest park in Xiaokeng, in landing planeLX=[Lx Ly Lz]TPicture point u in the i-th width decline image =[u v]TFor
Wherein σ is non-zero constant,F is camera focus, Ri=C (qi);
Since touchdown area is approximately plane, thenLZ=0, formula (4) are written as
Wherein
WhereinLri x,Lri y,Lri zIndicate lander location components under landing point coordinate system;
Crater is expressed as in the i-th width decline image
Then by formula (3), formula (5) and formula (8), crater is obtained as curve EiFor
Therefore decline the measurement of j-th of crater in image in the i-th widthIt is expressed as
WhereinFor crater boundary curve parameter,To measure noise, Vech () indicates the vectorization form of symmetrical matrix, and vec () indicates the vectorization form of Arbitrary Matrix, and matrix Η is vech Transition matrix between () and vec (),
Since crater absolute location information Q is unknown, formula (10) cannot be directly used to state estimation;
Crater Q is observed in continuous two width decline image, and lander is in t1And t2The crater of moment observation is as curve point It is not
It is obtained by formula (13) and formula (14)
Crater is in t2Moment measurement modelFor
WhereinTo measure noise,
4. the Curve Matching vision navigation method that circumstances not known lower planet as claimed in claim 3 lands, it is characterised in that: be The nonlinear problem based on unknown curvilinear characteristic measurement model that settlement steps to deal 2 is established, Kalman filtering algorithm described in step 3 Select Unscented kalman filtering algorithm.
5. the Curve Matching vision navigation method that circumstances not known lower planet as claimed in claim 4 lands, it is characterised in that: when When step 3 selects Unscented kalman filtering algorithm, step 3 concrete methods of realizing is as follows,
Step 3.1: based in step 1 using inertia measurement information establish lander kinematics model obtain lander state equation For
WhereinThe differential form of expression state,LrcWithIt is illustrated respectively in position and the posture four of previous imaging moment lander First number, w indicate state-noise, Qk=E [wwT] it is state-noise covariance matrix;
Step 3.2: based on the measurement model based on unknown curvilinear characteristic established in step 2, obtaining measurement model isFor
zk=h (vech (Ek))+vk (19)
Wherein k=1,2,3,vkTo measure noise, Rk=E { vk(vk)TIt is measurement noise covariance matrix;
Step 3.3: carrying out real-time estimation using absolute movement state of the Unscented kalman filtering algorithm to lander, realize unknown Interspace landing independent navigation under environment guarantees that lander precise and safety is landed.
6. the Curve Matching vision navigation method that circumstances not known lower planet as claimed in claim 5 lands, it is characterised in that: step Rapid 3.3 concrete methods of realizing is,
Step 3.3.1: lander motion state is initialized
Wherein x0WithRespectively indicate lander original state and its mean value, P0Indicate lander original state covariance matrix;
Step 3.3.2: lander motion state sigma sampled point is calculated
Step 3.3.3: lander motion state time propagation equation is established using formula (18);
Step 3.3.4: lander motion state measurement updaue equation is established using formula (19);
N indicates state dimension in above-mentioned formula,
Wherein 10-4≤ α≤1, κ=3-n, β=2
Step 3.3.5: the real-time absolute movement state of lander is obtained using formula (22) and (23)WithIt realizes unknown Interspace landing independent navigation under environment guarantees that lander precise and safety is landed.
CN201811310255.2A 2018-11-06 2018-11-06 Curve matching visual navigation method for planet landing in unknown environment Active CN109269511B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811310255.2A CN109269511B (en) 2018-11-06 2018-11-06 Curve matching visual navigation method for planet landing in unknown environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811310255.2A CN109269511B (en) 2018-11-06 2018-11-06 Curve matching visual navigation method for planet landing in unknown environment

Publications (2)

Publication Number Publication Date
CN109269511A true CN109269511A (en) 2019-01-25
CN109269511B CN109269511B (en) 2020-01-07

Family

ID=65192856

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811310255.2A Active CN109269511B (en) 2018-11-06 2018-11-06 Curve matching visual navigation method for planet landing in unknown environment

Country Status (1)

Country Link
CN (1) CN109269511B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110702122A (en) * 2019-10-22 2020-01-17 北京理工大学 Comprehensive optimization method for autonomous optical navigation characteristics of extraterrestrial celestial body landing
CN111652896A (en) * 2020-05-29 2020-09-11 北京理工大学 Inertial navigation auxiliary meteorite crater coarse-to-fine detection method
CN112066999A (en) * 2020-09-16 2020-12-11 北京控制工程研究所 Method for determining gravity direction in real time in planet landing process
CN113022898A (en) * 2021-02-18 2021-06-25 北京理工大学 State estimation method for flexible attachment system in weak gravity environment
CN114485678A (en) * 2021-12-31 2022-05-13 上海航天控制技术研究所 Heaven and earth integrated lunar surface landing navigation method
CN114577205A (en) * 2022-02-10 2022-06-03 北京空间飞行器总体设计部 Planet soft landing autonomous navigation landmark optimization method based on sequence images

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150253150A1 (en) * 2014-03-07 2015-09-10 Airbus Operations Sas Device for determining navigation parameters of an aircraft during a landing phase
CN105371853A (en) * 2014-08-06 2016-03-02 北京理工大学 Mars power descending section navigation method based on TDS and orbiter
CN106096621A (en) * 2016-06-02 2016-11-09 西安科技大学 Based on vector constraint fall position detection random character point choosing method
CN107144278A (en) * 2017-04-24 2017-09-08 北京理工大学 A kind of lander vision navigation method based on multi-source feature
CN108036785A (en) * 2017-11-24 2018-05-15 浙江大学 A kind of aircraft position and orientation estimation method based on direct method and inertial navigation fusion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150253150A1 (en) * 2014-03-07 2015-09-10 Airbus Operations Sas Device for determining navigation parameters of an aircraft during a landing phase
CN105371853A (en) * 2014-08-06 2016-03-02 北京理工大学 Mars power descending section navigation method based on TDS and orbiter
CN106096621A (en) * 2016-06-02 2016-11-09 西安科技大学 Based on vector constraint fall position detection random character point choosing method
CN107144278A (en) * 2017-04-24 2017-09-08 北京理工大学 A kind of lander vision navigation method based on multi-source feature
CN108036785A (en) * 2017-11-24 2018-05-15 浙江大学 A kind of aircraft position and orientation estimation method based on direct method and inertial navigation fusion

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
徐兴华等: "基于UKF算法的视线测量相对导航在火星着陆中的应用", 《中国宇航学会深空探测技术专业委员会第十届学术年会论文集》 *
徐超: "采用双目视觉测量的行星着陆相对导航方法", 《宇航学报》 *
高锡珍: "《中国优秀硕士学位论文全文数据库信息科技辑》", 15 August 2016 *
高锡珍等: "一种基于陨石坑拟合椭圆的着陆器位姿估计算法", 《深空探测学报》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110702122A (en) * 2019-10-22 2020-01-17 北京理工大学 Comprehensive optimization method for autonomous optical navigation characteristics of extraterrestrial celestial body landing
CN110702122B (en) * 2019-10-22 2021-03-30 北京理工大学 Comprehensive optimization method for autonomous optical navigation characteristics of extraterrestrial celestial body landing
CN111652896A (en) * 2020-05-29 2020-09-11 北京理工大学 Inertial navigation auxiliary meteorite crater coarse-to-fine detection method
CN111652896B (en) * 2020-05-29 2023-06-23 北京理工大学 Method for detecting coarse-fine meteorite crater by inertial navigation assistance
CN112066999A (en) * 2020-09-16 2020-12-11 北京控制工程研究所 Method for determining gravity direction in real time in planet landing process
CN113022898A (en) * 2021-02-18 2021-06-25 北京理工大学 State estimation method for flexible attachment system in weak gravity environment
CN113022898B (en) * 2021-02-18 2022-05-17 北京理工大学 State estimation method for flexible attachment system in weak gravity environment
CN114485678A (en) * 2021-12-31 2022-05-13 上海航天控制技术研究所 Heaven and earth integrated lunar surface landing navigation method
CN114485678B (en) * 2021-12-31 2023-09-12 上海航天控制技术研究所 Navigation method for land, ground and lunar landing
CN114577205A (en) * 2022-02-10 2022-06-03 北京空间飞行器总体设计部 Planet soft landing autonomous navigation landmark optimization method based on sequence images
CN114577205B (en) * 2022-02-10 2023-06-06 北京空间飞行器总体设计部 Satellite soft landing autonomous navigation landmark optimization method based on sequence images

Also Published As

Publication number Publication date
CN109269511B (en) 2020-01-07

Similar Documents

Publication Publication Date Title
CN109269511A (en) The Curve Matching vision navigation method that circumstances not known lower planet lands
Mourikis et al. Vision-aided inertial navigation for spacecraft entry, descent, and landing
Trawny et al. Vision‐aided inertial navigation for pin‐point landing using observations of mapped landmarks
CN110095116A (en) A kind of localization method of vision positioning and inertial navigation combination based on LIFT
CN105865454B (en) A kind of Navigation of Pilotless Aircraft method generated based on real-time online map
CN103411621B (en) A kind of vision/INS Combinated navigation method of the optical flow field towards indoor mobile robot
CN105953796A (en) Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
Johnson et al. A general approach to terrain relative navigation for planetary landing
CN101598556A (en) Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known
CN106767791A (en) A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing
CN106672265B (en) A kind of small feature loss accuracy Guidance and control method based on Optic flow information
CN109443354B (en) Visual-inertial tight coupling combined navigation method based on firefly group optimized PF
CN109269512A (en) The Relative Navigation that planetary landing image is merged with ranging
Williams et al. Feature and pose constrained visual aided inertial navigation for computationally constrained aerial vehicles
CN108917755B (en) Imaging seeker line-of-sight angle zero error estimation method and device
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
Trawny et al. Coupled vision and inertial navigation for pin-point landing
CN114964276A (en) Dynamic vision SLAM method fusing inertial navigation
CN112444245A (en) Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
CN111207773A (en) Attitude unconstrained optimization solving method for bionic polarized light navigation
CN103017773B (en) A kind of based on catalog of celestial bodies region feature and natural satellite road sign around section air navigation aid
Yang et al. Research on position and orientation measurement method for roadheader based on vision/INS
CN113375665B (en) Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling
Baheerathan et al. Image-aided inertial navigation for an Octocopter
Luo et al. An imu/visual odometry integrated navigation method based on measurement model optimization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant