CN110186465A - A kind of space non-cooperative target relative status estimation method based on monocular vision - Google Patents

A kind of space non-cooperative target relative status estimation method based on monocular vision Download PDF

Info

Publication number
CN110186465A
CN110186465A CN201910593684.3A CN201910593684A CN110186465A CN 110186465 A CN110186465 A CN 110186465A CN 201910593684 A CN201910593684 A CN 201910593684A CN 110186465 A CN110186465 A CN 110186465A
Authority
CN
China
Prior art keywords
target
follows
axis
servicing spacecraft
equation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910593684.3A
Other languages
Chinese (zh)
Other versions
CN110186465B (en
Inventor
孟中杰
郭新程
黄攀峰
张夷斋
张帆
刘正雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201910593684.3A priority Critical patent/CN110186465B/en
Publication of CN110186465A publication Critical patent/CN110186465A/en
Application granted granted Critical
Publication of CN110186465B publication Critical patent/CN110186465B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The space non-cooperative target relative status estimation method based on monocular vision that the present invention provides a kind of, Relative Navigation in the final approximate procedure of space non-cooperative target is realized using monocular camera, and convergence is filtered the acceleration in such a way that constraint is added in observational equation;The kinematics and dynamics modeling for establishing relative motion between target and Servicing spacecraft, establishes observational equation, wherein the constraint relationship equation between target feature point is added, designs filter, realizes the estimation of target relative status.Relative Navigation of the present invention due to realizing final approximate spatial noncooperative target using monocular camera, compared to binocular camera (stereoscopic vision) and laser imaging radar, the advantages such as monocular camera has volume mass small, low in energy consumption, and economic cost is low;And observational equation is added in the constraint for meeting relative characteristic, and filter is restrained faster.

Description

A kind of space non-cooperative target relative status estimation method based on monocular vision
Technical field
The present invention relates to spacecraft Relative Navigation technical fields, more particularly to a kind of target relative status estimation method.
Background technique
In order to noncooperative target implement in-orbit service, need to estimate in Close approach extraterrestrial target target with Servicing spacecraft relative position and posture.Space non-cooperative target has the feature that surface without specific optical identification mostly Mark;It can not be communicated;Object module is unknown etc..This just gives the Relative attitude and displacement estimation band during such target Proximity operation Very big difficulty is carried out.
From the point of view of the vision measurement sensor workable for Relative Navigation, have monocular camera, binocular camera (stereoscopic vision) and Laser imaging radar etc., compared to both rear, there is monocular camera technology to realize that simple, volume mass is small, the advantages such as low in energy consumption, It is economically more flexible and economy, but there is also the defects that cannot directly obtain depth information.Although existing document To the ornamental of position correlated condition when demonstrating camera setoff installation, but use monocular camera estimation target relative status still It is so a difficult thing.
In fact, the relative status estimation problem of unknown noncooperative target can estimate simultaneously mesh with analogy SLAM problem Target relative pose and recovery object construction generally also need the information such as the principal axis of inertia of estimation target for Tum bling Target.It is right This, a kind of method in document is using the image coordinate of target surface characteristic point as observed quantity, by the fortune for establishing characteristic point The kinetic model of movable model and target, design filtering algorithm estimate the relative status of target.But it is existing main Problem is that filtering convergence is slower.
Summary of the invention
For overcome the deficiencies in the prior art, it is opposite to provide a kind of space non-cooperative target based on monocular vision by the present invention Method for estimating state.The technical problem to be solved by the present invention is to realize that space non-cooperative target is finally approached using monocular camera Relative Navigation in the process, and convergence is filtered the acceleration in such a way that constraint is added in observational equation.
The technical solution adopted by the present invention to solve the technical problems the following steps are included:
Step 1: the kinematics and dynamics modeling of relative motion between target and Servicing spacecraft is established;
It is defined as follows coordinate system:
(1) geocentric inertial coordinate system I: origin is located at earth centroid, and z-axis is directed toward the earth arctic, and x-axis is directed toward the first point of Aries, y-axis It is determined by right-hand rule;
(2) LVLH coordinate system H: coordinate origin is located at the mass center of Servicing spacecraft, and x-axis is directed toward Servicing spacecraft by the earth's core Mass center, perpendicular to orbital plane and consistent with orbital angular momentum direction, y-axis is determined z-axis by right-hand rule;
(2) Servicing spacecraft body coordinate system A: coordinate origin is fixed on Servicing spacecraft mass center, reference axis and service The spacecraft principal axis of inertia is overlapped;
(3) target body coordinate system B: coordinate origin is fixed on target centroid, the principal axis of inertia weight of reference axis and target It closes, it is specified that the minimum principal axis of inertia is x-axis, maximum principal axis of inertia y-axis.
(5) camera coordinates system C: coordinate origin is fixed in camera photocentre, the position in Servicing spacecraft body coordinate system Setting vector is d, is known fixed quantity, and z-axis is parallel to imaging plane along camera optical axis direction, x-axis y-axis;
Target body coordinate system B indicates relative to the relative attitude of Servicing spacecraft body coordinate system A with quaternary number q, four First numberqv=[q1 q2 q3]TFor quaternionic vector part, q4For the scalar component of quaternary number, with quaternary number q Corresponding spin matrix are as follows:
Wherein [q ×] is vector q=[q1 q2 q3]TAntisymmetric matrix:
It is as follows to define multiplying between quaternary number b and q:
The angular speed of definition Servicing spacecraft ontology coordinate A system and target body coordinate system B are under respective body coordinate system Expression be respectivelyTarget is indicated at target body coordinate system B relative to the angular velocity omega of Servicing spacecraft are as follows:
Relative attitude kinematical equation are as follows:
Wherein Servicing spacecraft angular speedIt is obtained by the inertial equipment measurement of Servicing spacecraft, is used as known quantity In model equation;
The attitude dynamic equations of target are as follows:
Wherein ωtIndicate target angular velocity, asτtFor noise item, JtFor the inertia matrix of target, and Jt=diag (Jxx,Jyy,Jzz), Jxx,Jyy,JzzThe respectively rotary inertia of target main shaft x-axis, y-axis, z-axis;It enables Then formula (6) converts are as follows:
Target rotational ratio of inertias is constant, should be met:
Wherein,Indicate k1About the change rate of time,Indicate change rate of the k2 about the time;
Define ρ0Position vector for target centroid relative to Servicing spacecraft mass center is expressed as [x under LVLH coordinate system y z]T, it is contemplated that ρ when Servicing spacecraft Close approach target0Very little, then relative position kinetics equation are as follows:
Wherein:For Servicing spacecraft track true anomaly speed, rcFor Servicing spacecraft orbit radius, pcIt is positive burnt for half String,Respectively indicate x, the second dervative of y, z,Indicate rcFirst derivative, Servicing spacecraft orbit parameterrc、pcIt determines that system obtains by Servicing spacecraft track, will be used in model equation as known quantity;
Define p1,…,pNPosition vector of N number of characteristic point relative to target centroid respectively in target is indicated in target sheet In system, had by the rigid body hypothesis of target:
Wherein,It indicatesChange rate about the time;
Finally choose state to be estimatedThen dynamics of relative motion Model is written as:
Wherein: f (X) is respectively obtained by formula (5), (7), (8), (9) and (10);W (t) is system noise item;
Step 2: establishing observational equation, wherein the constraint relationship equation between target feature point is added;
Define position vector ρ of the characteristic point i under camera coordinates system in targetiAre as follows:
In formula:It indicates the spin matrix by Servicing spacecraft body coordinate system to camera coordinates system, is known quantity; It indicates to be obtained by LVLH coordinate system to the spin matrix of Servicing spacecraft body coordinate system by itself measuring device;With to Estimated state q is related, is provided by formula (1);
Characteristic point piImage projection coordinateWith ρi=[xi yi zi]TRelationship are as follows:
Wherein fx,fy,cx,cyFor the internal reference parameter of camera used;
Furthermore consider characteristic point i, j, k, define relative characteristic ρijAre as follows:
Similarly define ρjkAnd ρki, ρijProjection coordinate in the picture is yij, the relationship of the two satisfaction are as follows:
Wherein uij=ui-uj,vij=vi-vj,
In view of vector sum ρijjkki=0, convolution (15) should meet constraint between characteristic point i, j, k:
It is denoted as M (i, j, k)=yi,j,k
1,2 is numbered for characteristic point N number of in target ..., observation is added with 3 points for one group of composition constraint equation in N Equation;Assuming that s1,…,smFor a kind of composition of dots chosen mode, all characteristic points to be estimated are covered, composition of dots chosen mode is respectively s1 =1,2,3, s2=2,3,4 ..., sN-1=N-1, N, 1;
Final observational equation are as follows:
In observational equation (17): z is all observed quantities, and v is observation noise, and all ρi、ρijCorrelated components all answer It is replaced with formula (12) and formula (14), state can be obtained to the mapping relations observed;
Step 3: design filter realizes the estimation of target relative status;
Formula (11) describes continuous system state equation, is first converted to discrete model, is estimated again later, walk-off-mode Type are as follows:
In formula: X (k) is k moment state, and Δ t is filtering cycle,
System state equation and observational equation are respectively indicated to formula (18) and formula (17), using iterative extended Kalman filter Carry out state estimation;The step of specific estimation, is as follows:
Step 3.1: initialization: initial filtering estimated value is setWith varivance matrix P (0);
Step 3.2: status predication value are as follows:
Predict error covariance matrix are as follows:
In formula: P (k+1/k) indicates that prediction varivance matrix, P (k/k) indicate the filtering error variance matrix at k moment;
Wherein:QkFor system noise variance;
Step 3.3: state updates:
Step 3.3.1: using predicted value as iterative initial value:
P(k+1/k+1)0=P (k+1/k) (22)
Step 3.3.2: in iterative process, i-th iteration are as follows:
WhereinRk+1To measure noise variance, K (k+1)iWhen indicating i-th iteration Gain;
Wherein, (r (k)iIndicate observation residual error when i-th iteration,It is by (i-1)-th iteration As a result observational equation (17) are substituted into obtain;
Wherein z (k+1) is all observed quantities of k+1 moment;
P(k+1/k+1)i=[I-K (k+1)iH(k+1)i-1]P(k+1/k+1)i-1 (26)
Step 3.3.3: when the number of iterations reaches maximum number of iterations or state iteration error is less than the threshold value of setting, Iteration ends, state iteration error are defined as follows:
Wherein | | | |2Indicate two norm of vector;Assuming that iteration ends when nth iteration, iteration result at this time is made It is exported for final estimated result:
P (k+1/k+1)=P (k+1/k+1)n (29)
Wherein,The as end-state estimated result at k+1 moment.
The beneficial effects of the present invention are the phases due to realizing final approximate spatial noncooperative target using monocular camera To navigation, compared to binocular camera (stereoscopic vision) and laser imaging radar, monocular camera has volume mass small, low in energy consumption, The advantages such as economic cost is low;And observational equation is added in the constraint for meeting relative characteristic, and filter is restrained faster.
Detailed description of the invention
Fig. 1 is the body coordinate system and measurement geometrical relationship schematic diagram of Servicing spacecraft and target.
Wherein, 1- Servicing spacecraft, 2- target satellite, 3- monocular camera.
Specific embodiment
Present invention will be further explained below with reference to the attached drawings and examples.
The present invention proposes a kind of using monocular camera estimation target relative status and fast on the basis of existing research The convergent method of speed.The step of embodiment, is as follows:
Step 1: the kinematics and dynamics modeling of relative motion between target and Servicing spacecraft is established;
It is defined as follows coordinate system:
(1) geocentric inertial coordinate system I: origin is located at earth centroid, and z-axis is directed toward the earth arctic, and x-axis is directed toward the first point of Aries, y-axis It is determined by right-hand rule;
(2) LVLH coordinate system H: coordinate origin is located at the mass center of Servicing spacecraft, and x-axis is directed toward Servicing spacecraft by the earth's core Mass center, perpendicular to orbital plane and consistent with orbital angular momentum direction, y-axis is determined z-axis by right-hand rule;
(3) Servicing spacecraft body coordinate system A: coordinate origin is fixed on Servicing spacecraft mass center, reference axis and service The spacecraft principal axis of inertia is overlapped;
(4) target body coordinate system B: coordinate origin is fixed on target centroid, the principal axis of inertia weight of reference axis and target It closes, it is specified that the minimum principal axis of inertia is x-axis, maximum principal axis of inertia y-axis.
(5) camera coordinates system C: coordinate origin is fixed in camera photocentre, the position in Servicing spacecraft body coordinate system Setting vector is d, is known fixed quantity, and z-axis is parallel to imaging plane along camera optical axis direction, x-axis y-axis;
Target body coordinate system B indicates relative to the relative attitude of Servicing spacecraft body coordinate system A with quaternary number q, four First numberqv=[q1 q2 q3]TFor quaternionic vector part, q4For the scalar component of quaternary number, with quaternary number q Corresponding spin matrix are as follows:
Wherein [q ×] is vector q=[q1 q2 q3]TAntisymmetric matrix:
It is as follows to define multiplying between quaternary number b and q:
The angular speed of definition Servicing spacecraft ontology coordinate A system and target body coordinate system B are under respective body coordinate system Expression be respectivelyTarget is indicated at target body coordinate system B relative to the angular velocity omega of Servicing spacecraft are as follows:
Relative attitude kinematical equation are as follows:
Wherein Servicing spacecraft angular speedIt is obtained by the inertial equipment measurement of Servicing spacecraft, is used for as known quantity In model equation;
The attitude dynamic equations of target are as follows:
Wherein ωtIndicate target angular velocity, asτtFor noise item, JtFor the inertia matrix of target, and Jt=diag (Jxx, Jyy,Jzz), Jxx,Jyy,JzzThe respectively rotary inertia of target main shaft x-axis, y-axis, z-axis;It, cannot since the rotary inertia of target is unknown State estimation directly is carried out using formula (6), needs to carry out parameterized treatment to rotary inertia, enableThen formula (6) it converts are as follows:
In view of target is that rigid body, in a short time quality and its distribution do not change, target rotational ratio of inertias is constant, is answered Meet:
Wherein,Indicate k1About the change rate of time,Indicate change rate of the k2 about the time;
Define ρ0Position vector for target centroid relative to Servicing spacecraft mass center is expressed as [x under LVLH coordinate system y z]T, it is contemplated that ρ when Servicing spacecraft Close approach target0Very little, then relative position kinetics equation are as follows:
Wherein:For Servicing spacecraft track true anomaly speed, rcFor Servicing spacecraft orbit radius, pcIt is positive burnt for half String,Respectively indicate x, the second dervative of y, z,Indicate rcFirst derivative, Servicing spacecraft orbit parameterrc、pcIt determines that system obtains by Servicing spacecraft track, will be used in model equation as known quantity;
Define p1,…,pNPosition vector of N number of characteristic point relative to target centroid respectively in target is indicated in target sheet In system, had by the rigid body hypothesis of target:
Wherein,It indicatesChange rate about the time;
Finally choose state to be estimatedThen dynamics of relative motion Model is written as:
Wherein: f (X) is respectively obtained by formula (5), (7), (8), (9) and (10);W (t) is system noise item;
Step 2: establishing observational equation, wherein the constraint relationship equation between target feature point is added;
Observed quantity is the target feature point coordinate extracted on image, it is assumed that the characteristic point quantity to be chosen and be tracked is N, Do not consider that characteristic point blocks disappearance problem;Define position vector ρ of the characteristic point i under camera coordinates system in targeti, as Fig. 1 has:
In formula:It indicates the spin matrix by Servicing spacecraft body coordinate system to camera coordinates system, is known quantity; Indicate by LVLH coordinate system to the spin matrix of Servicing spacecraft body coordinate system, with the attitude motion of Servicing spacecraft itself and Orbital position is related, can be obtained by itself measuring device;It is related to state q to be estimated, it is provided by formula (1);
Characteristic point piImage projection coordinateWith ρi=[xi yi zi]TRelationship are as follows:
Wherein fx,fy,cx,cyFor the internal reference parameter of camera used;
Furthermore consider characteristic point i, j, k, define relative characteristic ρijAre as follows:
It similarly can define ρjkAnd ρki, ρijProjection coordinate in the picture is yij, the relationship of the two satisfaction are as follows:
Wherein uij=ui-uj,vij=vi-vj,
In view of vector sum ρijjkki=0, convolution (15) should meet constraint between characteristic point i, j, k:
It is denoted as M (i, j, k)=yi,j,k
1,2 is numbered for characteristic point N number of in target ..., observation is added with 3 points for one group of composition constraint equation in N Equation;Assuming that s1,…,smFor a kind of composition of dots chosen mode, all characteristic points to be estimated are covered, composition of dots chosen mode is respectively s1 =1,2,3, s2=2,3,4 ..., sN-1=N-1, N, 1;
Final observational equation are as follows:
In observational equation (17): z is all observed quantities, and v is observation noise, and all ρi、ρijCorrelated components all answer It is replaced with formula (12) and formula (14), state can be obtained to the mapping relations observed;
Step 3: design filter realizes the estimation of target relative status;
Formula (11) describes continuous system state equation, is first converted to discrete model, is estimated again later, walk-off-mode Type are as follows:
In formula: X (k) is k moment state, and Δ t is filtering cycle,
System state equation and observational equation are respectively indicated to formula (18) and formula (17), using iterative extended Kalman filter Carry out state estimation;The step of specific estimation, is as follows:
Step 3.1: initialization: initial filtering estimated value is setWith varivance matrix P (0);
Step 3.2: status predication value are as follows:
Predict error covariance matrix are as follows:
In formula: P (k+1/k) indicates that prediction varivance matrix, P (k/k) indicate the filtering error variance matrix at k moment;
Wherein:QkFor system noise variance;
Step 3.3: state updates:
Step 3.3.1: using predicted value as iterative initial value:
P(k+1/k+1)0=P (k+1/k) (22)
Step 3.3.2: in iterative process, i-th iteration are as follows:
WhereinRk+1To measure noise variance, K (k+1)iWhen indicating i-th iteration Gain;
Wherein, (r (k)iIndicate observation residual error when i-th iteration,It is by (i-1)-th iteration As a result observational equation (17) are substituted into obtain;
Wherein z (k+1) is all observed quantities of k+1 moment;
P(k+1/k+1)i=[I-K (k+1)iH(k+1)i-1]P(k+1/k+1)i-1 (26)
Step 3.3.3: when the number of iterations reaches maximum number of iterations or state iteration error is less than the threshold value of setting, Iteration ends, state iteration error are defined as follows:
Wherein | | | |2Indicate two norm of vector;Assuming that iteration ends when nth iteration, iteration result at this time is made It is exported for final estimated result:
P (k+1/k+1)=P (k+1/k+1)n (29)
Wherein,The as end-state estimated result at k+1 moment.

Claims (1)

1. a kind of space non-cooperative target relative status estimation method based on monocular vision, it is characterised in that including following steps It is rapid:
Step 1: the kinematics and dynamics modeling of relative motion between target and Servicing spacecraft is established;
It is defined as follows coordinate system:
(1) geocentric inertial coordinate system I: origin is located at earth centroid, and z-axis is directed toward the earth arctic, and x-axis is directed toward the first point of Aries, and y-axis is by the right side Gimmick then determines;
(2) LVLH coordinate system H: coordinate origin is located at the mass center of Servicing spacecraft, and x-axis is directed toward Servicing spacecraft matter by the earth's core The heart, perpendicular to orbital plane and consistent with orbital angular momentum direction, y-axis is determined z-axis by right-hand rule;
(2) Servicing spacecraft body coordinate system A: coordinate origin is fixed on Servicing spacecraft mass center, reference axis and service space flight The device principal axis of inertia is overlapped;
(3) target body coordinate system B: coordinate origin is fixed on target centroid, and reference axis is overlapped with the principal axis of inertia of target, rule The fixed minimum principal axis of inertia is x-axis, maximum principal axis of inertia y-axis;
(5) camera coordinates system C: coordinate origin is fixed in camera photocentre, the position in Servicing spacecraft body coordinate system to Amount is d, is known fixed quantity, and z-axis is parallel to imaging plane along camera optical axis direction, x-axis y-axis;
Target body coordinate system B indicates relative to the relative attitude of Servicing spacecraft body coordinate system A with quaternary number q, quaternary numberqv=[q1 q2 q3]TFor quaternionic vector part, q4It is corresponding with quaternary number q for the scalar component of quaternary number Spin matrix are as follows:
Wherein [q ×] is vector q=[q1 q2 q3]TAntisymmetric matrix:
It is as follows to define multiplying between quaternary number b and q:
Table of the angular speed of definition Servicing spacecraft ontology coordinate A system and target body coordinate system B under respective body coordinate system Show respectivelyTarget is indicated at target body coordinate system B relative to the angular velocity omega of Servicing spacecraft are as follows:
Relative attitude kinematical equation are as follows:
Wherein Servicing spacecraft angular speedIt is obtained by the inertial equipment measurement of Servicing spacecraft, is used for model as known quantity In equation;
The attitude dynamic equations of target are as follows:
Wherein ωtIndicate target angular velocity, asτtFor noise item, JtFor the inertia matrix of target, and Jt=diag (Jxx, Jyy,Jzz), Jxx,Jyy,JzzThe respectively rotary inertia of target main shaft x-axis, y-axis, z-axis;It enablesThen formula (6) it converts are as follows:
Target rotational ratio of inertias is constant, should be met:
Wherein,Indicate k1About the change rate of time,Indicate change rate of the k2 about the time;
Define ρ0Position vector for target centroid relative to Servicing spacecraft mass center is expressed as [x y z] under LVLH coordinate systemT, ρ when in view of Servicing spacecraft Close approach target0Very little, then relative position kinetics equation are as follows:
Wherein:For Servicing spacecraft track true anomaly speed, rcFor Servicing spacecraft orbit radius, pcFor semi-latus rectum,Respectively indicate x, the second dervative of y, z,Indicate rcFirst derivative, Servicing spacecraft orbit parameterrc、pc It determines that system obtains by Servicing spacecraft track, will be used in model equation as known quantity;
Define p1,...,pNPosition vector of N number of characteristic point relative to target centroid respectively in target is indicated in target ontology It fastens, is had by the rigid body hypothesis of target:
Wherein,Indicate [p1 T,…,pN T]TChange rate about the time;
Finally choose state to be estimatedThen dynamics of relative motion model It is written as:
Wherein: f (X) is respectively obtained by formula (5), (7), (8), (9) and (10);W (t) is system noise item;
Step 2: establishing observational equation, wherein the constraint relationship equation between target feature point is added;
Define position vector ρ of the characteristic point i under camera coordinates system in targetiAre as follows:
In formula:It indicates the spin matrix by Servicing spacecraft body coordinate system to camera coordinates system, is known quantity;It indicates By LVLH coordinate system to the spin matrix of Servicing spacecraft body coordinate system, obtained by itself measuring device;With it is to be estimated State q is related, is provided by formula (1);
Characteristic point piImage projection coordinateWith ρi=[xi yi zi]TRelationship are as follows:
Wherein fx,fy,cx,cyFor the internal reference parameter of camera used;
Furthermore consider characteristic point i, j, k, define relative characteristic ρijAre as follows:
Similarly define ρjkAnd ρki, ρijProjection coordinate in the picture is yij, the relationship of the two satisfaction are as follows:
Wherein uij=ui-uj,vij=vi-vj,
In view of vector sum ρijjkki=0, convolution (15) should meet constraint between characteristic point i, j, k:
It is denoted as M (i, j, k)=yi,j,k
1,2 is numbered for characteristic point N number of in target ..., observational equation is added with 3 points for one group of composition constraint equation in N; Assuming that s1,…,smFor a kind of composition of dots chosen mode, all characteristic points to be estimated are covered, composition of dots chosen mode is respectively s1=1,2, 3, s2=2,3,4 ..., sN-1=N-1, N, 1;
Final observational equation are as follows:
In observational equation (17): z is all observed quantities, and v is observation noise, and all ρi、ρijCorrelated components all applying equation (12) it is replaced with formula (14), state can be obtained to the mapping relations observed;
Step 3: design filter realizes the estimation of target relative status;
Formula (11) describes continuous system state equation, is first converted to discrete model, is estimated again later, discrete model Are as follows:
In formula: X (k) is k moment state, and Δ t is filtering cycle,
System state equation and observational equation are respectively indicated to formula (18) and formula (17), carried out using iterative extended Kalman filter State estimation;The step of specific estimation, is as follows:
Step 3.1: initialization: initial filtering estimated value is setWith varivance matrix P (0);
Step 3.2: status predication value are as follows:
Predict error covariance matrix are as follows:
In formula: P (k+1/k) indicates that prediction varivance matrix, P (k/k) indicate the filtering error variance matrix at k moment;
Wherein:QkFor system noise variance;
Step 3.3: state updates:
Step 3.3.1: using predicted value as iterative initial value:
P(k+1/k+1)0=P (k+1/k) (22)
Step 3.3.2: in iterative process, i-th iteration are as follows:
WhereinRk+1To measure noise variance, K (k+1)iIndicate increasing when i-th iteration Benefit;
Wherein, (r (k)iIndicate observation residual error when i-th iteration,It is by (i-1)-th iteration result generation Enter observational equation (17) to obtain;
Wherein z (k+1) is all observed quantities of k+1 moment;
P(k+1/k+1)i=[I-K (k+1)iH(k+1)i-1]P(k+1/k+1)i-1 (26)
Step 3.3.3: when the number of iterations reaches maximum number of iterations or state iteration error is less than the threshold value of setting, iteration It terminates, state iteration error is defined as follows:
Wherein | | | |2Indicate two norm of vector;Assuming that iteration ends when nth iteration, using iteration result at this time as final Estimated result output:
P (k+1/k+1)=P (k+1/k+1)n (29)
Wherein,The as end-state estimated result at k+1 moment.
CN201910593684.3A 2019-07-03 2019-07-03 Monocular vision-based space non-cooperative target relative state estimation method Active CN110186465B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910593684.3A CN110186465B (en) 2019-07-03 2019-07-03 Monocular vision-based space non-cooperative target relative state estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910593684.3A CN110186465B (en) 2019-07-03 2019-07-03 Monocular vision-based space non-cooperative target relative state estimation method

Publications (2)

Publication Number Publication Date
CN110186465A true CN110186465A (en) 2019-08-30
CN110186465B CN110186465B (en) 2022-08-05

Family

ID=67724752

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910593684.3A Active CN110186465B (en) 2019-07-03 2019-07-03 Monocular vision-based space non-cooperative target relative state estimation method

Country Status (1)

Country Link
CN (1) CN110186465B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110823214A (en) * 2019-10-18 2020-02-21 西北工业大学 Method for estimating relative pose and inertia of space complete non-cooperative target
CN113295171A (en) * 2021-05-19 2021-08-24 北京航空航天大学 Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN116958263A (en) * 2023-08-09 2023-10-27 苏州三垣航天科技有限公司 Monocular camera intelligent enhancement method in space observation target gesture recognition process

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104406598A (en) * 2014-12-11 2015-03-11 南京航空航天大学 Non-cooperative spacecraft attitude estimation method based on virtual sliding mode control
CN106548475A (en) * 2016-11-18 2017-03-29 西北工业大学 A kind of Forecasting Methodology of the target trajectory that spins suitable for space non-cooperative
CN107167145A (en) * 2017-05-25 2017-09-15 西北工业大学 A kind of morphological parameters measuring method of adaptive contactless inert satellite
CN107449402A (en) * 2017-07-31 2017-12-08 清华大学深圳研究生院 A kind of measuring method of the relative pose of noncooperative target
CN107702709A (en) * 2017-08-31 2018-02-16 西北工业大学 A kind of noncooperative target moves the time-frequency domain mixing discrimination method with inertial parameter
CN108804846A (en) * 2018-06-20 2018-11-13 哈尔滨工业大学 A kind of data-driven attitude controller design method of noncooperative target assembly spacecraft
CN108917772A (en) * 2018-04-04 2018-11-30 北京空间飞行器总体设计部 Noncooperative target Relative Navigation method for estimating based on sequence image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104406598A (en) * 2014-12-11 2015-03-11 南京航空航天大学 Non-cooperative spacecraft attitude estimation method based on virtual sliding mode control
CN106548475A (en) * 2016-11-18 2017-03-29 西北工业大学 A kind of Forecasting Methodology of the target trajectory that spins suitable for space non-cooperative
CN107167145A (en) * 2017-05-25 2017-09-15 西北工业大学 A kind of morphological parameters measuring method of adaptive contactless inert satellite
CN107449402A (en) * 2017-07-31 2017-12-08 清华大学深圳研究生院 A kind of measuring method of the relative pose of noncooperative target
CN107702709A (en) * 2017-08-31 2018-02-16 西北工业大学 A kind of noncooperative target moves the time-frequency domain mixing discrimination method with inertial parameter
CN108917772A (en) * 2018-04-04 2018-11-30 北京空间飞行器总体设计部 Noncooperative target Relative Navigation method for estimating based on sequence image
CN108804846A (en) * 2018-06-20 2018-11-13 哈尔滨工业大学 A kind of data-driven attitude controller design method of noncooperative target assembly spacecraft

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DONG HAN ET.AL: "Trajectory prediction of space robot for capturing non-cooperative target", 《2017 18TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS (ICAR)》 *
宋亮等: "对空间碎片的相对位姿估计", 《宇航学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110823214A (en) * 2019-10-18 2020-02-21 西北工业大学 Method for estimating relative pose and inertia of space complete non-cooperative target
CN110823214B (en) * 2019-10-18 2021-05-25 西北工业大学 Method for estimating relative pose and inertia of space complete non-cooperative target
CN113295171A (en) * 2021-05-19 2021-08-24 北京航空航天大学 Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN113295171B (en) * 2021-05-19 2022-08-16 北京航空航天大学 Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN116958263A (en) * 2023-08-09 2023-10-27 苏州三垣航天科技有限公司 Monocular camera intelligent enhancement method in space observation target gesture recognition process
CN116958263B (en) * 2023-08-09 2024-04-12 苏州三垣航天科技有限公司 Monocular camera intelligent enhancement method in space observation target gesture recognition process

Also Published As

Publication number Publication date
CN110186465B (en) 2022-08-05

Similar Documents

Publication Publication Date Title
CN104406598B (en) A kind of non-cooperative Spacecraft Attitude estimation method based on virtual sliding formwork control
Broida et al. Spacecraft rendezvous guidance in cluttered environments via reinforcement learning
Crassidis et al. Survey of nonlinear attitude estimation methods
CN110186465A (en) A kind of space non-cooperative target relative status estimation method based on monocular vision
Aghili et al. Fault-tolerant position/attitude estimation of free-floating space objects using a laser range sensor
CN109634307A (en) A kind of compound Track In Track control method of UAV navigation
CN105353763B (en) A kind of noncooperative target spacecraft relative orbit posture finite-time control method
CN111340868B (en) Unmanned underwater vehicle autonomous decision control method based on visual depth estimation
CN109269511B (en) Curve matching visual navigation method for planet landing in unknown environment
CN105865459A (en) Visual angle constraint-considered small heavenly body approaching section guidance method
CN110466805B (en) Planet landing guidance method based on optimized guidance parameters
CN114077258B (en) Unmanned ship pose control method based on reinforcement learning PPO2 algorithm
Zang et al. Standoff tracking control of underwater glider to moving target
CN110567462B (en) Identification method for three-axis rotational inertia ratio of approximate spinning non-cooperative spacecraft
CN106863297B (en) A kind of accurate approach method of space rope system robot vision
CN115619828A (en) Space robot on-orbit capturing method based on simulated binocular vision measurement
Wang et al. Ego-motion estimation of a quadrotor based on nonlinear observer
CN108645400B (en) Inertial parameter identification method and system for space non-cooperative target relative navigation
Beutler et al. Gaussian filtering using state decomposition methods
CN112541266B (en) Small celestial body attachment convex track guidance method
Zhou et al. Energy-based trajectory tracking control of under-actuated unmanned surface vessels
CN114018250A (en) Inertial navigation method, electronic device, storage medium, and computer program product
Rathinama et al. Vision based state estimation using a graph-SLAM approach for proximity operations near an asteroid
Zuehlke Autonomous Space Surveillance for Arbitrary Domains
CN111221340A (en) Design method of migratable visual navigation based on coarse-grained features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant