CN102129665A - Image surface driving point-based imaging method of optical remote sensor - Google Patents

Image surface driving point-based imaging method of optical remote sensor Download PDF

Info

Publication number
CN102129665A
CN102129665A CN 201010558192 CN201010558192A CN102129665A CN 102129665 A CN102129665 A CN 102129665A CN 201010558192 CN201010558192 CN 201010558192 CN 201010558192 A CN201010558192 A CN 201010558192A CN 102129665 A CN102129665 A CN 102129665A
Authority
CN
China
Prior art keywords
coordinate
centerdot
image
image planes
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010558192
Other languages
Chinese (zh)
Other versions
CN102129665B (en
Inventor
张智
刘兆军
周峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing spaceflight Creative Technology Co., Ltd.
Original Assignee
Beijing Institute of Space Research Mechanical and Electricity
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Space Research Mechanical and Electricity filed Critical Beijing Institute of Space Research Mechanical and Electricity
Priority to CN 201010558192 priority Critical patent/CN102129665B/en
Publication of CN102129665A publication Critical patent/CN102129665A/en
Application granted granted Critical
Publication of CN102129665B publication Critical patent/CN102129665B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to an image surface driving point-based imaging method of an optical remote sensor. The method comprises the steps of: establishing a platform optics payload geometric distortion model by utilizing a method of coordinate transformation according to a physical model transformed from a ground coordinate system to an image surface coordinate system; , fitting a small quantity of image surface driving points into an image by utilizing polynomial interpolation; and by utilizing the model, completing the imaging of the model in different postures and analyzing the image geometric distortion caused by the postures. Therefore, the effect transformation between an object surface and an image surface is realized; the calculation amount is reduced and a higher precision is maintained; meanwhile, the problem of overlarge calculation complexity in a traditional precise geometrical model is solved and the shortcoming of lacking physical meaning in a rational function model is made up.

Description

A kind of formation method of the optical sensor based on the image planes drive point
Technical field:
The invention belongs to the optical remote sensing technology field, particularly relate to a kind of formation method of the optical sensor based on the image planes drive point.
Background technology
The high-resolution optical remote sensing device, very harsh to the requirement of platform.Existing mode transmission spacer remote sensing device adopts the line array CCD push-scanning image usually.The push-broom type imaging is very sensitive to the piecture geometry fault that the attitude instability causes, the attitude variation that the pattern distortion that the attitude variation causes is divided on rolling, pitching, three directions of driftage has caused line array CCD projection on the ground to produce distortion, and is more responsive to the influence of high-resolution optical remote sensing device especially.Analyzing satellite platform kinematic error and imaging mapping relations need set up from the geometric model between the earth axes one image coordinates system.Traditional method all is the strict precise geometrical model that each pixel is changed one by one from the object plane to image planes that adopts.This model physics needs sizable calculated amount.
Summary of the invention:
The objective of the invention is to overcome the above-mentioned deficiency of prior art, a kind of formation method of the optical sensor based on the image planes drive point is provided, this method is by adopting the object point of some small samples, in conjunction with s internal and external orientation information, the process terrestrial coordinate is to a spot of image planes drive point that changes between the imaging surface coordinate system.Utilize these drive points then, carry out fitting of a polynomial, form piece image fast, significantly reduced calculated amount, and kept degree of precision as interpolation.
Above-mentioned purpose of the present invention is achieved by following technical solution:
A kind of formation method of the optical sensor based on the image planes drive point comprises the steps:
(1) set up optical sensor over the ground in the imaging process object plane to the Topological Mapping relation of image planes:
x p y p z p 1 = [ M ground orbit ] [ M ordit platform ] [ M platrorm camera ] x g y g z g 1
= m 11 m 12 m 13 m 14 m 12 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 x g y g z g 1
Wherein: x p, y p, z pBe the coordinate of following of image coordinates system, x g, y g, z gBe the coordinate of following of earth axes, z p=z g=0,
Figure BSA00000359416300023
For being tied to the transition matrix of orbital coordinate system from ground coordinate,
Figure BSA00000359416300024
For orbit coordinate is tied to the transition matrix of satellite platform coordinate system,
Figure BSA00000359416300025
Be tied to the transition matrix of image coordinates system for the satellite platform coordinate;
(2) on image planes, choose n small sample point coordinate (x Pn, y Pn) as the image planes drive point, described n small sample point coordinate (x Pn, y Pn) with object plane on n corresponding point coordinate (x Gn, y Gn) transformational relation as follows:
x p 1 y p 1 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 1 y g 1 0 1
x p 2 y p 2 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 2 y g 2 0 1 ,
x p 3 y p 3 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 3 y g 3 0 1
x p 4 y p 4 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 4 y g 4 0 1
.
.
.
x pn y pn 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 23 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x gn y gn 0 1 ,
N is a positive integer, and n 〉=4;
(3) n the image planes drive point that obtains in the step (2) carried out interpolation, obtain the pixel between the drive point, form complete image.
In the formation method of above-mentioned optical sensor based on the image planes drive point, on image planes, choose n small sample point coordinate (x in the step (2) Pn, y Pn) method be: image planes evenly are divided into the n equal portions, and the central point of choosing each equal portions obtains n small sample point altogether as a small sample point.
In the formation method of above-mentioned optical sensor based on the image planes drive point, the method for in the step (3) n image planes drive point being carried out interpolation is a polynomial interpolation, and concrete grammar is as follows:
I ( x ) = a M x pn n + a M - 1 x pn n - 1 + · · · + a 2 x pn 2 + a 1 x pn + a 0
I ( y ) = b M y pn n + b N - 1 y pn n - 1 + · · · + b 2 y pn 2 + b 1 y pn + b 0
Wherein, M, N are the size that obtains image after the interpolation, (x Pn, y Pn) be image planes drive point coordinate, a 0~a MFor n time with x PnBe the equation multinomial coefficient of independent variable, b 0~b NFor n time with y PnEquation multinomial coefficient for independent variable.
The present invention compared with prior art has following advantage:
(1) the present invention is according to the geometric model from earth coordinates-image coordinates system transforms, set up platform optics useful load geometric distortion model, and utilize polynomial interpolation that a spot of image planes drive point is interpolated to piece image, this method has been done effective conversion between object plane and image planes, significantly reduce calculated amount, and kept degree of precision.
(2) the inventive method is finished the imaging under the different attitudes of satellite platform, reflected the geometric distortion of the image that causes because of attitude, and embodied the remote sensing images geometric distortion with the mapping relations between the inside and outside element of orientation of satellite platform, finally realized the conversion from the object plane to image planes;
(3) optical remote sensor imaging geometric model of the present invention is considered all multifactorly, comprises elements of interior orientation: detector size, focal length etc.; Elements of exterior orientation: platform track height, attitude, track six roots of sensation number etc., thought in conjunction with polynomial interpolation, to becoming piece image to a small amount of point interpolation that the imaging surface coordinate transformation obtains through terrestrial coordinate, both solve traditional precise geometrical model because of the excessive problem of computation complexity, remedied the shortcoming that rational function model lacks physical meaning again.
Description of drawings
Fig. 1 is how much topological relation figure of optical remote sensor imaging;
Fig. 2 is each coordinate system corresponding relation synoptic diagram among the present invention;
Fig. 3 is an optical remote sensor imaging method flow diagram of the present invention;
The ideal image of Fig. 4 for adopting the inventive method to obtain;
Fig. 5 30 ° of ideal image of driftage for adopting the inventive method to obtain.
Embodiment
The present invention is described in further detail below in conjunction with the drawings and specific embodiments:
Be illustrated in figure 1 as how much topological relation figure of optical remote sensor imaging, the theoretical foundation of linear optics remote sensor model is as follows: can use the pin-hole model approximate representation for the image space of any one some P on image planes on the space, be that the projected position of some p ' on image planes on any object plane is p, p is video camera photocentre O and the P line OP of ordering and the intersection point of image planes.
Among Fig. 1, (x, y z) are TDICCD optical sensor coordinate, and O is the optical sensor coordinate origin, and blue arrow is the satellite flight direction, Ψ yBe the side-sway angle, Ψ xIt is the angle of pitch.Wherein Δ p ' is the ground error deviation distance that causes because of attitude error.
Linear optics remote sensor imaging model is a form of the imaging geometry of picture point and object point being write as perspective projection matrix:
P = M ( p ′ + Δp ′ ) ⇒ Z c = u v 1 = M x g + X Δp ′ y g + Y Δp ′ z g + Z Δp ′ 1 - - - ( 1 )
(x wherein p, y p) be the coordinate of the point under the image pixel coordinate system, (x g, y g, z g) be the coordinates of spatial points under the terrestrial coordinate system, the perspective projection matrix that M is, Z cBe the coordinate of p under the camera coordinate system, wherein z p, z gBe 0.
The optical remote sensor imaging geometric model will be considered all multifactor, comprising detector size, focal length, platform track height, attitude, track six roots of sensation number etc.; Generally adopt GPS to determine the current location of satellite at the rail platform, star is quick determines one group of seasonal effect in time series attitude information with gyroscope.
Object plane in the remote sensing satellite imaging process/image planes transforming relationship is as follows:
x p y p z p 1 = [ M ground orbit ] [ M orbit platform ] [ M platform camera ] x g y g z g 1
= m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 x g y g z g 1 - - - ( 2 )
Wherein: x p, y pBe the coordinate of following of image coordinates system, x g, y gBe the coordinate of following of earth axes, wherein z p=z g=0,
Figure BSA00000359416300053
For being tied to the transition matrix of orbital coordinate system from ground coordinate,
Figure BSA00000359416300054
For orbit coordinate is tied to the transition matrix of satellite platform coordinate system,
Figure BSA00000359416300055
Be tied to the transition matrix of image coordinates system for the satellite platform coordinate.Be illustrated in figure 2 as each coordinate system corresponding relation synoptic diagram among the present invention, showed among Fig. 2 in the conversion process from the thing to the picture, complex mapping relation between each coordinate system, modeling method of the present invention takes into full account the physics overall process, calculating is accurate, error is little, is than the Geometric Modeling Method near truth.Wherein
Figure BSA00000359416300056
To embody formula as follows:
m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 - fo r s 0 0 0 0 - fo r s 0 0 0 0 - fo r s - fo 0 0 0 1 1 0 0 0 0 cos ψ y sin ψ y 0 0 - sin ψ y cos ψ y 0 0 0 0 1 cos Ψ x 0 - sin ψ x 0 0 1 0 0 sin ψ x 0 cos ψ x 0 0 0 0 1 cos ( ψ + ψt · ) sin ( ψ + ψ · t ) 0 0 - sin ( ψ + ψt · ) cos ( ψ + ψ · t ) 0 0 0 0 1 0 0 0 0 1
cos ( Φ + Φt · ) 0 - sin ( Φ + Φ · t ) 0 0 1 0 0 sin ( Φ + Φ · t ) 0 cos ( Φ + Φt · ) 0 0 0 0 1 1 0 0 0 0 cos ( θ + θt · ) sin ( θ + θt · ) 0 0 - sin ( θ + θt · ) cos ( θ + θt · ) 0 0 0 0 1 cos ( γ 0 + ωt ) 0 sin ( γ 0 + ωt ) 0 0 1 0 0 - sin ( γ 0 + ωt ) 0 cos ( γ 0 + ωt ) 0 0 0 0 1
cos ( π - i 0 ) sin ( π - i 0 ) 0 0 - sin ( π - i 0 ) cos ( π - i 0 ) 0 0 0 0 1 0 0 0 0 1 cos ω e t 0 sin ω e t 0 0 1 0 0 - sin ω e t 0 cos ω e t 0 0 0 0 1 cos ( π - i 0 ) - sin ( π - i 0 ) - 0 0 sin ( π - i 0 ) cos ( π - i 0 ) 0 0 0 0 1 0 0 0 0 1 (3)
cos γ 0 0 - sin γ 0 0 0 1 0 0 sin γ 0 0 cos γ 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 1 R e + h 0 0 0 1
Wherein, R eBe earth radius, h is the height of terrain object, and H is the orbit altitude of satellite, r sBe oblique distance (satellite is to the distance of being shone between the target), γ 0Be the earth's core subtended angle of satellite position to ascending node, i 0Be orbit inclination, ω eBe rotational-angular velocity of the earth 7.2722 * 10 -5Radian per second, ω are the angular velocity that satellite rotates around the earth's core in orbit plane, and t is the time, and θ is a roll angle,
Figure BSA00000359416300065
Be rate of roll, Φ is the angle of pitch,
Figure BSA00000359416300066
Be rate of pitch, Ψ is a crab angle,
Figure BSA00000359416300067
Be yaw rate, fo is the focal length of optical sensor, Ψ xBe the preceding visual angle of optical sensor, Ψ yBe optical sensor side-looking angle, fo is the focal length of camera.
Be illustrated in figure 3 as optical remote sensor imaging method flow diagram of the present invention.
More any coordinate in terrestrial coordinate system and the coordinate in image coordinates system can be set up relation one to one by coordinate transform.Get small sample in image planes after utilizing the coordinate corresponding relation, promptly the image planes drive point carries out the fault image of interpolation to obtain wanting then between these points.Between coordinate system, in the conversion process, only analyze the situation of 2 dimensions, do not consider coordinate, i.e. z at optical axis direction p=0.By formula (2), the mathematic(al) representation of its coordinate transform can be write as:
x p y p 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g y g 0 1 - - - ( 4 )
On image planes, choose n small sample point coordinate (x Pn, y Pn) as the image planes drive point, described n small sample point coordinate (x Pn, y Pn) with object plane on n corresponding point coordinate (x Gn, y Gn) transformational relation as follows:
x p 1 y p 1 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 1 y g 1 0 1
x p 2 y p 2 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 2 y g 2 0 1
x p 3 y p 3 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 3 y g 3 0 1
x p 4 y p 4 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 4 y g 4 0 1
.
.
.
x pn y pn 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 23 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x gn y gn 0 1 - - - ( 5 )
N is a positive integer, and n 〉=4;
Utilize a spot of drive point significantly to reduce from terrestrial coordinates to the image planes data operation quantity.The drive point of some can reduce calculated amount under the prerequisite that guarantees geometric distortion image fitting precision.
N image planes drive point carried out interpolation, obtain the pixel between the drive point, form complete image, with parameter designing and the parameter prediction that instructs satellite platform.
( x ) = a M x pn n + a M - 1 x pn n - 1 a 2 x pn 2 a 1 x pn + a 0
I ( y ) = b N y pn n + b N - 1 y pn n - 1 b 2 y pn 2 b 1 y pn + b 0 (6)
Wherein, M, N are the size that obtains image after the interpolation, (x Pn, y Pn) be image planes drive point coordinate, a 0~a MFor n time with x PnBe the equation multinomial coefficient of independent variable, b 0~b NFor n time with y PnEquation multinomial coefficient for independent variable.
If when N=1, equation becomes:
I ( x ) = a 1 x + a 0 I ( y ) = b 1 + b 0 - - - ( 7 )
Can utilize (x P1, y P1), (x P2, y P2), (x P3, y P3), (x P4, y P4) four drive points can calculate multinomial coefficient, thereby utilize the method for extrapolated value in the polynomial expression, can obtain all pixels of image planes successively.
Be illustrated in figure 4 as the ideal image that adopts the inventive method to obtain; Figure 5 shows that the 30 ° of ideal image of driftage that adopt the inventive method to obtain, adopt formation method of the present invention can reflect the piecture geometry fault that causes because of the variation of platform attitude as seen from the figure.
The above; only be the embodiment of the best of the present invention, but protection scope of the present invention is not limited thereto, anyly is familiar with those skilled in the art in the technical scope that the present invention discloses; the variation that can expect easily or replacement all should be encompassed within protection scope of the present invention.
The content that is not described in detail in the instructions of the present invention belongs to this area professional and technical personnel's known technology.

Claims (3)

1. the formation method based on the optical sensor of image planes drive point is characterized in that: comprise the steps:
(1) set up the Topological Mapping relation of object plane in the optical remote sensor imaging process to image planes:
x p y p z p 1 = [ M ground orbit ] [ M orbit platform ] [ M platform camera ] x g y g z g 1
= m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 x g y g z g 1
Wherein: x p, y p, z pBe the coordinate of following of image coordinates system, x g, y g, z gBe the coordinate of following of earth axes, z p=z g=0,
Figure FSA00000359416200013
For being tied to the transition matrix of orbital coordinate system from ground coordinate, For orbit coordinate is tied to the transition matrix of satellite platform coordinate system,
Figure FSA00000359416200015
Be tied to the transition matrix of image coordinates system for the satellite platform coordinate;
(2) on image planes, choose n small sample point coordinate (x Pn, y Pn) as the image planes drive point, described n small sample point coordinate (x Pn, y Pn) with object plane on n corresponding point coordinate (x Gn, y Gn) transformational relation as follows:
x p 1 y p 1 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 1 y g 1 0 1
x p 2 y p 2 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 2 y g 2 0 1
x p 3 y p 3 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 3 y g 3 0 1
x p 4 y p 4 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x g 4 y g 4 0 1
.
.
.
x pn y pn 0 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 23 m 31 m 32 m 33 m 34 m 41 m 42 m 43 m 44 · x gn y gn 0 1 ,
N is a positive integer, and n 〉=4;
(3) n the image planes drive point that obtains in the step (2) carried out interpolation, obtain the pixel between the drive point, form complete image.
2. the formation method of a kind of optical sensor based on the image planes drive point according to claim 1 is characterized in that: choose n small sample point coordinate (x in the described step (2) on image planes Pn, y Pn) method be: image planes evenly are divided into the n equal portions, and the central point of choosing each equal portions obtains n small sample point altogether as a small sample point.
3. the formation method of a kind of optical sensor based on the image planes drive point according to claim 1 is characterized in that: the method for in the described step (3) n image planes drive point being carried out interpolation is a polynomial interpolation, and concrete grammar is as follows:
I ( x ) = a M x pn n + a M - 1 x pm n - 1 + · · · + a 2 x pn 2 + a 1 x pn + a 0
I ( y ) = b N y pn n + b N - 1 y pn n - 1 + · · · + b 2 y pn 2 + b 1 y pn + b 0
Wherein, M, N are the size that obtains image after the interpolation, x Pn, y PnBe image planes drive point coordinate, a 0~a nFor n time be the equation multinomial coefficient of independent variable with x, b 0~b nFor n time be the equation multinomial coefficient of independent variable with y.
CN 201010558192 2010-11-22 2010-11-22 Image surface driving point-based imaging method of optical remote sensor Active CN102129665B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010558192 CN102129665B (en) 2010-11-22 2010-11-22 Image surface driving point-based imaging method of optical remote sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010558192 CN102129665B (en) 2010-11-22 2010-11-22 Image surface driving point-based imaging method of optical remote sensor

Publications (2)

Publication Number Publication Date
CN102129665A true CN102129665A (en) 2011-07-20
CN102129665B CN102129665B (en) 2012-11-14

Family

ID=44267740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010558192 Active CN102129665B (en) 2010-11-22 2010-11-22 Image surface driving point-based imaging method of optical remote sensor

Country Status (1)

Country Link
CN (1) CN102129665B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103674244A (en) * 2013-07-05 2014-03-26 北京师范大学 Rapid deconvolution two-dimensional fiber spectrum extraction method based on GPU
CN112097798A (en) * 2020-11-12 2020-12-18 北京道达天际科技有限公司 High-precision calibration method and device for high resolution camera of high resolution six-satellite

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《环境遥感》 19940531 王新民 遥感卫星CCD相机数据的***几何校正 第145~149页 1-3 第9卷, 第2期 *
《遥感技术与应用》 20071231 栾庆祖,刘慧平,肖志强. 遥感影像的正射校正方法比较 第743~747页 1-3 第22卷, 第6期 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103674244A (en) * 2013-07-05 2014-03-26 北京师范大学 Rapid deconvolution two-dimensional fiber spectrum extraction method based on GPU
CN103674244B (en) * 2013-07-05 2015-10-21 北京师范大学 A kind of rapid deconvolution two-dimensional fiber spectrum extraction method based on GPU
CN112097798A (en) * 2020-11-12 2020-12-18 北京道达天际科技有限公司 High-precision calibration method and device for high resolution camera of high resolution six-satellite
CN112097798B (en) * 2020-11-12 2021-03-26 北京道达天际科技有限公司 High-precision calibration method and device for high resolution camera of high resolution six-satellite

Also Published As

Publication number Publication date
CN102129665B (en) 2012-11-14

Similar Documents

Publication Publication Date Title
CN103983254B (en) The motor-driven middle formation method of a kind of novel quick satellite
CN106124170B (en) A kind of camera optical axis direction computational methods based on high-precision attitude information
CN105627991B (en) A kind of unmanned plane image real time panoramic joining method and system
CN104848860B (en) A kind of agile satellite imagery process attitude maneuver planing method
CN105091906B (en) High-resolution optical, which pushes away, sweeps the weight imaging sensor bearing calibration of satellite stable state and system
CN103345737B (en) A kind of UAV high resolution image geometric correction method based on error compensation
CN102519433B (en) Method for inverting geometric calibrating parameter of satellite-borne linear array sensor by using RPC (Remote Position Control)
CN102508260B (en) Geometric imaging construction method for side-looking medium resolution ratio satellite
US9781378B2 (en) Coordinating image sensing with motion
CN101114022A (en) Navigation multiple spectrum scanner geometric approximate correction method under non gesture information condition
CN106125745B (en) A kind of satellite attitude control method to Spatial Cooperation target following imaging
CN102243074A (en) Method for simulating geometric distortion of aerial remote sensing image based on ray tracing technology
CN110030978B (en) Method and system for constructing geometric imaging model of full-link optical satellite
CN102426025B (en) Simulation analysis method for drift correction angle during remote sensing satellite attitude maneuver
CN103310487B (en) A kind of universal imaging geometric model based on time variable generates method
CN103914808A (en) Method for splicing ZY3 satellite three-line-scanner image and multispectral image
CN105004354A (en) Unmanned aerial vehicle visible light and infrared image target positioning method under large squint angle
CN111896009B (en) Method and system for correcting imaging sight line offset caused by satellite flight motion
CN103778610B (en) A kind of spaceborne line array sensor hangs down the geometry preprocess method of rail sweeping image
CN111538051B (en) Precise processing method for swing-scanning large-width optical satellite
CN111247389A (en) Data processing method and device for shooting equipment and image processing equipment
CN104144304A (en) High-resolution camera different-field-of-view integral time determination method
CN102129665B (en) Image surface driving point-based imaging method of optical remote sensor
CN105004321A (en) Unmanned plane GPS-supported bundle djustment method in consideration of non-synchronous exposal
CN103278140B (en) Coordinate back calculation method for TDICCD (time delay and integration charge coupled devices) linear array push-sweep sensor

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20180703

Address after: 100076 Beijing Fengtai District East Highland Wanyuan Dongli 99

Patentee after: Beijing spaceflight Creative Technology Co., Ltd.

Address before: 100076 Beijing South Fengtai District Road 1 Dahongmen 9201 mailbox 5 boxes

Patentee before: Beijing Research Institute of Space Mechanical & Electrical Technology