CN105547286B - A kind of compound three visual fields star sensor star map simulation method - Google Patents

A kind of compound three visual fields star sensor star map simulation method Download PDF

Info

Publication number
CN105547286B
CN105547286B CN201610015795.2A CN201610015795A CN105547286B CN 105547286 B CN105547286 B CN 105547286B CN 201610015795 A CN201610015795 A CN 201610015795A CN 105547286 B CN105547286 B CN 105547286B
Authority
CN
China
Prior art keywords
star
coordinate system
optical
stars
observation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610015795.2A
Other languages
Chinese (zh)
Other versions
CN105547286A (en
Inventor
吴峰
朱锡芳
相入喜
许清泉
李辉
邹全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zhixing Future Automobile Research Institute Co Ltd
Original Assignee
Changzhou Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Institute of Technology filed Critical Changzhou Institute of Technology
Priority to CN201610015795.2A priority Critical patent/CN105547286B/en
Publication of CN105547286A publication Critical patent/CN105547286A/en
Application granted granted Critical
Publication of CN105547286B publication Critical patent/CN105547286B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/02Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means
    • G01C21/025Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means with the use of startrackers

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Astronomy & Astrophysics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention discloses a kind of compound three visual fields star sensor star map simulation method, comprises the following steps:Step 1, the limiting magnitude according to star sensor, satisfactory observation star is chosen, the asterisk of extraction observation star, magnitude, right ascension, declination data, is stored as observing sing data storehouse;Observation star in step 2, compound three visual fields star sensor, three optical system field of view of statistics, and calculating observation star X, incidence angle of Y-direction in affiliated subsystem coordinate system;Step 3, according to pseudolite sum, randomly generate magnitude, the field positions data of pseudolite;Step 4, pass through ray tracing, imaging of the simulated optical system to observation star and pseudolite in respective visual field, calculating image planes information;Step 5, the brightness for calculating the digital each pixel of star chart, export digital star chart.This method is easy to operate, and independent of the hardware system of machine-shaping, cost is low, and abundant simulation star chart data can be provided for compound three visual fields star sensor other technologies research.

Description

Method for simulating star map of composite three-field-of-view star sensor
Technical Field
The invention belongs to the technical field of astronomical navigation, and relates to a technology for acquiring a simulated star map by using a computer to simulate the imaging process of a composite three-field-of-view star sensor.
Background
The effective attitude control is a necessary guarantee for the smooth flight of spacecrafts such as satellites, the attitude measurement is a precondition for the attitude control, the star sensor takes fixed stars as working objects, and the star sensor takes pictures of the fixed stars and recognizes the fixed stars to measure attitude information, so that the star sensor is one of attitude measurement instruments with the highest precision at present. At present, single-view field star sensors are mainly used, and the star sensors generally have larger view angles, are easily influenced by stray light such as sunlight and have low reliability. Meanwhile, the attitude measurement precision of the rolling angle is lower than the pitch angle and the yaw angle. The spacecraft is often provided with two or more star sensors to ensure higher reliability and measurement accuracy.
The multi-view field star sensor combines a plurality of optical systems into a whole according to certain spatial distribution arrangement, each optical system respectively images the star sky in the view field to a respective image sensor, and the output star maps of the optical systems are combined to carry out star map recognition, calculation and attitude output. Compared with a single-view-field star sensor, the multi-view-field star sensor has small view field of each optical system and is easy to acquire higher attitude measurement precision.
At present, a star map simulation method of a composite three-view-field star sensor is to take pictures of a star-air simulator by adopting 3 single-view-field star sensors respectively and output a star map. Because the optical systems of 3 fields of view of the compound three-field-of-view star sensor meet a certain space geometric relationship and the imaging of the optical systems should be kept synchronous, the simulation method requires that the simulated starry sky output by each multi-star simulator is respectively consistent with the starry sky observed by the 3 fields of view of the compound three-field-of-view star sensor and can be synchronously transformed. The system structure of the simulation method is complex, and the operation is more complicated.
A star atlas identification method of multi-view field star sensor (publication number: CN 103363987A) is actually a star atlas identification method of double view fields, and the technical scheme is that star image coordinates of other view fields are converted into space coordinates of a first view field image, and then the star atlas identification method of the double view field star sensor is used for identification. The method is mainly used for realizing the real-time performance of star map identification, has high requirements on devices, and cannot realize star map simulation.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to: a method for simulating a star map of a composite three-view-field star sensor is provided, abundant star map data are provided for researching technologies of star image extraction, navigation star optimization, star map recognition and the like of the composite three-view-field star sensor, and technical support is provided for testing imaging quality of an optical lens in a design stage of an optical system of the composite three-view-field star sensor.
The invention provides a method for simulating a star map of a composite three-field-of-view star sensor, which comprises the following steps of:
(1) Firstly, according to the limit star of the star sensor, selecting single star such as star not more than the limit star, double star such as equivalent star not more than the limit star and variable star such as highest star not more than the limit star from the original star table, extracting the data of star number, star, right ascension, declination and the like of the single star, the double star such as equivalent star not more than the limit star and the variable star not more than the limit star and the like, and establishing an observation star database.
(2) Then, observation stars in the fields of view of the 3 optical systems of the composite three-field-of-view star sensor are determined, and the incidence angles of the observation stars in the X direction and the Y direction in the subsystem coordinate system are calculated.
(3) Then, according to the total number of the pseudo-stars, random data are generated to simulate the star class and the field position of the pseudo-stars.
(4) And simulating the imaging of each optical system to the observation star and the pseudo star in the respective view field through ray tracing, and calculating image surface information. The optical surface of the optical system can be spherical or aspherical, and the coordinate of any point on the jth optical surface is (x) j ,y j ,z j ) J is 1 to eta, eta is the total number of optical surfaces, and the coordinate point satisfies
Wherein R is j Is the radius of curvature of the vertex of the jth optical surface, K j Is the coefficient of a quadric surface, A j,i Is an aspheric coefficient, N j Is the highest order number of the aspheric coefficients,is the distance of the coordinate point from the axis. This formula determines the shape of the optical surface, called the surface-type function of the optical surface. When K is j =0,A j,i When both are 0, the optical surface is spherical.
The specific process of the step comprises the following steps:
step 1, according to the incident angle and the incident position of the incident ray, calculating the direction cosine vector (l) of the ray incident on the 1 st optical surface 1 ,m 1 ,p 1 ). The position (x) of the incident point of the light ray on the first optical surface is obtained according to the position of the light ray on the entrance pupil, the distance between the entrance pupil and the first optical surface and the surface type function of the first optical surface 1 ,y 1 ,z 1 ). This step corresponds to j being equal to 1.
Step 2, calculating the direction cosine vector of the outgoing ray of the jth optical surface, namely the direction cosine vector (l) of the incident ray of the jth +1 optical surface j+1 ,m j+1 ,p j+1 )。
Step 3, if j&Eta, then calculating the incident point position (x) of the ray on the j +1 optical surface by adopting an approximation algorithm j+1 ,y j+1 ,z j+1 ) Then j is incremented by 1 and step 2 is repeated. Otherwise, step 4 is executed.
Step 4, calculating the position (x) of the light reaching the image surface image ,y image ,z image )。
(5) And calculating the brightness of each pixel of the digital star map and outputting the digital star map.
Compared with the prior art, the invention has the following characteristics:
(1) When the relative positions of the three optical systems of the composite three-field-of-view star sensor are changed, the imaging simulation of the star field can be carried out only by modifying the parameters, and the operation is simple and convenient.
(2) The invention does not need a star-space simulator and a processed star sensor optical system, the star map simulation is realized by a computer program, the requirements on hardware equipment are not required, and the cost is low.
(3) The output star map data can reflect the imaging quality of the optical system, the invention provides support for predicting and avoiding the occurrence of design errors of the optical system of the composite three-view-field star sensor, and provides reference for improving the design quality of the optical system of the composite three-view-field star sensor.
Drawings
FIG. 1 is a flow chart of a star map simulation according to an embodiment of the present invention.
FIG. 2 is a schematic diagram of imaging stars in the sub-system coordinate system 1.
FIG. 3 is a diagram of the relationship between the body coordinate system and the subsystem coordinate system.
Fig. 4 is a schematic diagram of light propagating between adjacent optical surfaces of the optical system 2.
Fig. 5 shows a schematic view of an optical system in an embodiment.
FIG. 6 is a schematic diagram of a simulated star map output by the sub-system 1 in the embodiment.
FIG. 7 is a schematic diagram of a simulated star map output by the sub-system 2 in the embodiment.
FIG. 8 is a schematic diagram of a simulated star map output by the 3 rd subsystem in the example.
FIG. 9 is a schematic diagram of a star map corresponding to the 1 st subsystem in the example of Sky chart software output.
FIG. 10 is a schematic star chart of the output of the Sky chart software corresponding to the 2 nd subsystem of the embodiment.
FIG. 11 is a schematic star chart of the output of the Sky chart software corresponding to the 3 rd subsystem in the example.
Detailed Description
The invention is further described with reference to the following figures and examples.
The principle of the embodiment is as follows: developing imaging simulation of the compound three-field star sensor star field, establishing an inertial coordinate system and a body coordinate system which are respectively set as O i -X i Y i Z i And O b -X b Y b Z b . And establishing a coordinate system fixedly connected with each optical system of the composite three-field-of-view star sensor, namely a subsystem coordinate systemO k -X k Y k Z k Wherein k is 1, 2 or 3, respectively corresponding to each optical system.
First, a subsystem coordinate system is established. Equating the optical system to an ideal imaging system, H k And H k ' main points of object and image, f is focal length of optical system, and origin O of subsystem coordinate system k Taking the principal point H of the image space of the optical system k At' point. X k 、Y k 、Z k Three axes into a right hand coordinate system, where X k Axis, Y k The axes are in the image-side principal plane, parallel to the rows and columns, respectively, of the detector focal plane. Z k The axis is along the optical axis, with the positive direction pointing towards the object plane, as shown in fig. 2.
If the observation star S is ideally imaged as S ', S' has coordinates (x) in the first subsystem coordinate system s ,y s And f), then the direction cosine vector of the observation star S in the subsystem coordinate system is
Then it is at X k And Y k Angle of view of direction of
Then, a body coordinate system is established. In order to facilitate the manufacturing and assembly of the star sensor, the optical axes of the three optical systems are selected to be symmetrical and mutually included at equal angles. For this purpose, the present embodiment uses a coordinate system O as shown in FIG. 3 b -X b Y b Z b The term is used as a body coordinate system. Wherein the origin is O b Taken as Z 1 、Z 2 、Z 3 Intersection of three axes, Z b Axis and Z 1 、Z 2 、Z 3 Have the same included angle, and the included angle is set asX 1 O 1 Z 1 Flour and X b O b Z b The included angle of the faces is tau. In a body coordinate system O b -X b Y b Z b In the subsystem coordinate system, the optical axis has an azimuth interval of 120 degrees and an elevation angle ofAs can be seen in fig. 3, the angleOr elevation angle, determines the positional relationship of the three optical systems with respect to each other.
If the body coordinate system Z is known b The orientation in the inertial frame is (alpha) c ,δ c ) And structural parameters have been givenAnd the value of τ, then the inertial coordinate system O-X i Y i Z i By rotating 3 times, the three-dimensional coordinate system can be matched with a body coordinate system O b -X b Y b Z b And (4) overlapping. First winding Z of inertial coordinate system i Axis is formed by + X i Axial + Y i Rotation of the shaft alpha c Obtaining an X 'Y' Z 'coordinate system, and rotating the new coordinate system by 90-delta degrees around the Y' axis from the + Z 'axis to the + X' axis c An X 'Y' Z 'coordinate system is obtained, the coordinate system rotates theta around a Z' axis, and the obtained coordinate system is superposed with the body coordinate system. Where θ is represented by the body coordinate system X b Axis and Y b The actual orientation of the shaft is determined. Similarly, the body coordinate system O b -X b Y b Z b By rotating for 2 times, the system can be matched with a subsystem coordinate system O k -X k Y k Z k And (4) overlapping. Body coordinate system firstly winds Z b Axis is formed by + X b Axial + Y b Rotation of the shaft by tau + (k-1) 120 DEG to obtain X k ’Y k ’Z k ' coordinate System, new coordinate System rewind Y k ' Axis is formed by + Z k ' axial + X k ' rotation of the shaftThe resulting coordinate system and subsystemSystem coordinate system O k -X k Y k Z k And (4) overlapping.
From the above relationship, if a star S has coordinates (α, δ) in the inertial frame, then it is in the subsystem frame O k -X k Y k Z k Has a direction cosine vector of
Conversely, if a vector is in subsystem coordinate system O k -X k Y k Z k The mid-direction cosine vector is { V k1 ,V k2 ,V k3 Then its vector in the inertial frame is
As shown in fig. 1, the process of the compound three-field-of-view star sensor star map simulation is as follows:
1. and establishing an observation star library. Processing an original star list according to the limit star and the like of the star sensor, selecting a single star with the star and the like not larger than the limit star and the like, a double star with the equivalent star and the like not larger than the limit star and the like and a variable star with the highest star and the like not larger than the limit star and the like, extracting the data of the star number, the star and the like, the right ascension, the declination and the like of the single star, the double star with the equivalent star and the like not larger than the limit star and the like, and storing the data into an observation star database. In the database, the data of each observation star are arranged in the descending order of declination.
2. In a known body coordinate system Z b Direction of axis (alpha) cc ) And angles characterizing the structure of the three-field-of-view systemAnd tau, determining the observation stars in the fields of view of 3 optical systems of the composite three-field-of-view star sensor, and calculating the incidence angles XFLD and YFLD of the observation stars in the X direction and the Y direction in the fields of view of the respective optical systems.
First, the subsystem coordinate systemO k -X k Y k Z k Z of (A) k The direction vector of the axis in the coordinate system is { V } k1 ,V k2 ,V k3 } = {0 1}, then its orientation in the inertial frame { V is calculated according to equation (4) 1 ,V 2 ,V 3 }, corresponding to right ascension and declination (alpha) zkzk ) Is composed of
Only the coordinates (α, δ) satisfy
|δ-δ zk |≤w m (6)
Can only appear in the field of view of the kth optical system, where w m And the field angle corresponding to the diagonal of the image plane detector of the optical system is shown.
The position of the selected observation star is then transformed from the inertial frame to the subsystem frame according to equation (3).
Finally, calculating the coordinate system O of the observation star in the subsystem according to the formula (2) k -X k Y k Z k Is in the angle of incidence. If the optical system is at X k ,Y k Maximum field angle in the direction of w A And w B Only satisfy
Can be observed by the kth optical system to determine whether they are present in the field of view of the optical system. According to the method, observation stars in the field of view of each optical system are obtained through statistics, and meanwhile, the incidence angles XFLD and YFLD of the observation stars are also obtained.
3. And randomly generating data such as star and the like, positions and the like of the pseudo-stars according to the total number of the pseudo-stars. When a pseudolite appears in the field of view of the kth optical system, it is at X k ,Y k Angle of view in the direction of
Wherein r and χ are random numbers within the interval of [0,1 ].
4. And simulating the imaging of each optical system to the observation star and the pseudo star in the respective view field through ray tracing, and calculating image surface information.
(1) The ray is selected for tracking. Each ray represents a portion of energy and the rays should be evenly distributed. For the k optical system, the entrance pupil of the optical system is divided according to a square grid, and light rays emitted by an observation star in the field of view of the k optical system and passing through the center of the entrance pupil and grid points in the entrance pupil are selected for ray tracing and analog imaging.
Setting the weight W of the light rays taking into account the brightness of the observation star and the spectral response characteristics of the detector m And W w Wherein W is m Proportional to star brightness, W w Proportional to the detector spectral response. Suppose that each viewing star emits a total number n of pupil-filling rays ray Initial energy per ray is W m W w /n ray
(2) And (4) performing ray tracing on each optical surface, and calculating the position and energy of the ray reaching the image surface. In this embodiment, a coordinate system is established for each optical surface. For the jth optical surface, coordinate system O j -X j Y j Z j Origin O of j At the apex of the optical surface, Z j The axis points along the optical axis to the image plane. As shown in FIG. 4, the distance from the jth optical surface vertex to the j +1 th optical surface vertex is d j . Let any point coordinate on the jth optical surface be (x) j ,y j ,z j ) J is 1 to eta, eta is the total number of optical surfaces, and the coordinate point satisfies
Wherein R is j Is the radius of curvature of the vertex of the jth optical surface, K j Is the coefficient of a quadric surface, A j,i Is an aspheric coefficient, N j Is the highest order of the aspheric coefficients,is the distance of the coordinate point from the axis. When K is j =0,A j,i When both are 0, the optical surface is spherical. The refractive indexes of the front and the back of the optical surface along the light advancing direction are respectively n j And n j+1 Its axial distance to the j +1 th optical surface is d j
Step 1, calculating the direction cosine vector (l) of the ray incident on the 1 st optical surface according to the incident angle and the incident position of the incident ray 1 ,m 1 ,p 1 ). Then, the position (x) of the incident point of the light ray on the first optical surface is obtained according to the position of the light ray on the entrance pupil, the distance between the entrance pupil and the first optical surface and the surface type function of the first optical surface 1 ,y 1 ,z 1 ). This step corresponds to j being equal to 1.
For the light with the incident angle of XFLD and YFLD, the cosine vector of the direction of the light reaching the 1 st optical surface is
With reference to the 1 st optical surface coordinate system, let the coordinates of the light on the entrance pupil surface be (x) ρ ,y ρ ,d ρ ) Wherein d is ρ Is the distance from the entrance pupil to the 1 st optical surface, then
Simultaneous equations (9) and (11) can be used to determine the coordinates (x) of the incident point 1 ,y 1 ,z 1 )。
Step 2, calculating the direction cosine vector of the outgoing ray of the jth optical surface, namely the direction cosine vector (l) of the incident ray of the jth +1 optical surface j+1 ,m j+1 ,p j+1 ). If it is incident on the jth optical surface at the point of incidence of (x) j ,y j ,z j ) Then the normal direction (ζ) at that point of the optical surface jjj ) Is composed of
The exit direction of the light ray is (l) 'j+1 ,m j+1 ,p j+1 ) Is calculated to obtain
Where mu is the angle between the direction of incidence and the normal,
step 3, if j&Eta, then calculating the incident point position (x) of the ray on the j +1 optical surface by adopting an approximation algorithm j+1 ,y j+1 ,z j+1 ) Then j is incremented by 1 and step 2 is repeated. Otherwise, step 4 is executed.
Adopting an approximation algorithm, and setting the initial coordinate value as (x) j+1,0 ,y j+1,0 ,z j+1,0 ) They are respectively
x j+1,0 =x j ,y j+1,0 =y j And z j+1,0 =z j -d j (14)
Calculating new coordinate values until the calculated z is obtained by using the formula (14) j+1,t+1 And z j+1,t Very little phase difference, resulting in x j+1,t+1 、y j+1,t+1 、z j+1,t+1 I.e. the position (x) of the incident point of the light ray on the (j + 1) th optical surface j+1 ,y j+1 ,z j+1 )。
Step 4, calculating the position (x) of the light reaching the image surface image ,y image ,z image ). The exit point of the light ray on the last optical surface is (x) η ,y η ,z η ) The direction cosine vector is (l) η ,m η ,p η ) The distance from the optical surface to the image surface is d η . Then
According to the intersection point of the light on each optical surface, the propagation distance of the light in the same medium can be obtained, and the energy of the light reaching the image surface can be obtained by utilizing the light energy attenuation coefficient of the medium. If the energy attenuation of a certain ray reaching the image surface is sigma times, the energy reaching the image surface is sigma W m W w /n ray . And (4) for each observation star in the field of view of the kth optical system, performing ray tracing in a spectral range to obtain images formed by the observation stars through the optical system.
5. And calculating the brightness of each pixel of the digital star map and outputting the digital star map. The star sensor is connected by CCD or APS detectorAnd (4) collecting the star image, wherein the detector takes the pixel as a basic unit. According to the position (x) of each ray reaching the image surface image ,y image ) And energy, calculating the pixels where the light source and the light source are located, and accumulating the energy of the light reaching the same pixel position to obtain the light intensity on the pixel in the star map. And (4) defining the gray value of the brightest pixel to be 255, and scaling the brightness of the rest pixels in equal proportion to obtain the corresponding gray value. And storing the positions and the gray levels of the pixels into a digital image format, and outputting a digital star map.
Example (b):
the field angle of the optical system of the three-field star sensor is 10 degrees multiplied by 10 degrees, the caliber is 27.3mm, the focal length is 43.89mm, the optical system is shown as figure 5, the optical system comprises 5 lenses, the optical surface parameters are shown as table 1, and the 1 st optical surface and the 7 th optical surface are aspheric surfaces. Aspheric coefficient of 1 st optical surface is K 1 =-0.41,A 1,8 =3.12×10 -12 And the balance 0. Aspheric coefficient of 7 th optical surface is K 1 =-0.61,A 1,4 =4.85×10 -5 ,A 1,6 =3.80×10 -7 ,A 1,8 =1.26×10 -9 And the balance 0. The number of the selected limit stars is 5.2, etc., and the number of the detector pixels is 1024 multiplied by 1024.
TABLE 1 optical system parameters (units mm)
Number of noodles Radius of Thickness of Material
1 19.44 6.00 Silica
2 219.08 9.00
3 16.90 6.00 K9
4 -40.89 1.00 ZF1
5 9.24 9.00
6 18.87 5.59 K9
7 22.07 3.96
8 -92.60 6.00 ZF1
9 -16.40 7.32
10 Infinity 10.94
And establishing an observation star database. When selecting the body coordinate system Z b The right ascension and declination of the axis are (36 degrees, 30 degrees), Z 1 Axis and Z b Angle of axisX 1 O 1 Z 1 Flour and X b O b Z b When the included angle of the surface is tau =0, Z of the subsystem coordinate system k The axes are respectively (341.43 °,41.28 °), (36 °, -15 °), (90.57 °,41.28 °) in the inertial coordinate system. The star images obtained by imaging the observation star through the optical systems of the subsystems are shown in fig. 6, 7 and 8, and 5 stars, 5 stars and 6 stars are observed in three fields of view respectively. Fig. 9, 10, and 11 show star maps corresponding to three fields of view output by the Sky chart simulation method, and as can be seen from comparison of fig. 6, 7, and 8, the star map output by the star chart simulation method of this embodiment is consistent with the Sky chart simulation software.
Table 2 shows the positions of the star images when the ideal imaging of the observation star is performed, and the positions of the star images obtained by processing fig. 6, 7 and 8 by using the star image extraction algorithm, which are called ideal positions and measurement positions respectively for convenience. As can be seen from the data in the table, the position error obtained by star map simulation is less than 0.4 pixel, and the method for simulating the star map of the composite three-field-of-view star sensor is correct.
TABLE 2 simulated Star image data
The invention can provide abundant simulated star map data for researching technologies such as compound three-field star sensor star image extraction, navigation star optimization, star map identification and the like. While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims (2)

1. A method for simulating a star map of a composite three-field-of-view star sensor is characterized by comprising the following steps:
step 1, establishing an observation star database; selecting single stars such as stars and the like which are not more than the limit stars, double stars such as equivalent stars and the like which are not more than the limit stars and the like, and variable stars such as the highest star and the like which are not more than the limit stars and the like from an original star chart according to the limit stars and the like of the star sensor, extracting star numbers, stars and the like, right ascension and declination data of an observation star, and storing the data as an observation star database;
step 2, counting observation stars in the fields of view of three optical systems of the composite three-field-of-view star sensor, and calculating the incident angle of the observation stars in the X, Y direction in the subsystem coordinate system to which the observation stars belong;
step 3, randomly generating star and the like and field position data of the pseudo-stars according to the total number of the pseudo-stars;
simulating the imaging of the observation star and the pseudo star in the respective view fields by the optical system through ray tracing, and calculating image surface information;
step 5, calculating the brightness of each pixel of the digital star map, and outputting the digital star map;
in the observation star database, the observation star data are arranged in the descending order of declination;
the optical axes of the three optical systems are in a symmetrical form, and included angles between the optical axes are equal; firstly, establishing a subsystem coordinate system, and secondly, establishing a body coordinate system; coordinate system of the body is O b -X b Y b Z b Wherein the origin is O b Taken as Z 1 、Z 2 、Z 3 Intersection of three axes, Z b Axis and Z 1 、Z 2 、Z 3 Have the same included angle, and the included angle is set asX 1 O 1 Z 1 Flour and X b O b Z b The included angle of the surface is tau; first winding Z of inertial coordinate system i Axis is formed by + X i Axial + Y i Rotation of the shaft alpha c Obtaining an X 'Y' Z 'coordinate system, and rotating the new coordinate system by 90-delta degrees around the Y' axis from the + Z 'axis to the + X' axis c Obtaining an X 'Y' Z 'coordinate system, rotating theta around a Z' axis, and overlapping the obtained coordinate system with the body coordinate system;
step 2 further comprises the following steps:
the coordinate system Z of the body is known b Direction of axis (alpha) cc ) And the included angle of the composite three-field-of-view system structure, step 21, subsystem coordinate system O k -X k Y k Z k Z of (A) k The direction vector of the axis in the coordinate system is { V } k1 ,V k2 ,V k3 } = {0,0,1}, according to:
calculating its orientation in inertial frame { V 1 ,V 2 ,V 3 }, corresponding to right ascension and declination (alpha) zkzk ) Comprises the following steps:
δ zk =sin -1 (V 3 ),
only the coordinates (α, δ) satisfy
|δ-δ zk |≤w m (3)
May appear in the field of view of the kth said optical system, wherein w m Representing the field angle corresponding to the diagonal of the image plane detector of the optical system;
step 22, according to:
converting the position of the observation star from an inertial coordinate system to a subsystem coordinate system;
step 23, according to:
calculating the coordinate system O of the observation star in the subsystem k -X k Y k Z k Angle of incidence of (1); if the optical system is at X k ,Y k Maximum field angle in the direction of w A And w B Only satisfy
Can be observed by the kth optical system.
2. The method for simulating the star map of the compound three-field-of-view star sensor according to claim 1, wherein: the optical surface of the optical system adopts a spherical surface or an aspherical surface, and the coordinate of any point on the jth optical surface is (x) j ,y j ,z j ) J is 1 to eta, eta is the total number of optical surfaces, and the coordinate points satisfy:
wherein R is j Is the radius of curvature of the vertex of the jth optical surface, K j Is the coefficient of a quadratic surface, A j,i Is an aspheric coefficient, N j Is the highest order of the aspheric coefficients,is the distance from the coordinate point to the axis; when K is j =0,A j,i When both are 0, the optical surface is a spherical surface;
step 4 is followed by further comprising:
step 41, calculating the direction cosine vector (l) of the light incident on the 1 st optical surface according to the incident angle and the incident position of the incident light 1 ,m 1 ,p 1 ) (ii) a Determining the position (x) of the incident point of the light ray on the first optical surface according to the position of the light ray on the entrance pupil, the distance from the entrance pupil to the first optical surface and the surface type function of the first optical surface 1 ,y 1 ,z 1 ) When j =1;
step 42, calculating the direction cosine vector of the exit ray of the jth optical surface, i.e. the direction cosine vector (l) of the incident ray to the (j + 1) th optical surface j+1 ,m j+1 ,p j+1 );
Step 43, if j&Eta, then calculating the incident point position (x) of the ray on the j +1 optical surface by adopting an approximation algorithm j+1 ,y j+1 ,z j+1 ) Then j is incremented by 1 and step 42 is repeated; if j is greater than or equal to η, go to step 44;
step 44, calculating the position (x) where the light reaches the image plane image ,y image ,z image )。
CN201610015795.2A 2016-01-11 2016-01-11 A kind of compound three visual fields star sensor star map simulation method Active CN105547286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610015795.2A CN105547286B (en) 2016-01-11 2016-01-11 A kind of compound three visual fields star sensor star map simulation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610015795.2A CN105547286B (en) 2016-01-11 2016-01-11 A kind of compound three visual fields star sensor star map simulation method

Publications (2)

Publication Number Publication Date
CN105547286A CN105547286A (en) 2016-05-04
CN105547286B true CN105547286B (en) 2018-04-10

Family

ID=55826641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610015795.2A Active CN105547286B (en) 2016-01-11 2016-01-11 A kind of compound three visual fields star sensor star map simulation method

Country Status (1)

Country Link
CN (1) CN105547286B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107101637A (en) * 2017-05-27 2017-08-29 电子科技大学天府协同创新中心 Digital star chart emulation mode and device
CN107883947B (en) * 2017-12-28 2020-12-22 常州工学院 Star sensor star map identification method based on convolutional neural network
CN110926501B (en) * 2019-11-08 2022-03-22 中国科学院长春光学精密机械与物理研究所 Automatic calibration method and system for optical measurement equipment and terminal equipment
CN112697136B (en) * 2020-11-26 2023-12-05 北京机电工程研究所 Quick minimum area star map simulation method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102853851A (en) * 2012-09-17 2013-01-02 常州工学院 Imaging system and imaging method for stellar field of computer simulated star sensors
CN103344256A (en) * 2013-06-19 2013-10-09 哈尔滨工业大学 Laboratory testing method for multi-field-of-view star sensor
CN104061929A (en) * 2014-07-08 2014-09-24 上海新跃仪表厂 Common-light-path and multi-view-field star sensor and star attitude measurement method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102853851A (en) * 2012-09-17 2013-01-02 常州工学院 Imaging system and imaging method for stellar field of computer simulated star sensors
CN103344256A (en) * 2013-06-19 2013-10-09 哈尔滨工业大学 Laboratory testing method for multi-field-of-view star sensor
CN104061929A (en) * 2014-07-08 2014-09-24 上海新跃仪表厂 Common-light-path and multi-view-field star sensor and star attitude measurement method thereof

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Design and application of star map simulation system for star sensors";Feng Wu 等,;《2013 International Conference on Optical Instruments and Technology: Optical Sensors and Applications》;20131231;第9044卷;1-9页 *
"变折射率介质中光线追迹通用算法的研究";黄战华 等,;《光学学报》;20050531;第25卷(第5期);589-592页 *
"基于粗测位置和方位的三视场快速星图识别方法";王昊京 等,;《中国光学》;20141031;第7卷(第5期);768-778页 *

Also Published As

Publication number Publication date
CN105547286A (en) 2016-05-04

Similar Documents

Publication Publication Date Title
Zhang Star identification
CN103913148B (en) Space flight TDI CCD camera full link numerical value emulation method
CN102607526B (en) Target posture measuring method based on binocular vision under double mediums
CN104406607B (en) The caliberating device of a kind of many visual fields complex optics sensor and method
CN104462776B (en) A kind of low orbit earth observation satellite is to moon absolute radiation calibration method
CN105547286B (en) A kind of compound three visual fields star sensor star map simulation method
CN111537003A (en) Starlight atmospheric refraction measurement correction method based on refraction surface collineation
CN104573251A (en) Method for determining full-field-of-view apparent spectral radiance of satellite-borne optical remote sensor
CN102261921B (en) Method for correcting influence of atmospheric refraction on precision of star sensor
CN105528500A (en) Imaging simulation method and system for decimeter-scale satellite-borne TDI CCD stereoscopic mapping camera
CN107655485A (en) A kind of cruise section independent navigation position deviation modification method
Gaudi Microlensing by exoplanets
CN102928201B (en) Target simulating system of dynamic selenographic imaging sensor
CN106679676A (en) Single-viewing-field multifunctional optical sensor and realization method
CN110146093A (en) Binary asteroid detection independently cooperates with optical navigation method
CN102927982A (en) Double-spectrum autonomous navigation sensor and design method of double-spectrum autonomous navigation sensor
CN103727937A (en) Star sensor based naval ship attitude determination method
CN105023281B (en) Asterism based on point spread function wavefront modification is as centroid computing method
CN105182678A (en) System and method for observing space target based on multiple channel cameras
CN106586041A (en) Simulation method of Mars object for deep space exploration
CN108225276B (en) Single-star imaging target motion characteristic inversion method and system
CN108154535B (en) Camera calibration method based on collimator
CN113218577A (en) Outfield measurement method for star point centroid position precision of star sensor
CN103743488B (en) Infrared imaging simulation method for globe limb background characteristics of remote sensing satellite
CN103234552A (en) Optical navigation target satellite analog simulation image generating method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201117

Address after: Jiangning District of Nanjing City, Jiangsu province 211111 streets moling Road No. 12 mo Zhou

Patentee after: Jiangsu Zhixing Future Automobile Research Institute Co., Ltd

Address before: 213022 No. 1, Wushan Road, Xinbei District, Jiangsu, Changzhou

Patentee before: CHANGZHOU INSTITUTE OF TECHNOLOGY

TR01 Transfer of patent right