CN114972875A - Multi-frame combination-based non-refracted star and refracted star classification method and device - Google Patents

Multi-frame combination-based non-refracted star and refracted star classification method and device Download PDF

Info

Publication number
CN114972875A
CN114972875A CN202210648818.9A CN202210648818A CN114972875A CN 114972875 A CN114972875 A CN 114972875A CN 202210648818 A CN202210648818 A CN 202210648818A CN 114972875 A CN114972875 A CN 114972875A
Authority
CN
China
Prior art keywords
star
frame
vector
refracted
refraction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210648818.9A
Other languages
Chinese (zh)
Inventor
马岩
江洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202210648818.9A priority Critical patent/CN114972875A/en
Publication of CN114972875A publication Critical patent/CN114972875A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • G06V10/765Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a multiframe association-based classification method and device for non-refracted stars and refracted stars. The method comprises the following steps: performing inter-frame star point matching according to inter-frame star point displacement, and calculating an inter-frame attitude matrix; constructing multi-frame star point projection characteristics according to the interframe attitude matrix; and constructing a refracted star classifier by using different projection characteristic properties between the refracted star and the non-refracted star, and obtaining a classification result. The invention only uses the multi-frame star point position information to classify the non-refraction star and the refraction star, has high classification accuracy, does not need to carry out star map identification, and does not need any prior pose information.

Description

Multi-frame combination-based non-refracted star and refracted star classification method and device
Technical Field
The invention relates to the technical field of refraction star navigation sensors, in particular to a multiframe association-based non-refraction star and refraction star classification method and device.
Background
The single view field refraction star astronomical navigation sensor is a high-precision position and attitude measuring instrument. The instrument uses a single sensor to observe the non-refraction star and the refraction star simultaneously, and achieves measurement and calculation of the attitude and the position. The method has complete autonomy, does not accumulate pose strategy errors along with time, has high navigation precision, and is a navigation scheme with great potential at present. The classification and identification of the non-refraction stars and the refraction stars which are observed simultaneously in the visual field are the precondition for realizing the high-precision navigation method. In recent years, the refracted star navigation technology has been developed to some extent, but most of research focuses on the method research for calculating the position of a carrier by using the measured refracted star angle, and neglects the importance of classifying non-refracted stars and refracted stars which appear simultaneously in a field of view. Summarizing the existing methods for classifying non-refracted stars and refracted stars, the following methods are mainly used. Qian et al in their article (QIAN H M, SUN L, CAI J N, et al. A starlight reflection scheme with single star sensor used in auto-nomous satellite navigation system [ J ]. Acta advanced 2014, 96:45-52.) consider that the star map recognition algorithm cannot recognize refracted stars whose positions are shifted, and therefore the star points that are not recognized by the star map recognition algorithm are refracted stars. In fact, small angle refracted stars can be identified by most star map identification algorithms, since star map identification can tolerate a certain degree of position error. This method therefore introduces additional errors into the pose calculation and also affects the angle of refraction measurement. Ningxialin et al, in their article (NING X, SUN X, FANG J, et al, Satellite stellate ladder reconstruction using star pixel coordinates [ J ]. Navigation, 2019, 66(1): 129) 138.) define that the visual axis of the simulation platform observer always points in the direction of orbital motion, which directly identifies stars appearing in the upper half of the image as non-refracting stars and stars as refracting stars. However, this identification method is also not feasible because the boresight orientation does not always follow the expectation due to the presence of external environmental disturbances when the satellite is actually operating in orbit. The existing non-refraction star and refraction star classification methods can be carried out only by depending on the pose prior information of a navigation sensor and matching with a star map identification method of a non-refraction star, and the methods cannot be operated normally when the problems of prior information loss of the pose or failure of a star map identification algorithm and the like occur in practical application.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The invention provides a multi-frame combination-based classification method and device for non-refracted stars and refracted stars, which at least solve the technical problem of classifying non-refracted stars and refracted stars which simultaneously appear in a view field of a single-view field refracted star astronomical navigation sensor when the navigation sensor cannot obtain pose prior information and a star map identification result.
According to an aspect of the embodiments of the present invention, there is provided a method for classifying non-refracted stars and refracted stars based on multi-frame association, including: performing inter-frame star point matching according to inter-frame star point displacement, and calculating an inter-frame attitude matrix; constructing multi-frame star point projection characteristics according to the interframe attitude matrix; and constructing a refracted star classifier by using different projection characteristic properties between the refracted star and the non-refracted star, and obtaining a classification result.
Further, the step S100 includes:
step S110, obtain
Figure 651095DEST_PATH_IMAGE001
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 865039DEST_PATH_IMAGE002
In the formula
Figure 505099DEST_PATH_IMAGE003
The vector of the star vector is used as the vector of the star,
Figure 804493DEST_PATH_IMAGE004
the number of the symbols is an asterisk,
Figure 719360DEST_PATH_IMAGE005
is as follows
Figure 256651DEST_PATH_IMAGE001
The total number of stars in the frame image; get the first
Figure 333192DEST_PATH_IMAGE006
In a frame imageThe star vectors of the camera coordinate system of all the star points form a set
Figure 588724DEST_PATH_IMAGE008
In the formula
Figure 50491DEST_PATH_IMAGE009
The vector of the star vector is used as the vector of the star,
Figure 239027DEST_PATH_IMAGE010
the number of the star marks is the same as the star number,
Figure 955310DEST_PATH_IMAGE011
is as follows
Figure 698138DEST_PATH_IMAGE012
The total number of stars in the frame image;
step S120, according to the second step
Figure 423649DEST_PATH_IMAGE001
Frame and second
Figure 201112DEST_PATH_IMAGE013
Finding all matched star points by using the star vectors in the frame;
and S130, calculating an interframe attitude matrix according to the matched star points.
Further, the step S200 includes:
respectively calculating to obtain the second one according to the inter-frame star point matching and the inter-frame attitude matrix calculation method
Figure 885034DEST_PATH_IMAGE001
Frame and preamble
Figure 377808DEST_PATH_IMAGE014
An inter-frame attitude matrix between frames;
through the calculated second
Figure 172588DEST_PATH_IMAGE001
Frame and preamble
Figure 335716DEST_PATH_IMAGE015
The posture matrix between frames projects the star point position at any time to the current time
Figure 924961DEST_PATH_IMAGE001
Time of day;
calculating the coordinates of the projection star vector under an image coordinate system according to a pinhole camera model to obtain two-dimensional coordinates of projection star points, and constructing star point projection characteristics
Figure 642381DEST_PATH_IMAGE016
Further, the step S300 includes:
according to the star point projection characteristics
Figure 709694DEST_PATH_IMAGE017
Calculating a linear regression matrix
Figure 461749DEST_PATH_IMAGE018
In the formula
Figure 956316DEST_PATH_IMAGE020
And
Figure 158103DEST_PATH_IMAGE022
are respectively as
Figure 560265DEST_PATH_IMAGE024
And
Figure 166827DEST_PATH_IMAGE026
average value of (d);
according to the linear regression matrix
Figure 363453DEST_PATH_IMAGE027
Calculating the eigenvector corresponding to the minimum eigenvalue, and recording as
Figure 55466DEST_PATH_IMAGE028
Figure 995740DEST_PATH_IMAGE029
Is a column vector of 2x1,
Figure 456808DEST_PATH_IMAGE030
is a column vector
Figure 824335DEST_PATH_IMAGE029
And calculating its vertical vector from the vector
Figure 269223DEST_PATH_IMAGE031
According to the vector
Figure 744679DEST_PATH_IMAGE029
And vector
Figure 325833DEST_PATH_IMAGE033
Separately calculating the linear regression residual
Figure 598683DEST_PATH_IMAGE034
And
Figure 999708DEST_PATH_IMAGE035
the calculation formula is
Figure 281785DEST_PATH_IMAGE037
And
Figure 717446DEST_PATH_IMAGE039
and calculating the index of refraction star
Figure 161197DEST_PATH_IMAGE040
According to the index of said refraction star
Figure 315097DEST_PATH_IMAGE042
Classifying the star points into two categories and defining
Figure 135286DEST_PATH_IMAGE043
The star point of (A) is a non-refracting star, defining
Figure 434242DEST_PATH_IMAGE044
The star point of (A) is a refraction star, wherein
Figure 48894DEST_PATH_IMAGE045
Is a classification threshold.
According to another aspect of the embodiments of the present invention, there is provided a multi-frame combination-based classification apparatus for non-refracted stars and refracted stars, including:
the matching module is used for matching star points among multiple frames and calculating an interframe attitude matrix;
the characteristic module is used for constructing star point projection characteristics according to the matched star point and the inter-frame attitude matrix;
and the classification module is used for classifying the non-refraction stars and the refraction stars according to the star point projection characteristics.
Further, the matching module comprises:
an acquisition unit for acquiring
Figure 424512DEST_PATH_IMAGE046
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 313970DEST_PATH_IMAGE047
In the formula
Figure 193065DEST_PATH_IMAGE048
The vector of the star vector is used as the vector of the star,
Figure 509776DEST_PATH_IMAGE049
the number of the star marks is the same as the star number,
Figure 372690DEST_PATH_IMAGE051
is as follows
Figure 534681DEST_PATH_IMAGE046
The total number of stars in the frame image; get the first
Figure 62090DEST_PATH_IMAGE052
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 18545DEST_PATH_IMAGE053
In the formula
Figure 634334DEST_PATH_IMAGE054
The vector of the star vector is used as the vector of the star,
Figure 600016DEST_PATH_IMAGE055
the number of the star marks is the same as the star number,
Figure 984861DEST_PATH_IMAGE056
is as follows
Figure 377796DEST_PATH_IMAGE052
The total number of stars in the frame image;
a matching unit for searching all matching star points according to the star vectors of the two frames,
and the computing unit is used for computing the interframe attitude matrix according to the matched star points.
Further, the feature module includes:
the projection unit is used for calculating the coordinates of the projection star vector under the image coordinate system according to the pinhole camera model;
a construction unit for constructing star point projection characteristics according to the projection star point two-dimensional coordinates
Figure 949723DEST_PATH_IMAGE057
Further, the classification module comprises:
a regression unit for projecting the characteristics according to the star points
Figure 656779DEST_PATH_IMAGE058
Calculating a linear regression matrix
Figure 158780DEST_PATH_IMAGE059
In the formula
Figure 722616DEST_PATH_IMAGE061
And
Figure 47418DEST_PATH_IMAGE063
are respectively as
Figure 89324DEST_PATH_IMAGE065
And
Figure 183182DEST_PATH_IMAGE067
average value of (d); and also for use in accordance with the linear regression matrix
Figure 183499DEST_PATH_IMAGE068
Calculating the eigenvector corresponding to the minimum eigenvalue, and recording as
Figure 730018DEST_PATH_IMAGE070
And calculating its vertical vector from the vector
Figure 841193DEST_PATH_IMAGE071
A decision unit for determining a vector based on the vector
Figure 786628DEST_PATH_IMAGE072
And vector
Figure 692267DEST_PATH_IMAGE073
Separately calculating linear regression residuals
Figure 726082DEST_PATH_IMAGE074
And
Figure 375369DEST_PATH_IMAGE075
the calculation formula is
Figure 443819DEST_PATH_IMAGE076
And
Figure 520360DEST_PATH_IMAGE077
and calculate the discountIndex of shooting star
Figure 510313DEST_PATH_IMAGE078
(ii) a And also for determining the star index of refraction
Figure 963291DEST_PATH_IMAGE079
Classifying the star points into two categories and defining
Figure 363878DEST_PATH_IMAGE080
The star point of (A) is a non-refracting star, defined
Figure 611320DEST_PATH_IMAGE081
The star point of (A) is a refraction star, wherein
Figure 619727DEST_PATH_IMAGE082
Is a classification threshold.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium including a stored program, wherein the program when running controls an apparatus in which the non-volatile storage medium is located to perform a multi-frame association-based non-refracted star and refracted star classification method.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform a multi-frame joint based non-refracted star and refracted star classification method.
The technical scheme of the invention can realize the following beneficial technical effects:
in the embodiment of the invention, inter-frame star point matching is carried out according to inter-frame star point displacement, and an inter-frame attitude matrix is calculated; constructing multi-frame star point projection characteristics according to the interframe attitude matrix; the invention only uses the multi-frame star point position information to classify the non-refraction star and the refraction star, has high classification accuracy, does not need to carry out star map identification, and does not need any prior pose information. When the satellite in orbit loses prior information of the attitude and the position, the method can still correctly classify the non-refraction star and the refraction star in the star map, ensure the capability of the satellite in orbit to process emergency and improve the safety of the satellite in orbit.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a simulated star map for a plurality of frames according to an embodiment of the present invention;
FIG. 2 is a diagram of multi-frame star point projection features and classification results according to an embodiment of the present invention;
FIG. 3 is a flowchart of a method for classifying non-refracted stars and refracted stars based on multi-frame association according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided an embodiment of a method for classifying non-refracted stars and refracted stars based on a multi-frame association, it should be noted that the steps shown in the flowchart of the drawings may be executed in a computer system such as a set of computer executable instructions, and although a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that shown.
Examples
Fig. 3 is a flowchart of a method for classifying non-refracted stars and refracted stars based on multi-frame association according to an embodiment of the present invention, as shown in fig. 3, the method includes the following steps:
s100, performing inter-frame star point matching according to inter-frame star point displacement, and calculating an inter-frame attitude matrix;
specifically, the inter-frame star point matching according to the inter-frame star point displacement and calculating the inter-frame attitude matrix includes the following steps:
step S110, obtain
Figure 610817DEST_PATH_IMAGE083
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 122701DEST_PATH_IMAGE047
In the formula
Figure 275464DEST_PATH_IMAGE084
The vector of the star vector is used as the vector of the star,
Figure 36747DEST_PATH_IMAGE049
the number of the star marks is the same as the star number,
Figure 565948DEST_PATH_IMAGE085
is as follows
Figure 194988DEST_PATH_IMAGE083
The total number of stars in the frame image; get the first
Figure 315391DEST_PATH_IMAGE052
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 32811DEST_PATH_IMAGE086
In the formula
Figure 631283DEST_PATH_IMAGE087
The vector of the star vector is used as the vector of the star,
Figure 383338DEST_PATH_IMAGE055
the number of the star marks is the same as the star number,
Figure 409063DEST_PATH_IMAGE056
is as follows
Figure 879359DEST_PATH_IMAGE052
The total number of stars in the frame image;
step S120, according to the second step
Figure 15942DEST_PATH_IMAGE083
Frame and second
Figure 622504DEST_PATH_IMAGE088
Finding all matched star points by using the star vectors in the frame;
specifically, when it comes to
Figure 816200DEST_PATH_IMAGE083
First of frame
Figure 773792DEST_PATH_IMAGE090
Individual star vector
Figure 714066DEST_PATH_IMAGE091
And a first
Figure 175135DEST_PATH_IMAGE092
First of frame
Figure 542662DEST_PATH_IMAGE094
Individual star vector
Figure 987550DEST_PATH_IMAGE095
Satisfy the requirement of
Figure 465936DEST_PATH_IMAGE096
And is and
Figure 47090DEST_PATH_IMAGE097
and is and
Figure 319939DEST_PATH_IMAGE098
then it is considered as the first
Figure 249193DEST_PATH_IMAGE083
First of frame
Figure 531270DEST_PATH_IMAGE089
Individual star vector
Figure 701352DEST_PATH_IMAGE091
And a first
Figure 348365DEST_PATH_IMAGE092
First of frame
Figure 971107DEST_PATH_IMAGE099
Individual star vector
Figure 56875DEST_PATH_IMAGE100
For matching stars, operators in the equation
Figure 612621DEST_PATH_IMAGE101
The included angle between the two vectors is calculated,
Figure 501641DEST_PATH_IMAGE103
angular threshold, operator for star point matching
Figure 611680DEST_PATH_IMAGE104
Which represents a set subtraction of the sets,
Figure 235559DEST_PATH_IMAGE106
is a set of camera coordinate system star vectors for all the star points in the t-k frame images,
Figure 911391DEST_PATH_IMAGE107
is the set of camera coordinate system star vectors for all the star points in the t-th frame image.
For arbitrary two vectors
Figure 962524DEST_PATH_IMAGE108
And
Figure 91017DEST_PATH_IMAGE110
with an included angle of
Figure 253008DEST_PATH_IMAGE112
For any two sets
Figure 517767DEST_PATH_IMAGE114
And
Figure 5380DEST_PATH_IMAGE116
the difference set thereof satisfies
Figure 621169DEST_PATH_IMAGE118
In the formula
Figure 849501DEST_PATH_IMAGE119
Representing elements in an arbitrary collection; x, y generally refer to any two vectors used to describe the operator
Figure 968767DEST_PATH_IMAGE120
Set X, Y broadly refers to any two sets used to describe an operator
Figure 361702DEST_PATH_IMAGE121
And S130, calculating an interframe attitude matrix according to the matched star points.
Specifically, it is assumed that after the star point matching between frames, the second step
Figure 464787DEST_PATH_IMAGE083
Frame and the first
Figure 968581DEST_PATH_IMAGE122
Between frames have
Figure 207932DEST_PATH_IMAGE123
The same star point is recorded
Figure 506189DEST_PATH_IMAGE083
First of frame
Figure 830992DEST_PATH_IMAGE124
The individual star points are exactly respectively connected with
Figure 135546DEST_PATH_IMAGE122
First of frame
Figure 229404DEST_PATH_IMAGE125
The matching of the individual star points can be calculated according to the matching star points through an attitude calculation algorithm
Figure 964142DEST_PATH_IMAGE083
Frame and the second
Figure 776240DEST_PATH_IMAGE122
Inter-frame attitude matrix between frames, denoted
Figure 887416DEST_PATH_IMAGE127
Satisfy the following requirements
Figure DEST_PATH_IMAGE129
S200, constructing multi-frame star point projection characteristics according to the inter-frame attitude matrix;
specifically, the construction of the multi-frame star point projection characteristics according to the interframe attitude matrix comprises the following steps:
respectively calculating to obtain the second frame according to the inter-frame star point matching and the inter-frame attitude matrix calculation method in the step S100
Figure 304622DEST_PATH_IMAGE083
Frame and preamble
Figure 475840DEST_PATH_IMAGE130
The inter-frame attitude matrix between frames is respectively recorded as
Figure 509655DEST_PATH_IMAGE132
In the formula
Figure 424522DEST_PATH_IMAGE134
The total frame number used when calculating the multi-frame star point projection characteristics in the step is set at the same time
Figure 490042DEST_PATH_IMAGE083
Frame to first
Figure 566583DEST_PATH_IMAGE136
Aggregation between frames
Figure 87694DEST_PATH_IMAGE134
All star points in the frame with corresponding matching have
Figure 540672DEST_PATH_IMAGE138
Am et al (1)
Figure DEST_PATH_IMAGE140
The individual star point is
Figure DEST_PATH_IMAGE142
The star vector in the frame is
Figure 807836DEST_PATH_IMAGE143
Obtained by said calculation
Figure 320857DEST_PATH_IMAGE083
Frame and preamble
Figure 338054DEST_PATH_IMAGE144
The posture matrix between frames can project the star point position at any time to the current time
Figure 329143DEST_PATH_IMAGE083
Time, star vector
Figure 372186DEST_PATH_IMAGE143
The projected star vector is
Figure DEST_PATH_IMAGE145
The projection formula is
Figure DEST_PATH_IMAGE147
In the formula
Figure DEST_PATH_IMAGE149
The number of the star marks is the same as the star number,
Figure DEST_PATH_IMAGE151
is a frame number in the formula
Figure 10103DEST_PATH_IMAGE152
When it is used, order
Figure 768456DEST_PATH_IMAGE154
In the formula
Figure 828815DEST_PATH_IMAGE156
Is an identity matrix; calculating the coordinates of the projection star vector under the image coordinate system according to a pinhole camera model, and recording
Figure DEST_PATH_IMAGE157
The corresponding two-dimensional vector of the image coordinate system is
Figure DEST_PATH_IMAGE159
In the formula
Figure 132889DEST_PATH_IMAGE160
And
Figure DEST_PATH_IMAGE161
two coordinate values of the projected star point image coordinate are respectively; constructing star point projection characteristics according to the projection star point two-dimensional coordinates
Figure DEST_PATH_IMAGE163
And step S300, constructing a refracted star classifier by using different projection characteristic properties between the refracted star and the non-refracted star, and obtaining a classification result.
Specifically, the construction of the refracted star classifier by using different projection characteristic properties between the refracted star and the non-refracted star and the obtaining of the classification result include the following steps:
according to the star point projection characteristics in the step S200
Figure DEST_PATH_IMAGE165
Calculating a linear regression matrix
Figure 863079DEST_PATH_IMAGE166
In the formula
Figure DEST_PATH_IMAGE167
And
Figure 314920DEST_PATH_IMAGE168
are respectively as
Figure DEST_PATH_IMAGE169
And
Figure 848145DEST_PATH_IMAGE170
average value of (d); according to the linear regression matrix
Figure 600200DEST_PATH_IMAGE027
Calculating the eigenvector corresponding to the minimum eigenvalue, and recording as
Figure 625925DEST_PATH_IMAGE028
Figure 96221DEST_PATH_IMAGE029
Is a column vector of 2x1,
Figure 498383DEST_PATH_IMAGE030
is a column vector
Figure 104945DEST_PATH_IMAGE029
And calculating its vertical vector from the vector
Figure DEST_PATH_IMAGE171
(ii) a According to the vector
Figure 770413DEST_PATH_IMAGE029
And vector
Figure 462425DEST_PATH_IMAGE172
Separately calculating linear regression residuals
Figure DEST_PATH_IMAGE173
And
Figure 337453DEST_PATH_IMAGE174
the calculation formula is
Figure DEST_PATH_IMAGE175
And
Figure 736204DEST_PATH_IMAGE176
and calculating the index of refraction star
Figure 103732DEST_PATH_IMAGE040
(ii) a According to the index of said refraction star
Figure DEST_PATH_IMAGE177
Classifying the star points into two categories and defining
Figure 955144DEST_PATH_IMAGE043
The star point of (A) is a non-refracting star, defined
Figure 699109DEST_PATH_IMAGE044
The star point of (A) is a refraction star, wherein
Figure 289052DEST_PATH_IMAGE045
Is a classification threshold.
The following describes the specific implementation process of the classification method of the present invention in detail by using specific examples:
1. and performing inter-frame star point matching according to the inter-frame star point displacement, and calculating an inter-frame attitude matrix.
Definition of
Figure 561902DEST_PATH_IMAGE178
Frame (currently is
Figure 494086DEST_PATH_IMAGE178
Time of day) of
Figure DEST_PATH_IMAGE179
The camera coordinate system star vector of each star point is as follows:
Figure 713845DEST_PATH_IMAGE180
(1)
in the formula
Figure DEST_PATH_IMAGE181
Is as follows
Figure 87189DEST_PATH_IMAGE182
The set of camera coordinate system star vectors for all the star points in the frame,
Figure DEST_PATH_IMAGE183
is as follows
Figure 734202DEST_PATH_IMAGE182
In the frame of
Figure 885173DEST_PATH_IMAGE184
The individual camera coordinate system star vector.
Define preamble
Figure DEST_PATH_IMAGE185
Of frames
Figure 174203DEST_PATH_IMAGE186
The camera coordinate system star vector of each star point is as follows:
Figure DEST_PATH_IMAGE187
(2)
in the formula
Figure 667633DEST_PATH_IMAGE188
Is as follows
Figure 813443DEST_PATH_IMAGE185
The set of camera coordinate system star vectors for all the star points in the frame,
Figure DEST_PATH_IMAGE189
is as follows
Figure 392323DEST_PATH_IMAGE182
In the frame of
Figure 16203DEST_PATH_IMAGE190
The individual camera coordinate system star vector.
When it comes to
Figure 426455DEST_PATH_IMAGE182
First of frame
Figure 209079DEST_PATH_IMAGE184
Individual star vector
Figure 71993DEST_PATH_IMAGE183
And a first
Figure 499563DEST_PATH_IMAGE185
First of frame
Figure 29902DEST_PATH_IMAGE190
Individual star vector
Figure 517515DEST_PATH_IMAGE189
Satisfies the following conditions:
Figure DEST_PATH_IMAGE191
(3)
and:
Figure 70987DEST_PATH_IMAGE192
(4)
and:
Figure DEST_PATH_IMAGE193
(5)
then it is considered as
Figure 239931DEST_PATH_IMAGE182
First of frame
Figure 356267DEST_PATH_IMAGE184
Individual star vector
Figure 14782DEST_PATH_IMAGE183
And a first
Figure 852288DEST_PATH_IMAGE185
First of frame
Figure 621661DEST_PATH_IMAGE190
Individual star vector
Figure 861012DEST_PATH_IMAGE189
Are the same star point. Operator in formula
Figure 690428DEST_PATH_IMAGE194
The included angle between the two vectors is calculated. In particular, for any two vectors
Figure DEST_PATH_IMAGE195
And
Figure 687334DEST_PATH_IMAGE196
and the included angle satisfies:
Figure 994818DEST_PATH_IMAGE198
(6)
in the formula
Figure DEST_PATH_IMAGE199
Angular threshold, operator for star point matching
Figure 300728DEST_PATH_IMAGE200
Representing aggregate subtraction. In particular, for any two sets
Figure DEST_PATH_IMAGE201
And
Figure 973149DEST_PATH_IMAGE202
and the difference set thereof satisfies:
Figure DEST_PATH_IMAGE203
(7)
in the formula
Figure 722930DEST_PATH_IMAGE204
Representing elements in an arbitrary collection.
Finding out the star point matching between frames
Figure 834105DEST_PATH_IMAGE182
Frame and the second
Figure 782470DEST_PATH_IMAGE185
Between frames have
Figure DEST_PATH_IMAGE205
The same star point. And do not mark
Figure 888442DEST_PATH_IMAGE182
First of frame
Figure 187836DEST_PATH_IMAGE206
The individual star points are exactly respectively connected with
Figure 837123DEST_PATH_IMAGE185
First of frame
Figure 905573DEST_PATH_IMAGE206
And matching the individual star points. Then the first one can be calculated by the attitude calculation algorithm based on the above-mentioned matching stars
Figure 982114DEST_PATH_IMAGE182
Frame and the second
Figure 768804DEST_PATH_IMAGE185
Inter-frame attitude matrix between frames, denoted
Figure DEST_PATH_IMAGE207
And satisfies the following conditions:
Figure 159465DEST_PATH_IMAGE208
(8)
2. and constructing multi-frame star point projection characteristics according to the interframe attitude matrix.
According to the above calculation, the first and second
Figure 82422DEST_PATH_IMAGE182
Frame and preamble
Figure DEST_PATH_IMAGE209
The inter-frame attitude matrix between frames is respectively recorded as
Figure DEST_PATH_IMAGE211
. Wherein
Figure DEST_PATH_IMAGE213
And calculating the total frame number used when the multi-frame star point characteristics are calculated for the step. At the same time, the above-mentioned slave stage is not provided
Figure 671142DEST_PATH_IMAGE182
Frame to first
Figure 945128DEST_PATH_IMAGE214
Aggregation between frames
Figure 201797DEST_PATH_IMAGE213
All star points in the frame with corresponding matching have
Figure 244840DEST_PATH_IMAGE216
And (4) granulating. And do not mark
Figure 928762DEST_PATH_IMAGE218
The individual star point is at
Figure DEST_PATH_IMAGE219
The star vector in the frame is
Figure 627728DEST_PATH_IMAGE220
. By the aid of the inter-frame projection matrix, the star point position at any moment can be projected to the current time
Figure DEST_PATH_IMAGE221
Time, star vector
Figure 357262DEST_PATH_IMAGE220
The projected star vector is
Figure 254811DEST_PATH_IMAGE222
. The projection formula is:
Figure DEST_PATH_IMAGE223
(9)
wherein when
Figure 47317DEST_PATH_IMAGE224
When the temperature of the water is higher than the set temperature,
Figure 561475DEST_PATH_IMAGE226
. Wherein
Figure 628788DEST_PATH_IMAGE228
Is an identity matrix.
At the moment, the coordinates of the projection star vector under the image coordinate system are calculated according to the pinhole camera model, and the coordinates are recorded
Figure DEST_PATH_IMAGE229
The corresponding two-dimensional vector of the image coordinate system is
Figure 584106DEST_PATH_IMAGE230
The multi-frame projection star vector of each star point can form the projection characteristics of the star point
Figure DEST_PATH_IMAGE231
3. And constructing a refracted star classifier by using different projection characteristic properties between the refracted star and the non-refracted star, and obtaining a classification result.
Since the non-refracted star vector between frames is only rotated by the different poses between frames, the projected star vectors of their different frames should coincide. And the refraction star vector is influenced by atmospheric refraction besides different postures among the multiframes. Due to the movement of the carrier, the observation heights of the refracted rays of different frames are changed, and then the projected star vectors of different frames are obtained. In a short time, it can be considered that the projected star vector of the refracted star is rotated about a fixed axis. Therefore, the star point projection characteristics of the non-refracting stars should be coincident scatter points, and the star point projection characteristics of the refracting stars should be scatter points distributed along a straight line.
Considering the single star positioning error and the attitude estimation error of the navigation sensor, the actual star point projection characteristic of the non-refraction star is scattered points randomly distributed near the true star point position, and the star point projection characteristic of the refraction star is scattered points randomly distributed near the straight line of the actual refraction star point projection track.
When the classifier is constructed, firstly, the projection characteristics of certain star points to be classified are projected
Figure 821882DEST_PATH_IMAGE232
A linear fit is performed. The scatter coordinates in the projection features are not made to be
Figure DEST_PATH_IMAGE233
From it to the fitted line, it is noted:
Figure 495440DEST_PATH_IMAGE234
wherein
Figure DEST_PATH_IMAGE235
Are the fitted estimated points. And the residue defining the fitted line is the scatter-to-line distance, then the sum of the squared errors can be calculated as:
Figure 304127DEST_PATH_IMAGE236
(10)
wherein
Figure DEST_PATH_IMAGE237
While
Figure 113951DEST_PATH_IMAGE238
Figure DEST_PATH_IMAGE239
And
Figure 510910DEST_PATH_IMAGE240
are respectively as
Figure DEST_PATH_IMAGE241
And
Figure 406185DEST_PATH_IMAGE242
average value of (a).
Figure 612038DEST_PATH_IMAGE244
Is solved as
Figure 338686DEST_PATH_IMAGE246
After the solution of the eigenvector corresponding to the minimum eigenvalue is substituted into the above formula, the residual error can be obtained
Figure DEST_PATH_IMAGE247
. At the same time will
Figure DEST_PATH_IMAGE249
After bringing into the above formula, can obtain
Figure 847159DEST_PATH_IMAGE250
. The index of the star point projection characteristic satisfying the refraction rule can be defined as
Figure DEST_PATH_IMAGE251
By calculating the index of refraction star corresponding to each star
Figure 760888DEST_PATH_IMAGE252
And are classified as follows:
Figure DEST_PATH_IMAGE253
(11)
in the formula
Figure 908448DEST_PATH_IMAGE254
Is a classification threshold.
The method is verified through simulation experiments. The generated multi-frame simulated star map simultaneously containing the non-refraction star and the refraction star is shown in fig. 1, wherein a dot in the map is a real non-refraction star point, a cross point is a real refraction star point, the star point type is only marked in the map as an illustration, and only position information of the star point is used when star point classification is carried out. By using the method of the invention to track the star points, the constructed star point projection characteristics are shown in fig. 2, and the number on each sub-graph is the refractive star index corresponding to each star. It can be seen that the star point projection of the non-refracted star is characterized by a scatter point scattered randomly at the center, and the index of the refracted star is notAnd the star point projection characteristics of the refraction star are distributed along a straight line, and the index of the refraction star is relatively large. By setting classification threshold values
Figure DEST_PATH_IMAGE255
Accurate classification of non-refracted stars and refracted stars in the map can be achieved.
According to another aspect of the embodiments of the present invention, there is also provided a multi-frame combination-based classification apparatus for non-refracted stars and refracted stars, including: the matching module is used for matching star points among multiple frames and calculating an interframe attitude matrix; the characteristic module is used for constructing star point projection characteristics according to the matched star point and the inter-frame attitude matrix; and the classification module is used for classifying the non-refraction stars and the refraction stars according to the star point projection characteristics.
Optionally, the matching module includes: an acquisition unit for acquiring
Figure 692865DEST_PATH_IMAGE256
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure DEST_PATH_IMAGE257
In the formula
Figure 434556DEST_PATH_IMAGE258
The vector of the star vector is used as the vector of the star,
Figure DEST_PATH_IMAGE259
the number of the star marks is the same as the star number,
Figure 304423DEST_PATH_IMAGE260
is as follows
Figure 852079DEST_PATH_IMAGE256
The total number of stars in the frame image; and the matching unit is used for searching all matched star points according to the star vectors of the two frames. When it comes to
Figure 287739DEST_PATH_IMAGE256
First of frame
Figure DEST_PATH_IMAGE261
Individual star vector
Figure 931823DEST_PATH_IMAGE258
And a first
Figure 351303DEST_PATH_IMAGE262
First of frame
Figure DEST_PATH_IMAGE263
Individual star vector
Figure 640333DEST_PATH_IMAGE264
Satisfy the requirement of
Figure DEST_PATH_IMAGE265
And is and
Figure 868183DEST_PATH_IMAGE266
and is made of
Figure DEST_PATH_IMAGE267
Then it is considered as the first
Figure 898149DEST_PATH_IMAGE256
First of frame
Figure 539346DEST_PATH_IMAGE261
Individual star vector
Figure 428804DEST_PATH_IMAGE258
And a first step of
Figure 573478DEST_PATH_IMAGE268
First of frame
Figure 624610DEST_PATH_IMAGE263
Individual star vector
Figure 753103DEST_PATH_IMAGE264
Are the matching stars. Operator in formula
Figure DEST_PATH_IMAGE269
The included angle between the two vectors is calculated. In particular, for any two vectors
Figure 587198DEST_PATH_IMAGE270
And
Figure DEST_PATH_IMAGE271
with an included angle of
Figure DEST_PATH_IMAGE273
In the formula
Figure DEST_PATH_IMAGE275
Angular threshold, operator for star point matching
Figure 724394DEST_PATH_IMAGE276
Representing aggregate subtraction. Specifically, for any two sets
Figure 212007DEST_PATH_IMAGE278
And
Figure 562217DEST_PATH_IMAGE280
the difference set thereof satisfies
Figure 793478DEST_PATH_IMAGE282
In the formula
Figure DEST_PATH_IMAGE283
Representing elements in an arbitrary collection; and the computing unit is used for computing the interframe attitude matrix according to the matched star points. Does not set the star point after the inter-frame matching
Figure 116006DEST_PATH_IMAGE256
Frame and the second
Figure 774521DEST_PATH_IMAGE268
Between frames have
Figure 877606DEST_PATH_IMAGE284
The same star point. And do not mark
Figure 378470DEST_PATH_IMAGE256
First of frame
Figure DEST_PATH_IMAGE285
The individual star points are exactly the same as
Figure 555505DEST_PATH_IMAGE268
First of frame
Figure 384920DEST_PATH_IMAGE285
And matching the individual star points. Then the second can be calculated by the attitude solution algorithm based on the above mentioned matching stars
Figure 912985DEST_PATH_IMAGE256
Frame and the second
Figure 486048DEST_PATH_IMAGE262
Inter-frame attitude matrix between frames, denoted
Figure DEST_PATH_IMAGE287
To satisfy
Figure DEST_PATH_IMAGE289
Optionally, the feature module includes: a projection unit for calculating the coordinates of the projection star vector in the image coordinate system according to the pinhole camera model and recording
Figure 252010DEST_PATH_IMAGE290
The corresponding two-dimensional vector of the image coordinate system is
Figure 252327DEST_PATH_IMAGE292
In the formula
Figure DEST_PATH_IMAGE293
And
Figure 999179DEST_PATH_IMAGE290
two coordinate values of the projected star point image coordinate are respectively;a construction unit for constructing star point projection characteristics according to the projection star point two-dimensional coordinates
Figure DEST_PATH_IMAGE295
Optionally, the classification module includes: a regression unit for projecting the characteristics according to the star points
Figure DEST_PATH_IMAGE297
Calculating a linear regression matrix
Figure 251300DEST_PATH_IMAGE298
In the formula
Figure DEST_PATH_IMAGE299
And
Figure 137347DEST_PATH_IMAGE300
are respectively as
Figure DEST_PATH_IMAGE301
And
Figure 511828DEST_PATH_IMAGE302
average value of (d); a regression unit for further performing linear regression on the basis of the linear regression matrix
Figure 632511DEST_PATH_IMAGE027
Calculating the eigenvector corresponding to the minimum eigenvalue, and recording as
Figure 281799DEST_PATH_IMAGE028
And calculating its vertical vector from the vector
Figure 615828DEST_PATH_IMAGE031
(ii) a A decision unit for determining a vector based on the vector
Figure 957948DEST_PATH_IMAGE029
And vector
Figure DEST_PATH_IMAGE303
Separately calculating linear regression residuals
Figure 416742DEST_PATH_IMAGE304
And
Figure DEST_PATH_IMAGE305
the calculation formula is
Figure 541824DEST_PATH_IMAGE306
And
Figure 730360DEST_PATH_IMAGE176
and calculating the index of refraction star
Figure 240451DEST_PATH_IMAGE040
(ii) a A decision unit for further determining the index of the star of refraction
Figure 983279DEST_PATH_IMAGE041
Classifying the star points into two categories and defining
Figure 239948DEST_PATH_IMAGE043
The star point of (A) is a non-refracting star, defined
Figure 282990DEST_PATH_IMAGE044
The star point of (A) is a refraction star, wherein
Figure 966913DEST_PATH_IMAGE045
Is a classification threshold.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium including a stored program, wherein the program when running controls an apparatus in which the non-volatile storage medium is located to perform a multi-frame association-based non-refracted star and refracted star classification method.
According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform a multi-frame joint based non-refracted star and refracted star classification method.
Through the embodiment, the non-refraction stars and the refraction stars are classified only by using the multi-frame star point position information, the classification accuracy is high, star map identification is not needed, and any prior pose information is not needed. The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In summary, in the embodiment of the present invention, inter-frame star point matching is performed according to inter-frame star point displacement, and an inter-frame attitude matrix is calculated; constructing multi-frame star point projection characteristics according to the interframe attitude matrix; the invention only uses the multi-frame star point position information to classify the non-refraction star and the refraction star, has high classification accuracy, does not need to carry out star map identification, and does not need any prior pose information. When the satellite in orbit loses prior information of the attitude and the position, the method can still correctly classify the non-refraction star and the refraction star in the star map, ensure the capability of the satellite in orbit to process emergency and improve the safety of the satellite in orbit.
In the embodiments provided in the present application, it should be understood that the disclosed technical content can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A non-refracted star and refracted star classification method based on multi-frame combination is characterized by comprising the following steps:
s100, performing inter-frame star point matching according to inter-frame star point displacement, and calculating an inter-frame attitude matrix;
s200, constructing a multi-frame star point projection characteristic according to the inter-frame attitude matrix;
and S300, constructing a refraction star classifier by using different projection characteristic properties between the refraction star and the non-refraction star, and obtaining a classification result.
2. The method for classifying a non-refraction star and a refraction star based on multi-frame association according to claim 1, wherein the step S100 comprises:
step S110, obtain
Figure 789448DEST_PATH_IMAGE002
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 353285DEST_PATH_IMAGE003
In the formula
Figure 615770DEST_PATH_IMAGE004
The vector of the star vector is used as the vector of the star,
Figure 923255DEST_PATH_IMAGE005
the number of the star marks is the same as the star number,
Figure 17113DEST_PATH_IMAGE007
is as follows
Figure 217762DEST_PATH_IMAGE008
The total number of stars in the frame image; get the first
Figure 233123DEST_PATH_IMAGE010
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 609878DEST_PATH_IMAGE011
In the formula
Figure 558242DEST_PATH_IMAGE012
The vector of the star vector is used as the vector of the star,
Figure 463881DEST_PATH_IMAGE013
the number of the star marks is the same as the star number,
Figure 966538DEST_PATH_IMAGE014
is as follows
Figure 350246DEST_PATH_IMAGE009
The total number of stars in the frame image;
step S120, according to the second step
Figure 150187DEST_PATH_IMAGE008
Frame and the first
Figure 961148DEST_PATH_IMAGE010
Finding all matched star points by using the star vectors in the frame;
and S130, calculating an interframe attitude matrix according to the matched star points.
3. The method for classifying non-refraction stars and refraction stars based on multi-frame association as claimed in claim 2, wherein said step S200 comprises:
respectively calculating to obtain the second frame star point matching and the frame attitude matrix calculating method
Figure 482259DEST_PATH_IMAGE015
Frame and preamble
Figure 669658DEST_PATH_IMAGE016
An inter-frame attitude matrix between frames;
obtained by said calculation
Figure 592615DEST_PATH_IMAGE015
Frame and preamble
Figure 574477DEST_PATH_IMAGE017
The posture matrix between frames projects the star point position at any time to the current time
Figure 582885DEST_PATH_IMAGE015
Time of day;
calculating the coordinates of the projection star vector under an image coordinate system according to a pinhole camera model to obtain two-dimensional coordinates of projection star points, and constructing star point projection characteristics
Figure 839554DEST_PATH_IMAGE018
4. The method for classifying a non-refracted star and a refracted star based on multi-frame association as claimed in claim 3, wherein said step S300 includes:
according to the star point projection characteristics
Figure 360227DEST_PATH_IMAGE019
Calculating a linear regression matrix
Figure 778570DEST_PATH_IMAGE020
In the formula
Figure 274273DEST_PATH_IMAGE022
And
Figure 69054DEST_PATH_IMAGE024
are respectively as
Figure 701023DEST_PATH_IMAGE026
And
Figure 821426DEST_PATH_IMAGE028
average value of (d);
according to the linear regression matrix
Figure 804426DEST_PATH_IMAGE029
Calculating the eigenvector corresponding to the minimum eigenvalue, and recording as
Figure 871739DEST_PATH_IMAGE031
Figure 89706DEST_PATH_IMAGE032
Is a column vector of 2x1,
Figure 115431DEST_PATH_IMAGE033
is a column vector
Figure 585726DEST_PATH_IMAGE032
And calculating its vertical vector from the vector
Figure 722310DEST_PATH_IMAGE034
According to the vector
Figure 594451DEST_PATH_IMAGE032
And vector
Figure 791077DEST_PATH_IMAGE035
Separately calculating linear regression residuals
Figure 483089DEST_PATH_IMAGE036
And
Figure 423363DEST_PATH_IMAGE037
the calculation formula is
Figure 150011DEST_PATH_IMAGE039
And
Figure 249030DEST_PATH_IMAGE040
and calculating the index of refraction star
Figure 693917DEST_PATH_IMAGE041
According to the index of said refraction star
Figure 437882DEST_PATH_IMAGE043
Classifying the star points into two categories and defining
Figure 19036DEST_PATH_IMAGE044
The star point of (A) is a non-refracting star, defined
Figure 557465DEST_PATH_IMAGE045
The star point of (A) is a refraction star, wherein
Figure 958491DEST_PATH_IMAGE046
Is a classification threshold.
5. A multiframe combined non-refracted star and refracted star classification device, comprising:
the matching module is used for matching star points among multiple frames and calculating an interframe attitude matrix;
the characteristic module is used for constructing star point projection characteristics according to the matched star point and the inter-frame attitude matrix;
and the classification module is used for classifying the non-refraction stars and the refraction stars according to the star point projection characteristics.
6. The multi-frame combined non-refracted star and refracted star classifying device according to claim 5, wherein the matching module includes:
an acquisition unit for acquiring
Figure 974988DEST_PATH_IMAGE047
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 410649DEST_PATH_IMAGE048
In the formula
Figure 119979DEST_PATH_IMAGE050
The vector of the star vector is used as the vector of the star,
Figure 270950DEST_PATH_IMAGE051
the number of the star marks is the same as the star number,
Figure 91138DEST_PATH_IMAGE052
is as follows
Figure 381305DEST_PATH_IMAGE047
The total number of stars in the frame image; get the first
Figure 995957DEST_PATH_IMAGE053
The star vectors of the camera coordinate system of all the star points in the frame image form a set
Figure 371575DEST_PATH_IMAGE054
In the formula
Figure 995455DEST_PATH_IMAGE056
The vector of the star vector is used as the vector of the star,
Figure 405707DEST_PATH_IMAGE057
the number of the star marks is the same as the star number,
Figure 722419DEST_PATH_IMAGE058
is as follows
Figure 594122DEST_PATH_IMAGE053
The total number of stars in the frame image;
a matching unit for searching all matching star points according to the star vectors of the two frames,
and the computing unit is used for computing the interframe attitude matrix according to the matched star points.
7. The multi-frame combined non-refracted star and refracted star classifying device according to claim 6, wherein the feature module includes:
the projection unit is used for calculating the coordinates of the projection star vector under the image coordinate system according to the pinhole camera model;
a construction unit for constructing star point projection characteristics according to the projection star point two-dimensional coordinates
Figure 756113DEST_PATH_IMAGE059
8. The multi-frame combined non-refracted star and refracted star classifying device according to claim 7, wherein the classifying module includes:
a regression unit for projecting the characteristics according to the star points
Figure 20872DEST_PATH_IMAGE060
Calculating a linear regression matrix
Figure 508485DEST_PATH_IMAGE061
In the formula
Figure 858695DEST_PATH_IMAGE063
And
Figure 824377DEST_PATH_IMAGE065
are respectively as
Figure 943643DEST_PATH_IMAGE067
And
Figure 602157DEST_PATH_IMAGE069
average value of (d); and also for use in accordance with the linear regression matrix
Figure 705243DEST_PATH_IMAGE070
Calculating the eigenvector corresponding to the minimum eigenvalue, and recording as
Figure 940527DEST_PATH_IMAGE071
And calculating its vertical vector from the vector
Figure 914300DEST_PATH_IMAGE072
A decision unit for determining a vector based on the vector
Figure 743715DEST_PATH_IMAGE073
And vector
Figure 68518DEST_PATH_IMAGE074
Separately calculating linear regression residuals
Figure DEST_PATH_IMAGE075
And
Figure DEST_PATH_IMAGE077
the calculation formula is
Figure 985789DEST_PATH_IMAGE078
And
Figure DEST_PATH_IMAGE079
and calculating the index of refraction star
Figure 282909DEST_PATH_IMAGE080
(ii) a And also for determining the star index of refraction
Figure DEST_PATH_IMAGE081
Classifying the star points into two categories and defining
Figure 217980DEST_PATH_IMAGE082
The star point of (A) is a non-refracting star, defined
Figure DEST_PATH_IMAGE083
The star point of (A) is a refraction star, wherein
Figure 702182DEST_PATH_IMAGE084
Is a classification threshold.
9. A non-volatile storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the non-volatile storage medium is located to perform the method of any one of claims 1 to 4.
10. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method of any of claims 1 to 4.
CN202210648818.9A 2022-06-09 2022-06-09 Multi-frame combination-based non-refracted star and refracted star classification method and device Pending CN114972875A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210648818.9A CN114972875A (en) 2022-06-09 2022-06-09 Multi-frame combination-based non-refracted star and refracted star classification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210648818.9A CN114972875A (en) 2022-06-09 2022-06-09 Multi-frame combination-based non-refracted star and refracted star classification method and device

Publications (1)

Publication Number Publication Date
CN114972875A true CN114972875A (en) 2022-08-30

Family

ID=82961552

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210648818.9A Pending CN114972875A (en) 2022-06-09 2022-06-09 Multi-frame combination-based non-refracted star and refracted star classification method and device

Country Status (1)

Country Link
CN (1) CN114972875A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108645401A (en) * 2018-04-03 2018-10-12 中国人民解放军国防科技大学 All-day star sensor star point extraction method based on attitude correlation image superposition
CN113074719A (en) * 2021-03-24 2021-07-06 航天科工空间工程发展有限公司 Rapid and reliable star map identification method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108645401A (en) * 2018-04-03 2018-10-12 中国人民解放军国防科技大学 All-day star sensor star point extraction method based on attitude correlation image superposition
CN113074719A (en) * 2021-03-24 2021-07-06 航天科工空间工程发展有限公司 Rapid and reliable star map identification method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MA LIHENG等: "Performance analysis of the attitude-correlated frames approach for star sensors", 2016 IEEE METROLOGY FOR AEROSPACE (METROAEROSPACE), 22 September 2016 (2016-09-22), pages 1 - 2 *
YANG JISAN等: "Joint Estimation of Stellar Atmospheric Refraction and Star Tracker Attitude Publisher", IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 13 December 2021 (2021-12-13) *
宁晓琳等: "基于折射方向矢量的地球卫星星光折射导航新方法", 飞控与探测, 31 March 2020 (2020-03-31), pages 1 - 2 *

Similar Documents

Publication Publication Date Title
Qin et al. Vins-mono: A robust and versatile monocular visual-inertial state estimator
CN108717531B (en) Human body posture estimation method based on Faster R-CNN
CN101489467B (en) Visual axis direction detection device and visual line direction detection method
Simo-Serra et al. Single image 3D human pose estimation from noisy observations
Corke et al. An introduction to inertial and visual sensing
Zhou et al. Human motion capture using a drone
Scaramuzza Omnidirectional vision: from calibration to root motion estimation
CN108711166A (en) A kind of monocular camera Scale Estimation Method based on quadrotor drone
Oskiper et al. Multi-sensor navigation algorithm using monocular camera, IMU and GPS for large scale augmented reality
US20130335528A1 (en) Imaging device capable of producing three dimensional representations and methods of use
EP4383193A1 (en) Line-of-sight direction tracking method and apparatus
CN105023010A (en) Face living body detection method and system
CN112734841B (en) Method for realizing positioning by using wheel type odometer-IMU and monocular camera
CN112556719B (en) Visual inertial odometer implementation method based on CNN-EKF
CN108932477A (en) A kind of crusing robot charging house vision positioning method
US20230085384A1 (en) Characterizing and improving of image processing
CN110119768B (en) Visual information fusion system and method for vehicle positioning
CN108681699A (en) A kind of gaze estimation method and line-of-sight estimation device based on deep learning
Hertzberg et al. Experiences in building a visual SLAM system from open source components
Sun et al. When we first met: Visual-inertial person localization for co-robot rendezvous
Yu et al. Accurate and robust visual localization system in large-scale appearance-changing environments
Samadzadeh et al. Srvio: Super robust visual inertial odometry for dynamic environments and challenging loop-closure conditions
CN113345032B (en) Initialization map building method and system based on wide-angle camera large distortion map
Zhang et al. A visual-inertial dynamic object tracking SLAM tightly coupled system
Qiu et al. Moirétag: Angular measurement and tracking with a passive marker

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination