CN102135421B - Method and system for measuring three-dimension altitude angle - Google Patents

Method and system for measuring three-dimension altitude angle Download PDF

Info

Publication number
CN102135421B
CN102135421B CN201010606504XA CN201010606504A CN102135421B CN 102135421 B CN102135421 B CN 102135421B CN 201010606504X A CN201010606504X A CN 201010606504XA CN 201010606504 A CN201010606504 A CN 201010606504A CN 102135421 B CN102135421 B CN 102135421B
Authority
CN
China
Prior art keywords
light
light spot
dimensional attitude
parallel light
gray value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201010606504XA
Other languages
Chinese (zh)
Other versions
CN102135421A (en
Inventor
江洁
王英雷
王昊予
张广军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201010606504XA priority Critical patent/CN102135421B/en
Publication of CN102135421A publication Critical patent/CN102135421A/en
Application granted granted Critical
Publication of CN102135421B publication Critical patent/CN102135421B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method for measuring three-dimension altitude angles, comprising the steps of forming facula images on an image sensor by two parallel light beams through pinhole diaphragms; recognizing all faculas by group respectively; obtaining the corresponding two-dimension altitude angles respectively according to the mass center coordinates of faculas corresponding to the two parallel light beams; and obtaining the three-dimension altitude angles based on a dual-vector altitude determination principle according to the two-dimension altitude angles respectively corresponding to the two parallel light beams. The invention also discloses a system for measuring the two-dimension altitude angle. By adopting the scheme, the measurement on the three-dimension altitude angles can achieve the angle measurement scope with a large range; and furthermore, the design of the system has no relationship with the working distance by taking the two parallel light beams and the pinhole diaphragms as the optical system.

Description

Method and system for measuring three-dimensional attitude angle
Technical Field
The invention relates to an attitude angle measuring technology, in particular to a three-dimensional attitude angle measuring method and system.
Background
In the sectors of machine-building, aviation, aerospace, national defense, construction and the like, the attitude angle is an important physical quantity to be determined. The attitude angle is an attitude angle of an object with respect to a reference object. Among many attitude angle measurement methods, optical angle measurement methods represented by a circular grating method and a ring laser method have high accuracy, but have strict requirements on hardware conditions and are limited to one-dimensional angle measurement. The existing three-dimensional attitude angle measuring method is relatively few, and Zhang river et al provides a three-dimensional small angle measuring system model based on the measuring principle of a double-reflecting surface auto-collimation system, as shown in fig. 1, a reflecting surface I and a reflecting surface II have a set spatial included angle, reflected light is imaged on a CCD surface through a collimating lens, and the measuring principle based on the double-reflecting surface auto-collimation system is characterized in that the measuring principle can distinguish the tiny variation of a gamma angle, so that the measurement of the spatial three-dimensional small angle is realized. However, the three-dimensional attitude angle measurement has the disadvantages of small range and high requirement on the design precision of the system.
Disclosure of Invention
In view of the above, the main objective of the present invention is to provide a method and a system for measuring a three-dimensional attitude angle, which can enable the measurement of the three-dimensional attitude angle to reach a wide-range angle measurement range.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the invention provides a method for measuring a three-dimensional attitude angle, which comprises the following steps:
forming a light spot image on the image sensor by the two paths of parallel light rays through the pinhole diaphragm;
calculating the maximum gray value of each light spot, averagely dividing all the light spots into two groups according to the maximum gray value of each light spot, wherein the parallel light rays with high illumination intensity correspond to a group of light spots with high maximum gray value, and the parallel light rays with low illumination intensity correspond to a group of light spots with low maximum gray value; respectively obtaining corresponding two-dimensional attitude angles according to the light spot centroid coordinates corresponding to the two parallel light rays;
and obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively.
In the above scheme, the pinhole diaphragm adopts a plurality of array apertures;
the two paths of parallel light rays have different illumination intensities and fixed included angles and pass through the pinhole diaphragm at different incidence angles.
In the above scheme, the dividing all the light spots into two groups according to the maximum gray value of each light spot includes: the maximum gray value of each light spot is arranged according to the sequence from large to small, and the arranged light spots are divided into two groups from the middle.
In the above scheme, the method further comprises: after all light spots are grouped and identified, according to the position relation of each small hole of the pinhole diaphragm and the mass center coordinate of each light spot corresponding to the same path of parallel light, mass center marks obtained when the mass center coordinate of the light spots is calculated and light spot marks needed when the two-dimensional attitude angle is calculated are in one-to-one correspondence.
The invention provides a system for measuring a three-dimensional attitude angle, which comprises: an imaging unit, a computer processing unit; wherein;
the imaging unit is used for forming light spot images on the image sensing device by the two paths of parallel light rays through the pinhole diaphragm and transmitting the imaging area of each light spot to the computer processing unit;
the computer processing unit is used for calculating the maximum gray value of each light spot, averagely dividing all the light spots into two groups according to the maximum gray value of each light spot, wherein the parallel light rays with high illumination intensity correspond to a group of light spots with high maximum gray value, and the parallel light rays with low illumination intensity correspond to a group of light spots with low maximum gray value; respectively obtaining corresponding two-dimensional attitude angles according to the centroid coordinates of the light spots corresponding to the two parallel light rays; and then obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angle.
In the above aspect, the imaging unit includes: pinhole diaphragms, image sensors; wherein,
the pinhole diaphragm is provided with a plurality of array pinholes and is used for forming a plurality of light spots on the image sensor when two paths of parallel light rays pass through;
and the image sensor is used for converting the optical signals of the imaging areas of the light spots into electric signals and transmitting the electric signals to the computer processing unit one by one.
In the foregoing solution, the computer processing unit includes: the system comprises a system modeling unit, a centroid calculating unit, a light spot grouping and identifying unit, a two-dimensional attitude angle calculating unit and a three-dimensional attitude angle calculating unit; wherein,
the system modeling unit is used for performing system modeling on the pinhole diaphragm and the imaging surface of the image sensor;
the centroid calculation unit is used for calculating the centroid coordinates of each light spot according to the imaging area of each light spot;
the light spot grouping and identifying unit is used for calculating the maximum gray value of each light spot, averagely dividing all the light spots into two groups according to the maximum gray value of each light spot, wherein the parallel light rays with high illumination intensity correspond to a group of light spots with high maximum gray value, and the parallel light rays with low illumination intensity correspond to a group of light spots with low maximum gray value;
the two-dimensional attitude angle calculation unit is used for calculating two-dimensional attitude angles corresponding to the two parallel light rays according to the light spot centroid coordinates corresponding to the two parallel light rays;
and the three-dimensional attitude angle calculation unit is used for obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively.
In the above scheme, the light spot grouping and identifying unit is further configured to perform one-to-one correspondence between the centroid mark obtained when the centroid coordinate of the light spot is calculated and the light spot mark required when the two-dimensional attitude angle is calculated, according to the position relationship of each small hole of the pinhole diaphragm and the centroid coordinate of each light spot corresponding to the same path of parallel light.
According to the method and the system for measuring the three-dimensional attitude angle, two paths of parallel light rays form a light spot image on an image sensor through a pinhole diaphragm; grouping and identifying all light spots, and respectively obtaining corresponding two-dimensional attitude angles according to the light spot centroid coordinates corresponding to the two parallel light rays; then obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively; therefore, the measurement of the three-dimensional attitude angle can reach a wide-range angle measurement range; in addition, because two paths of parallel light rays and the pinhole diaphragm are adopted as the optical system, the design of the system is independent of the working distance.
Drawings
FIG. 1 is a schematic diagram of a three-dimensional small angle measurement system model of the prior art;
FIG. 2 is a schematic flow chart of a method for measuring a three-dimensional attitude angle according to the present invention;
FIG. 3 is a schematic diagram of the system modeling of a pinhole diaphragm and an imaging plane of an image sensor according to the present invention;
FIG. 4 is a schematic diagram of grouping identification of six light spots formed by two parallel light beams passing through a pinhole diaphragm and one-to-one correspondence of light spot marks and centroid marks in the invention;
FIG. 5 is a schematic diagram of a principle of obtaining a three-dimensional attitude angle according to the double-vector attitude determination principle in the present invention;
fig. 6 is a schematic structural diagram of a measurement system for implementing a three-dimensional attitude angle according to the present invention.
Detailed Description
The basic idea of the invention is: forming a light spot image on the image sensor by the two paths of parallel light rays through the pinhole diaphragm; grouping and identifying all light spots, and respectively obtaining corresponding two-dimensional attitude angles according to the light spot centroid coordinates corresponding to the two parallel light rays; and then obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively.
The invention is further described in detail below with reference to the figures and the specific embodiments.
The invention realizes a method for measuring a three-dimensional attitude angle, which comprises the following steps as shown in figure 2:
step 201: forming a facula image on an imaging surface of the image sensor by the two paths of parallel light rays through the pinhole diaphragm;
in this step, the pinhole diaphragm is generally realized by coating and etching an array pinhole on a silicon substrate by adopting a micro-electro-mechanical system (MEMS) technology, the diameter of the pinhole is 100 μm, and the hole pitch is 600 μm; the pinhole diaphragm is designed into a plurality of array pinholes, in the embodiment of the invention, the pinhole diaphragm mode adopts three array pinholes, and the distribution of the three array pinholes is shown in the pinhole diaphragm mode in fig. 3;
the two paths of parallel light rays have different illumination intensities and fixed included angles and pass through the pinhole diaphragm at different incidence angles;
in fig. 3, F represents the focal length of the system, which is generally set according to the size of the image plane of the image sensor and the desired angle measurement range.
Step 202: grouping and identifying all light spots, and respectively obtaining corresponding two-dimensional attitude angles according to the centroid coordinates of the light spots corresponding to the two parallel light rays;
specifically, a system modeling is performed on the imaging surface of the pinhole diaphragm and the image sensor, as shown in fig. 3, a plane rectangular coordinate system is established on the imaging surface as an image space coordinate system of the sensor, and the center Om of the pinhole diaphragm to the vertical point Oc of the imaging surface is taken as the origin of the image space coordinate system; according to the imaging area of each light spot, calculating the centroid coordinate of each light spot through a first moment centroid algorithm, grouping and identifying all the light spots, determining the light spots corresponding to the two paths of parallel light rays, and calculating the two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively through a trigonometric method;
the grouping identification of all the light spots is as follows: calculating the maximum gray value of each light spot, averagely dividing all the light spots into two groups according to the maximum gray value of each light spot, wherein the parallel light rays with high illumination intensity correspond to a group of light spots with high maximum gray value, and the parallel light rays with low illumination intensity correspond to a group of light spots with low maximum gray value;
specifically, two way parallel light's illumination intensity is inconsistent, and the illumination intensity who sets up parallel light 1 is I, and parallel light 2's illumination intensity is 0.5I, and the maximum grey scale value of two way parallel light's formation of image facula on image sensor's the imaging surface is different, divide into two sets ofly with the facula according to the size of maximum grey scale value, include:
setting max _ pix (i) to represent the maximum gray value of the ith light spot, arranging max _ pix (i) of each light spot in a descending order, dividing the arranged light spots into two groups from the middle, wherein the parallel light ray 1 corresponds to a group of light spots with a large maximum gray value, the parallel light ray 2 corresponds to a group of light spots with a small maximum gray value, such as six light spots formed by the pinhole diaphragm in fig. 3, arranging max _ pix (i) of the six light spots in a descending order, and taking the light spots arranged in the first three positions as the first group, such as the max _ pix (1), the max _ pix (2) and the max _ pix (3) as the first group, and corresponding to the light ray 1; the max _ pix (4), max _ pix (5) and max _ pix (6) are arranged in the last three positions, and are a second group corresponding to the light ray 2;
further, after the grouping identification is performed on all the light spots, the step further includes: the light spot marks and the mass center marks are in one-to-one correspondence according to the position relation of each small hole of the pinhole diaphragm and the mass center coordinates of each light spot corresponding to the same path of parallel light; the centroid mark is a mark of a spot centroid obtained when a centroid coordinate of the spot is calculated, the mark value represents the centroid sequence of the spot taking the horizontal and vertical coordinates of the image as a reference, and the centroid coordinates with the centroid mark are (x) in sequence by taking six spots as an example1,y1)、(x2,y2)、......、(x6,y6) (ii) a The light spot mark is a mark of a light spot corresponding to each path of parallel light, and the light spot mark A is a light spot mark which is formed by distinguishing six light spots formed by two paths of parallel light according to the following marks 1 and 21、B1、C1、A2、B2、C2
According to the position relation of each small hole of the pinhole diaphragm and the mass center coordinate of each light spot corresponding to the same path of parallel light, the light spot marks and the mass center marks are in one-to-one correspondence, and the method specifically comprises the following steps:
taking the pinhole diaphragm in fig. 3 as an example, when the pinhole diaphragm is installed, the right-angle side of the diaphragm corresponds to the rectangular coordinate of the imaging surface, and for three light spots corresponding to the same parallel light, the vertical coordinates of the light spot a and the light spot B are always similar, and the horizontal coordinates of the light spot B and the light spot C are similar. Defining sets L (dx) and L (dy), which respectively represent the set of distances between the abscissas of every two centroids and the set of distances between the ordinates of every two centroids, the two coordinates with the smallest distance being most similar, as shown in formula (1):
<math> <mrow> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mi>L</mi> <mrow> <mo>(</mo> <mi>dx</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>{</mo> <mo>|</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>|</mo> <mo>|</mo> <mi>i</mi> <mo>=</mo> <mn>1,2,3</mn> <mo>,</mo> <mi>j</mi> <mo>=</mo> <mn>1,2,3</mn> <mo>,</mo> <mi>i</mi> <mo>&NotEqual;</mo> <mi>j</mi> <mo>}</mo> </mtd> </mtr> <mtr> <mtd> <mi>L</mi> <mrow> <mo>(</mo> <mi>dy</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>{</mo> <mo>|</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>|</mo> <mo>|</mo> <mi>i</mi> <mo>=</mo> <mn>1,2,3</mn> <mo>,</mo> <mi>j</mi> <mo>=</mo> <mn>1,2,3</mn> <mo>,</mo> <mi>i</mi> <mo>&NotEqual;</mo> <mi>j</mi> <mo>}</mo> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </math>
representing similarity measurement between spot coordinates by gamma (·), and defining spot coordinate similarity measurement function theta1(x) And Θ1(y) representing the similarity metric function of the abscissa and ordinate of the three spots, respectively, as shown in equation (2):
<math> <mrow> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <msub> <mi>&Theta;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>&Gamma;</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>x</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>x</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mi>min</mi> <mrow> <mo>(</mo> <mi>L</mi> <mrow> <mo>(</mo> <mi>dx</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mi>&Theta;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>&Gamma;</mi> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>y</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>y</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mi>min</mi> <mrow> <mo>(</mo> <mi>L</mi> <mrow> <mo>(</mo> <mi>dy</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow> </math>
through the formulas (1) and (2), the centroid coordinate which simultaneously satisfies the similarity of the horizontal and vertical coordinates in the first group of light spots can be obtained, and the light spot mark of the centroid coordinate is marked as B1And B is1The light spot mark of the mass center coordinate with similar ordinate is marked as A1And B is1The light spot mark of the centroid coordinate with similar abscissa is marked as C1
In the same way, the centroid coordinate which simultaneously satisfies the similarity of the horizontal and vertical coordinates in the second group can be obtained, and the light spot mark of the centroid coordinate is marked as B2And B is2The light spot mark of the mass center coordinate with similar ordinate is marked as A2And B is2The light spot mark of the centroid coordinate with similar abscissa is marked as C2
The following will describe in detail the process of grouping and identifying all the light spots and corresponding the light spot marks and the centroid marks one by one, taking fig. 4 as an example:
firstly, calculating the maximum gray value max _ pix (i) of each light spot, arranging the max _ pix (i) in a descending order, and dividing the light spots into two groups, wherein the light spots 1, 2 and 5 with larger gray values are the first group and correspond to the light ray 1; the light spots 3, 4 and 6 with smaller gray values are a second group and correspond to the light ray 2;
then, the theta of the first set of spots is calculated1(x)、Θ1(y), obtaining that the light spot meeting the mass center coordinate with similar horizontal and vertical coordinates is the light spot 2, and marking the light spot corresponding to the mass center mark of the light spot 2 as a light spot B1The light spot with the centroid coordinate similar to the ordinate of the light spot 2 is the light spot 1, and the light spot mark corresponding to the centroid mark of the light spot 1 is A1The light spot with the centroid coordinate similar to the abscissa of the light spot 2 is the light spot 5, and the light spot mark corresponding to the centroid mark of the light spot 5 is C1
Calculate theta for the second set of spots2(x)、Θ2(y), obtaining that the light spot meeting the mass center coordinate with similar horizontal and vertical coordinates is the light spot 4, and marking the light spot corresponding to the mass center mark of the light spot 4 as a light spot B2The light spot with the centroid coordinate similar to the longitudinal coordinate of the light spot 4 is the light spot 3, and the light spot mark corresponding to the centroid mark of the light spot 3 is A2The light spot with the centroid coordinate similar to the abscissa of the light spot 4 is the light spot 6, and the light spot mark corresponding to the centroid mark of the light spot 6 is C2
The centroid here is denoted as the centroid coordinate (x)1,y1)、(x2,y2)、......、(x6,y6) Subscripts of (a).
In this step, it is the prior art that the two-dimensional attitude angles corresponding to the two parallel light rays respectively are calculated by a trigonometric method, that is, the two-dimensional attitude angle corresponding to each light spot is calculated by the trigonometric method, and the two-dimensional attitude angle corresponding to each light spot corresponding to the same parallel light ray is calculated by averaging the two-dimensional attitude angles corresponding to each light spot according to the light spot mark of each light spot to obtain the two-dimensional attitude angle corresponding to the parallel light ray, and specific steps are not repeated here.
Step 203: obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively;
in the step, a double-vector attitude determination (TRIAD) principle is adopted, the double-vector attitude determination is a three-dimensional attitude calculation algorithm widely used on a vector attitude sensor, the definition of a three-dimensional attitude angle refers to the definition of a three-dimensional attitude angle in astronomical navigation, and is generally described by 3 attitude angle components of rolling, pitching and yawing. As shown in fig. 5, O-XnYnZn is an inertial coordinate system, Oc-XcYcZc is an image space coordinate system of the image sensor, and the yaw angle α, pitch angle δ, and roll angle κ are defined as follows:
yaw angle α: the included angle between the projection of the negative direction of the z axis on the XnYn surface and the Xn axis is measured in a counterclockwise mode from the Xn axis;
pitch angle δ: the included angle between the negative direction of the z axis and the projection of the z axis on the XnYn surface is measured clockwise from the projection;
roll angle κ: the projection of the Z axis on the XcYc plane makes an angle with the Yc axis, measured clockwise from the projection.
And if the inertial coordinate system O-XnYnZn rotates around the Zn axis by alpha +3 pi/2, then rotates around the rotated Xn axis by delta + pi/2, and then rotates around the Zn axis by kappa, then the O-XnYnZn is superposed with the Oc-XcYcZc. Suppose that the transformation matrix from the image space coordinate system to the inertial coordinate system of the sensor is MsThen M issAs shown in equation (4). Because the image space coordinate system and the inertia coordinate system of the sensor are rectangular coordinate systems, the conversion matrix M from the image space coordinate system to the inertia coordinate system of the sensor can be known according to the rectangular coordinate transformation theoremsIs an orthogonal matrix, and thus the transformation matrix from the inertial frame to the image space frame of the sensor is
Figure GDA00001642189100081
M s = a 1 a 2 a 3 b 1 b 2 b 3 c 1 c 2 c 3 =
<math> <mrow> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mi>cos</mi> <mrow> <mo>(</mo> <mn>3</mn> <mi>&pi;</mi> <mo>/</mo> <mn>2</mn> <mo>+</mo> <mi>&alpha;</mi> <mo>)</mo> </mrow> </mtd> <mtd> <mo>-</mo> <mi>sin</mi> <mrow> <mo>(</mo> <mn>3</mn> <mi>&pi;</mi> <mo>/</mo> <mn>2</mn> <mo>+</mo> <mi>&alpha;</mi> <mo>)</mo> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mi>sin</mi> <mrow> <mo>(</mo> <mn>3</mn> <mi>&pi;</mi> <mo>/</mo> <mn>2</mn> <mo>+</mo> <mi>&alpha;</mi> <mo>)</mo> </mrow> </mtd> <mtd> <mi>cos</mi> <mrow> <mo>(</mo> <mn>3</mn> <mi>&pi;</mi> <mo>/</mo> <mn>2</mn> <mo>+</mo> <mi>&alpha;</mi> <mo>)</mo> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&CenterDot;</mo> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mi>cos</mi> <mrow> <mo>(</mo> <mi>&pi;</mi> <mo>/</mo> <mn>2</mn> <mo>+</mo> <mi>&beta;</mi> <mo>)</mo> </mrow> </mtd> <mtd> <mo>-</mo> <mi>sin</mi> <mrow> <mo>(</mo> <mn>3</mn> <mi>&pi;</mi> <mo>/</mo> <mn>2</mn> <mo>+</mo> <mi>&beta;</mi> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mi>sin</mi> <mrow> <mo>(</mo> <mi>&pi;</mi> <mo>/</mo> <mn>2</mn> <mo>+</mo> <mi>&beta;</mi> <mo>)</mo> </mrow> </mtd> <mtd> <mi>cos</mi> <mrow> <mo>(</mo> <mi>&pi;</mi> <mo>/</mo> <mn>2</mn> <mo>+</mo> <mi>&beta;</mi> <mo>)</mo> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>&CenterDot;</mo> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mi>cos</mi> <mi>&kappa;</mi> </mtd> <mtd> <mo>-</mo> <mi>sin</mi> <mi>&kappa;</mi> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mi>sin</mi> <mi>&kappa;</mi> </mtd> <mtd> <mi>cos</mi> <mi>&kappa;</mi> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow> </math>
From equation (4) we can obtain:
α=arctan(b3/a3)
δ=arcsin(-c3) (5)
κ=arctan(c1/c2)
if the image space coordinate system to inertia of the sensor can be obtainedTransformation matrix M of sexual coordinate systemsThen, the instantaneous three-dimensional attitude angle can be obtained by using the formula (5), so the essence of calculating the three-dimensional attitude angle is to obtain the transformation matrix M from the image space coordinate system to the inertial coordinate system of the sensors
The specific process of the step is as follows:
as shown in FIG. 5, let W be the two-position attitude angle of two parallel rays under the image space coordinate system of the sensor Oc-XcYcZc1And W2I.e. the direction vector under the image space coordinate system Oc-XcYcZc of the sensor is W1And W2(ii) a The direction vector under the inertial coordinate system O-XnYnZn is U1And U2Said U1And U2Can be obtained by a camera calibration method.
Establishing a reference coordinate system O-X1Y1Z1 and a conversion matrix M from an image space coordinate system to an inertial coordinate system of the sensor by using the two flat light rayssThe solution process of (2) is as follows:
(a) the orthogonal coordinate basis of the reference coordinate system O-X1Y1Z1 under the image space coordinate system Oc-XcYcZc of the sensor is calculated as:
a=W1
b=(W1×W2)/|W1×W2| (6)
c=a×b
the transformation matrix from the image space coordinate system of the sensor, Oc-XcYcZc, to the reference coordinate system, O-X1Y1Z1, is obtained by equation (6):
C cl = a T b T c T - - - ( 7 ) .
(b) the orthogonal coordinate basis of the reference coordinate system O-X1Y1Z1 under the inertial coordinate system O-XnYnZn is calculated as follows:
A=U1
B=(U1×U2)/|U1×U2| (8)
C=A×B
the transformation matrix from the inertial coordinate system O-XnYnZn to the reference coordinate system O-X1Y1Z1 is obtained by equation (8) as follows:
C nl = A T B T C T - - - ( 9 ) .
(c) obtaining a conversion matrix M from an image space coordinate system O-XcYcZc to an inertial coordinate system O-XnYnZn of the sensor according to the formulas (7) and (9)sAs shown in equation (10);
M s = C nl - 1 C cl = C nl T C cl - - - ( 10 )
obtaining a conversion matrix M from an image space coordinate system to an inertial coordinate system of the sensor according to the methodsThen, the three-dimensional attitude angle is obtained by using the formula (5).
In order to implement the above method, the present invention further provides a system for measuring a three-dimensional attitude angle, as shown in fig. 6, the system comprising: an imaging unit 61, a computer processing unit 62; wherein;
the imaging unit 61 is used for forming light spot images on the image sensing device by the two paths of parallel light rays through the pinhole diaphragm and transmitting the imaging area of each light spot to the computer processing unit 62;
the computer processing unit 62 is used for grouping and identifying all the light spots, and respectively obtaining corresponding two-dimensional attitude angles according to the centroid coordinates of the light spots corresponding to the two parallel light rays; then obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angle;
the imaging unit 61 includes: a pinhole diaphragm 611, an image sensor 612; wherein,
a pinhole diaphragm 611 having a plurality of array pinholes, for forming a plurality of light spots on the image sensor when the two parallel light beams pass through;
the image sensor 612 is used for converting the optical signals of each light spot imaging area into electric signals and transmitting the electric signals to the computer processing unit one by one;
the computer processing unit 62 includes: a system modeling unit 621, a centroid calculating unit 622, a light spot grouping and identifying unit 623, a two-dimensional attitude angle calculating unit 624, and a three-dimensional attitude angle calculating unit 625; wherein,
the system modeling unit 621 is used for performing system modeling on the pinhole diaphragm 611 and the imaging surface of the image sensor 612;
the centroid calculating unit 622 is used for calculating the centroid coordinates of each light spot through a first moment centroid algorithm according to the imaging area of each light spot;
the light spot grouping and identifying unit 623 is used for grouping and identifying all light spots and determining parallel light rays corresponding to the light spots respectively;
the two-dimensional attitude angle calculation unit 624 is configured to calculate, according to the light spot centroid coordinates corresponding to the two parallel light beams, two-dimensional attitude angles corresponding to the two parallel light beams by a trigonometric method;
the three-dimensional attitude angle calculation unit 625 is configured to obtain a three-dimensional attitude angle based on a dual-vector attitude determination principle according to two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively;
the two parallel light rays have different illumination intensities and fixed included angles and pass through the pinhole diaphragm at different incidence angles;
the light spot grouping and identifying unit 623 is specifically configured to calculate the maximum gray value of each light spot, and averagely divide all the light spots into two groups according to the maximum gray value of each light spot, wherein parallel light rays with high illumination intensity correspond to a group of light spots with high maximum gray value, and parallel light rays with low illumination intensity correspond to a group of light spots with low maximum gray value;
the light spot grouping and identifying unit 623 is further configured to perform one-to-one correspondence between a centroid mark obtained when the centroid coordinate of the light spot is calculated and a light spot mark required when the two-dimensional attitude angle is calculated according to the position relationship of each small hole of the pinhole diaphragm and the centroid coordinate of each light spot corresponding to the same path of parallel light.
The imaging unit 61 further includes: an image sensor driving unit 613 for driving the image sensor 612;
the imaging unit 61 further includes: and the interface unit 614 is used for connecting with the computer processing unit 62 for data transmission.
By the scheme of the invention, the measurement of the three-dimensional attitude angle can reach the wide-range angle measurement range; in addition, because two paths of parallel light rays and the pinhole diaphragm are adopted as the optical system, the design of the system is independent of the working distance.
The above description is only exemplary of the present invention and should not be taken as limiting the scope of the present invention, and any modifications, equivalents, improvements, etc. that are within the spirit and principle of the present invention should be included in the present invention.

Claims (8)

1. A method for measuring a three-dimensional attitude angle, the method comprising:
forming a light spot image on the image sensor by the two paths of parallel light rays through the pinhole diaphragm;
calculating the maximum gray value of each light spot, averagely dividing all the light spots into two groups according to the maximum gray value of each light spot, wherein the parallel light rays with high illumination intensity correspond to a group of light spots with high maximum gray value, and the parallel light rays with low illumination intensity correspond to a group of light spots with low maximum gray value; respectively obtaining corresponding two-dimensional attitude angles according to the light spot centroid coordinates corresponding to the two parallel light rays;
and obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively.
2. The measurement method according to claim 1, wherein the pinhole diaphragm employs a plurality of array apertures;
the two paths of parallel light rays have different illumination intensities and fixed included angles and pass through the pinhole diaphragm at different incidence angles.
3. The measurement method according to claim 1, wherein the step of dividing all the light spots into two groups according to the size of the maximum gray value of each light spot is as follows: the maximum gray value of each light spot is arranged according to the sequence from large to small, and the arranged light spots are divided into two groups from the middle.
4. The measurement method according to claim 1, characterized in that the method further comprises: after all light spots are grouped and identified, according to the position relation of each small hole of the pinhole diaphragm and the mass center coordinate of each light spot corresponding to the same path of parallel light, mass center marks obtained when the mass center coordinate of the light spots is calculated and light spot marks needed when the two-dimensional attitude angle is calculated are in one-to-one correspondence.
5. A system for measuring a three-dimensional attitude angle, the system comprising: an imaging unit, a computer processing unit; wherein;
the imaging unit is used for forming light spot images on the image sensing device by the two paths of parallel light rays through the pinhole diaphragm and transmitting the imaging area of each light spot to the computer processing unit;
the computer processing unit is used for calculating the maximum gray value of each light spot, averagely dividing all the light spots into two groups according to the maximum gray value of each light spot, wherein the parallel light rays with high illumination intensity correspond to a group of light spots with high maximum gray value, and the parallel light rays with low illumination intensity correspond to a group of light spots with low maximum gray value; respectively obtaining corresponding two-dimensional attitude angles according to the centroid coordinates of the light spots corresponding to the two parallel light rays; and then obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angle.
6. The measurement system of claim 5, wherein the imaging unit comprises: pinhole diaphragms, image sensors; wherein,
the pinhole diaphragm is provided with a plurality of array pinholes and is used for forming a plurality of light spots on the image sensor when two paths of parallel light rays pass through;
and the image sensor is used for converting the optical signals of the imaging areas of the light spots into electric signals and transmitting the electric signals to the computer processing unit one by one.
7. The measurement system of claim 5 or 6, wherein the computer processing unit comprises: the system comprises a system modeling unit, a centroid calculating unit, a light spot grouping and identifying unit, a two-dimensional attitude angle calculating unit and a three-dimensional attitude angle calculating unit; wherein,
the system modeling unit is used for performing system modeling on the pinhole diaphragm and the imaging surface of the image sensor;
the centroid calculation unit is used for calculating the centroid coordinates of each light spot according to the imaging area of each light spot;
the light spot grouping and identifying unit is used for calculating the maximum gray value of each light spot, averagely dividing all the light spots into two groups according to the maximum gray value of each light spot, wherein the parallel light rays with high illumination intensity correspond to a group of light spots with high maximum gray value, and the parallel light rays with low illumination intensity correspond to a group of light spots with low maximum gray value;
the two-dimensional attitude angle calculation unit is used for calculating two-dimensional attitude angles corresponding to the two parallel light rays according to the light spot centroid coordinates corresponding to the two parallel light rays;
and the three-dimensional attitude angle calculation unit is used for obtaining a three-dimensional attitude angle based on a double-vector attitude determination principle according to the two-dimensional attitude angles corresponding to the two paths of parallel light rays respectively.
8. The measuring system according to claim 7, wherein the light spot grouping and identifying unit is further configured to perform a one-to-one correspondence between the centroid mark obtained when the centroid coordinate of the light spot is calculated and the light spot mark required when the two-dimensional attitude angle is calculated, according to the position relationship of the small holes of the pinhole diaphragm and the centroid coordinates of the light spots corresponding to the same path of parallel light.
CN201010606504XA 2010-12-24 2010-12-24 Method and system for measuring three-dimension altitude angle Expired - Fee Related CN102135421B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010606504XA CN102135421B (en) 2010-12-24 2010-12-24 Method and system for measuring three-dimension altitude angle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010606504XA CN102135421B (en) 2010-12-24 2010-12-24 Method and system for measuring three-dimension altitude angle

Publications (2)

Publication Number Publication Date
CN102135421A CN102135421A (en) 2011-07-27
CN102135421B true CN102135421B (en) 2012-11-07

Family

ID=44295272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010606504XA Expired - Fee Related CN102135421B (en) 2010-12-24 2010-12-24 Method and system for measuring three-dimension altitude angle

Country Status (1)

Country Link
CN (1) CN102135421B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107478195A (en) * 2017-09-15 2017-12-15 哈尔滨工程大学 One kind is based on optical space object status measurement apparatus and its measuring method

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102636150B (en) * 2012-04-28 2013-12-11 北京航空航天大学 Method for quickly determining attitude angles of spatial axisymmetric rigid-body target
CN102901459B (en) * 2012-10-08 2015-07-01 中国人民解放军国防科学技术大学 Device and method for measuring angular-second three-dimensional deformation based on collimation light path
CN104097793B (en) * 2014-06-24 2017-01-11 上海微小卫星工程中心 Zero momentum magnetic control sun capture device and method of satellite
CN104990533B (en) * 2015-06-22 2019-01-08 哈尔滨工业大学 Satellite ground physical simulation system superhigh precision attitude measurement method and device
CN106052631B (en) * 2016-05-10 2018-07-24 哈尔滨理工大学 A method of three-dimensional low-angle is measured based on auto-collimation principle
CN106092069B (en) * 2016-08-24 2019-05-07 湖南科天健光电技术有限公司 A kind of device of guiding and displacement monitoring in underground engineering construction
CN109540084B (en) * 2018-10-25 2021-02-09 北京航天控制仪器研究所 Method and device for measuring three-dimensional attitude of part in floating liquid
CN109579782B (en) * 2019-01-11 2021-01-08 哈尔滨工业大学 High-precision large-working-distance auto-collimation three-dimensional angle measuring device and method
CN109631827B (en) * 2019-01-11 2021-03-02 哈尔滨工业大学 Double-light-source high-precision anti-interference large-working-distance auto-collimation device and method based on absolute angle measurement
CN109579781B (en) * 2019-01-11 2021-01-12 哈尔滨工业大学 High-precision large-working-distance auto-collimation three-dimensional absolute angle measuring device and method
CN111238438B (en) * 2020-02-14 2022-03-11 天津时空经纬测控技术有限公司 Non-contact attitude measurement method and storage medium
CN111256650A (en) * 2020-02-14 2020-06-09 天津时空经纬测控技术有限公司 Non-contact attitude measurement method, non-contact attitude measurement device, and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101666640A (en) * 2009-09-27 2010-03-10 北京航空航天大学 Method and system for measuring two-dimensional attitude angle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101666640A (en) * 2009-09-27 2010-03-10 北京航空航天大学 Method and system for measuring two-dimensional attitude angle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
江洁 等.一种大视场姿态角传感器的光学***设计.《仪器仪表学报》.2010,第31卷(第1期),68-72. *
江洁 等.点结构光动态姿态角测量***.《红外与激光工程》.2010,第39卷(第3期),532-536. *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107478195A (en) * 2017-09-15 2017-12-15 哈尔滨工程大学 One kind is based on optical space object status measurement apparatus and its measuring method

Also Published As

Publication number Publication date
CN102135421A (en) 2011-07-27

Similar Documents

Publication Publication Date Title
CN102135421B (en) Method and system for measuring three-dimension altitude angle
CN102788559B (en) Optical vision measuring system with wide-field structure and measuring method thereof
CN109084724A (en) A kind of deep learning barrier distance measuring method based on binocular vision
CN109767476A (en) A kind of calibration of auto-focusing binocular camera and depth computing method
CN101666640B (en) Method and system for measuring two-dimensional attitude angle
CN104266608B (en) Field calibration device for visual sensor and calibration method
CN110009682B (en) Target identification and positioning method based on monocular vision
CN104613871B (en) Calibration method of coupling position relationship between micro lens array and detector
CN103759669A (en) Monocular vision measuring method for large parts
CN109859272A (en) A kind of auto-focusing binocular camera scaling method and device
CN103649674A (en) Measurement device and information processing device
CN103411553A (en) Fast calibration method of multiple line structured light visual sensor
CN105157592A (en) Binocular vision-based method for measuring deformation shape and deformation rate of flexible trailing edge of adaptive wing
CN104101331A (en) Method used for measuring pose of non-cooperative target based on complete light field camera
CN103903260A (en) Target method for quickly calibrating intrinsic parameters of vidicon
CN109827521A (en) Calibration method for rapid multi-line structured optical vision measurement system
CN103697883B (en) A kind of aircraft horizontal attitude defining method based on skyline imaging
AU2019353165B2 (en) Optics based multi-dimensional target and multiple object detection and tracking method
CN108469254A (en) A kind of more visual measuring system overall calibration methods of big visual field being suitable for looking up and overlooking pose
JP6333396B2 (en) Method and apparatus for measuring displacement of mobile platform
CN110763204A (en) Planar coding target and pose measurement method thereof
CN106679616A (en) Image ranging system, light source module and image sensing module
US20190339071A1 (en) Marker, and Posture Estimation Method and Position and Posture Estimation Method Using Marker
KR101188357B1 (en) Method for three dimensional recognition by two dimensional detection of multi projected light with predetermined magnification angle
CN113028990A (en) Laser tracking attitude measurement system and method based on weighted least square

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121107

Termination date: 20211224

CF01 Termination of patent right due to non-payment of annual fee