CN113776556B - Gyroscope and camera relative position matrix calibration method based on data fusion - Google Patents
Gyroscope and camera relative position matrix calibration method based on data fusion Download PDFInfo
- Publication number
- CN113776556B CN113776556B CN202110596374.4A CN202110596374A CN113776556B CN 113776556 B CN113776556 B CN 113776556B CN 202110596374 A CN202110596374 A CN 202110596374A CN 113776556 B CN113776556 B CN 113776556B
- Authority
- CN
- China
- Prior art keywords
- gyroscope
- camera
- matrix
- image
- relative position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 239000011159 matrix material Substances 0.000 title claims abstract description 118
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000004927 fusion Effects 0.000 title claims abstract description 14
- 238000001514 detection method Methods 0.000 claims abstract description 35
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 16
- 238000002474 experimental method Methods 0.000 claims abstract description 10
- 238000013519 translation Methods 0.000 claims description 18
- 238000005457 optimization Methods 0.000 claims description 13
- 238000001914 filtration Methods 0.000 claims description 11
- 238000003384 imaging method Methods 0.000 claims description 8
- 230000017105 transposition Effects 0.000 claims description 8
- 238000012545 processing Methods 0.000 claims description 7
- 238000009499 grossing Methods 0.000 claims description 4
- 238000013507 mapping Methods 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- PUAQLLVFLMYYJJ-UHFFFAOYSA-N 2-aminopropiophenone Chemical compound CC(N)C(=O)C1=CC=CC=C1 PUAQLLVFLMYYJJ-UHFFFAOYSA-N 0.000 description 1
- 238000012897 Levenberg–Marquardt algorithm Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 150000002500 ions Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Automation & Control Theory (AREA)
- Data Mining & Analysis (AREA)
- Computational Mathematics (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Manufacturing & Machinery (AREA)
- Computing Systems (AREA)
- Algebra (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Gyroscopes (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses a gyroscope and camera relative position matrix calibration method based on data fusion. Firstly, fixedly connecting a gyroscope sensor and a visible light camera to a rotary platform, and enabling the sensor, the camera and the rotary platform to sequentially form a fixedly connected photoelectric detection system from top to bottom; then, changing the horizontal and pitching angles of the rotating platform, shooting a plurality of Zhang Qipan-grid images, and recording the angular velocity information output by the gyroscope sensor at the corresponding position; then, a Harris corner detection algorithm is adopted to process the checkerboard image obtained through shooting, and pixel coordinate information of characteristic points on the checkerboard in the image is extracted; and finally, calibrating a relative position matrix of the gyroscope and the camera by utilizing the coordinate information of the characteristic points on the checkerboard image and the angular velocity information output by the gyroscope sensor. Experiments prove that the gyroscope and camera relative position matrix calibration method provided by the invention can accurately calibrate the gyroscope and camera relative position matrix in the photoelectric detection system through the angular velocity of the gyroscope and the coordinate information of the characteristic points on the checkerboard image, and has certain feasibility in practical engineering application.
Description
Technical Field
The invention belongs to the technical field of image processing in computer vision, and particularly relates to a gyroscope and camera relative position matrix calibration method based on data fusion.
Background
The visible light or infrared detector, the GPS and the inertial navigation sensor are strapdown to form a photoelectric detection system for detecting the gesture and the space coordinate of the target, and the photoelectric detection system is widely applied to the military and civil fields such as aerospace navigation, automatic robot control, unmanned flying chess gesture estimation and the like (Shaeffer D K.MEMS inertial sensors:A tutorial overview[J].Communications Magazine IEEE 2013,51,100-109.;Park S K,Suh Y S.A Zero Velocity Detection Algorithm Using Inertial Sensors for Pedestrian Navigation Systems[J].Sensors2010,10,9163-78.;Frank K,Nadales M J V,Robertson P,et al.Reliable Real-Time Recognition of Motion Related Human Activities Using MEMS Inertial Sensors[C]//Ion Gnss.DLR,2010:2919-2932.).
In the above-mentioned photoelectric detection system, the visible light detector is usually fixedly connected with the gyroscope and placed on the rotating platform. In engineering application, because the relative rotation and translation parameters between the camera and the gyroscope are far smaller than the motion parameters of the whole photoelectric detection system, the relative position matrix (Sun,T.;Xing,F.;You,Z.Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers.Sensors 2013,13,4598–4623.;Cheraghi S A,Sheikh U U.Moving object detection using image registration for a moving camera platform[C]//IEEE International Conference on Control System,Computing and Engineering.IEEE,2013:355-359.;Yan D J,Shu-Yan X U,Han C S.Effect of aerocraft attitude on image motion compensation of space camera[J].Optics&Precision Engineering 2008,16(11).). between the gyroscope and the camera is usually ignored when the target gesture detection is carried out, however, the relative position matrix has a certain influence on the system estimation precision in a high-precision measurement system, so that the method has a certain research significance on the research on the calibration method of the matrix.
Disclosure of Invention
The invention aims to provide a gyroscope and camera relative position matrix calibration method based on data fusion.
The technical solution for realizing the purpose of the invention is as follows: a method for calibrating a gyroscope and camera relative position matrix based on data fusion comprises the following steps:
(10) Building an experiment system: the gyroscope sensor is fixedly connected with the visible light camera and fixedly connected to the rotary platform, so that the sensor, the camera and the rotary platform sequentially form a fixedly connected photoelectric detection system from top to bottom;
(20) Shooting an image: changing the horizontal and pitching angles of the rotary platform, and shooting a plurality of images I i (i=1, 2,.. N) containing a checkerboard, wherein I i represents an I-th frame image, I represents the current image frame number of the image, and n represents the total frame number of the shot image; and recording the angular velocity information omega output by the gyroscope sensor at the corresponding position;
(30) Extracting checkerboard corner points: processing the checkerboard image obtained by shooting by adopting Harris corner detection algorithm, and extracting pixel coordinate information of characteristic points on the checkerboard in the image Wherein/>The method comprises the steps of representing characteristic point pixel coordinates obtained by a Harris corner detection algorithm in an ith frame image, n' representing the total frame number of a shot image, and a superscript T representing matrix transposition;
(40) And (3) gyroscope data processing: filtering and smoothing the angular velocity of the gyroscope sensor by adopting a Kalman filtering algorithm to obtain a gyroscope angular velocity [ omega x,ωy,ωz]T ] for removing noise interference;
(50) The rotation matrix quaternion represents: a rotational translation matrix [ R 1,n|t1,n ] at a corresponding position n is obtained by using a quaternion-based fourth-order Dragon-Gregory tower method, wherein R 1,n represents a rotational matrix at an initial position and the corresponding position n, and t 1,n represents a translation vector at the initial position and the corresponding position n;
(60) Calibrating a relative position matrix: by means of coordinate information of feature points on checkerboard image And angular velocity information omega output by the gyroscope sensor, so as to realize calibration of a relative position matrix [ R cg|tcg ] of the gyroscope and the camera, wherein R cg represents a rotation matrix between the gyroscope and the camera, and t cg represents a translation vector between the gyroscope and the camera.
The experimental system constructing step (10) comprises the following steps: the gyro sensor is fixed on the camera, and then the gyro sensor and the camera are integrally fixed on the rotary platform, so that the gyro sensor, the camera and the rotary platform sequentially form a fixedly connected photoelectric detection system from top to bottom.
The (20) image capturing step includes: and (3) shooting checkerboard images of the rotating platform at different levels and pitching angles by using the photoelectric detection system built in the step (10), and recording angular velocity output values corresponding to the gyroscope sensors at different positions.
The (50) rotation matrix quaternion may be represented by formula (1):
Equation (1) [ omega x,ωy,ωz]T ] represents angular velocity information of the gyro sensor subjected to Kalman filtering treatment, and the superscript T represents matrix transposition; [ q 0,q1,q2,q3 ] represents a quaternion at the previous position; Representing the quaternion at the current position.
The step of calibrating the relative position matrix (60) comprises the following steps:
(601) Constructing a gyroscope and camera corresponding position matrix model according to the camera imaging geometric model through the mapping relation between the feature point image coordinates and the world coordinates on the checkerboard;
(602) Establishing an optimization equation for solving a relative position matrix of the gyroscope and the camera by utilizing a minimum reprojection error principle;
(603) And (3) realizing calibration of a relative position matrix of the gyroscope and the camera by utilizing fusion of gyroscope sensor data and camera image data.
The (602) relative position matrix optimization equation establishment step includes:
an optimization equation for solving the relative position matrix of the gyroscope and the camera is established by adopting a Levenberg-Marquardt method and is shown as a formula (2), and the relative position matrix [ R cg|tcg ] of the gyroscope and the camera is calculated by using the optimization equation shown as the formula (2), so that the calibration of the relative position matrix of the gyroscope and the camera is realized:
The formula (2), [ R cg|tcg ] is a relative position matrix of the gyroscope and the camera which need to be calibrated, wherein R cg represents a rotation matrix between the gyroscope and the camera, and t cg represents a translation vector between the gyroscope and the camera; s n represents a scaling factor; k represents a camera internal reference matrix; [ R i|ti ] (i=1,., n represents a camera external parameter matrix corresponding to an i-th frame image, where i represents a current image frame number of the image and n represents a total frame number of the photographed image; [ R 1,n|t1,n ] represents the rotational translation matrix of the system from image I 1 to image I n, where R 1,n represents the rotational matrix and t 1,n represents the translation vector; [ X w,Yw,Zw,1]T ] represents the homogeneous world coordinates of the feature points on the checkerboard; homogeneous image coordinates representing feature points on a checkerboard, where/> The method comprises the steps of representing characteristic point pixel coordinates obtained by a Harris corner detection algorithm in an ith frame image, n' representing the total frame number of a shot image, and a superscript T representing matrix transposition; the superscript-1 denotes the inverse of the matrix;
The formula (2) is that a minf ([ R cg|tcg],[Rn|tn) part is an optimization equation established for solving a relative position matrix of the gyroscope and the camera, wherein minf ([ S ] is the minimum value of a solving function f ([ S ])); Σ (·) represents the summation.
Compared with the prior art, the invention has the remarkable advantages that: constructing a gyroscope and camera corresponding position matrix model according to the camera imaging geometric model through the mapping relation between the feature point image coordinates and the world coordinates on the checkerboard; and the calibration of a relative position matrix between the gyroscope sensor and the camera in the photoelectric detection system is realized by utilizing the fusion of the gyroscope sensor data and the camera image data. Compared with the traditional method for calibrating the rotation matrix of the photoelectric measurement system, the method disclosed by the invention realizes calibration of the relative position matrix between the gyroscope sensor and the camera through fusion of the angular velocity data of the gyroscope and the coordinate information of the characteristic points on the checkerboard, and the number of unknown parameters is not increased by increasing the number of image frames in the experimental process, but the precision of a calibration result can be improved, and the method has certain feasibility in engineering practice.
According to the method, angular velocity information of the gyroscope sensor and coordinate information of characteristic points on the checkerboard image are fused, and calibration of a relative position matrix of the gyroscope and the camera is carried out. Experiments prove that the method provided by the invention can accurately mark the relative position matrix of the gyroscope and the camera in the photoelectric detection system and has certain feasibility.
Drawings
FIG. 1 is a flow chart diagram of a method for calibrating a gyroscope and camera relative position matrix based on data fusion.
Fig. 2 is a camera imaging geometry model.
Fig. 3 is a conversion relationship between a camera coordinate system and a gyro coordinate system.
Fig. 4 is a schematic diagram of experimental system setup.
Fig. 5 is a schematic diagram of an experimentally photographed checkerboard image.
FIG. 6 is a feature point coordinate re-projection error comparison on a checkerboard.
Detailed Description
The invention provides a gyroscope and camera relative position matrix calibration method based on data fusion, which has the following basic ideas:
Step one: and (5) building an experiment system. The gyroscope sensor is fixedly connected with the visible light camera and fixedly connected to the rotary platform, so that the sensor, the camera and the rotary platform sequentially form a fixedly connected photoelectric detection system from top to bottom;
Step two: an image is photographed. Changing the horizontal and pitching angles of the rotary platform, shooting a plurality of images containing checkerboard, and recording the angular velocity information output by the gyroscope sensor at the corresponding position;
Step three: processing the checkerboard image obtained by shooting by adopting a Harris corner detection algorithm, and extracting pixel coordinate information of characteristic points on the checkerboard in the image;
Step four: filtering and smoothing the angular velocity of the gyroscope sensor by adopting a Kalman filtering algorithm to obtain filtered angular velocity information;
Step five: a quaternion-based fourth-order Dragon lattice base tower method is utilized to obtain a rotation translation matrix at a corresponding position;
Step six: and calibrating a relative position matrix of the gyroscope and the camera by utilizing the coordinate information of the characteristic points on the checkerboard image and the angular velocity information output by the gyroscope sensor.
Quaternion-based gyroscope rotation matrix description concept
The rigid body target rotation or pose transformation is typically described by a quaternion Q (Q 0,q1,q2,q3), then the rotation matrix R g_quat between the two positions can be represented by a quaternion:
Equation (1), the quaternion Q t at the current position can be obtained by differentiating the quaternion Q t-1 at the previous position from the angular velocity information ω output by the current gyro sensor, that is:
symbol (2) Representing a quaternion multiplication, the subscript t represents the current position and the subscript t-1 represents the previous position.
The formula (2) can be represented in matrix form, namely:
Equation (3) [ omega x,ωy,ωz]T ] represents angular velocity information of the gyro sensor subjected to Kalman filtering treatment, and the superscript T represents matrix transposition; [ q 0,q1,q2,q3 ] represents a quaternion at the previous position; Representing the quaternion at the current position.
In the strapdown inertial system, in order to improve the solving precision of the quaternion, a quaternion differential equation solving method based on a fourth-order Dragon-Gregorian tower method can be adopted, namely:
Formula (4), T represents the sampling period of the gyro sensor, and T represents the current moment; [ K 1,K2,K3,K4 ] can be expressed as:
The quaternion multiplication is detailed in literature (Qin Y Y.The Relationship between Quaternion and Pose.Inertial Navigation 2nd ed.;Science Press,Beijing,China,2014,Volume 3,pp.292-297,ISBN.);, and the quaternion differential equation solving method based on the fourth-order Dragon-Gregory tower method is detailed in literature (Baritzhack I Y.New Method for Extracting the Quaternion from a Rotation Matrix[J].Journal of Guidance Control&Dynamics 2015,23,1085-1087.).
Concept of calibrating relative position matrix of gyroscope and camera
In a photoelectric detection system in which a gyroscope and a camera are fixedly connected, a camera coordinate system and a gyroscope coordinate system are not overlapped, and the calibration of a relative position matrix between the gyroscope and the camera is beneficial to improving the accuracy of subsequent target detection and gesture calculation.
1. Camera imaging geometry model
The camera imaging process generally comprises four coordinate systems, namely: world coordinate system O w-XwYwZw, camera coordinate system O c-XcYcZc, image coordinate system c-xy, and pixel coordinate system O-uv. Fig. 2 is a conversion relationship among the four coordinate systems.
From the camera imaging geometry model, the relationship between the pixel coordinate system O-uv and the world coordinate system O w-XwYwZw can be expressed as:
(6),
Is an internal reference matrix of the camera, wherein, (f x,fy) is the focal length of the camera, and (c x,cy) is the principal point coordinate of the camera; and [ R|t ] is a camera extrinsic matrix. The internal reference matrix and the external reference matrix of the camera can be obtained by a Zhang Zhengyou checkerboard camera parameter calibration method;
Formula (6), I is a3×3 identity matrix; s is a proportionality coefficient ;(u,v,1)T,(Xc,Yc,Zc,1)T,(Xw,Yw,Zw,1)T which is the characteristic point coordinate in the image, the camera and the world homogeneous coordinate system respectively, and a superscript T represents matrix transposition;
Equation (6) is a basic equation for target pose calculation under ideal conditions. Before reconstructing the three-dimensional space coordinates from the two-dimensional image coordinates, the external reference matrix [ R|t ] of the camera needs to be calibrated in advance.
The Zhang Zhengyou checkerboard camera parameter calibration method is detailed in literature (Zhang Z.Flexible Camera Calibration by Viewing a Plane from Unknown Orientations[C]//IEEE International Conference on Computer Vision.IEEE,1999;pp.666-673.).
2. Matrix model of relative position of gyroscope and camera
The relative positional relationship between the gyroscope and the camera is shown in fig. 3. Defining a rotation and translation matrix [ R cg|tcg ] between a gyroscope coordinate system and a camera coordinate system as a relative position matrix of the gyroscope and the camera which are required to be calibrated, namely:
when the detection system moves from position 1 to position 2, the gyroscope coordinate system transformation relationship can be expressed as:
Formulas (9), (X g1,Yg1,Zg1)T, and (X g2,Yg2,Zg2)T represent gyroscope coordinates at positions 1 and 2, respectively; For detecting a rotation matrix during the system's movement from position 1 to position 2, wherein R g_quat1 and R g_quat2 can be obtained by formulas (1) and (3), the superscript-1 represents the inverse of the matrix; t 12 is a translation vector of the detection system in the process of moving from the position 1 to the position 2, and can be obtained through twice integration of the data of the acceleration sensor or can be obtained through direct measurement.
By combining the expression (8) and the expression (9), a conversion relationship between the camera coordinate systems at the position 1 and the position 2 can be obtained, that is:
equations (10), (X c1,Yc1,Zc1,1)T and (X c2,Yc2,Zc2,1)T) represent homogeneous coordinates of the feature point in the camera coordinate system at positions 1 and 2, respectively, the superscript T represents the matrix transpose, and the superscript-1 represents the inverse of the matrix.
By combining the formula (6) and the formula (10), it is possible to obtain:
[R2|t2]=[Rcg|tcg]-1[R12|t12][Rcg|tcg][R1|t1] (11)
Equation (11), [ R 1|t1 ] and [ R 2|t2 ] represent camera exogenous matrices at positions 1 and 2, respectively, for describing the conversion relationship between the camera coordinate system and the world coordinate system, and superscript-1 represents the inverse of the matrix.
3. Camera extrinsic matrix calculation model
According to the camera imaging geometric model, the gyroscope and camera relative position matrix shown in the formula (8) and the camera external parameter matrix shown in the formula (11) can be obtained through calibration of the conversion relation between the image coordinate system shown in the formula (6) and the world coordinate system.
In the calibration process, a black and white standard checkerboard as shown in fig. 3 can be used as a calibration object. The tessellation plane is defined as the O w-XwYw plane of the world coordinate system, where the X w axis and Y w axis are the horizontal and vertical edges of the tessellation, respectively.
At position 1, it can be obtained from formula (6):
(12), And s 1 is a proportionality coefficient for the homogeneous coordinates of the image of the characteristic point in the 1 st frame image detected by the Harris corner detection algorithm at the position 1.
When the photodetection system is moved to position 2, the following formula (11) can be obtained:
(13), And s 2 is a proportionality coefficient for the homogeneous coordinates of the image of the characteristic point in the 2 nd frame image detected by the Harris corner detection algorithm at the position 2.
For the acquired image sequence I i (i=1, 2, n.), formula (13) can be extended to:
(14), Homogeneous image coordinates of the feature points; [ R n|tn ] is the camera outlier matrix at position n; [ R cg|tcg ] is a relative position matrix of the gyroscope and the camera and is irrelevant to the moving angle of the system; [ R 1,n|t1,n ] is a rotation and translation matrix corresponding to the movement from the position 1 to the position n, and can be obtained by calculating the angular velocity of a gyroscope at the corresponding position, and s n is a proportionality coefficient.
4. Gyroscope and camera relative position matrix model optimization
The relative position matrix (R cg|tcg) of the gyroscope and the camera and the external parameter matrix (R n|tn) of the camera can be solved by using the image containing the checkerboard in the formula (14), and the method is different from the traditional Zhang Zhengyou camera parameter calibration method in that the number of the unknowns is increased by increasing the number of the images acquired by the camera, and the number of the unknowns is not increased by increasing the number of the acquired images.
The gyroscope and camera relative position matrix [ R cg|tcg ] and the camera extrinsic matrix [ R n|tn ] in formula (14) can be obtained by minimizing the re-projection coordinates of feature points on the checkerboard on the image plane o-uvAnd feature point coordinates/>, which are actually detected by using Harris corner detection methodAnd solving a nonlinear optimization equation of the difference value of the (2). The objective function of this equation can be expressed as:
In consideration of the accuracy problem of actual measurement, increasing the number of image frames acquired by the camera increases the number of feature points detected, so that the solution accuracy of equation (15) can be further improved. Therefore, the invention adopts the Levenberg-Marquardt method to establish an optimization equation for solving the relative position matrix [ R cg|tcg ] of the gyroscope and the camera, namely:
the Levenberg-Marquardt method is described in detail in the literature (Lourakis M I A.A Brief Description of the Levenberg-Marquardt Algorithm Implemened by levmar[J].Foundation of Research&Technology 2005.).
One flow of the method of the invention
Step one: and (5) building an experiment system. The gyroscope sensor is fixedly connected with the visible light camera and fixedly connected to the rotary platform, so that the sensor, the camera and the rotary platform sequentially form a fixedly connected photoelectric detection system from top to bottom;
Step two: an image is photographed. Changing the horizontal and pitching angles of the rotary platform, shooting a plurality of images containing checkerboard, and recording the angular velocity information output by the gyroscope sensor at the corresponding position;
Step three: processing the checkerboard image obtained by shooting by adopting a Harris corner detection algorithm, and extracting pixel coordinate information of characteristic points on the checkerboard in the image;
Step four: filtering and smoothing the angular velocity of the gyroscope sensor by adopting a Kalman filtering algorithm;
Step five: a quaternion-based fourth-order Dragon lattice base tower method is utilized to obtain a rotation translation matrix at a corresponding position;
Step six: and calibrating a relative position matrix of the gyroscope and the camera by utilizing the coordinate information of the characteristic points on the checkerboard image and the angular velocity information output by the gyroscope sensor.
The beneficial effects of the invention can be further illustrated by the following experiments:
1. composition of experimental system and initial value of system
The photoelectric detection system for the experiment is formed by fixedly connecting a visible light camera and a gyroscope sensor. The experimental setup is shown in fig. 4. The visible light camera adopts acA-90 gc model of Basler company, the resolution is 658 multiplied by 492, the CCD pixel size is 4.88mm multiplied by 3.66mm, the lens focal length is 12mm, and the lens distortion is negligible; the gyroscope model is GI550 type triaxial inertial sensor of the remifene company, and the period is 0.005s.
2. Camera internal reference matrix calibration
In solving equation (16), the internal reference matrix K of the known camera is required. The invention adopts Zhang Zhengyou camera parameter calibration method to calibrate the camera internal reference matrix K, and the calibrated internal reference matrix K is:
3. Experimental data measurement and collection
In the experiment, a rotating platform for fixing a visible light camera and a gyroscope sensor is moved, images containing checkerboard are shot at different positions, and meanwhile angular velocity data of the gyroscope sensor at the corresponding positions are recorded. The images taken from the experiment are shown in figure 5.
4. Gyroscope and camera relative position matrix calibration
Obtaining a relative position matrix of the gyroscope and the camera by adopting a formula (16):
as shown in FIG. 6, the reprojection coordinates of the feature points (circles) on the checkerboard on the image plane o-uv And the coordinates/>, of the characteristic points (cross) actually detected by using the Harris corner detection methodI.e. the re-projection error comparison map.
Claims (2)
1. A method for calibrating a gyroscope and camera relative position matrix based on data fusion is characterized in that,
(10) Building an experiment system: the gyroscope sensor is fixedly connected with the visible light camera and fixedly connected to the rotary platform, so that the sensor, the camera and the rotary platform sequentially form a fixedly connected photoelectric detection system from top to bottom;
(20) Shooting an image: changing the horizontal and pitching angles of the rotary platform, and shooting a plurality of images I i containing checkerboard, wherein i=1, 2, & n, wherein I i represents an ith frame image, I represents the current image frame number of the image, and n represents the total frame number of the shot image; and recording the angular velocity information omega output by the gyroscope sensor at the corresponding position;
(30) Extracting checkerboard corner points: processing the checkerboard image obtained by shooting by adopting Harris corner detection algorithm, and extracting pixel coordinate information of characteristic points on the checkerboard in the image Wherein/>The method comprises the steps of representing characteristic point pixel coordinates obtained by a Harris corner detection algorithm in an ith frame image, n' representing the total frame number of a shot image, and a superscript T representing matrix transposition;
(40) And (3) gyroscope data processing: filtering and smoothing the angular velocity of the gyroscope sensor by adopting a Kalman filtering algorithm to obtain a gyroscope angular velocity [ omega x,ωy,ωz]T ] for removing noise interference;
(50) The rotation matrix quaternion represents: a rotational translation matrix [ R 1,n|t1,n ] at a corresponding position n is obtained by using a quaternion-based fourth-order Dragon-Gregory tower method, wherein R 1,n represents a rotational matrix at an initial position and the corresponding position n, and t 1,n represents a translation vector at the initial position and the corresponding position n;
(60) Calibrating a relative position matrix: by means of coordinate information of feature points on checkerboard image And angular velocity information omega output by the gyroscope sensor, so as to realize calibration of a relative position matrix [ R cg|tcg ] of the gyroscope and the camera, wherein R cg represents a rotation matrix between the gyroscope and the camera, and t cg represents a translation vector between the gyroscope and the camera;
The step of calibrating the relative position matrix (60) comprises the following steps:
(601) Constructing a gyroscope and camera corresponding position matrix model according to the camera imaging geometric model through the mapping relation between the feature point image coordinates and the world coordinates on the checkerboard;
(602) Establishing an optimization equation for solving a relative position matrix of the gyroscope and the camera by utilizing a minimum reprojection error principle;
(603) The calibration of a gyroscope and a camera relative position matrix is realized by utilizing the fusion of gyroscope sensor data and camera image data;
The (602) relative position matrix optimization equation establishment step includes:
an optimization equation for solving the relative position matrix of the gyroscope and the camera is established by adopting a Levenberg-Marquardt method and is shown as a formula (2), and the relative position matrix [ R cg|tcg ] of the gyroscope and the camera is calculated by using the optimization equation shown as the formula (2), so that the calibration of the relative position matrix of the gyroscope and the camera is realized:
The formula (2), [ R cg|tcg ] is a relative position matrix of the gyroscope and the camera which need to be calibrated, wherein R cg represents a rotation matrix between the gyroscope and the camera, and t cg represents a translation vector between the gyroscope and the camera; s n represents a scaling factor; k represents a camera internal reference matrix; [ R i|ti ] i=1..n, n represents a camera external parameter matrix corresponding to the i-th frame image, where i represents the current image frame number of the image and n represents the total frame number of the photographed image; [ R 1,n|t1,n ] represents the rotational translation matrix of the system from image I 1 to image I n, where R 1,n represents the rotational matrix and t 1,n represents the translation vector; [ X w,Yw,Zw,1]T ] represents the homogeneous world coordinates of the feature points on the checkerboard; homogeneous image coordinates representing feature points on a checkerboard, where/> The method comprises the steps of representing characteristic point pixel coordinates obtained by a Harris corner detection algorithm in an ith frame image, n' representing the total frame number of a shot image, and a superscript T representing matrix transposition; the superscript-1 denotes the inverse of the matrix;
The formula (2) is that a minf ([ R cg|tcg],[Rn|tn) part is an optimization equation established for solving a relative position matrix of the gyroscope and the camera, wherein minf ([ S ] is the minimum value of a solving function f ([ S ])); Σ (·) represents the summation.
2. The method for calibrating a matrix of relative positions of a gyroscope and a camera based on data fusion according to claim 1, wherein the quaternion of the (50) rotation matrix can be represented by the following formula (1):
Equation (1) [ omega x,ωy,ωz]T ] represents angular velocity information of the gyro sensor subjected to Kalman filtering treatment, and the superscript T represents matrix transposition; [ q 0,q1,q2,q3 ] represents a quaternion at the previous position; Representing the quaternion at the current position.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110596374.4A CN113776556B (en) | 2021-05-30 | 2021-05-30 | Gyroscope and camera relative position matrix calibration method based on data fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110596374.4A CN113776556B (en) | 2021-05-30 | 2021-05-30 | Gyroscope and camera relative position matrix calibration method based on data fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113776556A CN113776556A (en) | 2021-12-10 |
CN113776556B true CN113776556B (en) | 2024-05-07 |
Family
ID=78835745
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110596374.4A Active CN113776556B (en) | 2021-05-30 | 2021-05-30 | Gyroscope and camera relative position matrix calibration method based on data fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113776556B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114910076B (en) * | 2022-05-20 | 2024-04-05 | 泉州装备制造研究所 | Outdoor camera positioning method and device based on GPS and IMU information |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107782309A (en) * | 2017-09-21 | 2018-03-09 | 天津大学 | Noninertial system vision and double tops instrument multi tate CKF fusion attitude measurement methods |
CN207923150U (en) * | 2017-08-04 | 2018-09-28 | 广东工业大学 | A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude |
CN109990801A (en) * | 2017-12-29 | 2019-07-09 | 南京理工大学 | Level meter rigging error scaling method based on plumb line |
CN110633015A (en) * | 2019-10-31 | 2019-12-31 | 江苏北方湖光光电有限公司 | Driver head gesture detection device |
CN110967005A (en) * | 2019-12-12 | 2020-04-07 | 中国科学院长春光学精密机械与物理研究所 | Imaging method and imaging system for on-orbit geometric calibration through star observation |
CN110987021A (en) * | 2019-12-25 | 2020-04-10 | 湖北航天技术研究院总体设计所 | Inertial vision relative attitude calibration method based on rotary table reference |
CN111156998A (en) * | 2019-12-26 | 2020-05-15 | 华南理工大学 | Mobile robot positioning method based on RGB-D camera and IMU information fusion |
CN111429532A (en) * | 2020-04-30 | 2020-07-17 | 南京大学 | Method for improving camera calibration accuracy by utilizing multi-plane calibration plate |
CN111951180A (en) * | 2020-07-09 | 2020-11-17 | 北京迈格威科技有限公司 | Image shake correction method, image shake correction apparatus, computer device, and storage medium |
CN112161586A (en) * | 2020-11-20 | 2021-01-01 | 苏州睿牛机器人技术有限公司 | Line structured light vision sensor calibration method based on coding checkerboard |
-
2021
- 2021-05-30 CN CN202110596374.4A patent/CN113776556B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN207923150U (en) * | 2017-08-04 | 2018-09-28 | 广东工业大学 | A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude |
CN107782309A (en) * | 2017-09-21 | 2018-03-09 | 天津大学 | Noninertial system vision and double tops instrument multi tate CKF fusion attitude measurement methods |
CN109990801A (en) * | 2017-12-29 | 2019-07-09 | 南京理工大学 | Level meter rigging error scaling method based on plumb line |
CN110633015A (en) * | 2019-10-31 | 2019-12-31 | 江苏北方湖光光电有限公司 | Driver head gesture detection device |
CN110967005A (en) * | 2019-12-12 | 2020-04-07 | 中国科学院长春光学精密机械与物理研究所 | Imaging method and imaging system for on-orbit geometric calibration through star observation |
CN110987021A (en) * | 2019-12-25 | 2020-04-10 | 湖北航天技术研究院总体设计所 | Inertial vision relative attitude calibration method based on rotary table reference |
CN111156998A (en) * | 2019-12-26 | 2020-05-15 | 华南理工大学 | Mobile robot positioning method based on RGB-D camera and IMU information fusion |
CN111429532A (en) * | 2020-04-30 | 2020-07-17 | 南京大学 | Method for improving camera calibration accuracy by utilizing multi-plane calibration plate |
CN111951180A (en) * | 2020-07-09 | 2020-11-17 | 北京迈格威科技有限公司 | Image shake correction method, image shake correction apparatus, computer device, and storage medium |
CN112161586A (en) * | 2020-11-20 | 2021-01-01 | 苏州睿牛机器人技术有限公司 | Line structured light vision sensor calibration method based on coding checkerboard |
Also Published As
Publication number | Publication date |
---|---|
CN113776556A (en) | 2021-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111156998B (en) | Mobile robot positioning method based on RGB-D camera and IMU information fusion | |
CN112396664B (en) | Monocular camera and three-dimensional laser radar combined calibration and online optimization method | |
Panahandeh et al. | Vision-aided inertial navigation based on ground plane feature detection | |
CN106052584B (en) | A kind of view-based access control model and the orbit space linear measurement method of Inertia information fusion | |
CN106017463B (en) | A kind of Aerial vehicle position method based on orientation sensing device | |
CN103745474B (en) | Image registration method based on inertial sensor and camera | |
CN109540126A (en) | A kind of inertia visual combination air navigation aid based on optical flow method | |
CN109141433A (en) | A kind of robot indoor locating system and localization method | |
CN109648558B (en) | Robot curved surface motion positioning method and motion positioning system thereof | |
CN104501779A (en) | High-accuracy target positioning method of unmanned plane on basis of multi-station measurement | |
CN208751577U (en) | A kind of robot indoor locating system | |
CN109724586B (en) | Spacecraft relative pose measurement method integrating depth map and point cloud | |
CN107145167B (en) | Video target tracking method based on digital image processing technology | |
CN115574816B (en) | Bionic vision multi-source information intelligent perception unmanned platform | |
CN113408623B (en) | Non-cooperative target flexible attachment multi-node fusion estimation method | |
CN115272596A (en) | Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene | |
CN112580683B (en) | Multi-sensor data time alignment system and method based on cross correlation | |
CN113776556B (en) | Gyroscope and camera relative position matrix calibration method based on data fusion | |
CN113029132A (en) | Spacecraft navigation method combining ground image and astrolabe measurement | |
CN112444245A (en) | Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor | |
Fang et al. | A motion tracking method by combining the IMU and camera in mobile devices | |
CN113639722A (en) | Continuous laser scanning registration auxiliary inertial positioning and attitude determination method | |
Veth et al. | Two-dimensional stochastic projections for tight integration of optical and inertial sensors for navigation | |
Zhang et al. | An open-source, fiducial-based, underwater stereo visual-inertial localization method with refraction correction | |
Wang et al. | Pose and velocity estimation algorithm for UAV in visual landing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |