CN111429524A - Online initialization and calibration method and system for camera and inertial measurement unit - Google Patents

Online initialization and calibration method and system for camera and inertial measurement unit Download PDF

Info

Publication number
CN111429524A
CN111429524A CN202010197372.3A CN202010197372A CN111429524A CN 111429524 A CN111429524 A CN 111429524A CN 202010197372 A CN202010197372 A CN 202010197372A CN 111429524 A CN111429524 A CN 111429524A
Authority
CN
China
Prior art keywords
measurement unit
inertial measurement
camera
frame
rotation angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010197372.3A
Other languages
Chinese (zh)
Other versions
CN111429524B (en
Inventor
王贺升
沈玥伶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202010197372.3A priority Critical patent/CN111429524B/en
Publication of CN111429524A publication Critical patent/CN111429524A/en
Application granted granted Critical
Publication of CN111429524B publication Critical patent/CN111429524B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a system for initializing and calibrating a camera and an inertia measurement unit on line, which comprises the following steps: step M1: pure vision initialization, namely acquiring the pose of an initial key frame by utilizing the observed quantity of a camera and constructing an initial map; step M2: calibrating and initializing vision and inertia jointly, and finishing the initialization of an angular velocity null shift value of an inertia measurement unit and the calibration of a relative rotation angle and a translation amount of a camera and the inertia measurement unit based on the pose of an initial key frame and an initial map; step M3: in the process of synchronous positioning and mapping fused by the camera and the inertial measurement unit, the initialization of the angular velocity null shift value of the inertial measurement unit and the calibration of the relative rotation angle and translation amount of the camera and the inertial measurement unit are subjected to abnormal detection and updating. The invention can complete the calibration of the relative rotation angle and the relative translation amount of the camera and the inertial measurement unit in the combined initialization process without manual calibration tools such as a calibration plate and the like, thereby expanding the application range.

Description

Online initialization and calibration method and system for camera and inertial measurement unit
Technical Field
The invention relates to the field of sensor fusion, in particular to a method and a system for initializing and calibrating a camera and an inertial measurement unit on line, and more particularly to a method and a system for jointly initializing and calibrating a monocular camera and the inertial measurement unit on line based on an information content window.
Background
The camera and the Inertial Measurement Unit (IMU) have complementary characteristics, and thus are widely used in fields such as navigation and three-dimensional reconstruction in combination. The camera can obtain rich information of the external environment, but is easy to lose effectiveness under the conditions of rapid movement, illumination change and the like. The IMU can obtain motion information inside the robot at high frequency and is not affected by the surrounding environment, thereby making up for the deficiency of the vision sensor. Meanwhile, the accumulated drift error of the IMU can be effectively corrected through loop detection and loop correction finished through visual matching.
Initialization is an indispensable link in an optimized S L AM system, the initialization process determines the initial values of variables to be estimated and parameters required by system operation, including the poses of initial frames, initial map points, scale factors, gravity vectors, IMU null shift values and the like.
The external parameters of the camera and the IMU comprise relative rotation angle and relative translation amount of the camera and the IMU. And determining external parameters of the vision sensor and the IMU as a precondition for fusion application of the camera and the IMU. At present, the calibration of the visual sensor and the IMU external parameters mainly depends on offline manual operation, and an operator needs to hold a calibration plate by hand and perform proper movement and rotation operation. The whole process is time and labor consuming and requires the operator to have relevant knowledge. There are several methods for on-line calibration. However, in the case of a degraded motion situation (the robot has a small acceleration and rotation), the amount of computation of these methods increases rapidly with time.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a method and a system for initializing and calibrating a camera and an inertial measurement unit on line.
The invention provides an online initialization and calibration method for a camera and an inertial measurement unit, which comprises the following steps:
step M1: pure vision initialization, namely acquiring the pose of an initial key frame by utilizing the observed quantity of a camera and constructing an initial map;
step M2: calibrating and initializing vision and inertia jointly, and finishing initialization of an angular velocity null shift value of an inertia measurement unit and calibration of a relative rotation angle and a translation amount of a camera and the inertia measurement unit based on the pose of an initial key frame and an initial map;
step M3: in the process of synchronous positioning and mapping fused by the camera and the inertial measurement unit, the initialization of the angular velocity null shift value of the inertial measurement unit and the calibration of the relative rotation angle and translation amount of the camera and the inertial measurement unit are subjected to abnormal detection and updating.
Preferably, the step M1 includes:
step M1.1: according to the new key frame, searching a current frame and a historical frame which accord with the parallax angle and the matching point logarithm of the preset condition;
step M1.2: calculating the relative pose between the current frame and the historical frame by utilizing epipolar constraint;
step M1.3: according to the obtained relative pose between the current frame and the historical frame, obtaining the world coordinate system coordinates of map points observed by the current frame and the historical frame together through triangulation, and constructing an initial map;
step M1.4: and projecting points in the initial map onto the pixel plane of each key frame by using a perspective n-point method, and solving the pose of each key frame in a world coordinate system.
Preferably, the step M2 includes:
step M2.1: screening the observed quantities of the camera and the inertia measurement unit based on the information quantity, and screening out the observed quantities of the preset quantity which maximizes the total information quantity; initializing an angular velocity null shift value of the inertial measurement unit and calibrating a relative rotation angle between the camera and the inertial measurement unit by using the observed quantities of the camera and the inertial measurement unit obtained after screening until the relative rotation angle between the camera and the inertial measurement unit is judged to be converged;
step M2.2: the method comprises the steps that the null shift value of the acceleration of an inertial measurement unit is assumed to be zero, the observed quantities of a camera and the inertial measurement unit are screened based on information quantity, the observed quantities of a fixed quantity of maximized total information quantity are reserved, and a scale factor, a gravity vector and the relative translation quantity of the camera and the inertial measurement unit are roughly estimated;
step M2.3: screening the observed quantities of the camera and the inertial measurement unit based on the information quantity by using the direction of the roughly estimated gravity vector and introducing the gravity as a new constraint, reserving a fixed quantity of observed quantities maximizing the total information quantity, and finely estimating a scale factor, the gravity vector, the relative translation quantity of the camera and the inertial measurement unit and a null shift value of an accelerometer in the inertial measurement unit; when the scale factor and the relative translation amount of the camera and the inertial measurement unit are converged, the visual and inertial joint initialization and calibration are completed;
the information quantity is the observed quantity of the camera and the inertial measurement unit and comprises the relative rotation angle information quantity of the camera and the inertial measurement unit.
Preferably, said step M2.1 comprises:
step M2.1.1: obtaining the relative rotation angle of the inertia measurement unit between two adjacent frames by using the observed quantity of the inertia measurement unit;
obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by integrating the angular velocity of the inertial measurement unit
Figure BDA0002418099640000031
Figure BDA0002418099640000032
Wherein, Δ Ri,i+1Representing an inertial measurement unit angular velocity integral value;
Figure BDA0002418099640000033
a Jacobian matrix representing a null shift value of the inertial measurement unit with respect to the angular velocity; bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
step M2.1.2: obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by using the observed quantity of the camera;
the poses of two adjacent frames of cameras under a world coordinate system are obtained through visual synchronous positioning and mapping
Figure BDA0002418099640000034
Obtaining the relative rotation angle of the inertial measurement unit by using the relative rotation angle of the camera and the inertial measurement unit
Figure BDA0002418099640000035
Figure BDA0002418099640000036
Wherein R iscbRepresenting a relative rotation angle of the camera and the inertial measurement unit;
step M2.1.3: minimizing an error of a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the inertial measurement unit and a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the camera to obtain an optimal estimation of an angular velocity null shift value of the inertial measurement unit, namely initializing the angular velocity null shift value of the inertial measurement unit;
obtaining the optimal estimation of the null shift value of the angular velocity of the inertial measurement unit by minimizing the error of the two relative rotation angles
Figure BDA0002418099640000037
Figure BDA0002418099640000038
Where N denotes the number of key frames, R2 denotes the relative rotation angle of the inertial measurement unit obtained from the camera observation, R1 denotes the relative rotation angle of the inertial measurement unit obtained by integration of the angular velocity of the inertial measurement unit, and biInertial measurement Unit representing the ith Key frame, bi+1Inertial measurement Unit representing the i +1 st Key frame, bgRepresenting inertial measurement unit angleA speed null shift value;
step M2.1.4: for any two adjacent key frames, the quantity of information about the relative rotation angle between the camera and the inertial measurement unit contained in the observed quantity is as follows:
iQi,i+1=trace((QL(qbi,bi+1)-QR(qci,ci+1))T·(QL(qbi,bi+1)-QR(qci,ci+1))) (4)
wherein Q isL(q) a left-hand matrix corresponding to the quaternion q; qR(q) a right-multiplication matrix corresponding to the quaternion q; q. q.sbi,bi+1A quaternion representing the relative rotation angle of the inertial measurement unit of the ith frame and the (i + 1) th frame; q. q.sci,ci+1A quaternion representing the relative rotation angle of the i frame and the i +1 frame camera;
setting the information content window size to NwinIf the total number of key frames is less than NwinThe latest key frame is added to the sequence for initialization and calibration. If the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
step M2.1.5: constructing an equation set taking the relative rotation angle of the camera and the inertial measurement unit as unknown quantity by using the screened observed quantity, and solving the equation set by using a least square method to obtain the optimal estimation of the relative rotation angle of the camera and the inertial measurement unit expressed in the form of quaternion
Figure BDA0002418099640000041
Calibrating the relative rotation angle between the camera and the inertial measurement unit;
for any two adjacent key frames, the relative rotation angle of the camera and the inertial measurement unit is an unknown quantity, and the following equation system is satisfied:
(QL(qbi,bi+1)-QR(qci,ci+1))·qbc=0 (5)
wherein Q isLA left-hand multiplication matrix, q, representing a quaternionbi,bi+1Representing the inertial measurement unit under the ith frame and the (i + 1) th frameQuaternion representation of the relative rotation angle, QRA right-hand multiplication matrix representing a quaternion, qci,ci+1Quaternion representation, q, representing the relative rotation angle of the camera in frame i and frame i +1bcA quaternion representation representing the relative rotation angle of the camera and the inertial measurement unit;
will (Q)L(qbi,bi+1)-QR(qci,ci+1) Is represented by Q)i,i+1Using the observed quantity Q after screeningi,i+1Forming an observation matrix Q, and establishing an equation:
Q·qbc=0 (6)
solving for optimal estimation of the relative rotation angle of the camera and the inertial measurement unit using least squares
Figure BDA0002418099640000042
Figure BDA0002418099640000043
Step M2.1.6: by judging the nearest NwincStandard deviation sigma of yaw angle, pitch angle and roll angle of Euler angle corresponding to relative rotation angle of individual camera and inertial measurement unityaw、σpitch、σrollWhether or not less than a threshold value
Figure BDA0002418099640000044
Determining whether the calibration of the visual inertia rotation angle is finished; when the calibration is not completed, that is, the relative rotation angle between the camera and the inertial measurement unit is not converged, waiting for a new key frame to repeatedly execute steps M2.1.1 to M2.1.5 until the calibration of the visual inertial rotation angle is completed;
said step M2.2 comprises:
step M2.2.1: obtaining a rotation angle and a translation amount of the camera in a world coordinate system according to the camera observed quantity by utilizing a vision synchronous positioning and mapping algorithm; obtaining the relative translation amount and the relative linear velocity of two adjacent key frames of the inertia measurement unit according to the observed amount of the inertia measurement unit by using a pre-integration algorithm;
constructing the ith frame by the scale factor s, gravity directionAmount gwAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure BDA0002418099640000051
Figure BDA0002418099640000052
Figure BDA0002418099640000053
Figure BDA0002418099640000054
Figure BDA0002418099640000055
wherein, an acceleration zero drift value b of the inertial measurement unit is assumedaα is zeroiExpressing the coefficients before the scale factor in the equation, βiCoefficient before the initialized value of the gravity vector in the equation, gammaiExpressing the coefficients of the equations before the amount of visual inertial translation, ηiConstant term coefficients in the equation are expressed, and s represents a scale factor; gwAn initialization value representing a gravity vector; pcbRepresenting the visual inertial translation amount; rwcRepresenting the rotation angle of the camera in the world coordinate system; pwcThe translation amount of the camera in the world coordinate system;
Figure BDA0002418099640000056
represents the translation amount of the i +1 th frame camera in the world coordinate system,
Figure BDA0002418099640000057
represents the translation amount, dt, of the camera of the ith frame in the world coordinate systemi+1,i+2Representing the time difference between the (i + 1) th frame and the (i + 2) th frame,
Figure BDA0002418099640000058
represents the translation amount, dt, of the i +2 th frame camera in the world coordinate systemi,i+1Representing the time difference between the ith frame and the (I + 1) th frame, I3×3The unit matrix is represented by a matrix of units,
Figure BDA0002418099640000059
represents the rotation angle of the i +2 th frame camera in the world coordinate system,
Figure BDA00024180996400000510
represents the rotation angle of the i +1 th frame camera in the world coordinate system,
Figure BDA00024180996400000511
representing the rotation angle, R, of the i-th frame camera in the world coordinate systemcbRepresenting the relative rotation angle, Δ P, between the camera and the inertial measurement uniti,i+1Represents the relative translation, Δ P, of the inertial measurement units of frame i and frame i +1i+1,i+2Represents the relative translation, Deltav, of the inertial measurement units of frame i +1 and frame i +2i,i+1Representing the relative linear speeds of the inertial measurement units of the (i + 1) th frame and the (i + 2) th frame;
step M2.2.2: suppose Ai=[αiβiγi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure BDA0002418099640000061
Setting the information content window size to NwinWhen the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of the key frames is more than or equal to NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
step M2.2.3: suppose that
Figure BDA0002418099640000062
Bi=ηiFor every 3 adjacent frames, there is equation Ai·x=Bi(ii) a Using information content after screeningEstablishing an information matrix A and a constant matrix B, and establishing an equation:
A·x=B (9)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the gravity vector gw *And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure BDA0002418099640000063
Said step M2.3 comprises: introducing a gravity vector with a preset value as a constraint to obtain a scale factor s and a gravity vector gwInitial value of (1) and visual inertial translation amount PcbA calibration value of (1);
according to the obtained gravity vector gwObtaining the rotation angle of gravity:
Figure BDA0002418099640000064
wherein R isWIRepresenting the rotation angle of gravity in the world coordinate system,
Figure BDA0002418099640000065
representation and gravity vector
Figure BDA0002418099640000066
And a fixed vector gIVertical unit vector, theta is expressed in relation to gravity vector
Figure BDA0002418099640000067
And a fixed vector gIAngle between them, gIRepresents a fixed vector; the gravity vector is expressed by the rotation angle and the gravity magnitude and is subjected to first-order Taylor expansion:
Figure BDA0002418099640000068
wherein θ represents an updated value of the gravity vector rotation angle; g represents the magnitude of gravity, θxThe component of the updated value representing the rotation angle of the gravity vector in the x-axis of the world coordinate system, θyA component of the updated value representing the rotation angle of the gravity vector in the y-axis of the world coordinate system;
obtaining the rotation angle R of the camera in the world coordinate system according to the camera observation quantity by using a vision synchronous positioning and mapping algorithmwcAnd the amount of translation Pwc(ii) a Obtaining the relative translation quantity delta P of two adjacent key frames of the inertia measurement unit according to the observed quantity of the inertia measurement unit by utilizing a pre-integration algorithmi,i+1And relative linear velocity Δ vi,i+1(ii) a Constructing the ith frame by a scale factor s and a gravity vector gwZero drift value b of acceleration of inertial measurement unitaAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure BDA0002418099640000071
Figure BDA0002418099640000072
Figure BDA0002418099640000073
Figure BDA0002418099640000074
Figure BDA0002418099640000075
Figure BDA0002418099640000076
wherein λ isiCoefficient before scale factor in the equation, muiCoefficient before the updated value of the rotation angle of the gravity vector in the equation, ωiThe coefficient before the acceleration zero drift value of the inertia measurement unit in the equation is expressed as upsiloniBefore the relative translation of the camera and the inertial measurement unit in the expression equationCoefficient of (p)iThe coefficients of the constant terms in the equation are expressed,
Figure BDA0002418099640000077
jacobian matrix, dt, representing the relative linear velocity of the inertial measurement unit in frame i and frame i +2 compared to the null shift value of the acceleration of the inertial measurement unit12Representing the time difference, dt, between the ith frame and the (i + 1) th frame23Representing the time difference between the i +1 th frame and the i +2 th frame,
Figure BDA0002418099640000078
a Jacobian matrix representing the relative translation of the inertial measurement unit between frame i +1 and frame i +2 compared to the inertial measurement unit acceleration null shift value,
Figure BDA0002418099640000079
a Jacobian matrix, a small scale, representing the relative translation of the inertial measurement unit between frame i and frame i +1 compared to the null shift value of the acceleration of the inertial measurement unit[:,1:2]The representation takes only the first two columns of the matrix;
hypothesis Ci=[λiμiωiυi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure BDA00024180996400000710
Setting the information content window size to NwinIf the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
constructing equation set and solving by least square method
Suppose that
Figure BDA0002418099640000081
Di=ρiFor every 3 adjacent frames, there is equation Ci·y=Di(ii) a Benefit toAnd (3) constructing an observation matrix C and a constant matrix D by using the screened observed quantity, and constructing an equation:
C·y=D (14)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the update value of the rotation angle of the gravity vector*Optimal estimation of acceleration null shift value of inertial measurement unit
Figure BDA0002418099640000082
And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure BDA0002418099640000083
By judging the nearest NwincStandard deviation sigma corresponding to each dimension value of vision inertial translation quantityx、σy、σzWhether or not less than a threshold value
Figure BDA0002418099640000084
Determining whether the calibration of the visual inertial translation amount is finished; by judging the nearest NwincStandard deviation sigma of individual scale factorssWhether or not it is less than threshold thrsTo determine whether visual inertia initialization is complete; and when the calibration of the vision inertia translation amount is not finished or the initialization of the vision inertia is not finished, waiting for a new key frame and then repeatedly executing the step M2.2 to the step M2.3 until the joint initialization and calibration of the vision and the inertia are finished.
Preferably, the step M3 includes: in the process of synchronous positioning and image building fused by the camera and the inertial measurement unit, judging whether the relative pose of the camera and the inertial measurement unit changes every other preset value period, and when the relative pose of the camera and the inertial measurement unit changes, initializing the angular velocity null shift value of the inertial measurement unit and recalibrating the relative rotation angle and translation quantity of the camera and the inertial measurement unit according to any two adjacent frames in the latest frame;
step M3.1: and judging whether the calibration value of the relative rotation angle of the camera and the inertia needs to be updated according to whether the consistency error between the observed quantity of the camera and the inertia measurement unit and the initial calibration value of the relative rotation angle of the camera and the inertia measurement unit exceeds a threshold value.
The invention provides an online initialization and calibration system for a camera and an inertial measurement unit, which comprises:
module M1: pure vision initialization, namely acquiring the pose of an initial key frame by utilizing the observed quantity of a camera and constructing an initial map;
module M2: calibrating and initializing vision and inertia jointly, and finishing initialization of an angular velocity null shift value of an inertia measurement unit and calibration of a relative rotation angle and a translation amount of a camera and the inertia measurement unit based on the pose of an initial key frame and an initial map;
module M3: in the process of synchronous positioning and mapping fused by the camera and the inertial measurement unit, the initialization of the angular velocity null shift value of the inertial measurement unit and the calibration of the relative rotation angle and translation amount of the camera and the inertial measurement unit are subjected to abnormal detection and updating.
Preferably, said module M1 comprises:
module M1.1: according to the new key frame, searching a current frame and a historical frame which accord with the parallax angle and the matching point logarithm of the preset condition;
module M1.2: calculating the relative pose between the current frame and the historical frame by utilizing epipolar constraint;
module M1.3: according to the obtained relative pose between the current frame and the historical frame, obtaining the world coordinate system coordinates of map points observed by the current frame and the historical frame together through triangulation, and constructing an initial map;
module M1.4: and projecting points in the initial map onto the pixel plane of each key frame by using a perspective n-point method, and solving the pose of each key frame in a world coordinate system.
Preferably, said module M2 comprises:
module M2.1: screening the observed quantities of the camera and the inertia measurement unit based on the information quantity, and screening out the observed quantities of the preset quantity which maximizes the total information quantity; initializing an angular velocity null shift value of the inertial measurement unit and calibrating a relative rotation angle between the camera and the inertial measurement unit by using the observed quantities of the camera and the inertial measurement unit obtained after screening until the relative rotation angle between the camera and the inertial measurement unit is judged to be converged;
module M2.2: the method comprises the steps that the null shift value of the acceleration of an inertial measurement unit is assumed to be zero, the observed quantities of a camera and the inertial measurement unit are screened based on information quantity, the observed quantities of a fixed quantity of maximized total information quantity are reserved, and a scale factor, a gravity vector and the relative translation quantity of the camera and the inertial measurement unit are roughly estimated;
module M2.3: screening the observed quantities of the camera and the inertial measurement unit based on the information quantity by using the direction of the roughly estimated gravity vector and introducing the gravity as a new constraint, reserving a fixed quantity of observed quantities maximizing the total information quantity, and finely estimating a scale factor, the gravity vector, the relative translation quantity of the camera and the inertial measurement unit and a null shift value of an accelerometer in the inertial measurement unit; when the scale factor and the relative translation amount of the camera and the inertial measurement unit are converged, the visual and inertial joint initialization and calibration are completed;
the information quantity is the observed quantity of the camera and the inertial measurement unit and comprises the relative rotation angle information quantity of the camera and the inertial measurement unit.
Preferably, said module M2.1 comprises:
module M2.1.1: obtaining the relative rotation angle of the inertia measurement unit between two adjacent frames by using the observed quantity of the inertia measurement unit;
obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by integrating the angular velocity of the inertial measurement unit
Figure BDA0002418099640000091
Figure BDA0002418099640000092
Wherein, Δ Ri,i+1Representing an inertial measurement unit angular velocity integral value;
Figure BDA0002418099640000101
representing inertiaA Jacobian matrix of magnitude units with respect to zero drift values of angular velocity; bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
module M2.1.2: obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by using the observed quantity of the camera;
the poses of two adjacent frames of cameras under a world coordinate system are obtained through visual synchronous positioning and mapping
Figure BDA0002418099640000102
Obtaining the relative rotation angle of the inertial measurement unit by using the relative rotation angle of the camera and the inertial measurement unit
Figure BDA0002418099640000103
Figure BDA0002418099640000104
Wherein R iscbRepresenting a relative rotation angle of the camera and the inertial measurement unit;
module M2.1.3: minimizing an error of a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the inertial measurement unit and a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the camera to obtain an optimal estimation of an angular velocity null shift value of the inertial measurement unit, namely initializing the angular velocity null shift value of the inertial measurement unit;
obtaining the optimal estimation of the null shift value of the angular velocity of the inertial measurement unit by minimizing the error of the two relative rotation angles
Figure BDA0002418099640000105
Figure BDA0002418099640000106
Where N denotes the number of key frames, R2 denotes the relative rotation angle of the inertial measurement unit obtained from the camera observation, and R1 denotes the phase of the inertial measurement unit obtained by integrating the angular velocity of the inertial measurement unitFor the angle of rotation, biInertial measurement Unit representing the ith Key frame, bi+1Inertial measurement Unit representing the i +1 st Key frame, bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
module M2.1.4: for any two adjacent key frames, the quantity of information about the relative rotation angle between the camera and the inertial measurement unit contained in the observed quantity is as follows:
Figure BDA0002418099640000107
wherein Q isL(q) a left-hand matrix corresponding to the quaternion q; qR(q) a right-multiplication matrix corresponding to the quaternion q; q. q.sbi,bi+1A quaternion representing the relative rotation angle of the inertial measurement unit of the ith frame and the (i + 1) th frame; q. q.sci,ci+1A quaternion representing the relative rotation angle of the i frame and the i +1 frame camera;
setting the information content window size to NwinIf the total number of key frames is less than NwinThe latest key frame is added to the sequence for initialization and calibration. If the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
module M2.1.5: constructing an equation set taking the relative rotation angle of the camera and the inertial measurement unit as unknown quantity by using the screened observed quantity, and solving the equation set by using a least square method to obtain the optimal estimation of the relative rotation angle of the camera and the inertial measurement unit expressed in the form of quaternion
Figure BDA0002418099640000111
Calibrating the relative rotation angle between the camera and the inertial measurement unit;
for any two adjacent key frames, the relative rotation angle of the camera and the inertial measurement unit is an unknown quantity, and the following equation system is satisfied:
(QL(qbi,bi+1)-QR(qci,ci+1))·qbc=0 (5)
wherein,QLA left-hand multiplication matrix, q, representing a quaternionbi,bi+1Quaternion representation, Q, representing the relative rotation angle of the inertial measurement unit in frame i and frame i +1RA right-hand multiplication matrix representing a quaternion, qci,ci+1Quaternion representation, q, representing the relative rotation angle of the camera in frame i and frame i +1bcA quaternion representation representing the relative rotation angle of the camera and the inertial measurement unit;
will (Q)L(qbi,bi+1)-QR(qci,ci+1) Is represented by Q)i,i+1Using the observed quantity Q after screeningi,i+1Forming an observation matrix Q, and establishing an equation:
Q·qbc=0 (6)
solving for optimal estimation of the relative rotation angle of the camera and the inertial measurement unit using least squares
Figure BDA0002418099640000112
Figure BDA0002418099640000113
Module M2.1.6: by judging the nearest NwincStandard deviation sigma of yaw angle, pitch angle and roll angle of Euler angle corresponding to relative rotation angle of individual camera and inertial measurement unityaw、σpitch、σrollWhether or not less than a threshold value
Figure BDA0002418099640000114
Determining whether the calibration of the visual inertia rotation angle is finished; when the calibration is not completed, that is, the relative rotation angle between the camera and the inertial measurement unit is not converged, wait for a new key frame to repeatedly trigger the execution of the modules M2.1.1 to M2.1.5 until the calibration of the visual inertial rotation angle is completed;
said module M2.2 comprises:
module M2.2.1: obtaining a rotation angle and a translation amount of the camera in a world coordinate system according to the camera observed quantity by utilizing a vision synchronous positioning and mapping algorithm; obtaining the relative translation amount and the relative linear velocity of two adjacent key frames of the inertia measurement unit according to the observed amount of the inertia measurement unit by using a pre-integration algorithm;
constructing the ith frame by a scale factor s and a gravity vector gwAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure BDA0002418099640000121
Figure BDA0002418099640000122
Figure BDA0002418099640000123
Figure BDA0002418099640000124
Figure BDA0002418099640000125
wherein, an acceleration zero drift value b of the inertial measurement unit is assumedaα is zeroiExpressing the coefficients before the scale factor in the equation, βiCoefficient before the initialized value of the gravity vector in the equation, gammaiExpressing the coefficients of the equations before the amount of visual inertial translation, ηiConstant term coefficients in the equation are expressed, and s represents a scale factor; gwAn initialization value representing a gravity vector; pcbRepresenting the visual inertial translation amount; rwcRepresenting the rotation angle of the camera in the world coordinate system; pwcThe translation amount of the camera in the world coordinate system;
Figure BDA0002418099640000126
represents the translation amount of the i +1 th frame camera in the world coordinate system,
Figure BDA0002418099640000127
represents the translation amount, dt, of the camera of the ith frame in the world coordinate systemi+1,i+2Representing the i +1 th frame and the i +2 th frameThe time difference between the two phases of the operation,
Figure BDA0002418099640000128
represents the translation amount, dt, of the i +2 th frame camera in the world coordinate systemi,i+1Representing the time difference between the ith frame and the (I + 1) th frame, I3×3The unit matrix is represented by a matrix of units,
Figure BDA0002418099640000129
represents the rotation angle of the i +2 th frame camera in the world coordinate system,
Figure BDA00024180996400001210
represents the rotation angle of the i +1 th frame camera in the world coordinate system,
Figure BDA00024180996400001211
representing the rotation angle, R, of the i-th frame camera in the world coordinate systemcbRepresenting the relative rotation angle, Δ P, between the camera and the inertial measurement uniti,i+1Represents the relative translation, Δ P, of the inertial measurement units of frame i and frame i +1i+1,i+2Represents the relative translation, Deltav, of the inertial measurement units of frame i +1 and frame i +2i,i+1Representing the relative linear speeds of the inertial measurement units of the (i + 1) th frame and the (i + 2) th frame;
module M2.2.2: suppose Ai=[αiβiγi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure BDA00024180996400001212
Setting the information content window size to NwinWhen the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of the key frames is more than or equal to NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
module M2.2.3: suppose that
Figure BDA00024180996400001213
Bi=ηiFor every 3 adjacent frames, there is equation Ai·x=Bi(ii) a Constructing an information matrix A and a constant matrix B by using the screened information quantity, and constructing an equation:
A·x=B (9)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the gravity vector gw *And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure BDA0002418099640000131
Said module M2.3 comprises: introducing a gravity vector with a preset value as a constraint to obtain a scale factor s and a gravity vector gwInitial value of (1) and visual inertial translation amount PcbA calibration value of (1);
according to the obtained gravity vector gwObtaining the rotation angle of gravity:
Figure BDA0002418099640000132
wherein R isWIRepresenting the rotation angle of gravity in the world coordinate system,
Figure BDA0002418099640000133
representation and gravity vector
Figure BDA0002418099640000134
And a fixed vector gIVertical unit vector, theta is expressed in relation to gravity vector
Figure BDA0002418099640000135
And a fixed vector gIAngle between them, gIRepresents a fixed vector; the gravity vector is expressed by the rotation angle and the gravity magnitude and is subjected to first-order Taylor expansion:
Figure BDA0002418099640000136
wherein θ represents an updated value of the gravity vector rotation angle; g represents the magnitude of gravity, θxThe component of the updated value representing the rotation angle of the gravity vector in the x-axis of the world coordinate system, θyA component of the updated value representing the rotation angle of the gravity vector in the y-axis of the world coordinate system;
obtaining the rotation angle R of the camera in the world coordinate system according to the camera observation quantity by using a vision synchronous positioning and mapping algorithmwcAnd the amount of translation Pwc(ii) a Obtaining the relative translation quantity delta P of two adjacent key frames of the inertia measurement unit according to the observed quantity of the inertia measurement unit by utilizing a pre-integration algorithmi,i+1And relative linear velocity Δ vi,i+1(ii) a Constructing the ith frame by a scale factor s and a gravity vector gwZero drift value b of acceleration of inertial measurement unitaAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure BDA0002418099640000141
Figure BDA0002418099640000142
Figure BDA0002418099640000143
Figure BDA0002418099640000144
Figure BDA0002418099640000145
Figure BDA0002418099640000146
wherein λ isiCoefficient before scale factor in the equation, muiCoefficient before the updated value of the rotation angle of the gravity vector in the equation, ωiInertia measurement in expression equationCoefficient before zero drift of acceleration of the measuring unit, upsiloniThe coefficient, rho, of the relative translation between the camera and the inertial measurement unit in the equationiThe coefficients of the constant terms in the equation are expressed,
Figure BDA0002418099640000147
jacobian matrix, dt, representing the relative linear velocity of the inertial measurement unit in frame i and frame i +2 compared to the null shift value of the acceleration of the inertial measurement unit12Representing the time difference, dt, between the ith frame and the (i + 1) th frame23Representing the time difference between the i +1 th frame and the i +2 th frame,
Figure BDA0002418099640000148
a Jacobian matrix representing the relative translation of the inertial measurement unit between frame i +1 and frame i +2 compared to the inertial measurement unit acceleration null shift value,
Figure BDA0002418099640000149
a Jacobian matrix, a small scale, representing the relative translation of the inertial measurement unit between frame i and frame i +1 compared to the null shift value of the acceleration of the inertial measurement unit[:,1:2]The representation takes only the first two columns of the matrix;
hypothesis Ci=[λiμiωiυi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure BDA00024180996400001410
Setting the information content window size to NwinIf the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
constructing equation set and solving by least square method
Suppose that
Figure BDA00024180996400001411
Di=ρiFor every 3 adjacent frames, there is equation Ci·y=Di(ii) a And (3) constructing an observation matrix C and a constant matrix D by using the screened observed quantity, and constructing an equation:
C·y=D (14)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the update value of the rotation angle of the gravity vector*Optimal estimation of acceleration null shift value of inertial measurement unit
Figure BDA0002418099640000151
And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure BDA0002418099640000152
By judging the nearest NwincStandard deviation sigma corresponding to each dimension value of vision inertial translation quantityx、σy、σzWhether or not less than a threshold value
Figure BDA0002418099640000153
Determining whether the calibration of the visual inertial translation amount is finished; by judging the nearest NwincStandard deviation sigma of individual scale factorssWhether or not it is less than threshold thrsTo determine whether visual inertia initialization is complete; when the calibration of the vision inertia translation amount is not completed or the initialization of the vision inertia is not completed, the module M2.2 to the module M2.3 are repeatedly triggered to execute after waiting for a new key frame until the vision and inertia combined initialization and calibration are completed.
Preferably, said module M3 comprises: in the process of synchronous positioning and image building fused by the camera and the inertial measurement unit, judging whether the relative pose of the camera and the inertial measurement unit changes every other preset value period, and when the relative pose of the camera and the inertial measurement unit changes, initializing the angular velocity null shift value of the inertial measurement unit and recalibrating the relative rotation angle and translation quantity of the camera and the inertial measurement unit according to any two adjacent frames in the latest frame;
module M3.1: and judging whether the calibration value of the relative rotation angle of the camera and the inertia needs to be updated according to whether the consistency error between the observed quantity of the camera and the inertia measurement unit and the initial calibration value of the relative rotation angle of the camera and the inertia measurement unit exceeds a threshold value.
Compared with the prior art, the invention has the following beneficial effects:
1. according to the invention, calibration of the relative rotation angle and the relative translation amount of the camera and the inertial measurement unit can be completed in a combined initialization process without manual calibration tools such as a calibration plate, so that the application range is expanded;
2. the invention can autonomously judge whether recalibration is needed in the operation process, and can deal with the condition that the relative pose of the camera and the inertial measurement unit changes along with time;
3. the invention utilizes the information quantity as a measurement index, screens the observed quantity participating in initialization and calibration, and inhibits the continuous increase of the calculated quantity under the condition of motion degradation.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of pure visual initialization;
FIG. 2 is a flow chart of visual and inertial fusion initialization and calibration;
FIG. 3 is a flow chart of a screening algorithm based on an information content window;
FIG. 4 is a flowchart illustrating the calibration value abnormality determination and update process.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
The invention provides a method for united online initialization and calibration of a monocular camera and an inertial measurement unit, which is carried out by taking a key frame period as a period and comprises three stages of pure visual initialization, visual and inertial united calibration and initialization and calibration value abnormity detection and updating. In the pure visual initialization stage, a current frame and a historical frame pair which meet the parallax condition and the matching point logarithm condition are searched first. And then, initializing the pose of each key frame in a world coordinate system by using the matching point pair in the current frame and the historical frame pair, obtaining the coordinates of each map point in the world coordinate system, and constructing an initial map. In the visual and inertial combined calibration and initialization phase, first, a key frame for calibration and initialization is screened based on the information amount. And then, initializing an angular velocity null shift value of the inertial measurement unit and calibrating the relative rotation angle of the camera and the inertial measurement unit by using the observed quantity of the key frame left after screening until the relative rotation angle of the camera and the inertial measurement unit is judged to be converged. Then, assuming that the null shift value of the acceleration is zero, a rough estimation is performed on the scale factor, the gravity vector, and the relative translation amount of the camera and the inertial measurement unit. And then, utilizing the direction of the roughly estimated gravity vector and introducing the gravity as a new constraint to carry out fine estimation on the scale factor, the gravity vector, the relative translation amount of the camera and the inertial measurement unit and the null shift value of the accelerometer in the inertial measurement unit. And when the scale factor and the visual inertial translation amount are converged, the initialization and calibration are completed. And updating the calibration value in the running process of the system, judging whether the relative pose of the camera and the inertial measurement unit changes at fixed intervals, and re-calibrating the relative pose if the relative pose of the camera and the inertial measurement unit changes. Compared with other methods for initializing and calibrating the monocular camera and the inertial measurement unit, the algorithm can directly utilize information in the actual environment without an additional calibration board, can deal with the condition of calibration value change in the long-term operation process, and can deal with the problem of rapid increase of the calculation amount of initialization and calibration under the condition of insufficient body movement.
Aiming at the problem that the calculated amount rapidly increases along with the time under the condition of motion degradation in the online initialization and calibration of the existing camera and the inertial measurement unit, the invention provides a monocular camera and inertial measurement unit combined online initialization and calibration method based on an information amount window, and aims to inhibit the rapid increase of the calculated amount along with the time.
The invention provides an online initialization and calibration method for a camera and an inertial measurement unit, which comprises the following steps:
defining variables to be initialized:
Figure BDA0002418099640000171
wherein
T1,T2......TN: the poses of the first N frames of cameras in a world coordinate system;
Figure BDA0002418099640000172
coordinates of k map points in a world coordinate system;
s: monocular camera scale factors;
gw: a gravity vector in a world coordinate system;
bg: the angular velocity zero drift value of the inertial measurement unit;
ba: an acceleration zero drift value of an inertia measurement unit;
defining variables to be calibrated:
θ2={Rbc,Pbctherein of
Rbc: the relative rotation angle of the camera and the inertial measurement unit;
Pbc: the camera and the inertia measurement unit relatively translate;
step M1: as shown in fig. 1, pure visual initialization, obtaining the pose of an initial key frame by using the observed quantity of a camera and constructing an initial map; performing pure vision initialization is a precondition for pure vision synchronous mapping and positioning, and in the step M2 and the step M3, the pose of the camera under each key frame comes from vision synchronous positioning and mapping; each point is a point observed in the initial key frame, the determination of each point coordinate depends on the determination of the initial key frame position, and each frame position is the initial key frame position.
Step M2: as shown in fig. 2, vision and inertia are calibrated and initialized in a combined manner, and initialization of an angular velocity null shift value of an inertia measurement unit and calibration of a relative rotation angle and a translation amount of a camera and the inertia measurement unit are completed based on the pose of an initial key frame and an initial map;
step M3: as shown in fig. 4, in the process of synchronous positioning and mapping by fusing the camera and the inertial measurement unit, the initialization of the null shift value of the angular velocity of the inertial measurement unit and the calibration of the relative rotation angle and translation amount of the camera and the inertial measurement unit are performed for abnormality detection and updating. The detection and updating of the anomalies is carried out at a preset frequency during the whole operation of the system, after the initialization and the completion of the initial calibration of steps M1 and M2, with the aim of correcting the calibration values which may vary over time.
Specifically, the step M1 includes:
step M1.1: according to the new key frame, searching a current frame and a historical frame which accord with the parallax angle and the matching point logarithm of the preset condition;
step M1.2: calculating the relative pose between the current frame and the historical frame by utilizing epipolar constraint;
step M1.3: according to the obtained relative pose between the current frame and the historical frame, obtaining the world coordinate system coordinates of map points observed by the current frame and the historical frame together through triangulation, and constructing an initial map;
step M1.4: and projecting points in the initial map onto the pixel plane of each key frame by using a Perspective n-Point method (PnP), and solving the pose of each key frame in a world coordinate system.
Specifically, the step M2 includes:
step M2.1: screening the observed quantities of the camera and the inertia measurement unit based on the information quantity, and screening out the observed quantities of the preset quantity which maximizes the total information quantity; initializing an angular velocity null shift value of the inertial measurement unit and calibrating a relative rotation angle between the camera and the inertial measurement unit by using the observed quantities of the camera and the inertial measurement unit obtained after screening until the relative rotation angle between the camera and the inertial measurement unit is judged to be converged;
step M2.2: the method comprises the steps that the null shift value of the acceleration of an inertial measurement unit is assumed to be zero, the observed quantities of a camera and the inertial measurement unit are screened based on information quantity, the observed quantities of a fixed quantity of maximized total information quantity are reserved, and a scale factor, a gravity vector and the relative translation quantity of the camera and the inertial measurement unit are roughly estimated;
step M2.3: utilizing the direction of the roughly estimated gravity vector and introducing the gravity as a new constraint to carry out fine estimation on the scale factor, the gravity vector, the relative translation amount of the camera and the inertial measurement unit and the null shift value of an accelerometer in the inertial measurement unit; when the scale factor and the relative translation amount of the camera and the inertial measurement unit are converged, the visual and inertial joint initialization and calibration are completed;
the information quantity is the observed quantity of the camera and the inertial measurement unit and comprises the relative rotation angle information quantity of the camera and the inertial measurement unit.
In particular, said step M2.1 comprises:
step M2.1.1: obtaining the relative rotation angle of the inertia measurement unit between two adjacent frames by using the observed quantity of the inertia measurement unit;
obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by integrating the angular velocity of the inertial measurement unit
Figure BDA0002418099640000181
Figure BDA0002418099640000182
Wherein, Δ Ri,i+1Representing an inertial measurement unit angular velocity integral value;
Figure BDA0002418099640000183
a Jacobian matrix representing a null shift value of the inertial measurement unit with respect to the angular velocity; bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
step M2.1.2: obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by using the observed quantity of the camera;
obtaining the poses of the cameras of the two adjacent frames in a world coordinate system through visual synchronous positioning and Mapping (S L AM: Simultaneous L localization and Mapping)
Figure BDA0002418099640000191
Obtaining the relative rotation angle of the inertial measurement unit by using the relative rotation angle of the camera and the inertial measurement unit
Figure BDA0002418099640000192
Figure BDA0002418099640000193
Wherein R iscbRepresenting a relative rotation angle of the camera and the inertial measurement unit;
step M2.1.3: minimizing an error of a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the inertial measurement unit and a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the camera to obtain an optimal estimation of an angular velocity null shift value of the inertial measurement unit, namely initializing the angular velocity null shift value of the inertial measurement unit;
obtaining the optimal estimation of the null shift value of the angular velocity of the inertial measurement unit by minimizing the error of the two relative rotation angles
Figure BDA0002418099640000194
Figure BDA0002418099640000195
Where N denotes the number of key frames, R2 denotes the relative rotation angle of the inertial measurement unit obtained from the camera observation, R1 denotes the relative rotation angle of the inertial measurement unit obtained by integration of the angular velocity of the inertial measurement unit, and biInertial measurement Unit representing the ith Key frame, bi+1Inertial measurement Unit representing the i +1 st Key frame, bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
step M2.1.4: for any two adjacent key frames, the observed quantity comprises the information quantity measure of the relative rotation angle between the camera and the inertial measurement unit
Figure BDA0002418099640000196
Wherein
Wherein Q isL(q): a left multiplication matrix corresponding to the quaternion q; qR(q): a right multiplication matrix corresponding to the quaternion q; q. q.sbi,bi+1: a quaternion corresponding to the relative rotation angle of the inertial measurement unit of the ith frame and the (i + 1) th frame; q. q.sci,ci+1: a quaternion corresponding to the relative rotation angle of the camera of the i-th frame and the i + 1-th frame;
setting the information content window size to NwinIf the total number of key frames is less than NwinThe latest key frame is added to the sequence for initialization and calibration. If the total number of key frames is greater than NwinAnd comparing the current information quantity measurement with the information quantity measurement in the existing sequence, and rejecting the observed quantity with the minimum information quantity.
Step M2.1.5: constructing an equation set taking the relative rotation angle of the camera and the inertial measurement unit as unknown quantity by using the screened observed quantity, and solving the equation set by using a least square method to obtain the optimal estimation of the relative rotation angle of the camera and the inertial measurement unit expressed in the form of quaternion
Figure BDA0002418099640000201
Calibrating the relative rotation angle between the camera and the inertial measurement unit;
for any two adjacent key frames, the relative rotation angle of the camera and the inertial measurement unit is an unknown quantity, and the following equation system is satisfied:
(QL(qbi,bi+1)-QR(qci,ci+1))·qbc=0 (4)
wherein Q isLA left-hand multiplication matrix, q, representing a quaternionbi,bi+1Quaternion representation, Q, representing the relative rotation angle of the inertial measurement unit in frame i and frame i +1RA right-hand multiplication matrix representing a quaternion, qci,ci+1Indicating relative rotation of the phase under the ith frame and the (i + 1) th frameQuaternion representation of the angle, qbcA quaternion representation representing the relative rotation angle of the camera and the inertial measurement unit;
will (Q)L(qbi,bi+1)-QR(qci,ci+1) Is represented by Q)i,i+1Using the observed quantity Q after screeningi,i+1Forming an observation matrix Q, and establishing an equation:
Q·qbc=0 (5)
solving for optimal estimation of the relative rotation angle of the camera and the inertial measurement unit using least squares
Figure BDA0002418099640000202
Figure BDA0002418099640000203
Step M2.1.6: by judging the nearest NwincStandard deviation sigma of yaw angle, pitch angle and roll angle of Euler angle corresponding to relative rotation angle of individual camera and inertial measurement unityaw、σpitch、σrollWhether or not less than a threshold value
Figure BDA0002418099640000204
Determining whether the calibration of the visual inertia rotation angle is finished; when the calibration is not completed, that is, the relative rotation angle between the camera and the inertial measurement unit is not converged, waiting for a new key frame to repeatedly execute steps M2.1.1 to M2.1.5 until the calibration of the visual inertial rotation angle is completed;
said step M2.2 comprises:
step M2.2.1: obtaining a rotation angle and a translation amount of the camera in a world coordinate system according to the camera observed quantity by utilizing a vision synchronous positioning and mapping algorithm; obtaining the relative translation amount and the relative linear velocity of two adjacent key frames of the inertia measurement unit according to the observed amount of the inertia measurement unit by using a pre-integration algorithm;
constructing the ith frame by a scale factor s and a gravity vector gwAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure BDA0002418099640000211
Figure BDA0002418099640000212
Figure BDA0002418099640000213
Figure BDA0002418099640000214
Figure BDA0002418099640000215
wherein, an acceleration zero drift value b of the inertial measurement unit is assumedaα is zeroiExpressing the coefficients before the scale factor in the equation, βiCoefficient before the initialized value of the gravity vector in the equation, gammaiExpressing the coefficients of the equations before the amount of visual inertial translation, ηiConstant term coefficients in the equation are expressed, and s represents a scale factor; gwAn initialization value representing a gravity vector; pcbRepresenting the visual inertial translation amount; rwcRepresenting the rotation angle of the camera in the world coordinate system; pwcThe translation amount of the camera in the world coordinate system;
Figure BDA0002418099640000216
represents the translation amount of the i +1 th frame camera in the world coordinate system,
Figure BDA0002418099640000217
represents the translation amount, dt, of the camera of the ith frame in the world coordinate systemi+1,i+2Representing the time difference between the (i + 1) th frame and the (i + 2) th frame,
Figure BDA0002418099640000218
represents the translation amount, dt, of the i +2 th frame camera in the world coordinate systemi,i+1Representing the time difference between the ith frame and the (i + 1) th frame,I3×3the unit matrix is represented by a matrix of units,
Figure BDA0002418099640000219
represents the rotation angle of the i +2 th frame camera in the world coordinate system,
Figure BDA00024180996400002110
represents the rotation angle of the i +1 th frame camera in the world coordinate system,
Figure BDA00024180996400002111
representing the rotation angle, R, of the i-th frame camera in the world coordinate systemcbRepresenting the relative rotation angle, Δ P, between the camera and the inertial measurement uniti,i+1Represents the relative translation, Δ P, of the inertial measurement units of frame i and frame i +1i+1,i+2Represents the relative translation, Deltav, of the inertial measurement units of frame i +1 and frame i +2i,i+1Representing the relative linear speeds of the inertial measurement units of the (i + 1) th frame and the (i + 2) th frame;
as shown in fig. 3, step M2.2.2: suppose Ai=[αiβiγi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure BDA00024180996400002112
Setting the information content window size to NwinWhen the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of the key frames is more than or equal to NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
step M2.2.3: suppose that
Figure BDA00024180996400002113
Bi=ηiFor every 3 adjacent frames, there is equation Ai·x=Bi(ii) a Constructing an information matrix A and a constant matrix B by using the screened information quantity, and constructing an equation:
A·x=B (8)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the gravity vector gw *And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure BDA0002418099640000221
Said step M2.3 comprises: introducing a constraint of G-9.8 as the gravity vector size to obtain a scale factor s and a gravity vector GwInitial value of (1) and visual inertial translation amount PcbA calibration value of (1);
according to the obtained gravity vector gwObtaining the rotation angle of gravity:
Figure BDA0002418099640000222
wherein R isWIRepresenting the rotation angle of gravity in the world coordinate system,
Figure BDA0002418099640000223
representation and gravity vector
Figure BDA0002418099640000224
And a fixed vector gIVertical unit vector, theta is expressed in relation to gravity vector
Figure BDA0002418099640000225
And a fixed vector gIAngle between them, gIRepresents a fixed vector [0,0, -1](ii) a The gravity vector is expressed by the rotation angle and the gravity magnitude and is subjected to first-order Taylor expansion:
Figure BDA0002418099640000226
wherein θ represents an updated value of the gravity vector rotation angle; g represents the magnitude of gravity, θxThe component of the updated value representing the rotation angle of the gravity vector in the x-axis of the world coordinate system, θyRepresenting the angle of rotation of the gravity vectorThe component of the new value under the y-axis of the world coordinate system;
obtaining the rotation angle R of the camera in the world coordinate system according to the camera observation quantity by using a vision synchronous positioning and mapping algorithmwcAnd the amount of translation Pwc(ii) a Obtaining the relative translation quantity delta P of two adjacent key frames of the inertia measurement unit according to the observed quantity of the inertia measurement unit by utilizing a pre-integration algorithmi,i+1And relative linear velocity Δ vi,i+1(ii) a Constructing the ith frame by a scale factor s and a gravity vector gwZero drift value b of acceleration of inertial measurement unitaAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure BDA0002418099640000231
Figure BDA0002418099640000232
Figure BDA0002418099640000233
Figure BDA0002418099640000234
Figure BDA0002418099640000235
Figure BDA0002418099640000236
wherein λ isiCoefficient before scale factor in the equation, muiCoefficient before the updated value of the rotation angle of the gravity vector in the equation, ωiThe coefficient before the acceleration zero drift value of the inertia measurement unit in the equation is expressed as upsiloniThe coefficient, rho, of the relative translation between the camera and the inertial measurement unit in the equationiThe coefficients of the constant terms in the equation are expressed,
Figure BDA0002418099640000237
jacobian matrix, dt, representing the relative linear velocity of the inertial measurement unit in frame i and frame i +2 compared to the null shift value of the acceleration of the inertial measurement unit12Representing the time difference, dt, between the ith frame and the (i + 1) th frame23Representing the time difference between the i +1 th frame and the i +2 th frame,
Figure BDA0002418099640000238
a Jacobian matrix representing the relative translation of the inertial measurement unit between frame i +1 and frame i +2 compared to the inertial measurement unit acceleration null shift value,
Figure BDA0002418099640000239
a Jacobian matrix, a small scale, representing the relative translation of the inertial measurement unit between frame i and frame i +1 compared to the null shift value of the acceleration of the inertial measurement unit[:,1:2]The representation takes only the first two columns of the matrix;
hypothesis Ci=[λiμiωiυi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure BDA00024180996400002310
Setting the information content window size to NwinIf the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
constructing equation set and solving by least square method
Suppose that
Figure BDA00024180996400002311
Di=ρiFor every 3 adjacent frames, there is equation Ci·y=Di(ii) a And (3) constructing an observation matrix C and a constant matrix D by using the screened observed quantity, and constructing an equation:
C·y=D (13)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the update value of the rotation angle of the gravity vector*Optimal estimation of acceleration null shift value of inertial measurement unit
Figure BDA0002418099640000241
And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure BDA0002418099640000242
By judging the nearest NwincStandard deviation sigma corresponding to each dimension value of vision inertial translation quantityx、σy、σzWhether or not less than a threshold value
Figure BDA0002418099640000243
Determining whether the calibration of the visual inertial translation amount is finished; by judging the nearest NwincStandard deviation sigma of individual scale factorssWhether or not it is less than threshold thrsTo determine whether visual inertia initialization is complete; and when the calibration of the vision inertia translation amount is not finished or the initialization of the vision inertia is not finished, waiting for a new key frame and then repeatedly executing the step M2.2 to the step M2.3 until the joint initialization and calibration of the vision and the inertia are finished.
Specifically, the step M3 includes: in the process of synchronous positioning and image building fused by the camera and the inertial measurement unit, judging whether the relative pose of the camera and the inertial measurement unit changes every other preset value period, and when the relative pose of the camera and the inertial measurement unit changes, initializing the angular velocity null shift value of the inertial measurement unit and recalibrating the relative rotation angle and translation quantity of the camera and the inertial measurement unit according to any two adjacent frames in the latest frame;
and judging whether the calibration value of the relative rotation angle of the camera and the inertia needs to be updated according to whether the consistency error between the observed quantity of the camera and the inertia measurement unit and the initial calibration value of the relative rotation angle of the camera and the inertia measurement unit exceeds a threshold value.
Step M3.1: and judging whether the calibration value of the relative rotation angle between the camera and the inertia needs to be updated or not.
With the latest NupdateTaking a frame as an observed quantity, calculating for any two adjacent frames i and i + 1:
Qi,i+1=(QL(qbi,bi+1)-QR(qci,ci+1))。
computing
Figure BDA0002418099640000244
Wherein
qbc: the relative rotation angle of the front camera and the inertial measurement unit is updated.
The calculation result is compared with a threshold value thrupdateRAnd (6) comparing. If it is greater than the threshold value thrupdateRThe relative rotation angle needs to be calibrated again. Step M3.2 is carried out; otherwise, turning to the step M3.3;
and M3.2, screening the observed quantity participating in calibration based on the information quantity. Calibrating the relative rotation angle between the camera and the inertial measurement unit by using the screened observed quantity;
for the latest NupdateCalculating the information metric contained in the observed quantity for any two adjacent frames in the frame
Figure BDA0002418099640000251
Selecting N with the largest information metricwinAnd (6) measuring the observed quantity. Forming an observation matrix Q by using the observed quantities, and solving the relative rotation angle between the camera and the inertial measurement unit by using a least square method
Figure BDA0002418099640000252
Figure BDA0002418099640000253
And M3.3, judging whether the relative translation calibration values of the camera and the inertial measurement unit need to be updated.
With the latest NupdateTaking frames as observed quantities, and performing visual comparison by using the observed quantities of any three adjacent frames i and i +1 and i +2The inertial fusion S L AM algorithm obtains the rotation angle R of each frame camera in the world coordinate systemwcCalculating the relative translation amount pre-integral delta P and the relative linear velocity pre-integral delta v of two adjacent frames:
Figure BDA0002418099640000254
Figure BDA0002418099640000255
computing
Figure BDA0002418099640000256
Wherein:
Pcbto update the relative translation of the front camera and the inertial measurement unit.
The calculation result is compared with a threshold value thrupdatePAnd (6) comparing. If it is greater than the threshold value thrupdatePThen the relative translation amount needs to be calibrated again, and the step M3.4 is carried out; otherwise, returning to the step M3.1;
step M3.4: and screening the observed quantity participating in calibration based on the information quantity. And calibrating the relative translation quantity of the camera and the inertial measurement unit by using the screened observed quantity.
For the latest NupdateCalculating the information metric contained in the observed quantity for any two adjacent frames in the frame
Figure BDA0002418099640000257
Selecting N with the largest information metricwinAnd (6) measuring the observed quantity. The observation matrixes V and P are formed by the observation quantities, and the relative translation quantity of the camera and the inertial measurement unit is solved by a least square method
Figure BDA0002418099640000258
Figure BDA0002418099640000259
Information content window size N during frame screeningwin=12, convergence judgment parameter Nwinc=15,
Figure BDA00024180996400002510
Figure BDA0002418099640000261
thrs0.02. Judging whether the window size N needs to be calibrated againupdate50, threshold size thrupdateR=0.2°,thrupdateP=0.1m。
The invention provides an online initialization and calibration system for a camera and an inertial measurement unit, which comprises:
defining variables to be initialized:
Figure BDA0002418099640000262
wherein
T1,T2......TN: the poses of the first N frames of cameras in a world coordinate system;
Figure BDA0002418099640000263
coordinates of k map points in a world coordinate system;
s: monocular camera scale factors;
gw: a gravity vector in a world coordinate system;
bg: the angular velocity zero drift value of the inertial measurement unit;
ba: an acceleration zero drift value of an inertia measurement unit;
defining variables to be calibrated:
θ2={Rbc,Pbctherein of
Rbc: the relative rotation angle of the camera and the inertial measurement unit;
Pbc: the camera and the inertia measurement unit relatively translate;
module M1: as shown in fig. 1, pure visual initialization, obtaining the pose of an initial key frame by using the observed quantity of a camera and constructing an initial map; the pure vision initialization is a precondition for pure vision synchronous mapping and positioning, and in a module M2 and a module M3, the poses of the cameras under each key frame come from vision synchronous positioning and mapping; each point is a point observed in the initial key frame, the determination of each point coordinate depends on the determination of the initial key frame position, and each frame position is the initial key frame position.
Module M2: as shown in fig. 2, vision and inertia are calibrated and initialized in a combined manner, and initialization of an angular velocity null shift value of an inertia measurement unit and calibration of a relative rotation angle and a translation amount of a camera and the inertia measurement unit are completed based on the pose of an initial key frame and an initial map;
module M3: as shown in fig. 4, in the process of synchronous positioning and mapping by fusing the camera and the inertial measurement unit, the initialization of the null shift value of the angular velocity of the inertial measurement unit and the calibration of the relative rotation angle and translation amount of the camera and the inertial measurement unit are performed for abnormality detection and updating. The detection and updating of the anomalies is carried out at a preset frequency during the whole system operation, after the initialization and the completion of the initial calibration of the module M1 and module M2, with the aim of correcting the calibration values which may vary over time.
Specifically, the module M1 includes:
module M1.1: according to the new key frame, searching a current frame and a historical frame which accord with the parallax angle and the matching point logarithm of the preset condition;
module M1.2: calculating the relative pose between the current frame and the historical frame by utilizing epipolar constraint;
module M1.3: according to the obtained relative pose between the current frame and the historical frame, obtaining the world coordinate system coordinates of map points observed by the current frame and the historical frame together through triangulation, and constructing an initial map;
module M1.4: and projecting points in the initial map onto the pixel plane of each key frame by using a Perspective n-Point method (PnP), and solving the pose of each key frame in a world coordinate system.
Specifically, the module M2 includes:
module M2.1: screening the observed quantities of the camera and the inertia measurement unit based on the information quantity, and screening out the observed quantities of the preset quantity which maximizes the total information quantity; initializing an angular velocity null shift value of the inertial measurement unit and calibrating a relative rotation angle between the camera and the inertial measurement unit by using the observed quantities of the camera and the inertial measurement unit obtained after screening until the relative rotation angle between the camera and the inertial measurement unit is judged to be converged;
module M2.2: the method comprises the steps that the null shift value of the acceleration of an inertial measurement unit is assumed to be zero, the observed quantities of a camera and the inertial measurement unit are screened based on information quantity, the observed quantities of a fixed quantity of maximized total information quantity are reserved, and a scale factor, a gravity vector and the relative translation quantity of the camera and the inertial measurement unit are roughly estimated;
module M2.3: utilizing the direction of the roughly estimated gravity vector and introducing the gravity as a new constraint to carry out fine estimation on the scale factor, the gravity vector, the relative translation amount of the camera and the inertial measurement unit and the null shift value of an accelerometer in the inertial measurement unit; when the scale factor and the relative translation amount of the camera and the inertial measurement unit are converged, the visual and inertial joint initialization and calibration are completed;
the information quantity is the observed quantity of the camera and the inertial measurement unit and comprises the relative rotation angle information quantity of the camera and the inertial measurement unit.
In particular, said module M2.1 comprises:
module M2.1.1: obtaining the relative rotation angle of the inertia measurement unit between two adjacent frames by using the observed quantity of the inertia measurement unit;
obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by integrating the angular velocity of the inertial measurement unit
Figure BDA0002418099640000271
Figure BDA0002418099640000272
Wherein, Δ Ri,i+1Representing an inertial measurement unit angular velocity integral value;
Figure BDA0002418099640000273
a Jacobian matrix representing a null shift value of the inertial measurement unit with respect to the angular velocity; bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
module M2.1.2: obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by using the observed quantity of the camera;
obtaining the poses of the cameras of the two adjacent frames in a world coordinate system through visual synchronous positioning and Mapping (S L AM: Simultaneous L localization and Mapping)
Figure BDA0002418099640000281
Obtaining the relative rotation angle of the inertial measurement unit by using the relative rotation angle of the camera and the inertial measurement unit
Figure BDA0002418099640000282
Figure BDA0002418099640000283
Wherein R iscbRepresenting a relative rotation angle of the camera and the inertial measurement unit;
module M2.1.3: minimizing an error of a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the inertial measurement unit and a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the camera to obtain an optimal estimation of an angular velocity null shift value of the inertial measurement unit, namely initializing the angular velocity null shift value of the inertial measurement unit;
obtaining the optimal estimation of the null shift value of the angular velocity of the inertial measurement unit by minimizing the error of the two relative rotation angles
Figure BDA0002418099640000284
Figure BDA0002418099640000285
Wherein N represents the number of key frames, R2 denotes a relative rotation angle of the inertial measurement unit obtained by camera observation, R1 denotes a relative rotation angle of the inertial measurement unit obtained by integration of an angular velocity of the inertial measurement unit, biInertial measurement Unit representing the ith Key frame, bi+1Inertial measurement Unit representing the i +1 st Key frame, bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
module M2.1.4: for any two adjacent key frames, the quantity of information contained in the observed quantity and related to the relative rotation angle of the camera and the inertial measurement unit is measured as iQi,i+1=trace((QL(qbi,bi+1)-QR(qci,ci+1))T·(QL(qbi,bi+1)-QR(qci,ci+1) ))) wherein
Wherein Q isL(q): a left multiplication matrix corresponding to the quaternion q; qR(q): a right multiplication matrix corresponding to the quaternion q; q. q.sbi,bi+1: a quaternion corresponding to the relative rotation angle of the inertial measurement unit of the ith frame and the (i + 1) th frame; q. q.sci,ci+1: a quaternion corresponding to the relative rotation angle of the camera of the i-th frame and the i + 1-th frame;
setting the information content window size to NwinIf the total number of key frames is less than NwinThe latest key frame is added to the sequence for initialization and calibration. If the total number of key frames is greater than NwinAnd comparing the current information quantity measurement with the information quantity measurement in the existing sequence, and rejecting the observed quantity with the minimum information quantity.
Module M2.1.5: constructing an equation set taking the relative rotation angle of the camera and the inertial measurement unit as unknown quantity by using the screened observed quantity, and solving the equation set by using a least square method to obtain the optimal estimation of the relative rotation angle of the camera and the inertial measurement unit expressed in the form of quaternion
Figure BDA0002418099640000291
Calibrating the relative rotation angle between the camera and the inertial measurement unit;
for any two adjacent key frames, the relative rotation angle of the camera and the inertial measurement unit is an unknown quantity, and the following equation system is satisfied:
(QL(qbi,bi+1)-QR(qci,ci+1))·qbc=0 (4)
wherein Q isLA left-hand multiplication matrix, q, representing a quaternionbi,bi+1Quaternion representation, Q, representing the relative rotation angle of the inertial measurement unit in frame i and frame i +1RA right-hand multiplication matrix representing a quaternion, qci,ci+1Quaternion representation, q, representing the relative rotation angle of the camera in frame i and frame i +1bcA quaternion representation representing the relative rotation angle of the camera and the inertial measurement unit;
will (Q)L(qbi,bi+1)-QR(qci,ci+1) Is represented by Q)i,i+1Using the observed quantity Q after screeningi,i+1Forming an observation matrix Q, and establishing an equation:
Q·qbc=0 (5)
solving for optimal estimation of the relative rotation angle of the camera and the inertial measurement unit using least squares
Figure BDA0002418099640000292
Figure BDA0002418099640000293
Module M2.1.6: by judging the nearest NwincStandard deviation sigma of yaw angle, pitch angle and roll angle of Euler angle corresponding to relative rotation angle of individual camera and inertial measurement unityaw、σpitch、σrollWhether or not less than a threshold value
Figure BDA0002418099640000294
Determining whether the calibration of the visual inertia rotation angle is finished; when the calibration is not completed, that is, the relative rotation angle between the camera and the inertial measurement unit is not converged, wait for a new key frame to repeatedly trigger the execution of the modules M2.1.1 to M2.1.5 until the calibration of the visual inertial rotation angle is completed;
said module M2.2 comprises:
module M2.2.1: obtaining a rotation angle and a translation amount of the camera in a world coordinate system according to the camera observed quantity by utilizing a vision synchronous positioning and mapping algorithm; obtaining the relative translation amount and the relative linear velocity of two adjacent key frames of the inertia measurement unit according to the observed amount of the inertia measurement unit by using a pre-integration algorithm;
constructing the ith frame by a scale factor s and a gravity vector gwAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure BDA0002418099640000301
Figure BDA0002418099640000302
Figure BDA0002418099640000303
Figure BDA0002418099640000304
Figure BDA0002418099640000305
wherein, an acceleration zero drift value b of the inertial measurement unit is assumedaα is zeroiExpressing the coefficients before the scale factor in the equation, βiCoefficient before the initialized value of the gravity vector in the equation, gammaiExpressing the coefficients of the equations before the amount of visual inertial translation, ηiConstant term coefficients in the equation are expressed, and s represents a scale factor; gwAn initialization value representing a gravity vector; pcbRepresenting the visual inertial translation amount; rwcRepresenting the rotation angle of the camera in the world coordinate system; pwcThe translation amount of the camera in the world coordinate system;
Figure BDA0002418099640000306
represents the translation amount of the i +1 th frame camera in the world coordinate system,
Figure BDA0002418099640000307
represents the translation amount, dt, of the camera of the ith frame in the world coordinate systemi+1,i+2Representing the time difference between the (i + 1) th frame and the (i + 2) th frame,
Figure BDA0002418099640000308
represents the translation amount, dt, of the i +2 th frame camera in the world coordinate systemi,i+1Representing the time difference between the ith frame and the (I + 1) th frame, I3×3The unit matrix is represented by a matrix of units,
Figure BDA0002418099640000309
represents the rotation angle of the i +2 th frame camera in the world coordinate system,
Figure BDA00024180996400003010
represents the rotation angle of the i +1 th frame camera in the world coordinate system,
Figure BDA00024180996400003011
representing the rotation angle, R, of the i-th frame camera in the world coordinate systemcbRepresenting the relative rotation angle, Δ P, between the camera and the inertial measurement uniti,i+1Represents the relative translation, Δ P, of the inertial measurement units of frame i and frame i +1i+1,i+2Represents the relative translation, Deltav, of the inertial measurement units of frame i +1 and frame i +2i,i+1Representing the relative linear speeds of the inertial measurement units of the (i + 1) th frame and the (i + 2) th frame;
as shown in fig. 3, module M2.2.2: suppose Ai=[αiβiγi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure BDA00024180996400003012
Setting the information content window size to NwinWhen the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of the key frames is more than or equal to NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
module M2.2.3: suppose that
Figure BDA00024180996400003013
Bi=ηiFor every 3 adjacent frames, there is equation Ai·x=Bi(ii) a Constructing an information matrix A and a constant matrix B by using the screened information quantity, and constructing an equation:
A·x=B (8)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the gravity vector gw *And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure BDA0002418099640000311
Said module M2.3 comprises: introducing a constraint of G-9.8 as the gravity vector size to obtain a scale factor s and a gravity vector GwInitial value of (1) and visual inertial translation amount PcbA calibration value of (1);
according to the obtained gravity vector gwObtaining the rotation angle of gravity:
Figure BDA0002418099640000312
wherein, R isWIRepresenting the rotation angle of gravity in the world coordinate system,
Figure BDA0002418099640000313
representation and gravity vector
Figure BDA0002418099640000314
And a fixed vector gIVertical unit vector, theta is expressed in relation to gravity vector
Figure BDA0002418099640000315
And a fixed vector gIAngle between them, gIRepresents a fixed vector [0,0, -1](ii) a Using the gravity vector as the rotation angle and the gravity magnitudeRepresent and perform a first order Taylor expansion:
Figure BDA0002418099640000316
wherein θ represents an updated value of the gravity vector rotation angle; g represents the magnitude of gravity, θxThe component of the updated value representing the rotation angle of the gravity vector in the x-axis of the world coordinate system, θyA component of the updated value representing the rotation angle of the gravity vector in the y-axis of the world coordinate system;
obtaining the rotation angle R of the camera in the world coordinate system according to the camera observation quantity by using a vision synchronous positioning and mapping algorithmwcAnd the amount of translation Pwc(ii) a Obtaining the relative translation quantity delta P of two adjacent key frames of the inertia measurement unit according to the observed quantity of the inertia measurement unit by utilizing a pre-integration algorithmi,i+1And relative linear velocity Δ vi,i+1(ii) a Constructing the ith frame by a scale factor s and a gravity vector gwZero drift value b of acceleration of inertial measurement unitaAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure BDA0002418099640000321
Figure BDA0002418099640000322
Figure BDA0002418099640000323
Figure BDA0002418099640000324
Figure BDA0002418099640000325
Figure BDA0002418099640000326
wherein λ isiCoefficient before scale factor in the equation, muiCoefficient before the updated value of the rotation angle of the gravity vector in the equation, ωiThe coefficient before the acceleration zero drift value of the inertia measurement unit in the equation is expressed as upsiloniThe coefficient, rho, of the relative translation between the camera and the inertial measurement unit in the equationiThe coefficients of the constant terms in the equation are expressed,
Figure BDA0002418099640000327
jacobian matrix, dt, representing the relative linear velocity of the inertial measurement unit in frame i and frame i +2 compared to the null shift value of the acceleration of the inertial measurement unit12Representing the time difference, dt, between the ith frame and the (i + 1) th frame23Representing the time difference between the i +1 th frame and the i +2 th frame,
Figure BDA0002418099640000328
a Jacobian matrix representing the relative translation of the inertial measurement unit between frame i +1 and frame i +2 compared to the inertial measurement unit acceleration null shift value,
Figure BDA0002418099640000329
a Jacobian matrix, a small scale, representing the relative translation of the inertial measurement unit between frame i and frame i +1 compared to the null shift value of the acceleration of the inertial measurement unit[:,1:2]The representation takes only the first two columns of the matrix;
hypothesis Ci=[λiμiωiυi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure BDA00024180996400003210
Setting the information content window size to NwinIf the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
constructing equation set and solving by least square method
Suppose that
Figure BDA00024180996400003211
Di=ρiFor every 3 adjacent frames, there is equation Ci·y=Di(ii) a And (3) constructing an observation matrix C and a constant matrix D by using the screened observed quantity, and constructing an equation:
C·y=D (13)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the update value of the rotation angle of the gravity vector*Optimal estimation of acceleration null shift value of inertial measurement unit
Figure BDA0002418099640000331
And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure BDA0002418099640000332
By judging the nearest NwincStandard deviation sigma corresponding to each dimension value of vision inertial translation quantityx、σy、σzWhether or not less than a threshold value
Figure BDA0002418099640000333
Determining whether the calibration of the visual inertial translation amount is finished; by judging the nearest NwincStandard deviation sigma of individual scale factorssWhether or not it is less than threshold thrsTo determine whether visual inertia initialization is complete; when the calibration of the vision inertia translation amount is not completed or the initialization of the vision inertia is not completed, the module M2.2 to the module M2.3 are repeatedly triggered to execute after waiting for a new key frame until the vision and inertia combined initialization and calibration are completed.
Specifically, the module M3 includes: in the process of synchronous positioning and image building fused by the camera and the inertial measurement unit, judging whether the relative pose of the camera and the inertial measurement unit changes every other preset value period, and when the relative pose of the camera and the inertial measurement unit changes, initializing the angular velocity null shift value of the inertial measurement unit and recalibrating the relative rotation angle and translation quantity of the camera and the inertial measurement unit according to any two adjacent frames in the latest frame;
and judging whether the calibration value of the relative rotation angle of the camera and the inertia needs to be updated according to whether the consistency error between the observed quantity of the camera and the inertia measurement unit and the initial calibration value of the relative rotation angle of the camera and the inertia measurement unit exceeds a threshold value.
Module M3.1: and judging whether the calibration value of the relative rotation angle between the camera and the inertia needs to be updated or not.
With the latest NupdateTaking a frame as an observed quantity, calculating for any two adjacent frames i and i + 1:
Qi,i+1=(QL(qbi,bi+1)-QR(qci,ci+1))。
computing
Figure BDA0002418099640000334
Wherein
qbc: the relative rotation angle of the front camera and the inertial measurement unit is updated.
The calculation result is compared with a threshold value thrupdateRAnd (6) comparing. If it is greater than the threshold value thrupdateRThe relative rotation angle needs to be calibrated again. Rotating the module M3.2; otherwise, turning the module M3.3;
and a module M3.2 for screening the observed quantity participating in calibration based on the information quantity. Calibrating the relative rotation angle between the camera and the inertial measurement unit by using the screened observed quantity;
for the latest NupdateCalculating the information metric contained in the observed quantity for any two adjacent frames in the frame
Figure BDA0002418099640000341
Selecting N with the largest information metricwinAnd (6) measuring the observed quantity. Forming an observation matrix Q by using the observed quantities, and solving the relative rotation angle between the camera and the inertial measurement unit by using a least square method
Figure BDA0002418099640000342
Figure BDA0002418099640000343
And a module M3.3 for judging whether the relative translation calibration values of the camera and the inertial measurement unit need to be updated.
With the latest NupdateTaking frames as observed quantities, and obtaining a rotation angle R of each frame camera in a world coordinate system by using the observed quantities of any three adjacent frames i and i +1 and i +2 through an S L AM algorithm of fusion of vision and inertiawcCalculating the relative translation amount pre-integral delta P and the relative linear velocity pre-integral delta v of two adjacent frames:
Figure BDA0002418099640000344
Figure BDA0002418099640000345
computing
Figure BDA0002418099640000346
Wherein:
Pcbto update the relative translation of the front camera and the inertial measurement unit.
The calculation result is compared with a threshold value thrupdatePAnd (6) comparing. If it is greater than the threshold value thrupdatePThen the relative translation amount needs to be calibrated again, and the module M3.4 is rotated; otherwise, returning to the module M3.1;
module M3.4: and screening the observed quantity participating in calibration based on the information quantity. And calibrating the relative translation quantity of the camera and the inertial measurement unit by using the screened observed quantity.
For the latest NupdateCalculating the information metric contained in the observed quantity for any two adjacent frames in the frame
Figure BDA0002418099640000347
Selecting N with the largest information metricwinAnd (6) measuring the observed quantity. These observations are used to form observation matrices V and P, using the best caseMethod for solving relative translation quantity of camera and inertial measurement unit by using small two multiplication
Figure BDA0002418099640000348
Figure BDA0002418099640000349
Information content window size N during frame screeningwin12, convergence judgment parameter Nwinc=15,
Figure BDA00024180996400003410
thrPcb=0.01m,thrs0.02. Judging whether the window size N needs to be calibrated againupdate50, threshold size thrupdateR=0.2°,thrupdateP=0.1m。
Those skilled in the art will appreciate that, in addition to implementing the systems, apparatus, and various modules thereof provided by the present invention in purely computer readable program code, the same procedures can be implemented entirely by logically programming method steps such that the systems, apparatus, and various modules thereof are provided in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system, the device and the modules thereof provided by the present invention can be considered as a hardware component, and the modules included in the system, the device and the modules thereof for implementing various programs can also be considered as structures in the hardware component; modules for performing various functions may also be considered to be both software programs for performing the methods and structures within hardware components.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (10)

1. A camera and inertial measurement unit online initialization and calibration method is characterized by comprising the following steps:
step M1: pure vision initialization, namely acquiring the pose of an initial key frame by utilizing the observed quantity of a camera and constructing an initial map;
step M2: calibrating and initializing vision and inertia jointly, and finishing initialization of an angular velocity null shift value of an inertia measurement unit and calibration of a relative rotation angle and a translation amount of a camera and the inertia measurement unit based on the pose of an initial key frame and an initial map;
step M3: in the process of synchronous positioning and mapping fused by the camera and the inertial measurement unit, the initialization of the angular velocity null shift value of the inertial measurement unit and the calibration of the relative rotation angle and translation amount of the camera and the inertial measurement unit are subjected to abnormal detection and updating.
2. The method for initializing and calibrating a camera and an inertial measurement unit online as claimed in claim 1, wherein the step M1 comprises:
step M1.1: according to the new key frame, searching a current frame and a historical frame which accord with the parallax angle and the matching point logarithm of the preset condition;
step M1.2: calculating the relative pose between the current frame and the historical frame by utilizing epipolar constraint;
step M1.3: according to the obtained relative pose between the current frame and the historical frame, obtaining the world coordinate system coordinates of map points observed by the current frame and the historical frame together through triangulation, and constructing an initial map;
step M1.4: and projecting points in the initial map onto the pixel plane of each key frame by using a perspective n-point method, and solving the pose of each key frame in a world coordinate system.
3. The method for initializing and calibrating a camera and an inertial measurement unit online as claimed in claim 1, wherein the step M2 comprises:
step M2.1: screening the observed quantities of the camera and the inertia measurement unit based on the information quantity, and screening out the observed quantities of the preset quantity which maximizes the total information quantity; initializing an angular velocity null shift value of the inertial measurement unit and calibrating a relative rotation angle between the camera and the inertial measurement unit by using the observed quantities of the camera and the inertial measurement unit obtained after screening until the relative rotation angle between the camera and the inertial measurement unit is judged to be converged;
step M2.2: the method comprises the steps that the null shift value of the acceleration of an inertial measurement unit is assumed to be zero, the observed quantities of a camera and the inertial measurement unit are screened based on information quantity, the observed quantities of a fixed quantity of maximized total information quantity are reserved, and a scale factor, a gravity vector and the relative translation quantity of the camera and the inertial measurement unit are roughly estimated;
step M2.3: screening the observed quantities of the camera and the inertial measurement unit based on the information quantity by using the direction of the roughly estimated gravity vector and introducing the gravity as a new constraint, reserving a fixed quantity of observed quantities maximizing the total information quantity, and finely estimating a scale factor, the gravity vector, the relative translation quantity of the camera and the inertial measurement unit and a null shift value of an accelerometer in the inertial measurement unit; when the scale factor and the relative translation amount of the camera and the inertial measurement unit are converged, the visual and inertial joint initialization and calibration are completed;
the information quantity is the observed quantity of the camera and the inertial measurement unit and comprises the relative rotation angle information quantity of the camera and the inertial measurement unit.
4. The method for initializing and calibrating a camera and an inertial measurement unit on line according to claim 3, wherein the step M2.1 comprises:
step M2.1.1: obtaining the relative rotation angle of the inertia measurement unit between two adjacent frames by using the observed quantity of the inertia measurement unit;
obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by integrating the angular velocity of the inertial measurement unit
Figure FDA0002418099630000021
Figure FDA0002418099630000022
Wherein, Δ Ri,i+1Representing an inertial measurement unit angular velocity integral value;
Figure FDA0002418099630000023
a Jacobian matrix representing a null shift value of the inertial measurement unit with respect to the angular velocity; bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
step M2.1.2: obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by using the observed quantity of the camera;
the poses of two adjacent frames of cameras under a world coordinate system are obtained through visual synchronous positioning and mapping
Figure FDA0002418099630000024
Obtaining the relative rotation angle of the inertial measurement unit by using the relative rotation angle of the camera and the inertial measurement unit
Figure FDA0002418099630000025
Figure FDA0002418099630000026
Wherein R iscbRepresenting a relative rotation angle of the camera and the inertial measurement unit;
step M2.1.3: minimizing an error of a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the inertial measurement unit and a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the camera to obtain an optimal estimation of an angular velocity null shift value of the inertial measurement unit, namely initializing the angular velocity null shift value of the inertial measurement unit;
obtaining the optimal estimation of the null shift value of the angular velocity of the inertial measurement unit by minimizing the error of the two relative rotation angles
Figure FDA0002418099630000027
Figure FDA0002418099630000028
Where N denotes the number of key frames, R2 denotes the relative rotation angle of the inertial measurement unit obtained from the camera observation, R1 denotes the relative rotation angle of the inertial measurement unit obtained by integration of the angular velocity of the inertial measurement unit, and biInertial measurement Unit representing the ith Key frame, bi+1Inertial measurement Unit representing the i +1 st Key frame, bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
step M2.1.4: for any two adjacent key frames, the quantity of information about the relative rotation angle between the camera and the inertial measurement unit contained in the observed quantity is as follows:
iQi,i+1=trace((QL(qbi,bi+1)-QR(qci,ci+1))T·(QL(qbi,bi+1)-QR(qci,ci+1))) (4)
wherein Q isL(q) a left-hand matrix corresponding to the quaternion q; qR(q) a right-multiplication matrix corresponding to the quaternion q; q. q.sbi,bi+1A quaternion representing the relative rotation angle of the inertial measurement unit of the ith frame and the (i + 1) th frame; q. q.sci,ci+1A quaternion representing the relative rotation angle of the i frame and the i +1 frame camera;
setting the information content window size to NwinIf the total number of key frames is less than NwinThe latest key frame is added to the sequence for initialization and calibration. If the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
step M2.1.5: constructing an equation set taking the relative rotation angle of the camera and the inertial measurement unit as unknown quantity by using the screened observed quantity, and solving the equation set by using a least square method to obtain the optimal estimation of the relative rotation angle of the camera and the inertial measurement unit expressed in the form of quaternion
Figure FDA0002418099630000031
Calibrating the relative rotation angle between the camera and the inertial measurement unit;
for any two adjacent key frames, the relative rotation angle of the camera and the inertial measurement unit is an unknown quantity, and the following equation system is satisfied:
(QL(qbi,bi+1)-QR(qci,ci+1))·qbc=0 (5)
wherein Q isLA left-hand multiplication matrix, q, representing a quaternionbi,bi+1Quaternion representation, Q, representing the relative rotation angle of the inertial measurement unit in frame i and frame i +1RA right-hand multiplication matrix representing a quaternion, qci,ci+1Quaternion representation, q, representing the relative rotation angle of the camera in frame i and frame i +1bcA quaternion representation representing the relative rotation angle of the camera and the inertial measurement unit;
will (Q)L(qbi,bi+1)-QR(qci,ci+1) Is represented by Q)i,i+1Using the observed quantity Q after screeningi,i+1Forming an observation matrix Q, and establishing an equation:
Q·qbc=0 (6)
solving for optimal estimation of the relative rotation angle of the camera and the inertial measurement unit using least squares
Figure FDA0002418099630000032
Figure FDA0002418099630000033
Step M2.1.6: by judging the nearest NwincStandard deviation sigma of yaw angle, pitch angle and roll angle of Euler angle corresponding to relative rotation angle of individual camera and inertial measurement unityaw、σpitch、σrollWhether or not less than a threshold value
Figure FDA0002418099630000041
Determining whether the calibration of the visual inertia rotation angle is finished; when the calibration is not completed, i.e. the camera and the inertia measurementIf the relative rotation angle of the quantity unit is not converged, waiting for a new key frame to repeatedly execute the steps M2.1.1 to M2.1.5 until the visual inertia rotation angle calibration is completed;
said step M2.2 comprises:
step M2.2.1: obtaining a rotation angle and a translation amount of the camera in a world coordinate system according to the camera observed quantity by utilizing a vision synchronous positioning and mapping algorithm; obtaining the relative translation amount and the relative linear velocity of two adjacent key frames of the inertia measurement unit according to the observed amount of the inertia measurement unit by using a pre-integration algorithm;
constructing the ith frame by a scale factor s and a gravity vector gwAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure FDA0002418099630000042
Figure FDA0002418099630000043
Figure FDA0002418099630000044
Figure FDA0002418099630000045
Figure FDA0002418099630000046
wherein, an acceleration zero drift value b of the inertial measurement unit is assumedaα is zeroiExpressing the coefficients before the scale factor in the equation, βiCoefficient before the initialized value of the gravity vector in the equation, gammaiExpressing the coefficients of the equations before the amount of visual inertial translation, ηiConstant term coefficients in the equation are expressed, and s represents a scale factor; gwAn initialization value representing a gravity vector; pcbRepresenting the visual inertial translation amount; rwcTo representThe rotation angle of the camera in the world coordinate system; pwcThe translation amount of the camera in the world coordinate system;
Figure FDA0002418099630000047
represents the translation amount of the i +1 th frame camera in the world coordinate system,
Figure FDA0002418099630000048
represents the translation amount, dt, of the camera of the ith frame in the world coordinate systemi+1,i+2Representing the time difference between the (i + 1) th frame and the (i + 2) th frame,
Figure FDA0002418099630000049
represents the translation amount, dt, of the i +2 th frame camera in the world coordinate systemi,i+1Representing the time difference between the ith frame and the (I + 1) th frame, I3×3The unit matrix is represented by a matrix of units,
Figure FDA00024180996300000410
represents the rotation angle of the i +2 th frame camera in the world coordinate system,
Figure FDA00024180996300000411
represents the rotation angle of the i +1 th frame camera in the world coordinate system,
Figure FDA00024180996300000412
representing the rotation angle, R, of the i-th frame camera in the world coordinate systemcbRepresenting the relative rotation angle, Δ P, between the camera and the inertial measurement uniti,i+1Represents the relative translation, Δ P, of the inertial measurement units of frame i and frame i +1i+1,i+2Represents the relative translation, Deltav, of the inertial measurement units of frame i +1 and frame i +2i,i+1Representing the relative linear speeds of the inertial measurement units of the (i + 1) th frame and the (i + 2) th frame;
step M2.2.2: suppose Ai=[αiβiγi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure FDA0002418099630000051
Setting the information content window size to NwinWhen the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of the key frames is more than or equal to NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
step M2.2.3: suppose that
Figure FDA0002418099630000052
Bi=ηiFor every 3 adjacent frames, there is equation Ai·x=Bi(ii) a Constructing an information matrix A and a constant matrix B by using the screened information quantity, and constructing an equation:
A·x=B (9)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the gravity vector gw *And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure FDA0002418099630000053
Said step M2.3 comprises: introducing a gravity vector with a preset value as a constraint to obtain a scale factor s and a gravity vector gwInitial value of (1) and visual inertial translation amount PcbA calibration value of (1);
according to the obtained gravity vector
Figure FDA0002418099630000054
Obtaining the rotation angle of gravity:
Figure FDA0002418099630000055
wherein R isWIRepresenting the rotation angle of gravity in the world coordinate system,
Figure FDA0002418099630000056
representation and gravity vector
Figure FDA0002418099630000057
And a fixed vector gIVertical unit vector, theta is expressed in relation to gravity vector
Figure FDA0002418099630000058
And a fixed vector gIAngle between them, gIRepresents a fixed vector; the gravity vector is expressed by the rotation angle and the gravity magnitude and is subjected to first-order Taylor expansion:
Figure FDA0002418099630000059
wherein θ represents an updated value of the gravity vector rotation angle; g represents the magnitude of gravity, θxThe component of the updated value representing the rotation angle of the gravity vector in the x-axis of the world coordinate system, θyA component of the updated value representing the rotation angle of the gravity vector in the y-axis of the world coordinate system;
obtaining the rotation angle R of the camera in the world coordinate system according to the camera observation quantity by using a vision synchronous positioning and mapping algorithmwcAnd the amount of translation Pwc(ii) a Obtaining the relative translation quantity delta P of two adjacent key frames of the inertia measurement unit according to the observed quantity of the inertia measurement unit by utilizing a pre-integration algorithmi,i+1And relative linear velocity Δ vi,i+1(ii) a Constructing the ith frame by a scale factor s and a gravity vector gwZero drift value b of acceleration of inertial measurement unitaAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure FDA0002418099630000061
Figure FDA0002418099630000062
Figure FDA0002418099630000063
Figure FDA0002418099630000064
Figure FDA0002418099630000065
Figure FDA0002418099630000066
wherein λ isiCoefficient before scale factor in the equation, muiCoefficient before the updated value of the rotation angle of the gravity vector in the equation, ωiThe coefficient before the acceleration zero drift value of the inertia measurement unit in the equation is expressed as upsiloniThe coefficient, rho, of the relative translation between the camera and the inertial measurement unit in the equationiThe coefficients of the constant terms in the equation are expressed,
Figure FDA0002418099630000067
jacobian matrix, dt, representing the relative linear velocity of the inertial measurement unit in frame i and frame i +2 compared to the null shift value of the acceleration of the inertial measurement unit12Representing the time difference, dt, between the ith frame and the (i + 1) th frame23Representing the time difference between the i +1 th frame and the i +2 th frame,
Figure FDA0002418099630000068
a Jacobian matrix representing the relative translation of the inertial measurement unit between frame i +1 and frame i +2 compared to the inertial measurement unit acceleration null shift value,
Figure FDA0002418099630000069
a Jacobian matrix, a small scale, representing the relative translation of the inertial measurement unit between frame i and frame i +1 compared to the null shift value of the acceleration of the inertial measurement unit[:,1:2]Representing only momentsThe first two columns of the array;
hypothesis Ci=[λiμiωiυi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure FDA00024180996300000610
Setting the information content window size to NwinIf the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
constructing equation set and solving by least square method
Suppose that
Figure FDA0002418099630000071
Di=ρiFor every 3 adjacent frames, there is equation Ci·y=Di(ii) a And (3) constructing an observation matrix C and a constant matrix D by using the screened observed quantity, and constructing an equation:
C·y=D (14)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the update value of the rotation angle of the gravity vector*Optimal estimation of acceleration null shift value of inertial measurement unit
Figure FDA0002418099630000072
And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure FDA0002418099630000073
By judging the nearest NwincStandard deviation sigma corresponding to each dimension value of vision inertial translation quantityx、σy、σzWhether or not less than a threshold value
Figure FDA0002418099630000074
Determining whether the calibration of the visual inertial translation amount is finished; by judging the nearest NwincStandard deviation sigma of individual scale factorssWhether or not it is less than threshold thrsTo determine whether visual inertia initialization is complete; and when the calibration of the vision inertia translation amount is not finished or the initialization of the vision inertia is not finished, waiting for a new key frame and then repeatedly executing the step M2.2 to the step M2.3 until the joint initialization and calibration of the vision and the inertia are finished.
5. The method for initializing and calibrating a camera and an inertial measurement unit online as claimed in claim 1, wherein the step M3 comprises: in the process of synchronous positioning and image building fused by the camera and the inertial measurement unit, judging whether the relative pose of the camera and the inertial measurement unit changes every other preset value period, and when the relative pose of the camera and the inertial measurement unit changes, initializing the angular velocity null shift value of the inertial measurement unit and recalibrating the relative rotation angle and translation quantity of the camera and the inertial measurement unit according to any two adjacent frames in the latest frame;
step M3.1: and judging whether the calibration value of the relative rotation angle of the camera and the inertia needs to be updated according to whether the consistency error between the observed quantity of the camera and the inertia measurement unit and the initial calibration value of the relative rotation angle of the camera and the inertia measurement unit exceeds a threshold value.
6. An online initialization and calibration system for a camera and an inertial measurement unit is characterized by comprising:
module M1: pure vision initialization, namely acquiring the pose of an initial key frame by utilizing the observed quantity of a camera and constructing an initial map;
module M2: calibrating and initializing vision and inertia jointly, and finishing initialization of an angular velocity null shift value of an inertia measurement unit and calibration of a relative rotation angle and a translation amount of a camera and the inertia measurement unit based on the pose of an initial key frame and an initial map;
module M3: in the process of synchronous positioning and mapping fused by the camera and the inertial measurement unit, the initialization of the angular velocity null shift value of the inertial measurement unit and the calibration of the relative rotation angle and translation amount of the camera and the inertial measurement unit are subjected to abnormal detection and updating.
7. The system for online initialization and calibration of a camera and an inertial measurement unit according to claim 6, wherein the module M1 comprises:
module M1.1: according to the new key frame, searching a current frame and a historical frame which accord with the parallax angle and the matching point logarithm of the preset condition;
module M1.2: calculating the relative pose between the current frame and the historical frame by utilizing epipolar constraint;
module M1.3: according to the obtained relative pose between the current frame and the historical frame, obtaining the world coordinate system coordinates of map points observed by the current frame and the historical frame together through triangulation, and constructing an initial map;
module M1.4: and projecting points in the initial map onto the pixel plane of each key frame by using a perspective n-point method, and solving the pose of each key frame in a world coordinate system.
8. The system for online initialization and calibration of a camera and an inertial measurement unit according to claim 6, wherein the module M2 comprises:
module M2.1: screening the observed quantities of the camera and the inertia measurement unit based on the information quantity, and screening out the observed quantities of the preset quantity which maximizes the total information quantity; initializing an angular velocity null shift value of the inertial measurement unit and calibrating a relative rotation angle between the camera and the inertial measurement unit by using the observed quantities of the camera and the inertial measurement unit obtained after screening until the relative rotation angle between the camera and the inertial measurement unit is judged to be converged;
module M2.2: the method comprises the steps that the null shift value of the acceleration of an inertial measurement unit is assumed to be zero, the observed quantities of a camera and the inertial measurement unit are screened based on information quantity, the observed quantities of a fixed quantity of maximized total information quantity are reserved, and a scale factor, a gravity vector and the relative translation quantity of the camera and the inertial measurement unit are roughly estimated;
module M2.3: screening the observed quantities of the camera and the inertial measurement unit based on the information quantity by using the direction of the roughly estimated gravity vector and introducing the gravity as a new constraint, reserving a fixed quantity of observed quantities maximizing the total information quantity, and finely estimating a scale factor, the gravity vector, the relative translation quantity of the camera and the inertial measurement unit and a null shift value of an accelerometer in the inertial measurement unit; when the scale factor and the relative translation amount of the camera and the inertial measurement unit are converged, the visual and inertial joint initialization and calibration are completed;
the information quantity is the observed quantity of the camera and the inertial measurement unit and comprises the relative rotation angle information quantity of the camera and the inertial measurement unit.
9. The system for on-line initialization and calibration of a camera and an inertial measurement unit according to claim 8, characterized in that said module M2.1 comprises:
module M2.1.1: obtaining the relative rotation angle of the inertia measurement unit between two adjacent frames by using the observed quantity of the inertia measurement unit;
obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by integrating the angular velocity of the inertial measurement unit
Figure FDA0002418099630000091
Figure FDA0002418099630000092
Wherein, Δ Ri,i+1Representing an inertial measurement unit angular velocity integral value;
Figure FDA0002418099630000093
a Jacobian matrix representing a null shift value of the inertial measurement unit with respect to the angular velocity; bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
module M2.1.2: obtaining the relative rotation angle of the inertial measurement unit between two adjacent frames by using the observed quantity of the camera;
two adjacent images are obtained by visual synchronous positioning and mappingPose of frame camera under world coordinate system
Figure FDA0002418099630000094
Obtaining the relative rotation angle of the inertial measurement unit by using the relative rotation angle of the camera and the inertial measurement unit
Figure FDA0002418099630000095
Figure FDA0002418099630000096
Wherein R iscbRepresenting a relative rotation angle of the camera and the inertial measurement unit;
module M2.1.3: minimizing an error of a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the inertial measurement unit and a relative rotation angle of the inertial measurement unit between two adjacent frames obtained by the observed quantity of the camera to obtain an optimal estimation of an angular velocity null shift value of the inertial measurement unit, namely initializing the angular velocity null shift value of the inertial measurement unit;
obtaining the optimal estimation of the null shift value of the angular velocity of the inertial measurement unit by minimizing the error of the two relative rotation angles
Figure FDA0002418099630000097
Figure FDA0002418099630000098
Where N denotes the number of key frames, R2 denotes the relative rotation angle of the inertial measurement unit obtained from the camera observation, R1 denotes the relative rotation angle of the inertial measurement unit obtained by integration of the angular velocity of the inertial measurement unit, and biInertial measurement Unit representing the ith Key frame, bi+1Inertial measurement Unit representing the i +1 st Key frame, bgRepresenting the zero drift value of the angular speed of the inertial measurement unit;
module M2.1.4: for any two adjacent key frames, the quantity of information about the relative rotation angle between the camera and the inertial measurement unit contained in the observed quantity is as follows:
Figure FDA0002418099630000099
wherein Q isL(q) a left-hand matrix corresponding to the quaternion q; qR(q) a right-multiplication matrix corresponding to the quaternion q; q. q.sbi,bi+1A quaternion representing the relative rotation angle of the inertial measurement unit of the ith frame and the (i + 1) th frame; q. q.sci,ci+1A quaternion representing the relative rotation angle of the i frame and the i +1 frame camera;
setting the information content window size to NwinIf the total number of key frames is less than NwinThe latest key frame is added to the sequence for initialization and calibration. If the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
module M2.1.5: constructing an equation set taking the relative rotation angle of the camera and the inertial measurement unit as unknown quantity by using the screened observed quantity, and solving the equation set by using a least square method to obtain the optimal estimation of the relative rotation angle of the camera and the inertial measurement unit expressed in the form of quaternion
Figure FDA0002418099630000101
Calibrating the relative rotation angle between the camera and the inertial measurement unit;
for any two adjacent key frames, the relative rotation angle of the camera and the inertial measurement unit is an unknown quantity, and the following equation system is satisfied:
(QL(qbi,bi+1)-QR(qci,ci+1))·qbc=0 (5)
wherein Q isLA left-hand multiplication matrix, q, representing a quaternionbi,bi+1Quaternion representation, Q, representing the relative rotation angle of the inertial measurement unit in frame i and frame i +1RA right-hand multiplication matrix representing a quaternion, qci,ci+1Quaternary element representing relative rotation angles of camera in ith frame and (i + 1) th frameNumber representation, qbcA quaternion representation representing the relative rotation angle of the camera and the inertial measurement unit;
will (Q)L(qbi,bi+1)-QR(qci,ci+1) Is represented by Q)i,i+1Using the observed quantity Q after screeningi,i+1Forming an observation matrix Q, and establishing an equation:
Q·qbc=0 (6)
solving for optimal estimation of the relative rotation angle of the camera and the inertial measurement unit using least squares
Figure FDA0002418099630000102
Figure FDA0002418099630000103
Module M2.1.6: by judging the nearest NwincStandard deviation sigma of yaw angle, pitch angle and roll angle of Euler angle corresponding to relative rotation angle of individual camera and inertial measurement unityaw、σpitch、σrollWhether or not less than a threshold value
Figure FDA0002418099630000104
Determining whether the calibration of the visual inertia rotation angle is finished; when the calibration is not completed, that is, the relative rotation angle between the camera and the inertial measurement unit is not converged, wait for a new key frame to repeatedly trigger the execution of the modules M2.1.1 to M2.1.5 until the calibration of the visual inertial rotation angle is completed;
said module M2.2 comprises:
module M2.2.1: obtaining a rotation angle and a translation amount of the camera in a world coordinate system according to the camera observed quantity by utilizing a vision synchronous positioning and mapping algorithm; obtaining the relative translation amount and the relative linear velocity of two adjacent key frames of the inertia measurement unit according to the observed amount of the inertia measurement unit by using a pre-integration algorithm;
constructing the ith frame by a scale factor s and a gravity vector gwAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure FDA0002418099630000111
Figure FDA0002418099630000112
Figure FDA0002418099630000113
Figure FDA0002418099630000114
Figure FDA0002418099630000115
wherein, an acceleration zero drift value b of the inertial measurement unit is assumedaα is zeroiExpressing the coefficients before the scale factor in the equation, βiCoefficient before the initialized value of the gravity vector in the equation, gammaiExpressing the coefficients of the equations before the amount of visual inertial translation, ηiConstant term coefficients in the equation are expressed, and s represents a scale factor; gwAn initialization value representing a gravity vector; pcbRepresenting the visual inertial translation amount; rwcRepresenting the rotation angle of the camera in the world coordinate system; pwcThe translation amount of the camera in the world coordinate system;
Figure FDA0002418099630000116
represents the translation amount of the i +1 th frame camera in the world coordinate system,
Figure FDA0002418099630000117
represents the translation amount, dt, of the camera of the ith frame in the world coordinate systemi+1,i+2Representing the time difference between the (i + 1) th frame and the (i + 2) th frame,
Figure FDA0002418099630000118
denotes the (i + 2) thTranslation, dt, of frame camera in world coordinate systemi,i+1Representing the time difference between the ith frame and the (I + 1) th frame, I3×3The unit matrix is represented by a matrix of units,
Figure FDA0002418099630000119
represents the rotation angle of the i +2 th frame camera in the world coordinate system,
Figure FDA00024180996300001110
represents the rotation angle of the i +1 th frame camera in the world coordinate system,
Figure FDA00024180996300001111
representing the rotation angle, R, of the i-th frame camera in the world coordinate systemcbRepresenting the relative rotation angle, Δ P, between the camera and the inertial measurement uniti,i+1Represents the relative translation, Δ P, of the inertial measurement units of frame i and frame i +1i+1,i+2Represents the relative translation, Deltav, of the inertial measurement units of frame i +1 and frame i +2i,i+1Representing the relative linear speeds of the inertial measurement units of the (i + 1) th frame and the (i + 2) th frame;
module M2.2.2: suppose Ai=[αiβiγi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure FDA00024180996300001112
Setting the information content window size to NwinWhen the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of the key frames is more than or equal to NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
module M2.2.3: suppose that
Figure FDA0002418099630000121
Bi=ηiFor every 3 adjacent frames, there is equation Ai·x=Bi(ii) a After screeningThe information quantity of the system is used for constructing an information matrix A and a constant matrix B, and an equation is constructed:
A·x=B (9)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the gravity vector gw *And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure FDA0002418099630000122
Said module M2.3 comprises: introducing a gravity vector with a preset value as a constraint to obtain a scale factor s and a gravity vector gwInitial value of (1) and visual inertial translation amount PcbA calibration value of (1);
according to the obtained gravity vector gwObtaining the rotation angle of gravity:
Figure FDA0002418099630000123
wherein R isWIRepresenting the rotation angle of gravity in the world coordinate system,
Figure FDA0002418099630000124
representation and gravity vector
Figure FDA0002418099630000125
And a fixed vector gIVertical unit vector, theta is expressed in relation to gravity vector
Figure FDA0002418099630000126
And a fixed vector gIAngle between them, gIRepresents a fixed vector; the gravity vector is expressed by the rotation angle and the gravity magnitude and is subjected to first-order Taylor expansion:
Figure FDA0002418099630000127
wherein, theta tableAn updated value indicating the rotation angle of the gravity vector; g represents the magnitude of gravity, θxThe component of the updated value representing the rotation angle of the gravity vector in the x-axis of the world coordinate system, θyA component of the updated value representing the rotation angle of the gravity vector in the y-axis of the world coordinate system;
obtaining the rotation angle R of the camera in the world coordinate system according to the camera observation quantity by using a vision synchronous positioning and mapping algorithmwcAnd the amount of translation Pwc(ii) a Obtaining the relative translation quantity delta P of two adjacent key frames of the inertia measurement unit according to the observed quantity of the inertia measurement unit by utilizing a pre-integration algorithmi,i+1And relative linear velocity Δ vi,i+1(ii) a Constructing the ith frame by a scale factor s and a gravity vector gwZero drift value b of acceleration of inertial measurement unitaAnd the relative translation amount P of the camera and the inertial measurement unitcbThe equation of (c):
Figure FDA0002418099630000131
Figure FDA0002418099630000132
Figure FDA0002418099630000133
Figure FDA0002418099630000134
Figure FDA0002418099630000135
Figure FDA0002418099630000136
wherein λ isiCoefficient before scale factor in the equation, muiCoefficient before the updated value of the rotation angle of the gravity vector in the equation, ωiTo representCoefficient before acceleration zero drift value of inertia measurement unit in equation, upsiloniThe coefficient, rho, of the relative translation between the camera and the inertial measurement unit in the equationiThe coefficients of the constant terms in the equation are expressed,
Figure FDA0002418099630000137
jacobian matrix, dt, representing the relative linear velocity of the inertial measurement unit in frame i and frame i +2 compared to the null shift value of the acceleration of the inertial measurement unit12Representing the time difference, dt, between the ith frame and the (i + 1) th frame23Representing the time difference between the i +1 th frame and the i +2 th frame,
Figure FDA0002418099630000138
a Jacobian matrix representing the relative translation of the inertial measurement unit between frame i +1 and frame i +2 compared to the inertial measurement unit acceleration null shift value,
Figure FDA0002418099630000139
a Jacobian matrix, a small scale, representing the relative translation of the inertial measurement unit between frame i and frame i +1 compared to the null shift value of the acceleration of the inertial measurement unit[:,1:2]The representation takes only the first two columns of the matrix;
hypothesis Ci=[λiμiωiυi]The observation quantity formed by adjacent 3 key frames contains the information measurement quantity of unknown quantity
Figure FDA00024180996300001310
Setting the information content window size to NwinIf the total number of key frames is less than NwinAdding the latest key frame into the sequence for initialization and calibration; when the total number of key frames is greater than NwinComparing the current information quantity measurement with the information quantity measurement in the existing sequence, and removing the observed quantity with the minimum information quantity;
constructing equation set and solving by least square method
Suppose that
Figure FDA00024180996300001311
Di=ρiFor every 3 adjacent frames, there is equation Ci·y=Di(ii) a And (3) constructing an observation matrix C and a constant matrix D by using the screened observed quantity, and constructing an equation:
C·y=D (14)
solving optimal estimate s of scale factor using least squares*Optimal estimation of the update value of the rotation angle of the gravity vector*Optimal estimation of acceleration null shift value of inertial measurement unit
Figure FDA0002418099630000141
And optimal estimation P of relative translation amount of camera and inertial measurement unitcb *
Figure FDA0002418099630000142
By judging the nearest NwincStandard deviation sigma corresponding to each dimension value of vision inertial translation quantityx、σy、σzWhether or not less than a threshold value
Figure FDA0002418099630000143
Determining whether the calibration of the visual inertial translation amount is finished; by judging the nearest NwincStandard deviation sigma of individual scale factorssWhether or not it is less than threshold thrsTo determine whether visual inertia initialization is complete; when the calibration of the vision inertia translation amount is not completed or the initialization of the vision inertia is not completed, the module M2.2 to the module M2.3 are repeatedly triggered to execute after waiting for a new key frame until the vision and inertia combined initialization and calibration are completed.
10. The system for online initialization and calibration of a camera and an inertial measurement unit according to claim 6, wherein the module M3 comprises: in the process of synchronous positioning and image building fused by the camera and the inertial measurement unit, judging whether the relative pose of the camera and the inertial measurement unit changes every other preset value period, and when the relative pose of the camera and the inertial measurement unit changes, initializing the angular velocity null shift value of the inertial measurement unit and recalibrating the relative rotation angle and translation quantity of the camera and the inertial measurement unit according to any two adjacent frames in the latest frame;
module M3.1: and judging whether the calibration value of the relative rotation angle of the camera and the inertia needs to be updated according to whether the consistency error between the observed quantity of the camera and the inertia measurement unit and the initial calibration value of the relative rotation angle of the camera and the inertia measurement unit exceeds a threshold value.
CN202010197372.3A 2020-03-19 2020-03-19 Online initialization and calibration method and system for camera and inertial measurement unit Active CN111429524B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010197372.3A CN111429524B (en) 2020-03-19 2020-03-19 Online initialization and calibration method and system for camera and inertial measurement unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010197372.3A CN111429524B (en) 2020-03-19 2020-03-19 Online initialization and calibration method and system for camera and inertial measurement unit

Publications (2)

Publication Number Publication Date
CN111429524A true CN111429524A (en) 2020-07-17
CN111429524B CN111429524B (en) 2023-04-18

Family

ID=71548171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010197372.3A Active CN111429524B (en) 2020-03-19 2020-03-19 Online initialization and calibration method and system for camera and inertial measurement unit

Country Status (1)

Country Link
CN (1) CN111429524B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112229424A (en) * 2020-11-16 2021-01-15 浙江商汤科技开发有限公司 Parameter calibration method and device for visual inertial system, electronic equipment and medium
CN112629565A (en) * 2021-03-08 2021-04-09 中国人民解放军国防科技大学 Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit
CN113313763A (en) * 2021-05-26 2021-08-27 珠海深圳清华大学研究院创新中心 Monocular camera pose optimization method and device based on neural network
CN114018291A (en) * 2021-11-08 2022-02-08 中国科学院空天信息创新研究院 Calibration method and device for parameters of inertial measurement unit
CN114199275A (en) * 2020-09-18 2022-03-18 阿里巴巴集团控股有限公司 Parameter determination method and device for sensor
CN115451958A (en) * 2022-11-10 2022-12-09 中国人民解放军国防科技大学 Camera absolute attitude optimization method based on relative rotation angle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108827315A (en) * 2018-08-17 2018-11-16 华南理工大学 Vision inertia odometer position and orientation estimation method and device based on manifold pre-integration
CN110044354A (en) * 2019-03-28 2019-07-23 东南大学 A kind of binocular vision indoor positioning and build drawing method and device
WO2019157925A1 (en) * 2018-02-13 2019-08-22 视辰信息科技(上海)有限公司 Visual-inertial odometry implementation method and system
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN110375738A (en) * 2019-06-21 2019-10-25 西安电子科技大学 A kind of monocular merging Inertial Measurement Unit is synchronous to be positioned and builds figure pose calculation method
CN110726406A (en) * 2019-06-03 2020-01-24 北京建筑大学 Improved nonlinear optimization monocular inertial navigation SLAM method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019157925A1 (en) * 2018-02-13 2019-08-22 视辰信息科技(上海)有限公司 Visual-inertial odometry implementation method and system
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN108827315A (en) * 2018-08-17 2018-11-16 华南理工大学 Vision inertia odometer position and orientation estimation method and device based on manifold pre-integration
CN110044354A (en) * 2019-03-28 2019-07-23 东南大学 A kind of binocular vision indoor positioning and build drawing method and device
CN110726406A (en) * 2019-06-03 2020-01-24 北京建筑大学 Improved nonlinear optimization monocular inertial navigation SLAM method
CN110375738A (en) * 2019-06-21 2019-10-25 西安电子科技大学 A kind of monocular merging Inertial Measurement Unit is synchronous to be positioned and builds figure pose calculation method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王昱欣等: "软体机器人手眼视觉/形状混合控制", 机器人 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114199275A (en) * 2020-09-18 2022-03-18 阿里巴巴集团控股有限公司 Parameter determination method and device for sensor
CN112229424A (en) * 2020-11-16 2021-01-15 浙江商汤科技开发有限公司 Parameter calibration method and device for visual inertial system, electronic equipment and medium
CN112229424B (en) * 2020-11-16 2022-04-22 浙江商汤科技开发有限公司 Parameter calibration method and device for visual inertial system, electronic equipment and medium
CN112629565A (en) * 2021-03-08 2021-04-09 中国人民解放军国防科技大学 Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit
CN112629565B (en) * 2021-03-08 2021-06-01 中国人民解放军国防科技大学 Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit
CN113313763A (en) * 2021-05-26 2021-08-27 珠海深圳清华大学研究院创新中心 Monocular camera pose optimization method and device based on neural network
CN114018291A (en) * 2021-11-08 2022-02-08 中国科学院空天信息创新研究院 Calibration method and device for parameters of inertial measurement unit
CN115451958A (en) * 2022-11-10 2022-12-09 中国人民解放军国防科技大学 Camera absolute attitude optimization method based on relative rotation angle
CN115451958B (en) * 2022-11-10 2023-02-03 中国人民解放军国防科技大学 Camera absolute attitude optimization method based on relative rotation angle

Also Published As

Publication number Publication date
CN111429524B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN111429524B (en) Online initialization and calibration method and system for camera and inertial measurement unit
Chilian et al. Multisensor data fusion for robust pose estimation of a six-legged walking robot
CN110081881B (en) Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology
CN110986968B (en) Method and device for real-time global optimization and error loop judgment in three-dimensional reconstruction
CN112113582A (en) Time synchronization processing method, electronic device, and storage medium
CN112183171A (en) Method and device for establishing beacon map based on visual beacon
Yang et al. Online imu intrinsic calibration: Is it necessary?
CN112184824A (en) Camera external parameter calibration method and device
CN113223161B (en) Robust panoramic SLAM system and method based on IMU and wheel speed meter tight coupling
CN111707261A (en) High-speed sensing and positioning method for micro unmanned aerial vehicle
CN111754579A (en) Method and device for determining external parameters of multi-view camera
CN112347205A (en) Method and device for updating error state of vehicle
CN112388635B (en) Method, system and device for fusing sensing and space positioning of multiple sensors of robot
CN115540860A (en) Multi-sensor fusion pose estimation algorithm
CN110986928A (en) Real-time drift correction method for triaxial gyroscope of photoelectric pod
CN114310901A (en) Coordinate system calibration method, apparatus, system and medium for robot
CN114217665A (en) Camera and laser radar time synchronization method, device and storage medium
JP7145770B2 (en) Inter-Vehicle Distance Measuring Device, Error Model Generating Device, Learning Model Generating Device, Methods and Programs Therefor
Liu et al. High altitude monocular visual-inertial state estimation: Initialization and sensor fusion
CN108827287B (en) Robust visual SLAM system in complex environment
JP2730457B2 (en) Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
CN113436267A (en) Visual inertial navigation calibration method and device, computer equipment and storage medium
CN114812601A (en) State estimation method and device of visual inertial odometer and electronic equipment
CN115797490B (en) Graph construction method and system based on laser vision fusion
CN112712107B (en) Optimization-based vision and laser SLAM fusion positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant