CN113763479A - Calibration method for catadioptric panoramic camera and IMU sensor - Google Patents

Calibration method for catadioptric panoramic camera and IMU sensor Download PDF

Info

Publication number
CN113763479A
CN113763479A CN202110811710.2A CN202110811710A CN113763479A CN 113763479 A CN113763479 A CN 113763479A CN 202110811710 A CN202110811710 A CN 202110811710A CN 113763479 A CN113763479 A CN 113763479A
Authority
CN
China
Prior art keywords
imu
camera
catadioptric
coordinate system
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110811710.2A
Other languages
Chinese (zh)
Other versions
CN113763479B (en
Inventor
张裕
李梦迪
张越
陈蔓菲
曹猛
彭新力
张恺霖
陈天楷
徐熙平
王世峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun University of Science and Technology
Original Assignee
Changchun University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun University of Science and Technology filed Critical Changchun University of Science and Technology
Priority to CN202110811710.2A priority Critical patent/CN113763479B/en
Publication of CN113763479A publication Critical patent/CN113763479A/en
Application granted granted Critical
Publication of CN113763479B publication Critical patent/CN113763479B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

A calibration method of a catadioptric panoramic camera and an IMU sensor belongs to the technical field of camera calibration, and aims to solve the problems in the prior art, the calibration method comprises the following steps: calculating the external parameters through an EPNP algorithm according to the internal parameters of the catadioptric camera, and establishing coordinates of constraint calculation control points in a camera coordinate system by applying N three-dimensional points so as to obtain the external parameters of the catadioptric panoramic camera; acquiring measurement data of the IMU, performing pre-integration processing on the measurement data, deducing state information of a next key frame according to a measurement model and estimation quantity of pre-integration on bias, and solving external reference data of the IMU through singular value decomposition; and finding the relation between the coordinate systems of the catadioptric panoramic camera and the IMU according to the measurement models of the catadioptric panoramic camera and the IMU, optimizing external parameters between the catadioptric camera and the IMU to construct a reprojection error and a pose estimation value between the camera and the IMU, obtaining the final external parameter calibration number, and realizing the accurate calibration of the catadioptric panoramic camera and the IMU on external parameters.

Description

Calibration method for catadioptric panoramic camera and IMU sensor
Technical Field
The invention belongs to the technical field of camera calibration, and particularly relates to a calibration method of a catadioptric panoramic camera and an IMU sensor.
Background
Camera calibration is one of the basic problems in computer vision, and is a long-standing hotspot problem in vision measurement research. Because the difference of the internal structure and the imaging mechanism of the camera, the movement of the camera and the measured value of the sensor deviate from the real value due to various reasons (such as electromagnetic interference, inaccurate manual installation and the like), the obtained images have distortion of different degrees, and therefore, the sensor and the camera must be modeled and calibrated, the measurement characteristics of the sensor and the distortion characteristics of the camera must be mastered, so as to obtain accurate camera parameters and establish a camera model, thereby obtaining good image information and improving the precision of the camera. The calibration of the camera is basically divided into two types, the first type is the self-calibration of the camera; the second is a calibration method that relies on a calibration reference. The former method is that a camera shoots surrounding objects, and an intersection point on an image plane of the camera is found by utilizing constraint information of camera motion to calibrate and obtain camera parameters, but the calibration result of the method is a method based on a curved surface and a quadratic curve, so that the method has large error and poor robustness, and is not suitable for high-precision application occasions. The latter is to calibrate the reference object, image by the camera, and calculate the internal and external parameters of the camera through the mutual conversion of the image coordinate system, the camera coordinate system and the world coordinate system and the later space arithmetic operation. The method has high calibration precision and is suitable for application occasions with high precision requirements. The calibration of the traditional camera is a linear problem, while the calibration of the catadioptric panoramic camera is a nonlinear problem, most of the existing methods are directed to the traditional camera, and the external reference calibration of the catadioptric panoramic camera and an Inertial Measurement Unit (IMU) is still a challenging problem.
Compared with the traditional camera system, the catadioptric panoramic imaging system realizes the purpose of expanding the visual field by adding one reflecting element on the basis of the traditional camera, and the maximum space visual field range can reach a hemisphere. And the catadioptric panoramic imaging system utilizes a convex surface with a rotationally symmetric quadric surface as a reflector, collects, compresses and reflects light rays in a view field into a camera system below, and finally realizes panoramic imaging of all around vision. And the IMU can acquire information such as self acceleration, angular velocity and the like, and is used for detecting and measuring the acceleration and the sensor of the rotary motion. The IMU may provide a relative positioning information determined by measuring the path of movement of the object relative to the starting point, and by incorporating a folded-back reflex panoramic camera, a relatively accurate positioning may be achieved. Therefore, the external reference calibration method based on the catadioptric panoramic camera and the IMU sensor has a great effect on the influence of the precision.
The Chinese patent publication number is 'CN 105678783A', the patent name is 'catadioptric panoramic camera and laser radar data fusion calibration method', the method mainly carries out effective calibration on parameters inside the panoramic camera, but external reference calibration and data optimization are lacked.
The Chinese patent publication number is 'CN 111207774A', the patent name is 'a method and a system for calibrating laser-IMU external parameters', the method mainly aims at the tight coupling of laser and IMU to calibrate the laser-IMU external parameters, but the field of view of the system is limited.
Disclosure of Invention
The invention aims to solve the problems that the catadioptric panoramic camera and the laser radar lack external reference calibration and data optimization, and measurement errors occur; and the limited field of view of the laser; an external reference calibration method for a catadioptric panoramic camera and an IMU sensor is provided.
The technical scheme adopted by the invention for solving the technical problems is as follows:
an external reference calibration method for a catadioptric panoramic camera and an IMU sensor comprises the following steps:
step 1, obtaining external parameters of a catadioptric panoramic camera: calculating the external parameters through an EPNP algorithm according to the internal parameters of the catadioptric camera, establishing coordinates of the constraint calculation control points in a camera coordinate system by applying N three-dimensional points, and reducing the influence of the true characteristic points on camera pose estimation errors in such a way so as to obtain the external parameters of the catadioptric panoramic camera;
step 2, obtaining external parameters of the IMU sensor: acquiring measurement data of the IMU, performing pre-integration processing on the acquired IMU data, deducing state information of a next key frame according to a measurement model and estimation quantity of pre-integration on bias, and solving external reference data of the IMU through singular value decomposition;
step 3, calibrating external parameters of the catadioptric camera-IMU: according to the measurement model of the catadioptric panoramic camera and the IMU sensor, the relation between the catadioptric panoramic camera coordinate system and the IMU coordinate system is found, the reprojection error and the pose estimation value between the camera and the IMU are constructed, external parameters between the catadioptric camera and the IMU are optimized, the final external parameter calibration number is obtained, and the external parameters are accurately calibrated by the catadioptric panoramic camera and the IMU.
The invention has the beneficial effects that: the invention firstly fuses the catadioptric panoramic camera and the sensor used in the positioning and mapping process, thereby ensuring the accuracy of the original data while expanding the field range. And secondly, by using IMU pre-integration and pose estimation judged at the up-down moment, the deviation can be corrected, and the result of external reference calibration optimization is improved. Meanwhile, error optimization is carried out between the catadioptric panoramic camera and the IMU, residual errors are used as constraint conditions of external reference calibration, and more accurate external acquisition estimation is carried out through continuous error optimization. Finally, the relative pose of the camera and the IMU is needed in the positioning process, and data fusion, namely joint calibration, between the camera and the IMU is needed, so that the quality of the subsequent processing effect is greatly influenced.
Drawings
FIG. 1: the invention relates to a flow chart of an external reference calibration method of a catadioptric panoramic camera and an IMU sensor.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
As shown in fig. 1, an external reference calibration method for a catadioptric panoramic camera and an IMU sensor includes the following steps:
step 1, obtaining external parameters of a catadioptric panoramic camera: calculating the external parameters through an EPNP algorithm according to the internal parameters of the catadioptric camera, establishing coordinates of the constraint calculation control points in a camera coordinate system by applying N three-dimensional points, reducing the influence of the true characteristic points on camera pose estimation errors in such a way, and acquiring the external parameters of the catadioptric panoramic camera;
step 2, obtaining external parameters of the IMU sensor: obtaining measurement data of the IMU, performing pre-integration processing on the obtained IMU data, deducing state information of a next key frame according to a measurement model and an estimator of pre-integration on bias, and solving external reference data of the IMU through singular value decomposition;
step 3, calibrating external parameters of the catadioptric camera-IMU: according to the measurement model of the catadioptric panoramic camera and the IMU sensor, the relation between the catadioptric panoramic camera coordinate system and the IMU coordinate system is found, the reprojection error and the pose estimation value between the camera and the IMU are constructed, external parameters between the catadioptric camera and the IMU are optimized, the final external parameter calibration number is obtained, and the external parameters are accurately calibrated by the catadioptric panoramic camera and the IMU.
In the step 1, calibration of the catadioptric panoramic camera is obtained, and the method specifically comprises the following steps:
1-1, preparing a chessboard pattern calibration board with known size;
1-2, fixing a calibration plate, and collecting images of the calibration plate at different positions and angles in the images by a mobile catadioptric panoramic camera;
1-3, extracting corner point information calibrated by the checkerboard in each image through a calibration algorithm, and calculating external parameters of the catadioptric camera by applying an EPNP algorithm according to camera internal parameter results of prior information;
1) establishing a world coordinate system, wherein the reference points of the camera coordinate system are respectively
Figure BDA0003168464930000031
And
Figure BDA0003168464930000032
and the control points under two coordinate systems are respectively
Figure BDA0003168464930000033
And
Figure BDA0003168464930000034
2) the coordinates of the reference point are expressed as a weighted sum of the control point coordinates using the EPNP algorithm:
Figure BDA0003168464930000035
and the same relationship still exists under the camera coordinate system:
Figure BDA0003168464930000036
selecting the center of gravity of the three-dimensional reference point as a first control point:
Figure BDA0003168464930000037
3) computing
Figure BDA0003168464930000038
Center of gravity of
Figure BDA0003168464930000039
And matrix A
Figure BDA0003168464930000041
4) Computing
Figure BDA0003168464930000042
And center of gravity
Figure BDA0003168464930000043
And matrix B
Figure BDA0003168464930000044
5) Calculating H ═ B-1A, performing singular value decomposition on H:
H=UΣVT (5)
6) calculating rotation matrix and translation quantities
Figure BDA0003168464930000045
In the step 2, external reference calibration of the IMU sensor is obtained, and the specific steps are as follows:
the measurement model of the IMU sensor is as follows:
Figure BDA0003168464930000046
Ba (t) is the accelerometer measuring the acceleration in the inertial coordinate system,
Figure BDA0003168464930000047
the gyroscope measures the angular velocity of rotation; ba(t) and bg(t) accelerometer and gyroscope biases as a function of time; n isa(t) and ng(t) accelerometer and gyroscope white noise resolved over time; gwA gravity vector under world coordinates; a isw(t) acceleration of the world coordinate system over time;
Figure BDA0003168464930000048
angular velocity at the carrier coordinate; and WB is the transformation of IMU coordinate system B to world coordinate system W; b, W is a carrier coordinate system B and a world coordinate system W;
assume that the last second of the IMU sensor bias is
Figure BDA0003168464930000049
Under the influence of the small disturbance deltab, the new value is converted into the sum integral deltaR of the IMU sensor measurement values between the moment b, i and the moment ji,j,Δυi,j,Δpi,jRotation information R, position information p, velocity information v and offset b under IMUi. And updating the estimation quantity of the pre-integral with respect to the bias by using a first-order approximation formula according to the measurement model:
Figure BDA0003168464930000051
deducing the state information of the next key frame according to the formula:
Figure BDA0003168464930000052
Figure BDA0003168464930000053
a jacobian matrix of rotation increments relative to a gyroscope bias;
Figure BDA0003168464930000054
a jacobian matrix of velocity increments with respect to gyroscope bias and accelerometers;
Figure BDA0003168464930000055
and
Figure BDA0003168464930000056
a jacobian matrix of displacement increments relative to gyroscope bias and acceleration bias; gyroscope bias estimation:
Figure BDA0003168464930000057
Figure BDA0003168464930000058
the gravity vector and accelerometer bias are solved by singular value decomposition. All attitudes include rotation angle R, speed V and position P. And obtaining the actual IMU measurement of which the movement track of the IMU is t + delta t within delta t time under the IMU initial pose coordinate system.
Step 3, calibrating external parameters of the catadioptric camera-IMU, which comprises the following steps:
and (3) analyzing the poses of the two sensors from the measurement models of the catadioptric panoramic camera and the IMU sensor respectively according to the last two steps. The following relation is satisfied between the catadioptric panoramic camera coordinate system and the IMU coordinate system:
Twb=Twc.Tcb (12)
Figure BDA0003168464930000059
and (3) developing the above formula to obtain the relation between the rotation angle and the translation amount between the camera coordinate system and the IMU coordinate system: and obtaining the pose relation between the catadioptric panoramic camera and the IMU sensor.
Rwb=Rwc.Rcb (14)
Pwb=Rwc.Pcb+s.Pwc (15)
And then, for the constructed reprojection error and the pose estimation value between the camera and the IMU, a set S is associated by adopting a least square method to iterate the external parameters between the camera and the IMU, so as to obtain the final external parameter calibration number.
Algebraic function E of reprojection error termvisual(k, j) performing optimized estimation on the state variable Xc of the camera:
Xc={R0,p0,R1,p1,...,Rn,pn,wX0,wX1,...,wXm} (16)
Figure BDA0003168464930000061
Figure BDA0003168464930000062
wXkthree-dimensional space points under a world coordinate system; x is the number ofkPixel plane coordinates in a pixel plane; simultaneous construction of cost function E of IMU sensorIMU(k, j) and by nonlinear optimization on state XiAnd (3) estimating:
Figure BDA0003168464930000063
Figure BDA0003168464930000064
Figure BDA0003168464930000065
and (4) performing combined optimization according to the two sensor error equations to finally obtain the final external parameter.
External reference calibration experiment of a catadioptric camera and an IMU sensor:
adopting a Kalibr tool box, selecting a chessboard pattern calibration plate, and specifically operating as follows:
1. knowing the parameters of the calibration plate, shooting a panoramic picture, carrying out graying processing, and identifying the calibration plate in each frame of image;
2. extracting angular points by using a Harris operator according to the characteristic region where the selected calibration plate is located, obtaining internal parameters according to calculation, and performing calculation of external parameters by using an EPNE algorithm;
3. the catadioptric camera and the IMU sensor are regarded as a whole, the camera is moved, and images in the motion process are recorded into bag files;
4. using a kalibr toolbox, executing a kalibe _ calibrate command, and importing camera internal parameters and IMU noise parameters (as known) respectively to obtain a rotation matrix between the camera and the IMU and internal parameters of a gyroscope;
5. after the rotation matrix is determined, the camera position and the IMU pre-integration position are continuously optimized to obtain the displacement part in the external reference, and then the external reference calibration results of the catadioptric camera and the IMU sensor are obtained.

Claims (3)

1. An external reference calibration method for a catadioptric panoramic camera and an IMU sensor comprises the following steps:
step 1, obtaining external parameters of a catadioptric panoramic camera: calculating the external parameters through an EPNP algorithm according to the internal parameters of the catadioptric camera, establishing coordinates of the constraint calculation control points in a camera coordinate system by applying N three-dimensional points, reducing the influence of real characteristic points on camera pose estimation errors in such a way, and obtaining the external parameters of the catadioptric panoramic camera;
step 2, obtaining external parameters of the IMU sensor: performing pre-integration processing on the obtained IMU data by acquiring measurement data of the IMU, deducing state information of a next key frame according to a measurement model and estimation quantity of pre-integration on bias, and solving external reference data of the IMU through singular value decomposition;
step 3, calibrating external parameters of the catadioptric camera-IMU: according to the measurement model of the catadioptric panoramic camera and the IMU sensor, the relation between the catadioptric panoramic camera coordinate system and the IMU coordinate system is found, the reprojection error and the pose estimation value between the camera and the IMU are constructed, external parameters between the catadioptric camera and the IMU are optimized, the final external parameter calibration number is obtained, and the external parameters are accurately calibrated by the catadioptric panoramic camera and the IMU.
2. The external reference calibration method for the catadioptric panoramic camera and the IMU sensor according to claim 1, wherein in the step 1, calibration of the catadioptric panoramic camera is obtained by the specific steps of:
1-1, preparing a chessboard pattern calibration board with known size;
1-2, fixing a calibration plate, and collecting images of the calibration plate at different positions and angles in the images by a mobile catadioptric panoramic camera;
1-3, extracting corner point information calibrated by the checkerboard in each image through a calibration algorithm, and calculating external parameters of the catadioptric camera by applying an EPNP algorithm according to camera internal parameter results of prior information;
the EPNP algorithm comprises the following steps:
1) establishing a world coordinate system, wherein the reference points of the camera coordinate system are respectively
Figure FDA0003168464920000011
And
Figure FDA0003168464920000012
and the control points under two coordinate systems are respectively
Figure FDA0003168464920000013
And
Figure FDA0003168464920000014
2) the coordinates of the reference point are expressed as a weighted sum of the control point coordinates using the EPNP algorithm:
Figure FDA0003168464920000015
and the same relationship still exists under the camera coordinate system:
Figure FDA0003168464920000016
selecting the center of gravity of the three-dimensional reference point as a first control point:
Figure FDA0003168464920000017
3) computing
Figure FDA0003168464920000021
Center of gravity of
Figure FDA0003168464920000022
And matrix A
Figure FDA0003168464920000023
4) Computing
Figure FDA0003168464920000024
And center of gravity
Figure FDA0003168464920000025
And matrix B
Figure FDA0003168464920000026
5) Calculating H ═ B-1A, performing singular value decomposition on H:
H=UΣVT (5)
6) calculating rotation matrix and translation quantities
Figure FDA0003168464920000027
In the step 2, external reference calibration of the IMU sensor is obtained, and the specific steps are as follows:
the measurement model of the IMU sensor is as follows:
Figure FDA0003168464920000028
Ba (t) is the accelerometer measuring the acceleration in the inertial coordinate system,Bw* wb(t) the gyroscope measures a rotational angular velocity; ba(t) and bg(t) accelerometer and gyroscope biases as a function of time; n isa(t) and ng(t) is dialect over timeAccelerometer and gyroscope white noise; gwA gravity vector under world coordinates; a isw(t) acceleration of the world coordinate system over time;Bwwb(t) angular velocity at the carrier coordinate; and WB is the transformation of IMU coordinate system B to world coordinate system W; b, W is a carrier coordinate system B and a world coordinate system W;
assume that the last second of the IMU sensor bias is
Figure FDA0003168464920000029
Under the influence of the small disturbance deltab, the new value is converted into the sum integral deltaR of the IMU sensor measurement values between the moment b, i and the moment ji,j,Δυi,j,Δpi,jRotation information R, position information p, velocity information v and offset b under IMUi. And updating the estimation quantity of the pre-integral with respect to the bias by using a first-order approximation formula according to the measurement model:
Figure FDA0003168464920000031
deducing the state information of the next key frame according to the formula:
Figure FDA0003168464920000032
Figure FDA0003168464920000033
a jacobian matrix of rotation increments relative to a gyroscope bias;
Figure FDA0003168464920000034
a jacobian matrix of velocity increments with respect to gyroscope bias and accelerometers;
Figure FDA0003168464920000035
and
Figure FDA0003168464920000036
a jacobian matrix of displacement increments relative to gyroscope bias and acceleration bias; gyroscope bias estimation:
Figure FDA0003168464920000037
Figure FDA0003168464920000038
solving a gravity vector and an accelerometer bias through singular value decomposition; all attitudes include rotation angle R, speed V and position P; and obtaining the actual IMU measurement of which the movement track of the IMU is t + delta t within delta t time under the IMU initial pose coordinate system.
3. The external reference calibration method for the catadioptric panoramic camera and the IMU sensor according to claim 2, wherein in the step 3, the catadioptric camera-IMU external reference calibration specifically comprises the following steps:
respectively starting from the measurement models of the catadioptric panoramic camera and the IMU sensor according to the step 1 and the step 2, and analyzing the poses of the two sensors; the following relation is satisfied between the catadioptric panoramic camera coordinate system and the IMU coordinate system:
Twb=Twc.Tcb (12)
Figure FDA0003168464920000039
and (3) developing the above formula to obtain the relation between the rotation angle and the translation amount between the camera coordinate system and the IMU coordinate system: obtaining a pose relation between the catadioptric panoramic camera and the IMU sensor;
Rwb=Rwc.Rcb (14)
Pwb=Rwc.Pcb+s.Pwc (15)
then, for the constructed re-projection error and the pose estimation value between the camera and the IMU, a set S is associated by adopting a least square method to iterate external parameters between the camera and the IMU to obtain a final external parameter calibration number;
algebraic function E of reprojection error termvisual(k, j) performing optimized estimation on the state variable Xc of the camera:
Xc={R0,p0,R1,p1,...,Rn,pn,wX0,wX1,...,wXm} (16)
Figure FDA0003168464920000041
Figure FDA0003168464920000042
wXkthree-dimensional space points under a world coordinate system; x is the number ofkPixel plane coordinates in a pixel plane; simultaneous construction of cost function E of IMU sensorIMU(k, j) and by nonlinear optimization on state XiAnd (3) estimating:
Figure FDA0003168464920000043
Figure FDA0003168464920000044
Figure FDA0003168464920000045
and (4) performing combined optimization according to the two sensor error equations to finally obtain the final external parameter.
CN202110811710.2A 2021-07-19 2021-07-19 Calibration method of refraction and reflection panoramic camera and IMU sensor Active CN113763479B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110811710.2A CN113763479B (en) 2021-07-19 2021-07-19 Calibration method of refraction and reflection panoramic camera and IMU sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110811710.2A CN113763479B (en) 2021-07-19 2021-07-19 Calibration method of refraction and reflection panoramic camera and IMU sensor

Publications (2)

Publication Number Publication Date
CN113763479A true CN113763479A (en) 2021-12-07
CN113763479B CN113763479B (en) 2024-04-12

Family

ID=78787716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110811710.2A Active CN113763479B (en) 2021-07-19 2021-07-19 Calibration method of refraction and reflection panoramic camera and IMU sensor

Country Status (1)

Country Link
CN (1) CN113763479B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549656A (en) * 2022-02-14 2022-05-27 希姆通信息技术(上海)有限公司 Calibration method for AR (augmented reality) glasses camera and IMU (inertial measurement Unit)
CN115855117A (en) * 2023-02-16 2023-03-28 深圳佑驾创新科技有限公司 Combined calibration method for installation postures of camera and inertia measurement unit relative to vehicle body
EP4354397A1 (en) * 2022-10-11 2024-04-17 Continental Autonomous Mobility Germany GmbH Method for a camera-based pose estimation, data processing device, computer program and computer-readable medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180231385A1 (en) * 2016-10-25 2018-08-16 Massachusetts Institute Of Technology Inertial Odometry With Retroactive Sensor Calibration
CN109029433A (en) * 2018-06-28 2018-12-18 东南大学 Join outside the calibration of view-based access control model and inertial navigation fusion SLAM on a kind of mobile platform and the method for timing
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
WO2020259106A1 (en) * 2019-06-24 2020-12-30 深圳奥比中光科技有限公司 Calibration method and device for relative attitudes of camera and inertial measurement unit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180231385A1 (en) * 2016-10-25 2018-08-16 Massachusetts Institute Of Technology Inertial Odometry With Retroactive Sensor Calibration
CN109029433A (en) * 2018-06-28 2018-12-18 东南大学 Join outside the calibration of view-based access control model and inertial navigation fusion SLAM on a kind of mobile platform and the method for timing
WO2020259106A1 (en) * 2019-06-24 2020-12-30 深圳奥比中光科技有限公司 Calibration method and device for relative attitudes of camera and inertial measurement unit
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐从营;蔡成涛;朱齐丹;: "基于全景视觉的旋翼共锥度测量方法", ***工程与电子技术, no. 02 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549656A (en) * 2022-02-14 2022-05-27 希姆通信息技术(上海)有限公司 Calibration method for AR (augmented reality) glasses camera and IMU (inertial measurement Unit)
EP4354397A1 (en) * 2022-10-11 2024-04-17 Continental Autonomous Mobility Germany GmbH Method for a camera-based pose estimation, data processing device, computer program and computer-readable medium
CN115855117A (en) * 2023-02-16 2023-03-28 深圳佑驾创新科技有限公司 Combined calibration method for installation postures of camera and inertia measurement unit relative to vehicle body
CN115855117B (en) * 2023-02-16 2023-06-02 深圳佑驾创新科技有限公司 Combined calibration method for mounting posture of camera and inertial measurement unit relative to vehicle body

Also Published As

Publication number Publication date
CN113763479B (en) 2024-04-12

Similar Documents

Publication Publication Date Title
CN113763479A (en) Calibration method for catadioptric panoramic camera and IMU sensor
CN112598757B (en) Multi-sensor time-space calibration method and device
KR100728377B1 (en) Method for real-time updating gis of changed region vis laser scanning and mobile internet
CN112629431B (en) Civil structure deformation monitoring method and related equipment
CN107014399B (en) Combined calibration method for satellite-borne optical camera-laser range finder combined system
CN100520297C (en) Zero deflection band based star sensor ground surface calibration method
US20110007948A1 (en) System and method for automatic stereo measurement of a point of interest in a scene
CN109708649B (en) Attitude determination method and system for remote sensing satellite
Zhang et al. A universal and flexible theodolite-camera system for making accurate measurements over large volumes
CN102622747B (en) Camera parameter optimization method for vision measurement
CN109003309B (en) High-precision camera calibration and target attitude estimation method
CN106457562A (en) Method for calibrating a robot and a robot system
CN110969665B (en) External parameter calibration method, device, system and robot
CN111486864B (en) Multi-source sensor combined calibration method based on three-dimensional regular octagon structure
CN112378396A (en) Hybrid high-precision indoor positioning method based on robust LM visual inertial odometer and UWB
CN114526745A (en) Drawing establishing method and system for tightly-coupled laser radar and inertial odometer
CN112461224B (en) Magnetometer calibration method based on known attitude angle
CN104422425A (en) Irregular-outline object space attitude dynamic measuring method
CN113267794B (en) Antenna phase center correction method and device with base line length constraint
CN112129263B (en) Distance measurement method of separated mobile stereo distance measurement camera
CN111220120A (en) Moving platform binocular ranging self-calibration method and device
CN112792814A (en) Mechanical arm zero calibration method based on visual marks
CN114964276A (en) Dynamic vision SLAM method fusing inertial navigation
CN111627100A (en) Numerical simulation method for evaluating photogrammetry precision
CN112113564B (en) Positioning method and system based on image sensor and inertial sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant