CN112819898A - Thermal imager-RGB camera-IMU space combined calibration method, system and storage medium - Google Patents

Thermal imager-RGB camera-IMU space combined calibration method, system and storage medium Download PDF

Info

Publication number
CN112819898A
CN112819898A CN202110076591.0A CN202110076591A CN112819898A CN 112819898 A CN112819898 A CN 112819898A CN 202110076591 A CN202110076591 A CN 202110076591A CN 112819898 A CN112819898 A CN 112819898A
Authority
CN
China
Prior art keywords
thermal imager
imu
rgb
rgb camera
thermal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110076591.0A
Other languages
Chinese (zh)
Inventor
张吟龙
苑明哲
梁炜
赵文涛
肖金超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Institute of Automation of CAS
Original Assignee
Shenyang Institute of Automation of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Institute of Automation of CAS filed Critical Shenyang Institute of Automation of CAS
Priority to CN202110076591.0A priority Critical patent/CN112819898A/en
Publication of CN112819898A publication Critical patent/CN112819898A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a thermal imager-RGB camera-IMU space combined calibration method, a system and a storage medium, wherein the method comprises two steps of thermal imager image feature positioning and thermal imager-RGB camera-IMU combined calibration, the method comprises the steps of firstly adopting a mixed Gaussian distribution model to position the heat source center position of an LED array, then constructing a target function by using the reprojection error of a target point in a thermal imager image, the reprojection error of an RGB image corner point and the acceleration and angular velocity error of an IMU, and calculating the spatial relation among three sensor coordinate systems by minimizing the target function. The method is suitable for combined calibration of three sensors, namely a thermal imager, an RGB camera and an IMU, and has the advantages of stability and accuracy.

Description

Thermal imager-RGB camera-IMU space combined calibration method, system and storage medium
Technical Field
The invention belongs to the technical field of image processing technology and multivariate information fusion, and particularly relates to a thermal imager-RGB camera-IMU space combined calibration method, system and storage medium.
Background
The multi-modal sensing platform integrated with the thermal imager, the RGB camera and the Inertial Measurement Unit (IMU) can enable the system to have the function of environmental sensing under complex and severe conditions, such as detection of vehicles and pedestrians driving at night; objects self-perceive under a violent motion background, etc.
The thermal imager is a device which converts an image displaying the temperature distribution of an object into a visual image by utilizing a thermal imaging technology, and has the advantages of good concealment, high safety, strong detection capability, long acting distance and the like. The RGB camera can collect rich scene information and is widely applied in the fields of positioning and navigation. The IMU is a device for measuring the three-axis angular velocity and the acceleration of an object, provides self inertial information similar to the human internal vestibular system, and has important application value in navigation. The thermal imager, the RGB camera and the IMU have complementary advantages in the aspects of image modality, perception accuracy and response speed, and the effective fusion of the thermal imager, the RGB camera and the IMU is widely concerned by robots and students in the field of automation. The effective and accurate calibration among the thermal imager, the RGB camera and the IMU is the premise of realizing multi-modal perception and fusion of the thermal imager, the RGB camera and the IMU, so the invention provides a thermal imager-RGB camera-IMU combined calibration method.
Disclosure of Invention
The invention mainly aims to overcome the defects in the prior art, and provides a thermal imager-RGB camera-IMU space combined calibration method, system and storage medium to realize effective fusion of three sensors.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention discloses a thermal imager-RGB camera-IMU space combined calibration method, which comprises the following steps:
positioning the image characteristics of the thermal imager, and searching the position of a target point of a calibration plate of the thermal imager;
the method comprises the following steps of thermal imager-RGB camera-IMU combined calibration, integrating thermal imager images, RGB images and IMU information in a unified framework, jointly estimating spatial relations among three coordinate systems of the RGB camera, the IMU and the thermal imager, including thermal imager and RGB camera spatial calibration, thermal imager and IMU spatial calibration and RGB camera and IMU spatial calibration, and then calculating parameters to be solved through a minimized objective function:
Figure BDA0002907746010000021
wherein, TrtRepresenting a transformation matrix of the thermal imager coordinate system relative to the RGB camera coordinate system; t isriA transformation matrix representing the inertial coordinate system relative to the RGB camera coordinate system; t istiRepresenting a transformation matrix, U, of the inertial frame relative to the thermal imager framerepro,rgb,thermalRepresenting the reprojection error, U, of the thermal imager image heat source center resolved by RGB camera motionrepro,thermal,rgbRepresenting the reprojection error, U, of the RGB image heat source center obtained by thermal imager motion solutionrepro,thermal,imuRepresenting a reprojection error of the thermal imager image heat source center obtained by IMU motion resolving; u shapeα,ω,rgbRepresenting the sum of the error terms of inertial acceleration and angular velocity of the IMU resulting from the RGB camera motion solution, Uα,ω,thermalRepresenting the sum of error terms of inertial acceleration and angular velocity of the IMU obtained by thermal imager motion calculation;
Trt,Tri,Ttithe following constraints are satisfied:
Tti·Tri -1·Trt=Ttr·Trt=I
wherein I is the identity matrix, (T)ti·Tri -1) Describing the conversion relationship from the RGB camera coordinate system to the thermal imager coordinate system, Tti·Tri -1·Trt=Ttr·TrtAdd to objective function T ═ Irt,Tri,TtiAnd obtaining a spatial relationship between the thermal imager, the RGB camera and the IMU coordinate system.
As a preferred technical scheme, in the thermal imager image feature positioning, a mixed gaussian distribution model is used to find the position of a target point of a thermal imager calibration plate, and an illumination area is modeled into a plurality of gaussian distributions, which are described as follows:
Figure BDA0002907746010000031
wherein A isiIs the amplitude of the ith heat source center; lambda [ alpha ]iAnd muiRespectively representing the position in the ith characteristic region and the centroid of the corresponding illumination region; sigmaiA covariance matrix representing the i-th region gaussian distribution; estimation of unknown parameter set by optimizing objective functioniii}:
Figure BDA0002907746010000032
Wherein I (x, y) represents a pixel value of (x, y); omegaiRepresenting a candidate set of pixels in an ith illumination region; even in the case of partial bulb area loss, the illumination area center can be accurately located by optimizing the objective function.
As a preferred technical scheme, in the thermal imager-RGB camera-IMU combined calibration, the objective function includes: the method comprises the following steps of re-projecting errors of target points of images of the thermal imager, re-projecting errors of corner points in RGB images, re-projecting errors of target points of images of the thermal imager, acceleration and angular velocity errors of IMUs, re-projecting errors of corner points in RGB images and acceleration and angular velocity errors of the IMUs.
As an optimal technical scheme, the thermal imager-RGB camera space calibration is adopted to describe the U part of the target function by adopting the reprojection errorrepro,rgb,thermalAnd Urepro,thermal,rgb,Urepro,rgb,thermalIs represented as:
Figure BDA0002907746010000033
wherein,
Figure BDA0002907746010000034
and
Figure BDA0002907746010000035
the first pixel positions of the RGB image and the thermal imager image at the moment are respectively, t is more than or equal to 1 and less than or equal to K, and i is more than or equal to 1 and less than or equal to M; ctAnd CrThermal imager and RGB camera models, respectively; t istrIs the transformation relation of the RGB camera coordinate system relative to the thermal imager coordinate system; srtThe weight coefficient is the center re-projection error of the thermal imager image heat source;
Urepro,thermal,rgbis represented as:
Figure BDA0002907746010000041
wherein s istrIs the weight coefficient of the center reprojection error of the RGB image heat source.
As a preferred technical scheme, the thermal imager and the IMU space calibration specifically comprises:
the transformation from the IMU coordinate system to the world coordinate system is defined as:
Figure BDA0002907746010000042
wherein R iswi(t) and twi(t) represents rotation and translation from the inertial frame to the world frame at time t, respectively;
the error terms of inertial acceleration and angular velocity are:
Figure BDA0002907746010000043
Figure BDA0002907746010000044
Figure BDA0002907746010000045
wherein s isαIs a weight coefficient corresponding to the acceleration error; sωIs a weight coefficient corresponding to the angular velocity error;
the ith projected onto the thermal imager image at time t
Figure BDA0002907746010000046
For pixel location
Figure BDA0002907746010000047
Represents; u shaperepro,thermal,imuGiven by:
Figure BDA0002907746010000048
wherein s istiThe weight coefficient of the ith heat source center on the thermal imager image at the time t; ctIs a thermal imager projection model.
As a preferred technical solution, the RGB camera and IMU space calibration specifically includes:
the error terms for acceleration and angular velocity are given by:
Figure BDA0002907746010000051
Figure BDA0002907746010000052
Figure BDA0002907746010000053
the ith projected onto RGB image at time t
Figure BDA0002907746010000054
By the pixel position of
Figure BDA0002907746010000055
Represents, Urepro,rgb,imuGiven by:
Figure BDA0002907746010000056
wherein s istiThe weight coefficient is the center reprojection error of the ith heat source on the RGB image at the time t; crIs an RGB camera projection model.
The invention also discloses a thermal imager-RGB camera-IMU space combined calibration system, which is applied to the thermal imager-RGB camera-IMU space combined calibration method and comprises a thermal imager image feature positioning module and a thermal imager-RGB camera-IMU combined calibration module;
the thermal imager image feature positioning module is used for positioning the thermal imager image features and searching the target point position of the thermal imager calibration plate;
the thermal imager-RGB camera-IMU combined calibration module is used for thermal imager-RGB camera-IMU combined calibration, images of the thermal imager, RGB images and IMU information are integrated in a unified framework, spatial relations among three coordinate systems of the RGB camera, the IMU and the thermal imager are jointly estimated, the spatial relations include thermal imager and RGB camera spatial calibration, thermal imager and IMU spatial calibration and RGB camera and IMU spatial calibration, and then parameters to be solved are calculated through a minimized objective function:
Figure BDA0002907746010000061
wherein, TrtRepresenting a transformation matrix of the thermal imager coordinate system relative to the RGB camera coordinate system; t isriA transformation matrix representing the inertial coordinate system relative to the RGB camera coordinate system; t istiRepresenting a transformation matrix, U, of the inertial frame relative to the thermal imager framerepro,rgb,thermalRepresenting the reprojection error, U, of the thermal imager image heat source center resolved by RGB camera motionrepro,thermal,rgbRepresenting the reprojection error, U, of the RGB image heat source center obtained by thermal imager motion solutionrepro,thermal,imuRepresenting a reprojection error of the thermal imager image heat source center obtained by IMU motion resolving; u shapeα,ω,rgbRepresenting the sum of the error terms of inertial acceleration and angular velocity of the IMU resulting from the RGB camera motion solution, Uα,ω,thermalRepresenting the sum of error terms of inertial acceleration and angular velocity of the IMU obtained by thermal imager motion calculation;
Trt,Tri,Ttithe following constraints are satisfied:
Tti·Tri -1·Trt=Ttr·Trt=I
wherein I is the identity matrix, (T)ti·Tri -1) Describing the conversion relationship from the RGB camera coordinate system to the thermal imager coordinate system, Tti·Tri -1·Trt=Ttr·TrtAdd to objective function T ═ Irt,Tri,TtiAnd obtaining a spatial relationship between the thermal imager, the RGB camera and the IMU coordinate system.
The invention further provides a storage medium which stores a program, and when the program is executed by a processor, the thermal imager-RGB camera-IMU space joint calibration method is realized.
Compared with the prior art, the invention has the following advantages and beneficial effects:
the invention provides a thermal imager-RGB camera-IMU space combined calibration method. Firstly, the center position of the heat source of the LED array is positioned by adopting the mixed Gaussian distribution model, the technical problem that the thermal imager cannot perform characteristic positioning due to uneven distribution of the light source is solved, and the technical effect that the position estimation is still accurately performed under the condition that the LED light source is partially lost is achieved. Then, the invention adopts the technical scheme of establishing a target optimization function and optimizing the heat source center re-projection error extracted from the thermal imager image, the chessboard angular point re-projection error extracted from the RGB image and the re-projection error of the target point in the thermal imager image with the acceleration and angular speed errors, solves the technical problem of inconsistent joint calibration of the thermal imager, the RGB camera and the IMU, and achieves the technical effect of accurately calibrating the spatial position relation among the three sensor coordinate systems.
Drawings
FIG. 1 is a flow chart of a thermal imager-RGB camera-IMU space joint calibration method according to an embodiment of the invention;
FIG. 2 is a schematic structural diagram of a thermal imager-RGB camera-IMU spatial joint calibration system according to an embodiment of the invention;
fig. 3 is a schematic structural diagram of a storage medium according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Examples
Thermal imagers, RGB cameras, and Inertial Measurement Units (IMU) have heterogeneous and complementary characteristics in terms of image modality, perception accuracy, and response speed, and the effective integration of these three elements is receiving increasing attention in the fields of robotics and computer vision. Effective and accurate calibration among the thermal imager, the RGB camera and the IMU is the premise for realizing multi-modal perception and fusion of the thermal imager, the RGB camera and the IMU. Therefore, the invention provides a spatial joint calibration method of the thermal imager, the RGB camera and the IMU, which is used for calibrating the relative positions and postures of the three sensors.
As shown in fig. 1, the thermal imager-RGB camera-IMU spatial joint calibration method of the present embodiment specifically includes the following steps:
s1, positioning the image characteristics of the thermal imager;
more specifically, the invention proposes to use a mixed gaussian distribution model to find the position of the target point of the calibration plate of the thermal imager, which models the illumination area as a plurality of gaussian distributions, mathematically described as:
Figure BDA0002907746010000081
wherein A isiIs the amplitude of the ith central region; lambda [ alpha ]iAnd muiRespectively representing the ith characteristic position and the centroid of the corresponding illumination area; sigmaiA covariance matrix representing the i-th region gaussian distribution. Estimation of unknown parameter set by optimizing objective functioniii}:
Figure BDA0002907746010000082
Wherein I (x, y) represents a pixel value of (x, y); omegaiRepresenting the candidate set of pixels in the ith illumination region, the proposed model is able to pinpoint the illumination region center by optimizing the objective function even in the event of bulb partial area loss.
Step 2: and (3) jointly calibrating the thermal imager, the RGB camera and the IMU, constructing a target function by using the reprojection error of a target point in an image of the thermal imager, the reprojection error of an angular point of the RGB image and the acceleration and angular speed error of the IMU, and calculating the spatial relationship among the three sensor coordinate systems by minimizing the target function.
Further, in this embodiment, the thermal imager image, the RGB image, and the inertial measurement information are integrated in a unified frame, the spatial relationship among the three coordinate systems of the RGB camera, the IMU, and the thermal imager is jointly estimated, and the parameters to be solved are calculated by minimizing the objective function:
Figure BDA0002907746010000091
wherein, TrtA transformation matrix (rotation and translation) representing the thermal imager coordinate system relative to the RGB camera coordinate system; t isriA transformation matrix representing the inertial coordinate system relative to the RGB camera coordinate system; t istiRepresenting a transformation matrix of the inertial coordinate system relative to the thermal imager coordinate system. T isrt,Tri,TtiThe following constraints are satisfied:
Tti·Tri -1·Trt=Ttr·Trt=I
wherein I is a 3 × 3 identity matrix; (T)ti·Tri -1) The conversion relationship from the RGB camera coordinate system to the thermal imager coordinate system is described. Will Tti·Tri -1·Trt=Ttr·TrtAdd to objective function T ═ Irt,Tri,TtiAnd obtaining the spatial relationship between the thermal imager-RGB camera-IMU coordinate system.
It can be understood that the objective function mainly includes three parts, which are respectively a reprojection error of a target point of the thermal imager image and a reprojection error of a corner point in the RGB image, a reprojection error of a target point of the thermal imager image and an acceleration and angular velocity error of the IMU, a reprojection error of a corner point in the RGB image and an acceleration and angular velocity error of the IMU, and they are respectively obtained through the following three steps:
s2.1, calibrating a thermal imager-RGB camera space;
the embodiment adopts the reprojection error to describe the objective function part Urepro,rgb,thermalAnd Urepro,thermal,rgb。Urepro,rgb,thermalRepresenting the reprojection error of the thermal imager image heat source center resolved by the RGB camera motion, which is expressed as:
Figure BDA0002907746010000092
wherein,
Figure BDA0002907746010000093
and
Figure BDA0002907746010000094
the pixel positions of the RGB image and the thermal imager image at the ith (i is more than or equal to 1 and less than or equal to M) pixel position at the moment t (t is more than or equal to 1 and less than or equal to K) are respectively; ctAnd CrThermal imager and RGB camera models, respectively; t istrIs a coordinate system of RGB cameraA transformation relationship with respect to a thermal imager coordinate system; srtAnd the weight coefficient is the center re-projection error of the thermal imager image heat source.
Similarly, Urepro,thermal,rgbRepresenting the reprojection error of the RGB image heat source center resolved by the thermal imager motion, which is expressed as:
Figure BDA0002907746010000101
wherein s istrIs the weight coefficient of the center reprojection error of the RGB image heat source.
S2.2, calibrating a thermal imager and an IMU space;
the transformation from the IMU coordinate system to the world coordinate system is defined as:
Figure BDA0002907746010000102
wherein R iswi(t) and twi(t) represents rotation and translation, respectively, from the inertial frame to the world frame at time t.
The error terms of inertial acceleration and angular velocity are:
Figure BDA0002907746010000103
Figure BDA0002907746010000104
Figure BDA0002907746010000105
wherein s isαIs a weight coefficient corresponding to the acceleration error; sωIs a weight coefficient corresponding to the angular velocity error.
The ith projected onto the thermal imager image at time t
Figure BDA0002907746010000106
For pixel location
Figure BDA0002907746010000107
Represents, Urepro,thermal,imuRepresenting the reprojection error of the thermal imager image heat source center resolved by the IMU motion, given by:
Figure BDA0002907746010000108
wherein s istiThe weight coefficient of the ith heat source center on the thermal imager image at the time t; ctIs that
And (5) projecting the model by the thermal imager.
S2.3, calibrating the RGB camera and the IMU space;
similar to the mutual calibration of the thermal imager coordinate system and the inertial coordinate system, the error terms of acceleration and angular velocity are given by:
Figure BDA0002907746010000111
Figure BDA0002907746010000112
Figure BDA0002907746010000113
the ith projected onto RGB image at time t
Figure BDA0002907746010000114
By the pixel position of
Figure BDA0002907746010000115
And (4) showing. U shaperepro,rgb,imuRepresenting the reprojection error of the RGB image heat source center resolved by IMU motion, given by:
Figure BDA0002907746010000116
wherein s istiThe weight coefficient is the center reprojection error of the ith heat source on the RGB image at the time t; crIs an RGB camera projection model.
As shown in fig. 2, in another embodiment, a thermal imager-RGB camera-IMU spatial joint calibration system is provided, which includes a thermal imager image feature positioning module and a thermal imager-RGB camera-IMU joint calibration module;
the thermal imager image feature positioning module is used for positioning the thermal imager image features and searching the target point position of the thermal imager calibration plate;
the thermal imager-RGB camera-IMU combined calibration module is used for thermal imager-RGB camera-IMU combined calibration, images of the thermal imager, RGB images and IMU information are integrated in a unified framework, spatial relations among three coordinate systems of the RGB camera, the IMU and the thermal imager are jointly estimated, the spatial relations include thermal imager and RGB camera spatial calibration, thermal imager and IMU spatial calibration and RGB camera and IMU spatial calibration, and then parameters to be solved are calculated through a minimized objective function:
Figure BDA0002907746010000121
wherein, TrtRepresenting a transformation matrix of the thermal imager coordinate system relative to the RGB camera coordinate system; t isriA transformation matrix representing the inertial coordinate system relative to the RGB camera coordinate system; t istiRepresenting a transformation matrix, U, of the inertial frame relative to the thermal imager framerepro,rgb,thermalRepresenting the reprojection error, U, of the thermal imager image heat source center resolved by RGB camera motionrepro,thermal,rgbRepresenting the reprojection error, U, of the RGB image heat source center obtained by thermal imager motion solutionrepro,thermal,imuRepresenting a reprojection error of the thermal imager image heat source center obtained by IMU motion resolving; u shapeα,ω,rgbRepresenting phases from RGBSum of error terms of inertial acceleration and angular velocity of the IMU obtained by machine motion solution, Uα,ω,thermalAnd representing the sum of error terms of inertial acceleration and angular speed of the IMU obtained by thermal imager motion calculation.
Trt,Tri,TtiThe following constraints are satisfied:
Tti·Tri -1·Trt=Ttr·Trt=I
wherein I is the identity matrix, (T)ti·Tri -1) Describing the conversion relationship from the RGB camera coordinate system to the thermal imager coordinate system, Tti·Tri -1·Trt=Ttr·TrtAdd to objective function T ═ Irt,Tri,TtiAnd obtaining a spatial relationship between the thermal imager, the RGB camera and the IMU coordinate system.
It should be noted that the system provided in the above embodiment is only illustrated by the division of the functional modules, and in practical applications, the function allocation may be completed by different functional modules according to needs, that is, the internal structure is divided into different functional modules to complete all or part of the functions described above.
As shown in fig. 3, in another embodiment of the present application, a storage medium is further provided, where the storage medium stores a program, and when the program is executed by a processor, the method for implementing a thermal imager-RGB camera-IMU spatial joint calibration is specifically:
positioning the image characteristics of the thermal imager, and searching the position of a target point of a calibration plate of the thermal imager;
the method comprises the following steps of thermal imager-RGB camera-IMU combined calibration, integrating thermal imager images, RGB images and IMU information in a unified framework, jointly estimating spatial relations among three coordinate systems of the RGB camera, the IMU and the thermal imager, including thermal imager and RGB camera spatial calibration, thermal imager and IMU spatial calibration and RGB camera and IMU spatial calibration, and then calculating parameters to be solved through a minimized objective function:
Figure BDA0002907746010000131
wherein, TrtRepresenting a transformation matrix of the thermal imager coordinate system relative to the RGB camera coordinate system; t isriA transformation matrix representing the inertial coordinate system relative to the RGB camera coordinate system; t istiRepresenting a transformation matrix, U, of the inertial frame relative to the thermal imager framerepro,rgb,thermalRepresenting the reprojection error, U, of the thermal imager image heat source center resolved by RGB camera motionrepro,thermal,rgbRepresenting the reprojection error, U, of the RGB image heat source center obtained by thermal imager motion solutionrepro,thermal,imuRepresenting a reprojection error of the thermal imager image heat source center obtained by IMU motion resolving; u shapeα,ω,rgbRepresenting the sum of the error terms of inertial acceleration and angular velocity of the IMU resulting from the RGB camera motion solution, Uα,ω,thermalAnd representing the sum of error terms of inertial acceleration and angular speed of the IMU obtained by thermal imager motion calculation.
Trt,Tri,TtiThe following constraints are satisfied:
Tti·Tri -1·Trt=Ttr·Trt=I
wherein I is the identity matrix, (T)ti·Tri -1) Describing the conversion relationship from the RGB camera coordinate system to the thermal imager coordinate system, Tti·Tri -1·Trt=Ttr·TrtAdd to objective function T ═ Irt,Tri,TtiAnd obtaining a spatial relationship between the thermal imager, the RGB camera and the IMU coordinate system.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (8)

1. The thermal imager-RGB camera-IMU space combined calibration method is characterized by comprising the following steps of:
positioning the image characteristics of the thermal imager, and searching the position of a target point of a calibration plate of the thermal imager;
the method comprises the following steps of thermal imager-RGB camera-IMU combined calibration, integrating thermal imager images, RGB images and IMU information in a unified framework, jointly estimating spatial relations among three coordinate systems of the RGB camera, the IMU and the thermal imager, including thermal imager and RGB camera spatial calibration, thermal imager and IMU spatial calibration and RGB camera and IMU spatial calibration, and then calculating parameters to be solved through a minimized objective function:
Figure FDA0002907745000000011
wherein, TrtRepresenting a transformation matrix of the thermal imager coordinate system relative to the RGB camera coordinate system; t isriA transformation matrix representing the inertial coordinate system relative to the RGB camera coordinate system; t istiRepresenting a transformation matrix, U, of the inertial frame relative to the thermal imager framerepro,rgb,thermalRepresenting the reprojection error, U, of the thermal imager image heat source center resolved by RGB camera motionrepro,thermal,rgbRepresenting the reprojection error, U, of the RGB image heat source center obtained by thermal imager motion solutionrepro,thermal,imuRepresentation resolution by IMU motionObtaining a reprojection error of the thermal imager image heat source center; u shapeα,ω,rgbRepresenting the sum of the error terms of inertial acceleration and angular velocity of the IMU resulting from the RGB camera motion solution, Uα,ω,thermalRepresenting the sum of error terms of inertial acceleration and angular velocity of the IMU obtained by thermal imager motion calculation;
Trt,Tri,Ttithe following constraints are satisfied:
Tti·Tri -1·Trt=Ttr·Trt=I
wherein I is the identity matrix, (T)ti·Tri -1) Describing the conversion relationship from the RGB camera coordinate system to the thermal imager coordinate system, Tti·Tri -1·Trt=Ttr·TrtAdd to objective function T ═ Irt,Tri,TtiAnd obtaining a spatial relationship between the thermal imager, the RGB camera and the IMU coordinate system.
2. The thermal imager-RGB camera-IMU spatial joint calibration method of claim 1, wherein the thermal imager image feature location uses a mixed gaussian distribution model to find the target point position of the thermal imager calibration plate, modeling the illumination area as a plurality of gaussian distributions, described as follows:
Figure FDA0002907745000000021
wherein A isiIs the amplitude of the ith heat source center; lambda [ alpha ]iAnd muiRespectively representing the position in the ith characteristic region and the centroid of the corresponding illumination region; sigmaiA covariance matrix representing the i-th region gaussian distribution; estimation of unknown parameter set by optimizing objective functioniii}:
Figure FDA0002907745000000022
Wherein I (x, y) represents a pixel value of (x, y); omegaiRepresenting a candidate set of pixels in an ith illumination region; even in the case of partial bulb area loss, the illumination area center can be accurately located by optimizing the objective function.
3. The thermal imager-RGB camera-IMU spatial joint calibration method of claim 1, wherein in the thermal imager-RGB camera-IMU spatial joint calibration, the objective function includes: the method comprises the following steps of re-projecting errors of target points of images of the thermal imager, re-projecting errors of corner points in RGB images, re-projecting errors of target points of images of the thermal imager, acceleration and angular velocity errors of IMUs, re-projecting errors of corner points in RGB images and acceleration and angular velocity errors of the IMUs.
4. The thermal imager-RGB camera-IMU space joint calibration method as claimed in claim 1, wherein thermal imager-RGB camera space calibration adopts re-projection error to describe the U of the objective function partrepro,rgb,thermalAnd Urepro,thermal,rgb,Urepro,rgb,thermalIs represented as:
Figure FDA0002907745000000023
wherein,
Figure FDA0002907745000000024
and
Figure FDA0002907745000000025
the first pixel positions of the RGB image and the thermal imager image at the moment are respectively, t is more than or equal to 1 and less than or equal to K, and i is more than or equal to 1 and less than or equal to M; ctAnd CrThermal imager and RGB camera models, respectively; t istrIs the transformation relation of the RGB camera coordinate system relative to the thermal imager coordinate system; srtThe weight coefficient is the center re-projection error of the thermal imager image heat source;
Urepro,thermal,rgbquilt watchShown as follows:
Figure FDA0002907745000000031
wherein s istrIs the weight coefficient of the center reprojection error of the RGB image heat source.
5. The thermal imager-RGB camera-IMU spatial joint calibration method according to claim 1, wherein the thermal imager and IMU spatial calibration specifically is:
the transformation from the IMU coordinate system to the world coordinate system is defined as:
Figure FDA0002907745000000032
wherein R iswi(t) and twi(t) represents rotation and translation from the inertial frame to the world frame at time t, respectively;
the error terms of inertial acceleration and angular velocity are:
Figure FDA0002907745000000033
Figure FDA0002907745000000034
Figure FDA0002907745000000035
wherein s isαIs a weight coefficient corresponding to the acceleration error; sωIs a weight coefficient corresponding to the angular velocity error;
the ith projected onto the thermal imager image at time t
Figure FDA0002907745000000036
For pixel location
Figure FDA0002907745000000037
Represents; u shaperepro,thermal,imuGiven by:
Figure FDA0002907745000000038
wherein s istiThe weight coefficient of the ith heat source center on the thermal imager image at the time t; ctIs a thermal imager projection model.
6. The thermal imager-RGB camera-IMU spatial joint calibration method according to claim 1, wherein the RGB camera and IMU spatial calibration specifically is:
the error terms for acceleration and angular velocity are given by:
Figure FDA0002907745000000041
Figure FDA0002907745000000042
Figure FDA0002907745000000043
the ith projected onto RGB image at time t
Figure FDA0002907745000000044
By the pixel position of
Figure FDA0002907745000000045
Represents, Urepro,rgb,imuGiven by:
Figure FDA0002907745000000046
wherein s istiThe weight coefficient is the center reprojection error of the ith heat source on the RGB image at the time t; crIs an RGB camera projection model.
7. The thermal imager-RGB camera-IMU space combined calibration system is characterized by being applied to the thermal imager-RGB camera-IMU space combined calibration method of any one of claims 1-6, and comprising a thermal imager image feature positioning module and a thermal imager-RGB camera-IMU combined calibration module;
the thermal imager image feature positioning module is used for positioning the thermal imager image features and searching the target point position of the thermal imager calibration plate;
the thermal imager-RGB camera-IMU combined calibration module is used for thermal imager-RGB camera-IMU combined calibration, images of the thermal imager, RGB images and IMU information are integrated in a unified framework, spatial relations among three coordinate systems of the RGB camera, the IMU and the thermal imager are jointly estimated, the spatial relations include thermal imager and RGB camera spatial calibration, thermal imager and IMU spatial calibration and RGB camera and IMU spatial calibration, and then parameters to be solved are calculated through a minimized objective function:
Figure FDA0002907745000000047
wherein, TrtRepresenting a transformation matrix of the thermal imager coordinate system relative to the RGB camera coordinate system; t isriA transformation matrix representing the inertial coordinate system relative to the RGB camera coordinate system; t istiRepresenting a transformation matrix, U, of the inertial frame relative to the thermal imager framerepro,rgb,thermalRepresenting the reprojection error, U, of the thermal imager image heat source center resolved by RGB camera motionrepro,thermal,rgbRepresenting the reprojection error, U, of the RGB image heat source center obtained by thermal imager motion solutionrepro,thermal,imuRepresentation is solved by IMU motionCalculating the obtained re-projection error of the thermal imager image heat source center; u shapeα,ω,rgbRepresenting the sum of the error terms of inertial acceleration and angular velocity of the IMU resulting from the RGB camera motion solution, Uα,ω,thermalRepresenting the sum of error terms of inertial acceleration and angular velocity of the IMU obtained by thermal imager motion calculation;
Trt,Tri,Ttithe following constraints are satisfied:
Tti·Tri -1·Trt=Ttr·Trt=I
wherein I is the identity matrix, (T)ti·Tri -1) Describing the conversion relationship from the RGB camera coordinate system to the thermal imager coordinate system, Tti·Tri -1·Trt=Ttr·TrtAdd to objective function T ═ Irt,Tri,TtiAnd obtaining a spatial relationship between the thermal imager, the RGB camera and the IMU coordinate system.
8. A storage medium storing a program, wherein the program, when executed by a processor, implements the thermal imager-RGB camera-IMU spatial joint calibration method of any one of claims 1-6.
CN202110076591.0A 2021-01-20 2021-01-20 Thermal imager-RGB camera-IMU space combined calibration method, system and storage medium Withdrawn CN112819898A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110076591.0A CN112819898A (en) 2021-01-20 2021-01-20 Thermal imager-RGB camera-IMU space combined calibration method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110076591.0A CN112819898A (en) 2021-01-20 2021-01-20 Thermal imager-RGB camera-IMU space combined calibration method, system and storage medium

Publications (1)

Publication Number Publication Date
CN112819898A true CN112819898A (en) 2021-05-18

Family

ID=75858801

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110076591.0A Withdrawn CN112819898A (en) 2021-01-20 2021-01-20 Thermal imager-RGB camera-IMU space combined calibration method, system and storage medium

Country Status (1)

Country Link
CN (1) CN112819898A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114004901A (en) * 2022-01-04 2022-02-01 南昌虚拟现实研究院股份有限公司 Multi-camera calibration method and device, terminal equipment and readable storage medium
CN116736227A (en) * 2023-08-15 2023-09-12 无锡聚诚智能科技有限公司 Method for jointly calibrating sound source position by microphone array and camera

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108090921A (en) * 2016-11-23 2018-05-29 中国科学院沈阳自动化研究所 Monocular vision and the adaptive indoor orientation method of IMU fusions

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108090921A (en) * 2016-11-23 2018-05-29 中国科学院沈阳自动化研究所 Monocular vision and the adaptive indoor orientation method of IMU fusions

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王鑫等: "热像仪-RGB相机-IMU传感器的空间联合标定方法", 《仪器仪表学报2020》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114004901A (en) * 2022-01-04 2022-02-01 南昌虚拟现实研究院股份有限公司 Multi-camera calibration method and device, terminal equipment and readable storage medium
CN114004901B (en) * 2022-01-04 2022-03-18 南昌虚拟现实研究院股份有限公司 Multi-camera calibration method and device, terminal equipment and readable storage medium
CN116736227A (en) * 2023-08-15 2023-09-12 无锡聚诚智能科技有限公司 Method for jointly calibrating sound source position by microphone array and camera
CN116736227B (en) * 2023-08-15 2023-10-27 无锡聚诚智能科技有限公司 Method for jointly calibrating sound source position by microphone array and camera

Similar Documents

Publication Publication Date Title
CN110009681B (en) IMU (inertial measurement unit) assistance-based monocular vision odometer pose processing method
JP7054803B2 (en) Camera parameter set calculation device, camera parameter set calculation method and program
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
JP6830140B2 (en) Motion vector field determination method, motion vector field determination device, equipment, computer readable storage medium and vehicle
US20190371003A1 (en) Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
CN111210478B (en) Common-view-free multi-camera system external parameter calibration method, medium and system
US20180075618A1 (en) Measurement system and method for measuring multi-dimensions
WO2020140431A1 (en) Camera pose determination method and apparatus, electronic device and storage medium
CN111476106B (en) Monocular camera-based straight road relative gradient real-time prediction method, system and device
US20180075614A1 (en) Method of Depth Estimation Using a Camera and Inertial Sensor
KR20220025028A (en) Method and device for building beacon map based on visual beacon
CN112819898A (en) Thermal imager-RGB camera-IMU space combined calibration method, system and storage medium
CN110675455B (en) Natural scene-based self-calibration method and system for vehicle body looking-around camera
JP2018190402A (en) Camera parameter set calculation device, camera parameter set calculation method, and program
CN112734765A (en) Mobile robot positioning method, system and medium based on example segmentation and multi-sensor fusion
CN113888639B (en) Visual odometer positioning method and system based on event camera and depth camera
CN116205947A (en) Binocular-inertial fusion pose estimation method based on camera motion state, electronic equipment and storage medium
CN111899276A (en) SLAM method and system based on binocular event camera
CN112631431B (en) Method, device and equipment for determining pose of AR (augmented reality) glasses and storage medium
CN111798521A (en) Calibration method, calibration device, storage medium and electronic equipment
CN114897988B (en) Multi-camera positioning method, device and equipment in hinge type vehicle
CN114663463A (en) Method, system, device, electronic device and storage medium for measuring joint mobility
Qian et al. Survey on fish-eye cameras and their applications in intelligent vehicles
CN113701750A (en) Fusion positioning system of underground multi-sensor
CN116442707B (en) System and method for estimating vertical and pitching motion information of vehicle body based on binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210518