CN116823968A - Combined calibration method and device for visible light and infrared camera - Google Patents

Combined calibration method and device for visible light and infrared camera Download PDF

Info

Publication number
CN116823968A
CN116823968A CN202310870111.7A CN202310870111A CN116823968A CN 116823968 A CN116823968 A CN 116823968A CN 202310870111 A CN202310870111 A CN 202310870111A CN 116823968 A CN116823968 A CN 116823968A
Authority
CN
China
Prior art keywords
camera
visible light
marker
infrared camera
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310870111.7A
Other languages
Chinese (zh)
Inventor
龙知洲
马天
李伟萍
唐荣富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weihai Zhonghe Electromechanical Technology Co ltd
Institute of Systems Engineering of PLA Academy of Military Sciences
Original Assignee
Weihai Zhonghe Electromechanical Technology Co ltd
Institute of Systems Engineering of PLA Academy of Military Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weihai Zhonghe Electromechanical Technology Co ltd, Institute of Systems Engineering of PLA Academy of Military Sciences filed Critical Weihai Zhonghe Electromechanical Technology Co ltd
Priority to CN202310870111.7A priority Critical patent/CN116823968A/en
Publication of CN116823968A publication Critical patent/CN116823968A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method and a device for jointly calibrating a visible light camera and an infrared camera, wherein the method comprises the following steps: the method comprises the steps of installing a visible light camera and an infrared camera, wherein two visible light cameras are respectively a left camera and a right camera; the method comprises the steps of collecting patterns of a common calibration plate by using a visible light camera, and carrying out three-dimensional calibration of the visible light camera; placing a marker in a common visual angle space of a visible light camera and an infrared camera, and acquiring data of the marker to obtain N images; three-dimensional resolving is carried out on the position information of the marker in the N images by using a visible light camera, so that three-dimensional coordinate information of the marker is obtained; and processing the three-dimensional coordinate information of the marker and imaging pixel points of the infrared camera to obtain the combined calibration information of the visible light and the infrared camera. The method is simple to operate, and after the visible light camera is calibrated, the infrared camera calibration can be completed in two steps. The method has lower cost, does not need to customize a special calibration plate for calibrating the infrared camera, and saves the cost.

Description

Combined calibration method and device for visible light and infrared camera
Technical Field
The invention relates to the technical fields of computer vision, mixed reality, automatic driving, artificial intelligence and security protection, in particular to a method and a device for jointly calibrating a visible light camera and an infrared camera.
Background
Mixed reality technology (including augmented reality and virtual reality) is a novel man-machine interaction technology which has been rapidly developed and is in great attention in recent years. The mixed reality technology fuses the virtual world and the real world to create a visual environment where virtual and real objects coexist and are interactable in real time. The mixed reality essentially provides a brand new computer interaction interface and means, can be widely applied to military (such as military training, combat, maintenance and the like), industry (such as remote expert support, inspection, information visualization and the like), entertainment, education and the like, and is more likely to become a next generation computing platform following a mobile phone in the future.
In the mixed reality technology, the visible light camera can work under certain illumination, the infrared thermal imaging camera is used for imaging by collecting infrared radiation of objects, and in theory, only objects higher than absolute zero can radiate infrared rays outwards, so that the infrared thermal camera can work for 24 hours all the time, but the infrared camera only has gray values, imaging details are much worse than visible light, and therefore, the combined use of the visible light and the infrared camera is a common scheme in many fields.
In some use scenes, we need to know the position relationship between the infrared camera and the visible light camera and the parameters of the camera, this process is called camera calibration, and the visible light camera can be calibrated by directly using a checkerboard, a center board and the like in the visible light, but the infrared camera needs to manufacture a special calibration board, and in the use process, the calibration board needs to be heated or refrigerated, so that the calibration cost of the infrared camera is very high.
Disclosure of Invention
The invention aims to solve the technical problem of providing a method and a device for jointly calibrating a visible light camera and an infrared camera, which are used for solving the problem of jointly calibrating the infrared camera and the visible light camera.
In order to solve the technical problems, a first aspect of the embodiment of the invention discloses a method for jointly calibrating a visible light camera and an infrared camera, which comprises the following steps:
s1, installing a visible light camera and an infrared camera, wherein the number of the visible light cameras is two, namely a left camera and a right camera;
s2, collecting patterns of a common calibration plate by using the visible light camera, and performing three-dimensional calibration of the visible light camera;
s3, placing a marker in a common visual angle space of the visible light camera and the infrared camera, and collecting data of the marker to obtain N images, wherein N is more than or equal to 6;
s4, carrying out three-dimensional calculation on the position information of the marker in the N images by using the visible light camera to obtain three-dimensional coordinate information of the marker;
s5, processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain visible light and infrared camera combined calibration information;
the visible light and infrared camera combined calibration information comprises internal parameter information of the infrared camera and position relation information between the infrared camera and the visible light camera.
As an alternative implementation manner, in the first aspect of the embodiment of the present invention, the visible light camera and the infrared camera need to satisfy:
the positions between the visible light camera and the infrared camera are relatively fixed, and the visible light camera and the infrared camera must have the same viewing angle;
the visible light camera and the infrared camera can not move relatively but can move integrally.
In a first aspect of the embodiment of the present invention, the capturing the pattern of the common calibration board with the visible light camera and performing the stereoscopic calibration of the visible light camera includes:
s21, collecting patterns of a checkerboard or other calibration plates by using the visible light camera;
s22, carrying out three-dimensional calibration on the visible light camera by using a Zhang calibration method or other camera calibration methods.
In a first aspect of the embodiment of the present invention, the placing a marker in a viewing angle space common to the visible light camera and the infrared camera, and collecting position data of the marker, to obtain N images, includes:
s31, a marker is placed in a common visual angle space of the visible light camera and the infrared camera, so that the visible light camera and the infrared camera can capture the marker;
s32, moving the position of the marker or moving the whole formed by the cameras, and collecting data of the marker in a common visual angle space of the visible light camera and the infrared camera to obtain N images.
In a first aspect of the embodiment of the present invention, the three-dimensional resolving, by using the visible light camera, the position information of the marker in the N images to obtain three-dimensional coordinate information of the marker includes:
s41, calculating the pixel coordinates of a special point on the marker in the N images;
s42, three-dimensional calculation is carried out on pixel coordinates of a special point on the marker in the N images by utilizing parameters of the visible light camera, and three-dimensional coordinate information of the marker is obtained.
In a first aspect of the embodiment of the present invention, the three-dimensional resolving of the pixel coordinates of a specific point on the marker in the N images by using the parameters of the visible light camera to obtain three-dimensional coordinate information of the marker includes:
s421, obtaining the coordinate p of a special point on the marker in the N images in the left camera l =[[u 11 ,v 11 ],[u 12 ,v 12 ]…,[u 1N ,v 1N ]And the coordinate p in the right camera r =[[u 21 ,v 21 ],[u 22 ,v 22 ],…,[u 2N ,v 2N ]];
S422, using a three-dimensional coordinate calculation model, the coordinates p in the left camera are calculated l =[[u 11 ,v 11 ],[u 12 ,v 12 ]…,[u 1N ,v 1N ]And the coordinate p in the right camera r =[[u 21 ,v 21 ],[u 22 ,v 22 ],…,[u 2N ,v 2N ]]Processing to obtain three-dimensional coordinate information of the marker;
the three-dimensional coordinate calculation model is as follows:
the three-dimensional coordinate information of the marker is p 3d =[X,Y,Z],Is a projection matrix of the actual coordinate system to the camera coordinate system.
In a first aspect of the embodiment of the present invention, the processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain the combined calibration information of the visible light and the infrared camera includes:
s51, processing three-dimensional coordinate information of the marker by using a projection matrix calculation model to obtain a projection matrix;
the projection matrix calculation model is as follows:
wherein the three-dimensional coordinate information of the marker is p 3d =[X,Y,Z],z*p c =g, x represents the product, p c For homogeneous representation of camera coordinates, i.e. [ u, v,1 ]],A projection matrix from an actual coordinate system to a camera coordinate system;
s52, performing orthogonal triangular decomposition on a projection matrix from the actual coordinate system to a camera coordinate system to obtain internal parameters of the infrared camera and a position relation R between the internal parameters and the visible light camera ir And T ir
The internal parameters of the infrared camera and the position relation R between the internal parameters and the visible light camera ir And T ir And the information is calibrated for the combination of the visible light and the infrared camera.
The second aspect of the embodiment of the invention discloses a visible light and infrared camera combined calibration device, which comprises:
the camera mounting module is used for mounting a visible light camera and an infrared camera, wherein the number of the visible light cameras is two, namely a left camera and a right camera;
the calibration module is used for acquiring patterns of a common calibration plate by using the visible light camera and carrying out three-dimensional calibration of the visible light camera;
the data acquisition module is used for placing a marker in a common visual angle space of the visible light camera and the infrared camera and acquiring data of the marker to obtain N images, wherein N is more than or equal to 6;
the three-dimensional coordinate calculation module is used for carrying out three-dimensional calculation on the position information of the marker in the N images by utilizing the visible light camera to obtain three-dimensional coordinate information of the marker;
the combined calibration module is used for processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain visible light and infrared camera combined calibration information;
the visible light and infrared camera combined calibration information comprises internal parameter information of the infrared camera and position relation information between the infrared camera and the visible light camera.
As an alternative implementation manner, in the second aspect of the embodiment of the present invention, the visible light camera and the infrared camera need to satisfy:
the positions between the visible light camera and the infrared camera are relatively fixed, and the visible light camera and the infrared camera must have the same viewing angle;
the visible light camera and the infrared camera can not move relatively but can move integrally.
In a second aspect of the embodiment of the present invention, the capturing the pattern of the common calibration board with the visible light camera and performing the stereoscopic calibration of the visible light camera includes:
s21, collecting patterns of a checkerboard or other calibration plates by using the visible light camera;
s22, carrying out three-dimensional calibration on the visible light camera by using a Zhang calibration method or other camera calibration methods.
In a second aspect of the embodiment of the present invention, the placing a marker in a viewing angle space common to the visible light camera and the infrared camera, and collecting position data of the marker, to obtain N images, includes:
s31, a marker is placed in a common visual angle space of the visible light camera and the infrared camera, so that the visible light camera and the infrared camera can capture the marker;
s32, moving the position of the marker or moving the whole formed by the cameras, and collecting data of the marker in a common visual angle space of the visible light camera and the infrared camera to obtain N images.
In a second aspect of the embodiment of the present invention, the three-dimensional resolving, by using the visible light camera, the position information of the marker in the N images to obtain three-dimensional coordinate information of the marker includes:
s41, calculating the pixel coordinates of a special point on the marker in the N images;
s42, three-dimensional calculation is carried out on pixel coordinates of a special point on the marker in the N images by utilizing parameters of the visible light camera, and three-dimensional coordinate information of the marker is obtained.
In a second aspect of the embodiment of the present invention, the three-dimensional resolving of the pixel coordinates of a specific point on the marker in the N images by using the parameters of the visible light camera to obtain three-dimensional coordinate information of the marker includes:
s421, obtaining the coordinate p of a special point on the marker in the N images in the left camera l =[[u 11 ,v 11 ],[u 12 ,v 12 ]…,[u 1N ,v 1N ]And the coordinate p in the right camera r =[[u 21 ,v 21 ],[u 22 ,v 22 ],…,[u 2N ,v 2N ]];
S422, using a three-dimensional coordinate calculation model, the coordinates p in the left camera are calculated l =[[u 11 ,v 11 ],[u 12 ,v 12 ]…,[u 1N ,v 1N ]And the coordinate p in the right camera r =[[u 21 ,v 21 ],[u 22 ,v 22 ],…,[u 2N ,v 2N ]]Processing to obtain three-dimensional coordinate information of the marker;
the three-dimensional coordinate calculation model is as follows:
the three-dimensional coordinate information of the marker is p 3d =[X,Y,Z],Is a projection matrix of the actual coordinate system to the camera coordinate system.
In a second aspect of the embodiment of the present invention, the processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain the combined calibration information of the visible light and the infrared camera includes:
s51, processing three-dimensional coordinate information of the marker by using a projection matrix calculation model to obtain a projection matrix;
the projection matrix calculation model is as follows:
wherein the three-dimensional coordinate information of the marker is p 3d =[X,Y,Z],z*p c =g, x represents the product, p c For homogeneous representation of camera coordinates, i.e. [ u, v,1 ]],A projection matrix from an actual coordinate system to a camera coordinate system;
s52, performing orthogonal triangular decomposition on a projection matrix from the actual coordinate system to a camera coordinate system to obtain internal parameters of the infrared camera and a position relation R between the internal parameters and the visible light camera ir And T ir
The internal parameters of the infrared camera and the position relation R between the internal parameters and the visible light camera ir And T ir And the information is calibrated for the combination of the visible light and the infrared camera.
The third aspect of the invention discloses another visible light and infrared camera combined calibration device, which comprises:
a memory storing executable program code;
a processor coupled to the memory;
the processor invokes the executable program codes stored in the memory to execute part or all of the steps in the method for calibrating the combination of the visible light and the infrared camera disclosed in the first aspect of the embodiment of the invention.
In a fourth aspect, the present invention discloses a computer-readable storage medium, where the computer-readable storage medium stores computer instructions that, when called, are used to perform part or all of the steps in the method for jointly calibrating a visible light and an infrared camera disclosed in the first aspect of the embodiment of the present invention.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
(1) In order to solve the problem of the combined calibration of the infrared camera and the visible light camera, the invention realizes the method for the combined calibration of the visible light and the infrared camera, the method is simple to operate, and the infrared camera calibration can be completed in two steps after the visible light camera is calibrated.
(2) The method has low cost, does not need to customize a special calibration plate for calibrating the infrared camera, and saves the cost.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a method for calibrating a combination of a visible light camera and an infrared camera according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the positional relationship of the visible and infrared cameras disclosed in an embodiment of the present invention;
FIG. 3 is an image of a marker captured by a visible light camera and an infrared camera as disclosed in an embodiment of the present invention;
FIG. 4 is a schematic diagram of a combined calibration device for visible and infrared cameras according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of another combined calibration device for visible and infrared cameras according to an embodiment of the present invention.
Detailed Description
In order to make the present invention better understood by those skilled in the art, the following description will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, apparatus, article, or device that comprises a list of steps or elements is not limited to the list of steps or elements but may, in the alternative, include other steps or elements not expressly listed or inherent to such process, method, article, or device.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The invention discloses a method and a device for jointly calibrating visible light and infrared cameras, wherein two visible light cameras are respectively a left camera and a right camera by installing the visible light cameras and the infrared cameras; the method comprises the steps of collecting patterns of a common calibration plate by using a visible light camera, and carrying out three-dimensional calibration of the visible light camera; placing a marker in a common visual angle space of a visible light camera and an infrared camera, and acquiring data of the marker to obtain N images; three-dimensional resolving is carried out on the position information of the marker in the N images by using a visible light camera, so that three-dimensional coordinate information of the marker is obtained; and processing the three-dimensional coordinate information of the marker and imaging pixel points of the infrared camera to obtain the combined calibration information of the visible light and the infrared camera. The method is simple to operate, and after the visible light camera is calibrated, the infrared camera calibration can be completed in two steps. The method has lower cost, does not need to customize a special calibration plate for calibrating the infrared camera, and saves the cost. The following will describe in detail.
Example 1
Referring to fig. 1, fig. 1 is a flow chart of a method for calibrating a combination of a visible light camera and an infrared camera according to an embodiment of the invention. The method for calibrating the visible light and infrared camera in a combined manner described in fig. 1 is applied to the fields of computer vision, mixed reality (including augmented reality and virtual reality), automatic driving, artificial intelligence and security, and the embodiment of the invention is not limited. As shown in fig. 1, the method for calibrating the visible light and infrared camera in a combined way can comprise the following operations:
s1, installing a visible light camera and an infrared camera, wherein the number of the visible light cameras is two, namely a left camera and a right camera;
s2, collecting patterns of a common calibration plate by using the visible light camera, and performing three-dimensional calibration of the visible light camera;
s3, placing a marker in a common visual angle space of the visible light camera and the infrared camera, and collecting data of the marker to obtain N images, wherein N is more than or equal to 6;
s4, carrying out three-dimensional calculation on the position information of the marker in the N images by using the visible light camera to obtain three-dimensional coordinate information of the marker;
s5, processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain visible light and infrared camera combined calibration information;
the visible light and infrared camera combined calibration information comprises internal parameter information of the infrared camera and position relation information between the infrared camera and the visible light camera.
Optionally, the visible light camera and the infrared camera need to satisfy:
the positions between the visible light camera and the infrared camera are relatively fixed, and the visible light camera and the infrared camera must have the same viewing angle;
the visible light camera and the infrared camera can not move relatively but can move integrally.
Optionally, the collecting the pattern of the common calibration board by using the visible light camera and performing the stereoscopic calibration of the visible light camera includes:
s21, collecting patterns of a checkerboard or other calibration plates by using the visible light camera;
s22, carrying out three-dimensional calibration on the visible light camera by using a Zhang calibration method or other camera calibration methods.
Optionally, the placing a marker in a viewing angle space common to the visible light camera and the infrared camera, and collecting position data of the marker, to obtain N images, includes:
s31, a marker is placed in a common visual angle space of the visible light camera and the infrared camera, so that the visible light camera and the infrared camera can capture the marker;
s32, moving the position of the marker or moving the whole formed by the cameras, and collecting data of the marker in a common visual angle space of the visible light camera and the infrared camera to obtain N images.
Optionally, the three-dimensional resolving of the position information of the marker in the N images by using the visible light camera to obtain three-dimensional coordinate information of the marker includes:
s41, calculating the pixel coordinates of a special point on the marker in the N images;
s42, three-dimensional calculation is carried out on pixel coordinates of a special point on the marker in the N images by utilizing parameters of the visible light camera, and three-dimensional coordinate information of the marker is obtained.
Optionally, the three-dimensional resolving of the pixel coordinates of a specific point on the marker in the N images by using the parameters of the visible light camera to obtain three-dimensional coordinate information of the marker includes:
s421, obtaining the coordinate p of a special point on the marker in the N images in the left camera l =[[u 11 ,v 11 ],[u 12 ,v 12 ]…,[u 1N ,v 1N ]And the coordinate p in the right camera r =[[u 21 ,v 21 ],[u 22 ,v 22 ],…,[u 2N ,v 2N ]];
S422, using a three-dimensional coordinate calculation model, the coordinates p in the left camera are calculated l =[[u 11 ,v 11 ],[u 12 ,v 12 ]…,[u 1N ,v 1N ]And the coordinate p in the right camera r =[[u 21 ,v 21 ],[u 22 ,v 22 ],…,[u 2N ,v 2N ]]Processing to obtain three-dimensional coordinate information of the marker;
the three-dimensional coordinate calculation model is as follows:
the three-dimensional coordinate information of the marker is p 3d =[X,Y,Z],Is a projection matrix of the actual coordinate system to the camera coordinate system.
Optionally, the processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain the combined calibration information of the visible light and the infrared camera includes:
s51, processing three-dimensional coordinate information of the marker by using a projection matrix calculation model to obtain a projection matrix;
the projection matrix calculation model is as follows:
wherein the three-dimensional coordinate information of the marker is p 3d =[X,Y,Z],z*p c =g, x represents the product, p c For homogeneous representation of camera coordinates, i.e. [ u, v,1 ]],A projection matrix from an actual coordinate system to a camera coordinate system;
s52, performing orthogonal triangular decomposition on a projection matrix from the actual coordinate system to a camera coordinate system to obtain internal parameters of the infrared camera and a position relation R between the internal parameters and the visible light camera ir And T ir
The internal parameters of the infrared camera and the position relation R between the internal parameters and the visible light camera ir And T ir And the information is calibrated for the combination of the visible light and the infrared camera.
Optionally, after calibration is completed, three-dimensional correction is performed, and the method comprises the following steps:
I L and I R Imaging planes of left and right cameras respectively, x 1 And x 2 Is the space point p at I L And I R Projection points on the plane. X is x 1 、x 2 P and two camera optical centers o 1 ,o 2 All lying in the same plane (polar plane pi). The projections of pi plane on two imaging planes are respectively l 1 And l 2 Optical center connecting line o of two cameras 1 o 2 Referred to as the baseline, the projection point e at which the baseline intersects the imaging plane 1 And e 2 Called pole, then called l 1 Is I L Corresponds to x 2 Polar lines of (l) 2 Is I R Corresponds to x 1 The polar lines of the imaging plane all intersect at the poles thereof to form a epipolar constraint relationship.
Assume that the rotation matrix for horizontally aligning the left pole line is R rect Principal point (C) x ,C y ) Is the origin of the left view, then the origin points to the left pole e 1 Is the direction of the translation vector between the centers of the two camera projections:
thereby calculating the vector e 1 Setting a vector e orthogonal to the main optical axis direction 2 By combining the vectors e 1 Cross product is carried out on the main optical axis direction vector, and normalized representation is carried out, so that the following steps are obtained:
similarly, a vector e is known to exist 3 Orthogonal to vector e 1 ,e 2 Push e 3 =e 1 ×e 2 . Through the operation, the left polar line is aligned with the horizontal direction, and the horizontal alignment matrix R rect The method comprises the following steps:
based on horizontal alignment matrix R rect The left eye image can be rotated about its center of projection, with the left pole shifted to infinity, and the epipolar line shifted to a state horizontal and parallel to the baseline. To further align the two camera imaging planes in a row, one can achieve this by the following equation:
after correction, the following relation exists between the matrix of the two cameras and the projection matrix:
in the above formula, alpha l ,α r The distortion ratio of the two camera pixels is respectively set to 0 to simplify the calculation flow. Converting the coordinates of the spatial points to pixel coordinates in the imaging plane by a re-projection matrix, the re-projection being expressed as:
mapping the two-dimensional coordinate points into a three-dimensional coordinate system by using a reprojection matrix Q, and obtaining:
d represents parallax, and the three-dimensional coordinates (X/W, Y/W, Z/W) are obtained by simplifying the above formula, wherein c x ' represents the principal point of the right image, c when the correction is correct x =c x After three-dimensional correction, the obtained three-dimensional coordinates are unfolded to obtain:
example two
(1) The device shown in fig. 2 was fabricated and the visible light camera was calibrated using a tensor camera calibration method to obtain the parameters K1, K2, R, T. Fig. 3 is an image of a marker captured by a visible light camera and an infrared camera of the present invention.
In this embodiment, the calibration results in
K1=[[1.38241129e+03,0.00000000e+00,5.02113539e+02],[0.00000000e+00,1.38765347e+03,4.23706762e+02],[0.00000000e+00,0.00000000e+00,1.00000000e+00]]
K2=[[1.39748811e+03,0.00000000e+00,4.51084764e+02],[0.00000000e+00,1.40174003e+03,3.95981758e+02],[0.00000000e+00,0.00000000e+00,1.00000000e+00]]
[R;T]=[[1.00046103e+00,3.35276056e-05,-1.19961898e-02,2.42697871e+02],[-1.71027399e-03,1.00011587e+00,-2.06932312e-03,5.14891559e+00],[-1.08667825e-02,2.04476690e-02,1.12054883e+00,2.30785148e+01]]
(2) And acquiring the mark point images in the cameras at different positions and calculating the pixel coordinates of the mark points in the images.
In this embodiment, six images of markers at different positions are acquired, and coordinates of the marker points in the left camera image are as follows:
p_l=[[120,120],[125,703],[540,330],[865,122],[860,684]]
the coordinates of the mark points in the right camera image are as follows:
p_r=[[135,121],[142,644],[508,310],[798,120],[791,627]]
the infrared camera image coordinates are:
p_ir=[[118,88],[123,429],[364,211],[556,87],[551,418]]
(3) And calculating the 3D coordinates of the visible points by using the marked points in the calibrated visible light camera.
The three-dimensional coordinates of each marker point are calculated by using the least square method as follows:
p_3d=[[-1934.87,-1532.04,7000][-1909.55,1408.89,7000],[219.24,-540.23,8000],[2362.52,-1956.80,9000],[2329.97,1688.20,9000]]
(4) The projection matrix g_ir is calculated using the calculated 3D coordinates p_3d and the pixel positions of the infrared image flag points p_ir.
The projection matrix g_ir of the 3D coordinates to the infrared camera can be calculated by using the least square method, g_ir= [ [ 9.111520990 e+02,6.96287471e+00,3.68921595e+02, 1.2393539e+05 ],
[-4.46993615e+00,9.20438368e+02,2.97688771e+02,1.08806726e+04],
[-1.08667825e-02,2.04476690e-02,1.12054883e+00,2.30785148e+01]]
(5) Calculating an infrared camera internal parameter K_ir by using QR decomposition G_ir, namely calculating G_ir=K_ir [ R_ir ]; t_ir ]
K_ir=[[914.41460397,0.,279.02233936+60],
[0.,914.86561956,267.35285008],
[0.,0.,1.]]
Example III
Referring to fig. 4, fig. 4 is a schematic structural diagram of a combined calibration device for a visible light and an infrared camera according to an embodiment of the invention. The visible light and infrared camera combined calibration device described in fig. 4 is applied to the fields of computer vision, mixed reality (including augmented reality and virtual reality), automatic driving, artificial intelligence and security, and the embodiment of the invention is not limited. As shown in fig. 4, the combined calibration device for visible light and infrared camera may include the following operations:
s301, a camera mounting module is used for mounting two visible light cameras, namely a left camera and a right camera;
s302, a calibration module is used for acquiring patterns of a common calibration plate by using the visible light camera and carrying out three-dimensional calibration of the visible light camera;
s303, a data acquisition module is used for placing a marker in a common visual angle space of the visible light camera and the infrared camera and acquiring data of the marker to obtain N images, wherein N is more than or equal to 6;
s304, a three-dimensional coordinate calculation module, which is used for carrying out three-dimensional calculation on the position information of the marker in the N images by using the visible light camera to obtain the three-dimensional coordinate information of the marker;
s305, a joint calibration module is used for processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain joint calibration information of the visible light and the infrared camera;
the visible light and infrared camera combined calibration information comprises internal parameter information of the infrared camera and position relation information between the infrared camera and the visible light camera.
Example IV
Referring to fig. 5, fig. 5 is a schematic structural diagram of another combined calibration device for a visible light and infrared camera according to an embodiment of the invention. The visible light and infrared camera combined calibration device described in fig. 5 is applied to the fields of computer vision, mixed reality (including augmented reality and virtual reality), automatic driving, artificial intelligence and security, and the embodiment of the invention is not limited. As shown in fig. 5, the combined calibration device for visible light and infrared camera may include the following operations:
a memory 401 storing executable program codes;
a processor 402 coupled with the memory 401;
the processor 402 invokes executable program codes stored in the memory 401 for performing the steps in the combined calibration method of visible light and infrared camera described in the first and second embodiments.
Example five
The embodiment of the invention discloses a computer readable storage medium which stores a computer program for electronic data exchange, wherein the computer program enables a computer to execute the steps in the visible light and infrared camera combined calibration method described in the first embodiment and the second embodiment.
The apparatus embodiments described above are merely illustrative, in which the modules illustrated as separate components may or may not be physically separate, and the components shown as modules may or may not be physical, i.e., may be located in one place, or may be distributed over multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above detailed description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course by means of hardware. Based on such understanding, the foregoing technical solutions may be embodied essentially or in part in the form of a software product that may be stored in a computer-readable storage medium including Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), one-time programmable Read-Only Memory (OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM) or other optical disc Memory, magnetic disc Memory, tape Memory, or any other medium that can be used for computer-readable carrying or storing data.
Finally, it should be noted that: the embodiment of the invention discloses a method and a device for jointly calibrating a visible light camera and an infrared camera, which are disclosed as preferred embodiments of the invention, and are only used for illustrating the technical scheme of the invention, but not limiting the technical scheme; although the invention has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that; the technical scheme recorded in the various embodiments can be modified or part of technical features in the technical scheme can be replaced equivalently; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (10)

1. The method for jointly calibrating the visible light and the infrared camera is characterized by comprising the following steps of:
s1, installing a visible light camera and an infrared camera, wherein the number of the visible light cameras is two, namely a left camera and a right camera;
s2, collecting patterns of a common calibration plate by using the visible light camera, and performing three-dimensional calibration of the visible light camera;
s3, placing a marker in a common visual angle space of the visible light camera and the infrared camera, and collecting data of the marker to obtain N images, wherein N is more than or equal to 6;
s4, carrying out three-dimensional calculation on the position information of the marker in the N images by using the visible light camera to obtain three-dimensional coordinate information of the marker;
s5, processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain visible light and infrared camera combined calibration information;
the visible light and infrared camera combined calibration information comprises internal parameter information of the infrared camera and position relation information between the infrared camera and the visible light camera.
2. The method for calibrating the visible light and the infrared camera in a combined manner according to claim 1, wherein the visible light camera and the infrared camera are required to meet the following requirements:
the positions between the visible light camera and the infrared camera are relatively fixed, and the visible light camera and the infrared camera must have the same viewing angle;
the visible light camera and the infrared camera can not move relatively but can move integrally.
3. The method for combined calibration of visible light and infrared camera according to claim 1, wherein the step of collecting the pattern of the normal calibration plate by using the visible light camera and performing the stereoscopic calibration of the visible light camera comprises the steps of:
s21, collecting patterns of a checkerboard or other calibration plates by using the visible light camera;
s22, carrying out three-dimensional calibration on the visible light camera by using a Zhang calibration method or other camera calibration methods.
4. The method for calibrating a combination of a visible light camera and an infrared camera according to claim 1, wherein the steps of placing a marker in a viewing angle space common to the visible light camera and the infrared camera, and collecting position data of the marker to obtain N images include:
s31, a marker is placed in a common visual angle space of the visible light camera and the infrared camera, so that the visible light camera and the infrared camera can capture the marker;
s32, moving the position of the marker or moving the whole formed by the cameras, and collecting data of the marker in a common visual angle space of the visible light camera and the infrared camera to obtain N images.
5. The method for calibrating a combination of visible light and infrared camera according to claim 1, wherein the three-dimensional calculation of the position information of the marker in the N images by using the visible light camera to obtain three-dimensional coordinate information of the marker comprises:
s41, calculating the pixel coordinates of a special point on the marker in the N images;
s42, three-dimensional calculation is carried out on pixel coordinates of a special point on the marker in the N images by utilizing parameters of the visible light camera, and three-dimensional coordinate information of the marker is obtained.
6. The method for calibrating a combination of a visible light camera and an infrared camera according to claim 5, wherein the three-dimensional resolving of the pixel coordinates of a specific point on the marker in the N images by using the parameters of the visible light camera to obtain three-dimensional coordinate information of the marker comprises:
s421, obtaining the coordinate p of a special point on the marker in the N images in the left camera l =[[u 11 ,v 11 ],[u 12 ,v 12 ]…,[u 1N ,v 1N ]And the coordinate p in the right camera r =[[u 21 ,v 21 ],[u 22 ,v 22 ],…,[u 2N ,v 2N ]];
S422, using a three-dimensional coordinate calculation model, the coordinates p in the left camera are calculated l =[[u 11 ,v 11 ],[u 12 ,v 12 ]…,[u 1N ,v 1N ]And the coordinate p in the right camera r =[[u 21 ,v 21 ],[u 22 ,v 22 ],…,[u 2N ,v 2N ]]Processing to obtain three-dimensional coordinate information of the marker;
the three-dimensional coordinate calculation model is as follows:
the three-dimensional coordinate information of the marker is p 3d =[X,Y,Z],Is a projection matrix of the actual coordinate system to the camera coordinate system.
7. The method for combined calibration of visible light and infrared camera according to claim 1, wherein the processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain the combined calibration information of visible light and infrared camera comprises:
s51, processing three-dimensional coordinate information of the marker by using a projection matrix calculation model to obtain a projection matrix;
the projection matrix calculation model is as follows:
wherein the three-dimensional coordinate information of the marker is p 3d =[X,Y,Z],z*p c =g, x represents the product, p c For homogeneous representation of camera coordinates, i.e. [ u, v,1 ]],A projection matrix from an actual coordinate system to a camera coordinate system;
s52, performing orthogonal triangular decomposition on a projection matrix from the actual coordinate system to a camera coordinate system to obtain internal parameters of the infrared camera and a position relation R between the internal parameters and the visible light camera ir And T ir
The internal parameters of the infrared camera and the position relation R between the internal parameters and the visible light camera ir And T ir And the information is calibrated for the combination of the visible light and the infrared camera.
8. A combined calibration device for a visible light camera and an infrared camera, the device comprising:
the camera mounting module is used for mounting a visible light camera and an infrared camera, wherein the number of the visible light cameras is two, namely a left camera and a right camera;
the calibration module is used for acquiring patterns of a common calibration plate by using the visible light camera and carrying out three-dimensional calibration of the visible light camera;
the data acquisition module is used for placing a marker in a common visual angle space of the visible light camera and the infrared camera and acquiring data of the marker to obtain N images, wherein N is more than or equal to 6;
the three-dimensional coordinate calculation module is used for carrying out three-dimensional calculation on the position information of the marker in the N images by utilizing the visible light camera to obtain three-dimensional coordinate information of the marker;
the combined calibration module is used for processing the three-dimensional coordinate information of the marker and the imaging pixel point of the infrared camera to obtain visible light and infrared camera combined calibration information;
the visible light and infrared camera combined calibration information comprises internal parameter information of the infrared camera and position relation information between the infrared camera and the visible light camera.
9. A combined calibration device for a visible light camera and an infrared camera, the device comprising:
a memory storing executable program code;
a processor coupled to the memory;
the processor invokes the executable program code stored in the memory to perform the combined calibration method of visible and infrared cameras as claimed in any one of claims 1-7.
10. A computer-readable storage medium storing computer instructions that, when invoked, are operable to perform the combined calibration method of a visible light and infrared camera of any one of claims 1-7.
CN202310870111.7A 2023-07-14 2023-07-14 Combined calibration method and device for visible light and infrared camera Pending CN116823968A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310870111.7A CN116823968A (en) 2023-07-14 2023-07-14 Combined calibration method and device for visible light and infrared camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310870111.7A CN116823968A (en) 2023-07-14 2023-07-14 Combined calibration method and device for visible light and infrared camera

Publications (1)

Publication Number Publication Date
CN116823968A true CN116823968A (en) 2023-09-29

Family

ID=88120215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310870111.7A Pending CN116823968A (en) 2023-07-14 2023-07-14 Combined calibration method and device for visible light and infrared camera

Country Status (1)

Country Link
CN (1) CN116823968A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106989824A (en) * 2017-04-26 2017-07-28 华中科技大学 A kind of infrared measurement of temperature imaging device and detection method
CN108010085A (en) * 2017-11-30 2018-05-08 西南科技大学 Target identification method based on binocular Visible Light Camera Yu thermal infrared camera
CN112330747A (en) * 2020-09-25 2021-02-05 中国人民解放军军事科学院国防科技创新研究院 Multi-sensor combined detection and display method based on unmanned aerial vehicle platform
CN115393519A (en) * 2022-08-30 2022-11-25 电子科技大学 Three-dimensional reconstruction method based on infrared and visible light fusion image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106989824A (en) * 2017-04-26 2017-07-28 华中科技大学 A kind of infrared measurement of temperature imaging device and detection method
CN108010085A (en) * 2017-11-30 2018-05-08 西南科技大学 Target identification method based on binocular Visible Light Camera Yu thermal infrared camera
CN112330747A (en) * 2020-09-25 2021-02-05 中国人民解放军军事科学院国防科技创新研究院 Multi-sensor combined detection and display method based on unmanned aerial vehicle platform
CN115393519A (en) * 2022-08-30 2022-11-25 电子科技大学 Three-dimensional reconstruction method based on infrared and visible light fusion image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WU DI ETC.: ""A New Camera Calibration Method for Phase Measuring Profilometry"", 《ADVANCED MATERIALS AND DEVICES FOR SENSING AND IMAGING III》, 31 December 2007 (2007-12-31), pages 3 *

Similar Documents

Publication Publication Date Title
CN111145238B (en) Three-dimensional reconstruction method and device for monocular endoscopic image and terminal equipment
Kumar et al. Simple calibration of non-overlapping cameras with a mirror
WO2021139176A1 (en) Pedestrian trajectory tracking method and apparatus based on binocular camera calibration, computer device, and storage medium
CN110782394A (en) Panoramic video rapid splicing method and system
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
CN111028155B (en) Parallax image splicing method based on multiple pairs of binocular cameras
CN111062873A (en) Parallax image splicing and visualization method based on multiple pairs of binocular cameras
US20110188762A1 (en) Real-Time Composite Image Comparator
CN103337094A (en) Method for realizing three-dimensional reconstruction of movement by using binocular camera
CN107527336B (en) Lens relative position calibration method and device
CN113841384A (en) Calibration device, chart for calibration and calibration method
CN113329179B (en) Shooting alignment method, device, equipment and storage medium
CN106886976B (en) Image generation method for correcting fisheye camera based on internal parameters
CN110807814A (en) Camera pose calculation method, device, equipment and storage medium
CN112598778B (en) VR three-dimensional reconstruction method based on improved texture mapping algorithm
CN114742905B (en) Multi-camera parameter calibration method, device, equipment and storage medium
CN115797461A (en) Flame space positioning system calibration and correction method based on binocular vision
CN111815672A (en) Dynamic tracking control method, device and control equipment
McIlroy et al. Kinectrack: 3d pose estimation using a projected dense dot pattern
CN103546680B (en) A kind of deformation-free omni-directional fisheye photographic device and a method for implementing the same
CN116524022B (en) Offset data calculation method, image fusion device and electronic equipment
CN117115211A (en) Point cloud coloring method, point cloud coloring apparatus, and computer-readable storage medium
CN116823968A (en) Combined calibration method and device for visible light and infrared camera
CN108683897B (en) Intelligent correction method for distortion of multi-projection display system
CN117974796A (en) XR augmented reality camera calibration method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination