CN113870364B - Self-adaptive binocular camera calibration method - Google Patents

Self-adaptive binocular camera calibration method Download PDF

Info

Publication number
CN113870364B
CN113870364B CN202111159656.4A CN202111159656A CN113870364B CN 113870364 B CN113870364 B CN 113870364B CN 202111159656 A CN202111159656 A CN 202111159656A CN 113870364 B CN113870364 B CN 113870364B
Authority
CN
China
Prior art keywords
camera
coordinate system
calibration
image
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111159656.4A
Other languages
Chinese (zh)
Other versions
CN113870364A (en
Inventor
张晋东
冯天琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Research Institute Of Jilin University
Original Assignee
Chongqing Research Institute Of Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Research Institute Of Jilin University filed Critical Chongqing Research Institute Of Jilin University
Priority to CN202111159656.4A priority Critical patent/CN113870364B/en
Publication of CN113870364A publication Critical patent/CN113870364A/en
Application granted granted Critical
Publication of CN113870364B publication Critical patent/CN113870364B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a self-adaptive binocular camera calibration method, which is characterized in that monocular calibration is respectively carried out on a left camera and a right camera, and three-dimensional calibration is carried out on the binocular cameras, and the obtained result matrixes are respectively as follows: an internal reference matrix cameraMatrixR, cameraMatrixL of the left and right cameras; distortion matrix distCoeffeL, distCoeffeR of left and right cameras; and a rotation matrix R and a translation matrix T between the two lenses of the binocular camera, and finally storing the calibration result. Compared with the existing camera calibration tool, the camera calibration tool can freely select the number of pictures and the size of the pictures for calibration, and can save the independent calibration results of the left camera and the right camera and the calibration results of the binocular camera when the calibration results are output.

Description

Self-adaptive binocular camera calibration method
Technical Field
The invention relates to the technical field of image processing, in particular to a self-adaptive binocular camera calibration method.
Background
In the application of image measurement and computer vision, in order to confirm the corresponding relation between the coordinates of three-dimensional points and the coordinates of images in space and facilitate the stereo imaging by a binocular camera and the matching of image points of left and right cameras, the camera needs to be calibrated, that is, parameters needed by conversion between pixel coordinates and actual coordinates are calculated. Under normal conditions, the double-target calibration of the cameras is carried out by firstly carrying out monocular calibration on the left camera and the right camera respectively, and then calibrating the double-target cameras by utilizing data obtained by calculation of calibration parameters. However, the existing camera calibration tool cannot select the number of pictures and the size of the pictures for calibration, and cannot customize the storage mode of the calibration result.
Disclosure of Invention
Aiming at the defects of the prior art, the invention aims to provide a self-adaptive binocular camera calibration method.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
a self-adaptive binocular camera calibration method comprises the following specific processes:
S1, determining the number of corner points of the calibration plate in the transverse and longitudinal directions and the distance between the corner points on the calibration plate; determining a saving path of the photographs taken by the left camera and the right camera; determining the number of pictures used for calibrating the left camera and the right camera, and defining the size of the pictures used for calibrating the pictures and a storage path of a calibration result; shooting a set number of checkerboard pictures at different angles by using a binocular camera, wherein the checkerboard serving as a calibration plate in the pictures is required to be a complete checkerboard; respectively storing checkerboard pictures obtained by the left camera and the right camera;
s2, searching corner points:
Uniformly adjusting the checkerboard pictures to a set resolution ratio, and converting the checkerboard pictures into gray level pictures; searching and determining angular point positions in the checkerboard picture by utilizing a Harris algorithm; the obtained angular point position is refined by using a sub-pixel refinement method, and then the image coordinates of the angular point are obtained;
s3, monocular calibration is carried out on the left camera and the right camera respectively:
Solving a projection matrix M of corner points converted from a world coordinate system to a pixel coordinate system by using pixel coordinates of images of a calibration plate shot by a left camera and a right camera and world coordinates of the calibration plate; setting a pixel coordinate system of a camera as a coordinate system Mp (x, y) taking the upper left corner of an image as an origin, and setting a world coordinate system Mw (x, y, z) taking the coordinate of a corner point of the upper left corner of a calibration plate as the origin; thus, the process of converting from world coordinate system to pixel coordinate system is expressed as:
Mp=MMw
The image is converted into a pixel coordinate system from a world coordinate system, and the three processes of converting the world coordinate system into a camera coordinate system Mc, converting the camera coordinate system into an image coordinate system Mxy and converting the image coordinate system into a pixel coordinate system Mp are included;
(1) Conversion from world coordinate system to camera coordinate system:
Setting a rotation matrix as R and a translation matrix as T;
The projection matrix M 1 for converting the image from world coordinate system to camera coordinate system is composed of R and T, i.e
That is, the process of converting the image from the world coordinate system to the camera coordinate system is M c=M1Mw,Mc, which represents the image under the camera coordinate system, and M w, which represents the image under the world coordinate system;
(2) Converting from the camera coordinate system to the image coordinate system:
Considering the camera coordinate system as being on the same plane as the image coordinate system, assuming that the coordinates under the image coordinate system are M xy (x, y), it can be expressed as:
xc, yc, zc are image coordinates of the corner points in a camera coordinate system;
Then multiplication by a matrix can be expressed as:
Assuming that the matrix used for the transformation is M2, the process of converting from the camera coordinate system to the image coordinate system is expressed as:
Mxy=M2Mc
(3) From image coordinate system to pixel coordinate system:
since the origin of the image coordinate system is at the center point of the image, the pixel coordinate system sets the upper left corner of the image as the origin to be right and downward as the positive directions of the x and y axes; let cx, cy be the coordinates of the origin of the image coordinate system in the pixel coordinate system, fx, fy be the number of pixels represented by the x, y axes per millimeter, respectively, (u, v) be the coordinates of the corner point in the pixel coordinate system, and convert from the image coordinate system to the pixel coordinate system as follows:
i.e. M t=M3Mxy;
Thus, the projection matrix m=m 3M2M1 of the image from the world coordinate system to the pixel coordinate system, where the unknown quantities include the rotation matrix R, the translation matrix T, and fx, fy, cx, cy and the focal length f; the unknown quantity can be solved through camera calibration;
(4) Distortion of camera
The camera distortion model is as follows:
I.e., taylor expansion between the real coordinates and the ideal coordinates of the camera; x distorted and y distorted are coordinates of pixel points in the image, x and y are coordinates under ideal conditions, and r 2=x2+y2;
The camera calibration is to solve the unknown quantities R, T, fx, fy, cx, cy and radial distortion k by using the pixel coordinates of the corner points and the world coordinates of the corner points obtained by the corner point search function to obtain an internal reference matrix, an external reference matrix and a distortion matrix of the camera;
s4, calculating world coordinates of the corner points:
according to Zhang Youzheng calibration method, the first corner point of the upper left corner of the checkerboard is set as the origin of coordinates, the edges of the checkerboard are set as positive directions of x axis and y axis, the vertical checkerboard is set as z axis, the checkerboard is placed on the plane with z=0, and 1mm is set as a coordinate unit, then the coordinates of each corner point are expressed as:
(xn1,yn2,zn3)=(n1×N,n2×N,n3×N)
Wherein N is the interval between two angular points;
S5, calibrating a binocular camera:
Setting the external parameter matrixes of the left camera and the right camera obtained according to monocular calibration as R r,Tr and R l,Tl respectively; then, the relationship matrix between the left camera and the right camera can be expressed as:
S6, data storage:
after monocular and binocular calibration of the camera is completed, the obtained calibration results are respectively stored.
Further, in step S2, the resolution is set to 640×400.
Further, in step S6, after the calibration is completed, the result to be stored includes: an internal reference matrix of the left camera and the right camera, a distortion matrix of the left camera and the right camera, a rotation matrix between the left camera and the right camera and a translation matrix; therefore, the calibration results of the monocular cameras are respectively stored in text files under file paths of checkerboard pictures shot by the left camera and the right camera, and the calibration results of the binocular cameras are stored under the image folders.
The invention has the beneficial effects that: compared with the existing camera calibration tool, the camera calibration tool can freely select the number of pictures and the size of the pictures for calibration, and can save the independent calibration results of the left camera and the right camera and the calibration results of the binocular camera when the calibration results are output.
Drawings
FIG. 1 is a flow chart of a method according to an embodiment of the invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings, and it should be noted that, while the present embodiment provides a detailed implementation and a specific operation process on the premise of the present technical solution, the protection scope of the present invention is not limited to the present embodiment.
The embodiment provides a self-adaptive binocular camera calibration method, which is divided into two processes: monocular calibration is carried out on the left camera and the right camera, and stereoscopic calibration is carried out on the binocular camera. Firstly, determining that the result matrixes required to be obtained in the whole calibration process are respectively as follows: an internal reference matrix cameraMatrixR, cameraMatrixL for the left and right cameras; a distortion matrix distCoeffeL, distCoeffeR for the left and right cameras, and a rotation matrix R and a translation matrix T between the left and right cameras. Wherein the internal reference matrix is stored using a 3×3 matrix and the distortion matrix is stored using a 5×1 vector. The rotation matrix R between the left and right cameras is a 3×3 matrix and the translation matrix T is a 3×1 vector. In order to calibrate the binocular camera, two queue containers are needed to store the angular point coordinates of checkerboard pictures shot by the left camera and the right camera respectively. The specific process is shown in fig. 1.
1. Front-end preparation for camera calibration
The camera to be calibrated in this embodiment may be a binocular camera such as an IR camera. And determining the number of corner points of the calibration plate in the transverse and longitudinal directions and the lattice distance between the corner points on the calibration plate (manual measurement is needed) for defining in an initial global variable. In addition, in the method of the embodiment, variables required to be defined in advance for calibration also include storage paths of photographs taken by the left camera and the right camera.
And determining the number of pictures used for calibrating the left camera and the right camera, and defining the size of the pictures used for calibrating the pictures and a storage path of a calibration result. The size and type of the picture are not limited in this embodiment.
Specifically, in this embodiment, the calibration is a binocular camera, a checkerboard with 16×11 angular points and 15mm angular point spacing is used as the calibration board, and six checkerboard pictures are respectively taken at different angles by using the binocular camera before calibration, and the checkerboard serving as the calibration board in the pictures is required to be a complete checkerboard. And then, respectively storing the obtained checkerboard pictures according to the left camera and the right camera, wherein the obtained pictures are used for calibration.
2. Corner search
And uniformly adjusting the checkerboard pictures shot by the binocular camera to 640 multiplied by 400 resolution, and converting the checkerboard pictures into gray level pictures. And searching and determining the angular point positions in each checkerboard picture by using a Harris algorithm.
In this embodiment, the Harris algorithm is used to locate the corner points, and the subpixel accurate method is used to accurately locate the obtained corner points, so as to obtain the image coordinates of the corner points. Further, the present embodiment uses two-dimensional arrays to store the image coordinates of the corner points of the checkerboard picture obtained by the left camera and the right camera, respectively. And particularly, carrying out angular point search on each checkerboard picture in turn according to the storage sequence of each checkerboard picture.
3. Monocular calibration
The calibration of the monocular camera is the process of converting the world coordinate system of the angular point into the projection matrix M of the pixel coordinate system by utilizing the pixel coordinate of the image of the calibration plate shot by the camera and the world coordinate of the calibration plate. Let the pixel coordinate system of the camera be the coordinate system Mp (x, y) with the upper left corner of the image as the origin, and the world coordinate system be the coordinate system Mw (x, y, z) with the coordinates of the corner point of the upper left corner of the calibration plate as the origin. Thus, the process of converting from world coordinate system to pixel coordinate system can be seen as:
Mp=MMw
The image is converted from the world coordinate system to the pixel coordinate system, and includes three steps of converting the world coordinate system Mw to the camera coordinate system Mc, converting the camera coordinate system to the image coordinate system Mxy, and converting the image coordinate system to the pixel coordinate system Mp.
(1) Conversion from world to camera coordinate system
The coordinate transformation between two different coordinate systems can be regarded as transformation of two parts of rotation and translation, and the rotation matrix of a single camera is set as R, and the translation matrix is set as T;
The projection matrix M 1 for converting the world coordinate system into the camera coordinate system is formed by R and T;
That is, the process of converting an image from the world coordinate system to the camera coordinate system is M c=M1Mw,Mc, which represents an image in the camera coordinate system, and M w, which represents an image in the world coordinate system.
(2) Conversion from camera coordinate system to image coordinate system
The camera coordinate system is considered to be on the same plane as the image coordinate system, and therefore, the transformation of the camera coordinate system and the image coordinate system can be considered as a similar triangle transformation with respect to the camera focal length f. Let the coordinates in the image coordinate system be M xy (x, y), it can be expressed as:
Xc, yc, zc are image coordinates of the corner points in the camera coordinate system.
Then multiplication by a matrix can be expressed as:
Assuming that the transformation matrix used by the camera coordinate system to the image coordinate system is M 2, the process of transforming the camera coordinate system to the image coordinate system can be expressed as:
Mxy=M2Mc
(3) From image coordinate system to pixel coordinate system
Since the origin of the image coordinate system is at the center point of the image, and for convenience of calculation, the pixel coordinate system sets the upper left corner of the image as the origin to the right and downward as the positive x and y axes. Let cx, cy be the coordinates of the origin of the image coordinate system in the pixel coordinate system, fx, fy be the number of pixels represented by the x, y axes per millimeter, respectively, (u, v) be the coordinates of the corner point in the pixel coordinate system, and convert from the image coordinate system to the pixel coordinate system as follows:
I.e. can be represented as M p=M3Mxy
In summary, the above equations, the projection matrix m=m 3M2M1 of the image from the world coordinate system to the pixel coordinate system, where the unknown quantities include the rotation matrix R, the translation matrix T, and fx, fy, cx, cy and the focal length f. The above unknowns can be solved by camera calibration.
(4) Distortion of camera
The distortion model of the camera is as follows:
Wherein x distorted and y distorted are coordinates of pixel points in the image, x and y are coordinates under ideal conditions, r 2=x2+y2, and in the Zhang Zhengyou calibration method adopted in this embodiment, only radial distortion of the camera is considered, namely, the camera distortion model is considered as
I.e. the taylor expansion between the real and ideal coordinates of the camera.
Thus, it can be considered that the camera calibration is to solve the unknowns R, T, fx, fy, cx, cy, and radial distortion k by using the pixel coordinates of the corner points and the world coordinates of the corner points obtained by the corner point search function.
The present embodiment uses Zhang Zhengyou calibration method, i.e. the world coordinate system is taken as the plane with the plane of the calibration plate being z=0.
The world coordinate M w can thus be represented by (x, y, z, 1) as (x, y, 1). Let M A=M3M2 be
There are a total of 8 unknown quantity components. Where M A consists of 4 unknowns, α and β are the focal length parameters of the camera, and u0 and v0 are the offset parameters of the focal point relative to the origin of the upper left corner.
According to Zhang Zhengyou calibration method, let the plane z=0 where the calibration plate is located under the world coordinate system, thus
Wherein r1=[a11 a21 a31]T,r2=[a12 a22 a32]T, is M AM1, which is the matrix to be solved, and at least three homography matrices are needed for solving. In order to ensure the error range of the result, the embodiment respectively calibrates the left camera and the right camera by using six pictures, and respectively solves an internal reference matrix, a distortion matrix and an external reference matrix of the left camera and the right camera.
4. Calculation of corner world coordinates
According to Zhang Zhengyou calibration method, for the convenience of calculation, the first corner point of the upper left corner of the checkerboard is taken as the origin of coordinates, the edges of the checkerboard are taken as positive directions of x axis and y axis, the vertical checkerboard is taken as z axis, the checkerboard is placed on a plane with z=0, and 1mm is taken as a coordinate unit, so that the coordinates of each corner point can be expressed as:
(xn1,yn2,zn3)=(n1×N,n2×N,n3×N)
wherein N is the spacing between the two corner points.
Therefore, the world coordinates of the corner points on the calibration board can be obtained, and in this embodiment, six 16×11 pictures are used for calibration, so that a two-dimensional dynamic array container is used for storing the world coordinates of the checkerboard obtained by calculation.
5. Binocular camera calibration
The binocular camera is a camera structure simulating human eye imaging, binocular stereo calibration is to calculate the parallax relation between a left camera and a right camera of the binocular camera, namely a rotation matrix R lr and a translation matrix T lr between the left camera and the right camera, and can utilize the rotation matrix R lr and the translation matrix T lr to correct binocular vision images, and two pictures shot by the left camera and the right camera are pulled to the same horizontal plane through rotation and translation transformation.
First, let the left and right camera exogenous matrices obtained according to monocular calibration be R r,Tr and R l,Tl, respectively. Then, the relationship matrices R lr and T lr between the left and right cameras can be expressed as:
Therefore, the external parameter matrices R lr and T lr between the binocular cameras can also be solved by using the pixel coordinates and world coordinates of the corner points.
6. Data preservation
After monocular and binocular calibration of the camera is completed, the obtained calibration results are respectively stored. After calibration is completed, the results to be stored include: an internal reference matrix of the left and right cameras, a distortion matrix of the left and right cameras, a rotation matrix between the left and right cameras, and a translation matrix. Therefore, in this embodiment, the calibration results of the monocular camera are respectively stored in the text file under the file path where the checkered pictures shot by the left camera and the right camera are located, and the calibration results of the binocular camera are stored under the image folder. In order to facilitate use and reference, the embodiment simultaneously stores the result data of the rotation matrix and the translation matrix of the monocular camera obtained by calibration calculation relative to each calibration sample image, and simultaneously stores the essence matrix, the basic matrix and the back projection matrix obtained by binocular calibration into a calibration result file.
Various modifications and variations of the present invention will be apparent to those skilled in the art in light of the foregoing teachings and are intended to be included within the scope of the following claims.

Claims (3)

1. The self-adaptive binocular camera calibration method is characterized by comprising the following specific steps of:
S1, determining the number of corner points of the calibration plate in the transverse and longitudinal directions and the distance between the corner points on the calibration plate; determining a saving path of the photographs taken by the left camera and the right camera; determining the number of pictures used for calibrating the left camera and the right camera, and defining the size of the pictures used for calibrating the pictures and a storage path of a calibration result; shooting a set number of checkerboard pictures at different angles by using a binocular camera, wherein the checkerboard serving as a calibration plate in the pictures is required to be a complete checkerboard; respectively storing checkerboard pictures obtained by the left camera and the right camera;
s2, searching corner points:
Uniformly adjusting the checkerboard pictures to a set resolution ratio, and converting the checkerboard pictures into gray level pictures; searching and determining angular point positions in the checkerboard picture by utilizing a Harris algorithm; the obtained angular point position is refined by using a sub-pixel refinement method, and then the image coordinates of the angular point are obtained;
s3, monocular calibration is carried out on the left camera and the right camera respectively:
Solving a projection matrix M of corner points converted from a world coordinate system to a pixel coordinate system by using pixel coordinates of images of a calibration plate shot by a left camera and a right camera and world coordinates of the calibration plate; setting a pixel coordinate system of a camera as a coordinate system Mp (x, y) taking the upper left corner of an image as an origin, and setting a world coordinate system Mw (x, y, z) taking the coordinate of a corner point of the upper left corner of a calibration plate as the origin; thus, the process of converting from world coordinate system to pixel coordinate system is expressed as:
Mp=MMw
The image is converted into a pixel coordinate system from a world coordinate system, and the three processes of converting the world coordinate system into a camera coordinate system Mc, converting the camera coordinate system into an image coordinate system Mxy and converting the image coordinate system into a pixel coordinate system Mp are included;
(1) Conversion from world coordinate system to camera coordinate system:
Setting a rotation matrix as R and a translation matrix as T;
The projection matrix M 1 for converting the image from world coordinate system to camera coordinate system is composed of R and T, i.e
That is, the process of converting the image from the world coordinate system to the camera coordinate system is M c=M1Mw,Mc, which represents the image under the camera coordinate system, and M w, which represents the image under the world coordinate system;
(2) Converting from the camera coordinate system to the image coordinate system:
Considering the camera coordinate system as being on the same plane as the image coordinate system, assuming that the coordinates under the image coordinate system are M xy (x, y), it can be expressed as:
xc, yc, zc are image coordinates of the corner points in a camera coordinate system;
Then multiplication by a matrix can be expressed as:
Assuming that the matrix used for the transformation is M2, the process of converting from the camera coordinate system to the image coordinate system is expressed as:
Mxy=M2Mc
(3) From image coordinate system to pixel coordinate system:
since the origin of the image coordinate system is at the center point of the image, the pixel coordinate system sets the upper left corner of the image as the origin to be right and downward as the positive directions of the x and y axes; let cx, cy be the coordinates of the origin of the image coordinate system in the pixel coordinate system, fx, fy be the number of pixels represented by the x, y axes per millimeter, respectively, (u, v) be the coordinates of the corner point in the pixel coordinate system, and convert from the image coordinate system to the pixel coordinate system as follows:
i.e. M t=M3Mxy;
Thus, the projection matrix m=m 3M2M1 of the image from the world coordinate system to the pixel coordinate system, where the unknown quantities include the rotation matrix R, the translation matrix T, and fx, fy, cx, cy and the focal length f; the unknown quantity can be solved through camera calibration;
(4) Distortion of camera
The camera distortion model is as follows:
I.e., taylor expansion between the real coordinates and the ideal coordinates of the camera; x distorted and y distorted are coordinates of pixel points in the image, x and y are coordinates under ideal conditions, and r 2=x2+y2;
The camera calibration is to solve the unknown quantities R, T, fx, fy, cx, cy and radial distortion k by using the pixel coordinates of the corner points and the world coordinates of the corner points obtained by the corner point search function to obtain an internal reference matrix, an external reference matrix and a distortion matrix of the camera;
s4, calculating world coordinates of the corner points:
according to Zhang Youzheng calibration method, the first corner point of the upper left corner of the checkerboard is set as the origin of coordinates, the edges of the checkerboard are set as positive directions of x axis and y axis, the vertical checkerboard is set as z axis, the checkerboard is placed on the plane with z=0, and 1mm is set as a coordinate unit, then the coordinates of each corner point are expressed as:
(xn1,yn2,zn3)=(n1×N,n2×N,n3×N)
Wherein N is the interval between two angular points;
S5, calibrating a binocular camera:
Setting the external parameter matrixes of the left camera and the right camera obtained according to monocular calibration as R r,Tr and R l,Tl respectively; then, the relationship matrix between the left camera and the right camera can be expressed as:
S6, data storage:
after monocular and binocular calibration of the camera is completed, the obtained calibration results are respectively stored.
2. The method according to claim 1, wherein in step S2, the resolution is set to 640 x 400.
3. The method according to claim 1, wherein in step S6, after the calibration is completed, the result to be saved includes: an internal reference matrix of the left camera and the right camera, a distortion matrix of the left camera and the right camera, a rotation matrix between the left camera and the right camera and a translation matrix; therefore, the calibration results of the monocular cameras are respectively stored in text files under file paths of checkerboard pictures shot by the left camera and the right camera, and the calibration results of the binocular cameras are stored under the image folders.
CN202111159656.4A 2021-09-30 2021-09-30 Self-adaptive binocular camera calibration method Active CN113870364B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111159656.4A CN113870364B (en) 2021-09-30 2021-09-30 Self-adaptive binocular camera calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111159656.4A CN113870364B (en) 2021-09-30 2021-09-30 Self-adaptive binocular camera calibration method

Publications (2)

Publication Number Publication Date
CN113870364A CN113870364A (en) 2021-12-31
CN113870364B true CN113870364B (en) 2024-05-24

Family

ID=79001086

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111159656.4A Active CN113870364B (en) 2021-09-30 2021-09-30 Self-adaptive binocular camera calibration method

Country Status (1)

Country Link
CN (1) CN113870364B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117788604A (en) * 2023-12-29 2024-03-29 珠海广浩捷科技股份有限公司 Method for calibrating camera through calibrated robot

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106846415A (en) * 2017-01-24 2017-06-13 长沙全度影像科技有限公司 A kind of multichannel fisheye camera binocular calibration device and method
CN108629810A (en) * 2017-03-23 2018-10-09 展讯通信(上海)有限公司 Scaling method, device and the terminal of binocular camera
WO2019039996A1 (en) * 2017-08-25 2019-02-28 Maker Trading Pte Ltd Machine vision system and method for identifying locations of target elements
CN110889875A (en) * 2019-12-04 2020-03-17 南京美基森信息技术有限公司 Binocular camera calibration method based on partial corner points
CN111080714A (en) * 2019-12-13 2020-04-28 太原理工大学 Parallel binocular camera calibration method based on three-dimensional reconstruction
CN111243033A (en) * 2020-01-10 2020-06-05 大连理工大学 Method for optimizing external parameters of binocular camera
CN112308925A (en) * 2019-08-02 2021-02-02 上海肇观电子科技有限公司 Binocular calibration method and device of wearable device and storage medium
WO2021139176A1 (en) * 2020-07-30 2021-07-15 平安科技(深圳)有限公司 Pedestrian trajectory tracking method and apparatus based on binocular camera calibration, computer device, and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106846415A (en) * 2017-01-24 2017-06-13 长沙全度影像科技有限公司 A kind of multichannel fisheye camera binocular calibration device and method
CN108629810A (en) * 2017-03-23 2018-10-09 展讯通信(上海)有限公司 Scaling method, device and the terminal of binocular camera
WO2019039996A1 (en) * 2017-08-25 2019-02-28 Maker Trading Pte Ltd Machine vision system and method for identifying locations of target elements
CN112308925A (en) * 2019-08-02 2021-02-02 上海肇观电子科技有限公司 Binocular calibration method and device of wearable device and storage medium
CN110889875A (en) * 2019-12-04 2020-03-17 南京美基森信息技术有限公司 Binocular camera calibration method based on partial corner points
CN111080714A (en) * 2019-12-13 2020-04-28 太原理工大学 Parallel binocular camera calibration method based on three-dimensional reconstruction
CN111243033A (en) * 2020-01-10 2020-06-05 大连理工大学 Method for optimizing external parameters of binocular camera
WO2021139176A1 (en) * 2020-07-30 2021-07-15 平安科技(深圳)有限公司 Pedestrian trajectory tracking method and apparatus based on binocular camera calibration, computer device, and storage medium

Also Published As

Publication number Publication date
CN113870364A (en) 2021-12-31

Similar Documents

Publication Publication Date Title
CN107767422B (en) Fisheye lens correction method and device and portable terminal
CN109767474B (en) Multi-view camera calibration method and device and storage medium
US20170127045A1 (en) Image calibrating, stitching and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN107492127B (en) Light field camera parameter calibration method and device, storage medium and computer equipment
CN111210468B (en) Image depth information acquisition method and device
US20170134713A1 (en) Image calibrating, stitching and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN108416812B (en) Calibration method of single-camera mirror image binocular vision system
CN112229323B (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
CN111340737B (en) Image correction method, device and electronic system
CN115861445B (en) Hand-eye calibration method based on three-dimensional point cloud of calibration plate
CN112233184B (en) Laser radar and camera calibration parameter correction method and device based on image registration
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
CN113870364B (en) Self-adaptive binocular camera calibration method
CN116188591A (en) Multi-camera global calibration method and device and electronic equipment
CN111915681B (en) External parameter calibration method, device, storage medium and equipment for multi-group 3D camera group
CN113436267B (en) Visual inertial navigation calibration method, device, computer equipment and storage medium
CN112258581B (en) On-site calibration method for panoramic camera with multiple fish glasses heads
CN110750094A (en) Method, device and system for determining pose change information of movable equipment
CN112598751A (en) Calibration method and device, terminal and storage medium
CN117173254A (en) Camera calibration method, system, device and electronic equipment
CN115457142B (en) Calibration method and system of MR hybrid photographic camera
CN114693807B (en) Method and system for reconstructing mapping data of power transmission line image and point cloud
CN107067441B (en) Camera calibration method and device
CN111292380A (en) Image processing method and device
CN112465914B (en) Camera array calibration method based on non-common view field

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant