CN110689580A - Multi-camera calibration method and device - Google Patents

Multi-camera calibration method and device Download PDF

Info

Publication number
CN110689580A
CN110689580A CN201810732946.5A CN201810732946A CN110689580A CN 110689580 A CN110689580 A CN 110689580A CN 201810732946 A CN201810732946 A CN 201810732946A CN 110689580 A CN110689580 A CN 110689580A
Authority
CN
China
Prior art keywords
camera
calibration
image
cameras
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810732946.5A
Other languages
Chinese (zh)
Other versions
CN110689580B (en
Inventor
孙元栋
张小峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikrobot Co Ltd
Original Assignee
Hangzhou Haikang Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Haikang Robot Technology Co Ltd filed Critical Hangzhou Haikang Robot Technology Co Ltd
Priority to CN201810732946.5A priority Critical patent/CN110689580B/en
Publication of CN110689580A publication Critical patent/CN110689580A/en
Application granted granted Critical
Publication of CN110689580B publication Critical patent/CN110689580B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a multi-camera calibration method and device, and belongs to the technical field of machine vision. The method comprises the following steps: selecting any two cameras from the multiple cameras as an initial camera set, and calibrating the two cameras included in the initial camera set by using images acquired by the initial camera set to obtain calibration parameters; the method comprises the steps of taking a camera to be calibrated and a first calibration camera as a first camera set, taking the camera to be calibrated and a second calibration camera as a second camera set, calibrating the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, wherein the first calibration camera and the second calibration camera refer to any two cameras in the calibrated cameras. That is, this application does not need all cameras to gather the image that includes all mark points simultaneously, and as long as have two cameras to gather the image that includes all mark points simultaneously, just can mark the camera, has reduced the degree of difficulty that the camera was markd.

Description

Multi-camera calibration method and device
Technical Field
The application relates to the technical field of machine vision, in particular to a multi-camera calibration method and device.
Background
Currently, a multi-camera system can be widely applied to the fields of three-dimensional reconstruction, motion capture or multi-view video and the like. Before the multi-camera system is used, calibration needs to be performed on a plurality of cameras in the multi-camera system. Calibrating the plurality of cameras actually determines the intrinsic parameters and the extrinsic parameters of each of the plurality of cameras.
In the related art, when a plurality of cameras are calibrated, a calibration plate is needed, and the calibration plate comprises at least three marking points which are collinear. And during calibration, the calibration plate is controlled to do rigid motion in the working area of the multi-camera system. And in the motion process of the calibration plate, the plurality of cameras can synchronously acquire images of the calibration plate at each moment in a plurality of moments to obtain a plurality of images, and the plurality of images acquired at the same moment are sent to the terminal as a group of images. After receiving the plurality of sets of images, the terminal may calibrate the plurality of cameras according to the plurality of sets of images. For any one of the multiple sets of images, each image in the set of images needs to include all the marking points on the calibration plate.
Therefore, when the cameras are calibrated by the method, for a plurality of images acquired at the same time, all the mark points on the calibration plate must be contained in each image, the terminal can calibrate the cameras by using the group of images, that is, when the cameras acquire the images of the calibration plate, all the mark points of the calibration plate are required to be in the visual field range of each camera at the same time, and the acquired images can be used for calibrating the cameras. However, the working area of the multi-camera system is generally large, and the distance between each camera is long, so it is difficult to ensure that all the mark points on the calibration plate are simultaneously within the visual field of each camera, and the calibration difficulty is high.
Disclosure of Invention
The embodiment of the application provides a multi-camera calibration method and device, which can be used for solving the problem that the calibration difficulty of a multi-camera system in the related art is high. The technical scheme is as follows:
in a first aspect, a multi-camera calibration method is provided, the method comprising:
selecting any two cameras from a plurality of cameras as an initial camera set, and calibrating the two cameras included in the initial camera set by using images acquired by the initial camera set to obtain calibration parameters;
the method comprises the steps of taking a camera to be calibrated and a first calibration camera as a first camera set, taking the camera to be calibrated and a second calibration camera as a second camera set, calibrating the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, wherein the first calibration camera and the second calibration camera refer to any two cameras in the calibrated cameras.
Optionally, the calibrating the two cameras included in the initial camera group by using the image acquired by the initial camera group to obtain calibration parameters includes:
determining a basic matrix between two cameras included in the initial camera group based on the images which are acquired by the initial camera group and meet the calibration condition;
determining a projection matrix of a first camera in the initial camera set as a preset projection matrix, and determining calibration parameters of the first camera based on the preset projection matrix;
and determining a projection matrix of a second camera in the initial camera group based on a basic matrix between two cameras in the initial camera group, and determining calibration parameters of the second camera based on the projection matrix of the second camera.
Optionally, the calibrating the camera to be calibrated based on the image acquired by the first camera group and the image acquired by the second camera group to obtain calibration parameters includes:
determining a basis matrix between two cameras in the corresponding camera group based on the images which are acquired by each camera group in the first camera group and the second camera group and meet the calibration condition;
determining a projection matrix of the camera to be calibrated based on the projection matrix of the first calibration camera, the projection matrix of the second calibration camera and a basic matrix between two cameras in each of the first camera group and the second camera group;
and determining the calibration parameters of the camera to be calibrated based on the projection matrix of the camera to be calibrated.
Optionally, the images meeting the calibration condition refer to at least four groups of images obtained by image acquisition of the calibration rod at different positions by the corresponding camera group, each group of the at least four groups of images includes two images obtained by image acquisition of the calibration rod at the same position by the two cameras in the corresponding camera group, and the two images both include all the marker points on the calibration rod.
Optionally, the determining, based on the images acquired by each of the first camera group and the second camera group and meeting the calibration condition, a basis matrix between two cameras in the corresponding camera group includes:
acquiring at least two image point pairs from each of at least four groups of images acquired by each camera group in the first camera group and the second camera group to obtain at least eight image point pairs;
and acquiring the image coordinates of the two image points included in each image point pair of the at least eight image point pairs, and determining a basic matrix between the two cameras in the corresponding camera group based on the image coordinates of the two image points included in each image point pair of the at least eight image point pairs.
Optionally, the calibration rod includes at least four marker points, and three of the at least four marker points are collinear;
the acquiring at least two image point pairs from each of at least four sets of images acquired by each of the first camera set and the second camera set includes:
detecting at least four marking points in a first image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the first image;
detecting at least four marking points in a second image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the second image;
determining at least one image point pair according to at least one mark point left in the first image except the three collinear mark points and at least one mark point left in the second image except the three collinear mark points, and determining a mark point located in the middle of the three collinear mark points in the first image and a mark point located in the middle of the three collinear mark points in the second image as one image point pair.
Optionally, the method further comprises:
acquiring at least eight image point pairs from the images which are acquired by each camera group and meet the calibration conditions;
determining coordinates of at least eight three-dimensional points corresponding to each camera group based on the image coordinates of each image point in at least eight image point pairs corresponding to each camera group and the projection matrixes of the two cameras in the corresponding camera group;
determining a reprojection error based on the image coordinates of each image point in the at least eight image point pairs corresponding to each camera group, the coordinates of the at least eight three-dimensional points corresponding to each camera group, and the projection matrix of each camera of the plurality of cameras;
under the condition that the image coordinates of each image point in at least eight image point pairs corresponding to each camera group are unchanged, adjusting the coordinates of at least eight three-dimensional points corresponding to each camera group and the projection matrix of each camera in the plurality of cameras so as to minimize the reprojection error;
and updating the calibration parameters of the corresponding camera based on the adjusted projection matrix of each camera in the plurality of cameras.
Optionally, the calibration rod includes at least four marker points, and three of the at least four marker points are collinear;
after updating the calibration parameters of the corresponding camera based on the adjusted projection matrix of each camera in the plurality of cameras, the method further includes:
determining at least four estimated distances between a mark point positioned at the middle position in the three collinear mark points on the calibration rod and any mark point in the rest at least one mark point based on the adjusted coordinates of at least eight three-dimensional points corresponding to each camera set;
acquiring an actual measurement distance between a mark point positioned in the middle of the three collinear mark points on the calibration rod and any one of the rest at least one mark point;
generating a cost function based on the at least four estimated distances, the measured distance, the updated calibration parameters of each camera in the plurality of cameras, the adjusted coordinates of the at least eight three-dimensional points corresponding to each camera group, and the distortion coefficient of each camera in the plurality of cameras, wherein a function value of the cost function is used for indicating the sum of the pixel error and the scale error of the plurality of cameras;
adjusting a calibration parameter of each camera in the plurality of cameras, a distortion coefficient of each camera in the plurality of cameras and coordinates of at least eight three-dimensional points corresponding to each adjusted camera group so as to minimize a function value of the cost function;
and updating the distortion coefficient of each camera in the plurality of cameras to the adjusted distortion coefficient of the corresponding camera, and updating the calibration parameter of each camera in the plurality of cameras to the adjusted calibration parameter of the corresponding camera.
In a second aspect, a multi-camera calibration apparatus is provided, the apparatus comprising:
the system comprises a first calibration module, a second calibration module and a third calibration module, wherein the first calibration module is used for selecting any two cameras from a plurality of cameras as an initial camera set, and calibrating the two cameras included in the initial camera set by using images acquired by the initial camera set to obtain calibration parameters;
the second calibration module is configured to use a camera to be calibrated and a first calibration camera as a first camera set, use the camera to be calibrated and a second calibration camera as a second camera set, calibrate the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, where the first calibration camera and the second calibration camera refer to any two cameras among the calibrated cameras.
Optionally, the first calibration module includes:
the first determining unit is used for determining a basic matrix between two cameras included in the initial camera group based on the images which are collected by the initial camera group and meet the calibration condition;
the first calibration unit is used for determining a projection matrix of a first camera in the initial camera group as a preset projection matrix and determining calibration parameters of the first camera based on the preset projection matrix;
the first calibration unit is further configured to determine a projection matrix of a second camera in the initial camera group based on a base matrix between two cameras in the initial camera group, and determine calibration parameters of the second camera based on the projection matrix of the second camera.
Optionally, the second calibration module includes:
a second determining unit, configured to determine a basis matrix between two cameras in a corresponding camera group based on images that satisfy a calibration condition and are acquired by each of the first camera group and the second camera group;
a third determining unit, configured to determine a projection matrix of the camera to be calibrated based on the projection matrix of the first calibration camera, the projection matrix of the second calibration camera, and a basis matrix between two cameras in each of the first camera group and the second camera group;
and the second calibration unit is used for determining calibration parameters of the camera to be calibrated based on the projection matrix of the camera to be calibrated.
Optionally, the images meeting the calibration condition refer to at least four groups of images obtained by image acquisition of the calibration rod at different positions by the corresponding camera group, each group of the at least four groups of images includes two images obtained by image acquisition of the calibration rod at the same position by the two cameras in the corresponding camera group, and the two images both include all the marker points on the calibration rod.
Optionally, the second determining unit includes:
the first obtaining subunit is configured to obtain at least two image point pairs from each of at least four groups of images acquired by each of the first camera group and the second camera group, so as to obtain at least eight image point pairs;
a first determining subunit, configured to obtain image coordinates of two image points included in each of the at least eight image point pairs, and determine a basis matrix between two cameras in the corresponding camera group based on the image coordinates of the two image points included in each of the at least eight image point pairs.
Optionally, the calibration rod includes at least four marker points, and three of the at least four marker points are collinear;
the first obtaining subunit is specifically configured to:
detecting at least four marking points in a first image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the first image;
detecting at least four marking points in a second image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the second image;
determining at least one image point pair according to at least one mark point left in the first image except the three collinear mark points and at least one mark point left in the second image except the three collinear mark points, and determining a mark point located in the middle of the three collinear mark points in the first image and a mark point located in the middle of the three collinear mark points in the second image as one image point pair.
Optionally, the apparatus further comprises:
the first acquisition module is used for acquiring at least eight image point pairs from the images which are acquired by each camera group and meet the calibration conditions;
the first determining module is used for determining the coordinates of at least eight three-dimensional points corresponding to each camera group based on the image coordinates of each image point in at least eight image point pairs corresponding to each camera group and the projection matrixes of two cameras in the corresponding camera group;
the first determining module is further configured to determine a reprojection error based on the image coordinates of each image point in the at least eight image point pairs corresponding to each camera group, the coordinates of the at least eight three-dimensional points corresponding to each camera group, and the projection matrix of each camera in the plurality of cameras;
a first adjusting module, configured to adjust coordinates of at least eight three-dimensional points corresponding to each camera group and a projection matrix of each camera of the multiple cameras under a condition that an image coordinate of each image point in at least eight image point pairs corresponding to each camera group is unchanged, so as to minimize the reprojection error;
and the first updating module is used for updating the calibration parameters of the corresponding camera based on the adjusted projection matrix of each camera in the plurality of cameras.
Optionally, the calibration rod includes at least four marker points, and three of the at least four marker points are collinear;
the device further comprises:
the second determining module is used for determining at least four estimated distances between a mark point positioned at the middle position in the three collinear mark points on the calibration rod and any mark point in the rest at least one mark point based on the adjusted coordinates of at least eight three-dimensional points corresponding to each camera set;
the second acquisition module is used for acquiring the actual measurement distance between the mark point positioned in the middle of the three collinear mark points on the calibration rod and any one of the rest at least one mark point;
a second adjusting module, configured to generate a cost function based on the at least four estimated distances, the measured distance, the calibration parameter of each camera in the plurality of cameras, the adjusted coordinates of the at least eight three-dimensional points corresponding to each camera group, and the distortion coefficient of each camera in the plurality of cameras, where a function value of the cost function is used to indicate a sum of pixel errors and scale errors of the plurality of cameras;
the second adjusting module is further configured to adjust a calibration parameter of each of the plurality of cameras, a distortion coefficient of each of the plurality of cameras, and coordinates of at least eight three-dimensional points corresponding to each adjusted camera group, so as to minimize a function value of the cost function;
and the second updating module is used for updating the distortion coefficient of each camera in the plurality of cameras into the adjusted distortion coefficient of the corresponding camera, and updating the calibration parameter of each camera in the plurality of cameras into the adjusted calibration parameter of the corresponding camera.
In a third aspect, a multi-camera calibration apparatus is provided, the apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor executes executable instructions in the memory to perform any of the methods of the first aspect.
In a fourth aspect, a computer-readable storage medium is provided, in which a computer program is stored, which computer program, when being executed by a processor, carries out any of the methods of the first aspect.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise: selecting any two cameras from the multiple cameras as an initial camera set, and calibrating the two cameras included in the initial camera set by using images acquired by the initial camera set to obtain calibration parameters; the method comprises the steps of taking a camera to be calibrated and a first calibration camera as a first camera set, taking the camera to be calibrated and a second calibration camera as a second camera set, calibrating the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, wherein the first calibration camera and the second calibration camera refer to any two cameras in the calibrated cameras. Therefore, in the embodiment of the application, as long as two cameras can acquire images meeting the calibration conditions, the images can be used as an initial camera set to calibrate the two cameras, and for the rest of the cameras, the calibration can be completed as long as the images meeting the calibration conditions are acquired by any two cameras in the calibrated cameras.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a system architecture diagram of a multi-camera calibration method provided in an embodiment of the present application;
fig. 2 is a flowchart of a multi-camera calibration method provided in an embodiment of the present application;
fig. 3 is a flowchart of a multi-camera calibration method provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of extracting a marker point from an image according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a multi-camera calibration device provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal for multi-camera calibration according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Before explaining the embodiments of the present application in detail, a system architecture related to the embodiments of the present application will be described.
Fig. 1 is a system architecture diagram illustrating a multi-camera calibration method according to an embodiment of the present application. As shown in fig. 1, the system may comprise a plurality of cameras 101 and 104 and a terminal 105, wherein the working areas between the cameras 101, 102, 103 and 104 are shown as shaded parts in fig. 1. The terminal establishes a communication connection with each camera.
In the embodiment of the application, when the camera calibration is performed, a worker can hold the calibration rod by hand to swing freely in the working area. The camera 101, the camera 102, the camera 103 and the camera 104 can synchronously acquire images of the calibration rod according to a preset period. After the images are acquired, the camera 101, the camera 102, the camera 103 and the camera 104 can acquire the images at the same time and send the images to the terminal, and the terminal can screen and process the images acquired by the plurality of cameras and calibrate the plurality of cameras according to the screened and processed images.
It should be noted that, the above is only an exemplary system composed of four cameras for explanation, and in practical applications, the system may include at least three cameras, or may include more cameras, that is, the number of cameras in the system architecture is not limited to the number of cameras in the system architecture according to the embodiments of the present application.
Next, a multi-camera calibration method provided in the embodiment of the present application is described.
Fig. 2 is a flowchart of a multi-camera calibration method provided in an embodiment of the present application, which may be applied to a terminal in the system shown in fig. 1, as shown in fig. 2, and the method includes the following steps:
step 201: any two cameras are selected from the multiple cameras to serve as an initial camera set, and the two cameras included in the initial camera set are calibrated by using images acquired by the initial camera set to obtain calibration parameters.
The calibration parameters comprise internal parameters and external parameters of the camera. The images acquired by the initial camera set comprise at least four groups of images, the at least four groups of images are acquired by two cameras in the initial camera set for acquiring images of the calibration rods at four different positions, each group of images comprises two images acquired by the two cameras in the initial camera set for shooting the calibration rods at the same position, and the two images respectively comprise all the marking points on the calibration rods.
In addition, the calibration rod may include at least four marker points thereon, three of the at least four marker points being collinear. For example, the calibration rod may include four marker points, three marker points are collinear, and a line between the fourth marker point and a middle marker point of the three marker points is perpendicular to a straight line of the three collinear marker points.
It should be noted that, in order to better identify the mark point on the calibration rod in the image, in the embodiment of the present application, the surface of the mark point may be coated with a reflective material, or the color of the surface of the mark point may be different from the color of other parts on the calibration rod.
Step 202: the camera to be calibrated and the first calibration camera are used as a first camera set, the camera to be calibrated and the second calibration camera are used as a second camera set, and the camera to be calibrated is calibrated based on the image acquired by the first camera set and the image acquired by the second camera set to obtain calibration parameters.
The cameras to be calibrated refer to cameras which are not calibrated in the plurality of cameras, and the first calibration camera and the second calibration camera refer to any two cameras in the plurality of cameras which are calibrated.
It should be noted that, if the camera to be calibrated is the camera calibrated third after calibrating two cameras in the initial camera group, the first calibration camera and the second calibration camera are the two cameras in the calibrated initial camera group. If the camera to be calibrated is not the third calibrated camera, the first calibrated camera and the second calibrated camera may refer to two cameras in the initial camera group, or may refer to other calibrated cameras except the two cameras in the initial camera group.
In the embodiment of the application, any two cameras are selected from a plurality of cameras to serve as an initial camera set, and the two cameras included in the initial camera set are calibrated by using images acquired by the initial camera set to obtain calibration parameters; the method comprises the steps of taking a camera to be calibrated and a first calibration camera as a first camera set, taking the camera to be calibrated and a second calibration camera as a second camera set, calibrating the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, wherein the first calibration camera and the second calibration camera refer to any two cameras in the calibrated cameras. Therefore, in the embodiment of the application, as long as two cameras can acquire images meeting the calibration conditions, the images can be used as an initial camera set to calibrate the two cameras, and for the rest of the cameras, the calibration can be completed as long as the images meeting the calibration conditions are acquired by any two cameras in the calibrated cameras.
Fig. 3 is a flowchart of a multi-camera calibration method according to an embodiment of the present application. The method can be applied to the terminal in the system shown in fig. 1, and as shown in fig. 3, the method comprises the following steps:
step 301: any two cameras are selected from the multiple cameras to serve as an initial camera set, and images which are acquired by the initial camera set and meet calibration conditions are acquired.
The terminal may select any two cameras from the multiple cameras as the initial camera group, for example, three cameras to be calibrated in the multiple-camera system are respectively a camera a, a camera B, and a camera C, and then the terminal may use the camera a and the camera B as the initial camera group, may also use the camera B and the camera C as the initial camera group, and of course, may also use the camera a and the camera C as the initial camera group.
After the initial camera set is obtained, the terminal can acquire the images which are acquired by the initial camera set and meet the calibration conditions. The images which are acquired by the initial camera set and meet the calibration condition refer to at least four groups of images which are acquired by image acquisition of the calibration rods at different positions by the camera set, each group of images in the at least four groups of images comprises two images which are acquired by image acquisition of the calibration rods at the same position by two cameras in the corresponding camera set, and the two images both comprise all the marking points on the calibration rods.
For example, in the present embodiment, a worker may hold a calibration bar and swing the bar randomly over the work area of multiple cameras. The cameras can synchronously acquire images at regular intervals and send the images acquired at each moment to the terminal. When the terminal receives a plurality of images sent by a plurality of cameras at a certain moment, the terminal can acquire two images acquired by two cameras included in the initial camera group from the plurality of images, and if the acquired two images all include all the mark points on the calibration rod, the acquired two images can be used as a group of images which are acquired by the initial camera group and meet the calibration condition. If any one of the two images does not contain all the mark points, the two images do not meet the calibration condition, that is, the two images cannot be used as the images for calibrating the initial camera set. By the method, the terminal can search at least four groups of images which are collected by two cameras and meet the calibration condition and are included in the initial camera group from the images received at multiple moments.
However, since the worker continues to swing the calibration stick, the positions of the calibration stick at the respective times are different, and the position of the calibration stick at a certain time is fixed every time the plurality of cameras synchronously capture images at the certain time, the calibration stick included in the plurality of images captured at the certain time is also the calibration stick at the same position. In this way, each of the at least four sets of images which satisfy the calibration condition and are acquired by the terminal is an image acquired at the same time, so that the calibration rods included in the two images in each set of images are the calibration rods located at the same position, and the images in each set are acquired from the images acquired at different times, so that the positions of the calibration rods corresponding to the at least four sets of images are different.
After acquiring the image which is collected by the initial camera set and meets the calibration condition, the terminal may calibrate the two cameras included in the initial camera set through steps 302 and 304.
Step 302: and determining a basic matrix between the two cameras included by the initial camera based on the images which are acquired by the initial camera group and meet the calibration condition.
After the images which are collected by the initial camera group and meet the calibration condition are acquired, the terminal can determine the basic matrix between two cameras in the initial camera group by acquiring at least four groups of images through the steps 3021-3022.
3021: and acquiring at least two image point pairs from each group of images in at least four groups of images corresponding to the initial camera group to obtain at least eight image point pairs.
The images which are acquired by the terminal and meet the calibration condition and acquired by the initial camera set comprise at least four groups of images, at least two image point pairs are acquired from each group of images, and at least eight image point pairs can be acquired from the at least four groups of image pairs.
For example, in the embodiment of the present application, the calibration rod for camera calibration may include at least four calibration points thereon, wherein three calibration points are collinear. Based on the above, for any one of the at least four groups of images, which is marked as a, the terminal can detect at least four marker points in a first image in the group a of images, and determine a marker point located at a middle position among three collinear marker points in the first image; detecting at least four marking points in a second image in the group A images, and determining a marking point positioned in the middle of three collinear marking points in the second image; and determining at least one image point pair according to the at least one mark point left in the first image except the three collinear mark points and the at least one mark point left in the second image except the three collinear mark points, and determining a mark point positioned in the middle of the three collinear mark points in the first image and a mark point positioned in the middle of the three collinear mark points in the second image as one image point pair.
The terminal can detect the at least four marking points in the first image of the group A images because the at least four marking points are included on the marking rod and three marking points of the at least four marking points are collinear. After detecting the at least four marker points, the terminal may determine three marker points that are collinear from the at least four marker points by a linear fit. Then, the terminal can acquire the image coordinates of the collinear three marking points and sequence the x coordinates or the y coordinates in the image coordinates of the three marking points so as to determine the marking point located at the middle position in the three marking points. After the collinear three marking points are determined, for the marking points except the three marking points in the at least four marking points, the terminal can distinguish the remaining marking points according to the position relation between each marking point in the remaining marking points and the marking point located at the middle position. For the second image in the group a images, the terminal may refer to the method for processing the first image, and process the second image, so as to extract the mark point located at the middle position and the remaining mark points from the second image.
After the middle-located mark point and the remaining mark points are determined, the terminal may take the middle-located mark point in the first image and the middle-located mark point in the second image as one image point pair. For the remaining mark points, the terminal can determine which mark point in the first image and which mark point in the second image correspond to the same mark point on the mark rod according to the position relationship between each mark point in the remaining mark points and the mark point located at the middle position, and further take the pair of determined mark points as an image point pair.
Fig. 4 is a schematic view of a calibration rod according to an embodiment of the present application. As shown in the upper diagram of fig. 4, the calibration rod includes four marking points a, b, c, d. The mark points a, b and c are three collinear mark points, and the mark point b is located between the mark point a and the mark point c, namely, the mark point b is a mark point located in the middle. The first image in the group a images is shown in the lower left diagram of fig. 4 and the second image in the group a images is shown in the lower right diagram of fig. 4. The terminal can detect four marking points in the first image and the second image respectively, and after the four marking points are detected, the terminal can perform straight line fitting in the first image firstly because it is unclear which marking point in the first image and which marking point in the second image is the same marking point on the corresponding calibration rod, so that the collinear three marking points a are determined1、b1And c1Acquiring the image coordinates of the three marking points, and comparing the x coordinate and the y coordinate in the image coordinates of the three marking points, wherein the marking point b1Is smaller than the x coordinate at the mark point a1And c1Between x coordinates of, or, mark point b1Is smaller than the y coordinate at the mark point a1And c1In this case, the terminal may determine the mark point b1I.e. the mark point located at the middle position, i.e. mark point b1Namely the corresponding image point of the marking point b on the calibration rod in the first image. Since there are four markers in total, the remaining marker d is determined after the collinear three markers are determined1Namely the image point corresponding to the marking point d on the calibration rod in the first image. For the second image, the terminal can extract the mark point b from the second image by the same method2And d2The mark point b2In fact, the marking point b on the marking rod corresponds to the image point in the second image, and the marking point d2Namely the image point corresponding to the marking point d on the calibration rod in the second image. Then, because of the mark point b1And b2Are all image points corresponding to the marking point b on the calibration rod, therefore, the marking point b can be used1And b2Marking the point d as an image point pair1And d2Are all image points corresponding to the marking point d on the marking rod, therefore, the marking point d can be marked1And d2As a pair of image points.
For each of at least four groups of images corresponding to the initial camera group, the terminal may determine at least two image point pairs corresponding to each group of images by referring to the method described above, so as to obtain at least eight image point pairs.
3022: acquiring the image coordinates of two image points included in each image point pair of at least eight image point pairs, and determining a basic matrix between two cameras included in the initial camera group based on the image coordinates of the two image points included in each image point pair of at least eight image point pairs
Wherein the basis matrix between two cameras can be represented as:
Figure BDA0001721360920000131
and, the base matrix satisfies x'TFx ═ 0, where x' means the figureThe homogeneous coordinates of the image coordinates of one image point of the pair of image points, and x is the homogeneous coordinates of the image coordinates of the other image point of the pair of image points.
For example, if the image coordinates of one image point in the pair of image points is (x, y) and the image coordinates of the other image point is (x ', y '), then x ' and x are respectively:
Figure BDA0001721360920000132
according to the acquired image coordinates of two image points included in each of the at least eight image point pairs, passing through x'TIf Fx is 0, it may be determined to obtain a basis matrix between two cameras included in the initial camera group.
Step 303: determining a projection matrix of a first camera in the initial camera set as a preset projection matrix, and determining calibration parameters of the first camera based on the preset projection matrix.
After determining a base matrix between two cameras included in the initial camera group, the terminal may determine a projection matrix of a first camera in the initial camera group as a preset projection matrix. Wherein, the first camera may be any one of two cameras included in the initial camera group, and the preset projection matrix may be represented as P1=[I|0]Wherein, I is an identity matrix.
In general, the projection matrix can be expressed as: p ═ K [ R | t ], where K is the camera's intrinsic parameters, i.e., the camera's intrinsic parameters, R is the camera's rotation matrix, t is the camera's translation matrix, and R and t are collectively referred to as the camera's extrinsic parameters. Based on this, after determining the projection matrix of the first camera, the terminal may perform QR decomposition on a left 3 × 3 matrix of the projection matrix of the first camera, and determine a non-singular upper triangular matrix obtained after the decomposition as an internal reference matrix K of the corresponding camera, and determine an orthogonal matrix obtained after the decomposition as a rotation matrix R of the corresponding camera. Then, the terminal may calculate a translation matrix t of the first camera according to the projection matrix of the first camera, the determined internal reference matrix of the first camera, and the rotation matrix.
Step 304: and determining a projection matrix of a second camera in the initial camera set based on a basic matrix between two cameras in the initial camera set, and determining calibration parameters of the second camera based on the projection matrix of the second camera.
After the projection matrix of one camera in the initial camera group is set as the preset projection matrix, the projection matrix of another camera in the initial camera group can be determined by the basis matrix between the two cameras, that is, in the case that the projection matrix of the first camera is the preset projection matrix, the projection matrix of the second camera can be determined by the following formulas (1) and (2) according to the basis matrix between the two cameras, wherein the second camera is another camera in the initial camera group except the first camera.
P1'=[[e']×Fe'](1)
FTe'=0 (2)
Wherein, P1'is a projection matrix of the second camera, and F is a basis matrix between the first camera and the second camera, [ e']×Is an e' skew symmetric matrix.
After determining the projection matrix of the second camera, the terminal may determine the calibration parameters of the second camera with reference to the method for determining the calibration parameters based on the projection matrix described in step 303.
Step 305: the camera to be calibrated and the first calibration camera are used as a first camera set, the camera to be calibrated and the second calibration camera are used as a second camera set, and the camera to be calibrated is calibrated based on the image acquired by the first camera set and the image acquired by the second camera set to obtain calibration parameters.
After calibrating the two cameras included in the initial camera group, the terminal may arbitrarily select one camera from the remaining cameras in the multiple cameras as a camera to be calibrated, and combine the camera to be calibrated and each camera in the two cameras in the initial camera group to obtain the first camera group and the second camera group. At this time, the first calibration camera and the second calibration camera are two calibrated cameras in the initial camera set. Then, the terminal can acquire the images which are acquired by the first camera group and meet the calibration conditions, so that a basic matrix between two cameras in the first camera group is determined based on the images which are acquired by the first camera group and meet the calibration conditions.
The implementation manner of acquiring the image meeting the calibration condition and acquired by the terminal through the first camera set and acquiring the image meeting the calibration condition and acquired by the second camera set may refer to the related implementation manner in step 301, which is not described herein again in this embodiment of the application.
In addition, the implementation manner in which the terminal determines the basis matrix between two cameras in the first camera group based on the image that satisfies the calibration condition and is acquired by the first camera group, and determines the basis matrix between two cameras in the second camera group based on the image that satisfies the calibration condition and is acquired by the second camera group may refer to the related implementation manner in step 302, which is not described herein again in this embodiment of the application.
After determining the basis matrix between two cameras in the first camera group and the basis matrix between two cameras in the second camera group, the terminal may determine the projection matrix of the camera to be calibrated according to the projection matrix of the first calibration camera included in the first camera group, the projection matrix of the second calibration camera included in the second camera group, and the basis matrix between two cameras included in each of the first camera group and the second camera group.
For example, since the calibration of the first calibration camera in the first camera group is completed, the projection matrix of the first calibration camera may be directly obtained, and similarly, the projection matrix of the second calibration camera may also be directly obtained.
After the projection matrixes of the first calibration camera and the second calibration camera are acquired, the terminal can randomly generate at least six three-dimensional points. Then, the terminal can project at least six three-dimensional points into an image coordinate system of the first calibration camera through a projection matrix of the first calibration camera to obtain at least six two-dimensional points corresponding to the first calibration camera, and project at least six three-dimensional points into an image coordinate system of the second calibration camera through a projection matrix of the second calibration camera to obtain at least six two-dimensional points corresponding to the second calibration camera. Then, the terminal can determine the epipolar line corresponding to each two-dimensional point in the at least six two-dimensional points in the camera to be calibrated through the following formula (3) according to each two-dimensional point in the at least six two-dimensional points corresponding to the first calibration camera and the basic matrix between the first calibration camera and the camera to be calibrated; in addition, the terminal may determine, according to each of the at least six two-dimensional points corresponding to the second calibration camera and the base matrix between the second calibration camera and the camera to be calibrated, the epipolar line corresponding to each of the at least six two-dimensional points in the camera to be calibrated by the following formula (3).
eik=FiAxik(3)
The calibration method comprises the following steps that i is used for indicating a first calibration camera or a second calibration camera, k is used for indicating the kth three-dimensional point in at least six three-dimensional points, and A is used for indicating a camera to be calibrated.
When i is used to indicate the first calibration camera, eikRefers to the epipolar line corresponding to the kth two-dimensional point in the camera to be calibrated in at least six two-dimensional points corresponding to the first calibration camera, FiAReferring to the fundamental matrix, x, between the first calibration camera and the phase-to-phase calibration cameraikThe homogeneous coordinate of the kth two-dimensional point is obtained after the kth three-dimensional point in at least six three-dimensional points is projected into an image coordinate system of the first calibration camera.
When i is used to indicate a second calibration camera, eikThe epipolar line F corresponding to the kth two-dimensional point in the camera to be calibrated in at least six two-dimensional points corresponding to the second calibration cameraiARefers to the fundamental matrix, x, between the second calibration camera and the camera to be calibratedikThe homogeneous coordinate of the kth two-dimensional point is obtained after the kth three-dimensional point in the at least six three-dimensional points is projected into an image coordinate system of the second calibration camera.
After determining an epipolar line corresponding to each two-dimensional point in the to-be-calibrated camera of each of the at least six two-dimensional points corresponding to the first calibration camera and an epipolar line corresponding to each two-dimensional point in the to-be-calibrated camera of each of the at least six two-dimensional points corresponding to the second calibration camera, the terminal may determine an intersection point between the epipolar lines respectively corresponding to the two-dimensional points obtained by projecting the same three-dimensional point in the two cameras in the camera a, and determine the intersection point as the two-dimensional point obtained by projecting the three-dimensional point in the camera a.
For example, assuming that the three-dimensional point a is one of at least six three-dimensional points, the terminal may project the three-dimensional point a into an image coordinate system of a first calibration camera to obtain a two-dimensional point a1 corresponding to the first calibration camera, and project the three-dimensional point a into an image coordinate system of a second calibration camera to obtain a two-dimensional point a2 corresponding to the second calibration camera. Then, the terminal may determine the epipolar line e of the two-dimensional point a1 corresponding to the camera to be calibrated in formula (3) according to the basic matrix between the first calibration camera and the camera to be calibrateda1Determining the polar line e corresponding to the two-dimensional point a2 in the camera to be calibrated by the formula (3)a2Determining ea1And ea2The intersection point a3, and the intersection point a3 is a two-dimensional point obtained by projecting the three-dimensional point a to the image coordinate system of the camera to be calibrated.
For each three-dimensional point in the at least six three-dimensional points, the method can determine and obtain the corresponding two-dimensional point after the corresponding three-dimensional point is projected into the image coordinate system of the camera to be calibrated, so that the at least six two-dimensional points corresponding to the at least six three-dimensional points in the image coordinate system of the camera to be calibrated can be obtained according to the at least six three-dimensional points.
After determining at least six two-dimensional points corresponding to the at least six three-dimensional points in the image coordinate system of the camera to be calibrated, the terminal may determine the projection matrix of the camera to be calibrated according to the three-dimensional coordinates of the at least six three-dimensional points and the homogeneous coordinates of the at least six two-dimensional points corresponding to the at least six two-dimensional points in the image coordinate system of the camera to be calibrated by the following formula (4).
xk×(PAXk)=0 (4)
Wherein x iskRefers to the homogeneous coordinate, P, of the kth two-dimensional point corresponding to the kth three-dimensional point in the image coordinate system of the camera to be calibratedARefers to the projection moment of the camera A to be calibratedArray, XkRefers to the homogeneous coordinates of the kth three-dimensional point of the at least six three-dimensional points.
After determining the projection matrix of the camera to be calibrated, the terminal may determine the calibration parameters of the camera to be calibrated by referring to the implementation manner described in step 303, which is related to determining the calibration parameters based on the projection matrix, and this embodiment of the present application is not described herein again.
After the calibration of the camera to be calibrated is completed, the camera to be calibrated becomes a calibrated camera, then, the terminal may select one camera again from the remaining uncalibrated cameras in the plurality of cameras as the camera to be calibrated, then, the terminal may combine any two cameras in the camera to be calibrated and the calibrated camera respectively to obtain a new first camera set and a new second camera set, and calibrate the camera to be calibrated to obtain calibration parameters according to the method for calibrating the camera to be calibrated described above. At this time, one of the first calibration camera and the second calibration camera, which form the first camera group and the second camera group with the camera to be calibrated, may be any one of the initial camera groups, and the other may be other calibrated cameras except for the two cameras in the initial camera group. Alternatively, the first calibration camera and the second calibration camera may still be two cameras in the initial set of cameras. By analogy, when each camera to be calibrated is calibrated, calibration parameters can be obtained by calibrating images acquired by two camera groups respectively combined with any two cameras in the cameras which are calibrated before, so that calibration of a plurality of cameras is completed.
After the calibration parameters are obtained by calibrating the plurality of cameras through the above steps 301-305, in order to further improve the accuracy of the calibration parameters, the terminal may further optimize the calibration parameters of each of the plurality of cameras through the following manner.
For example, the terminal may determine coordinates of at least eight three-dimensional points corresponding to each camera group based on the image coordinates of each image point of the at least eight image point pairs corresponding to each camera group and the projection matrix of the two cameras included in the corresponding camera group; determining a reprojection error based on the image coordinates of each image point in the at least eight image point pairs corresponding to each camera group, the coordinates of the at least eight three-dimensional points corresponding to each camera group, and the projection matrix of each camera in the plurality of cameras; under the condition that the image coordinates of each image point in at least eight image point pairs corresponding to each camera group are unchanged, the coordinates of at least eight three-dimensional points corresponding to each camera group and the projection matrix of each camera in the plurality of cameras are adjusted to minimize the reprojection error; and updating the calibration parameters of the corresponding camera based on the adjusted projection matrix of each camera in the plurality of cameras.
Based on the foregoing description, when determining each camera to be calibrated, two camera sets are corresponding to each other, and a plurality of camera sets can be obtained by adding the initial camera set to each of the two camera sets, where each of the plurality of camera sets corresponds to at least four groups of images, and at least two image point pairs can be obtained from each of the at least four groups of images, that is, each camera set corresponds to at least eight image point pairs. Wherein each image point pair comprises two image points. Based on this, for any camera group a, the terminal may determine to obtain the coordinates of a three-dimensional point according to the image coordinates of two image points included in any image point pair of at least eight image point pairs corresponding to the camera group a and the projection matrices of two cameras in the camera group a, that is, may obtain the coordinates of a three-dimensional point according to an image point pair of the camera group a and the projection matrices of two cameras, where the three-dimensional point is actually the three-dimensional point corresponding to two image points in the image point pair in the world coordinate system. Because the camera group A is corresponding to at least eight image point pairs, the coordinates of at least eight three-dimensional points can be correspondingly obtained.
Specifically, in this embodiment of the application, the terminal may determine, according to the image coordinates of two image points in any image point pair and the projection matrices of two cameras, to obtain the coordinates of a three-dimensional point corresponding to the image point pair through the following formula (5).
xi×(PiX)=0 (5)
Wherein x isiRefers to the camera in the camera group in the image point pairi homogeneous coordinates of the image points taken, PiRefers to the projection matrix of the camera i, and X refers to the homogeneous coordinates of the three-dimensional points corresponding to the image point pairs.
By the above manner, the terminal can determine and obtain at least eight three-dimensional point coordinates corresponding to each of the plurality of camera groups, and then, the terminal can determine a reprojection error based on the image coordinates of each image point in at least eight image point pairs corresponding to each of the plurality of cameras, the coordinates of at least eight three-dimensional points corresponding to each of the camera groups, and the projection matrix of each camera. At this time, the reprojection error can be expressed by the following equation (6).
Figure BDA0001721360920000181
Where E is the reprojection error and i is used to indicate the camera i, xikIs the homogeneous coordinate corresponding to the image coordinate of the k-th image point, and the k-th image point is acquired by a camera i, XkCoordinates of three-dimensional points corresponding to the kth image point, PiIs the projection matrix of camera i.
After determining the reprojection error, the terminal may keep the image coordinates of each image point in equation (6) unchanged and then minimize the reprojection error by continuously adjusting the coordinates of the three-dimensional points and the projection matrix of each camera. Specifically, the terminal can continuously adjust X through a Levenberg-Marquardt algorithmkAnd PiTo optimize the reprojection error. The implementation manner of minimizing the reprojection error through the Levenberg-Marquardt algorithm may refer to the Levenberg-Marquardt algorithm in the related art, and the embodiment of the present application is not described herein again.
After minimizing the reprojection error, the terminal may determine the intrinsic and extrinsic parameters of the respective camera from the adjusted projection matrix of each camera. The following explanation will be given by taking any one of the cameras a as an example.
Specifically, since the projection matrix optimized by minimizing the reprojection error does not generally satisfy the euclidean space, the terminal may determine the invertible matrix H, thereby converting the adjusted projection matrix of the camera a into the euclidean space.
Therein, the internal reference matrix of a camera can be generally represented as follows:
Figure BDA0001721360920000191
in the embodiment of the present application, the terminal may assume fx=fy,s=0,u0=v00, wherein the camera's internal reference matrix and projection matrix satisfy the following geometric relationship:
Figure BDA0001721360920000192
wherein P is a projection matrix of the camera,
Figure BDA0001721360920000193
can be expressed as follows:
Figure BDA0001721360920000194
according to the formula, the compound has the advantages of,
Figure BDA0001721360920000195
including 10 unknowns, at fx=fy,s=0,u0=v0In the case of 0, four information pairs can be obtained from the internal reference matrix and the projection matrix of one camera
Figure BDA0001721360920000196
So that if it is to calculateAt least three cameras 'internal reference matrix and projection matrix are needed, so the terminal can be based on the three cameras' projection matrix sum at fx=fy,s=0,u0=v0The internal reference matrix in the case of 0 is calculatedTo obtain
In addition, P is satisfied between the projection matrix and the reversible matrix H of any one of the plurality of camerasiH=Ki[Ri|ti]For the first camera in the initial set of cameras, since P1=[I|0],fx=fy=f,s=0,u0=v0An assumption of 0, thus yields:
Figure BDA0001721360920000199
Figure BDA00017213609200001910
in addition to this, the present invention is,
Figure BDA0001721360920000201
the following geometrical relationship is satisfied with the reversible matrix H:
since it has been calculated as described above
Figure BDA0001721360920000203
Therefore, according to the above equations (7) to (9), p and f can be determined, and after determining p and f, the invertible matrix H can be determined according to the above equation (7).
After the reversible matrix is determined, the terminal can convert the projection matrix of the camera A and the coordinates of at least eight three-dimensional points corresponding to the camera group where the camera A is located into the Euclidean space through the reversible matrix. Then, the terminal may perform QR decomposition on the converted projection matrix of the camera a, so as to obtain an internal parameter matrix of the camera a, that is, an internal parameter of the camera a and a rotation matrix of the camera. Then, the terminal may determine a translation matrix of camera a according to the projection matrix of camera a, the internal reference matrix of camera a, and the rotation matrix of camera a. The rotation matrix and the translation matrix of the camera A are external parameters of the camera A.
The foregoing mainly describes how to determine the internal parameters and the external parameters of the camera a based on the adjusted projection matrix of the camera a by taking any one of the plurality of cameras as an example, and for other cameras in the plurality of cameras, the internal parameters and the external parameters of each camera can be determined and obtained by referring to the foregoing method, which is not described herein again in this embodiment of the present application.
After the calibration parameters of each of the plurality of cameras are optimized by the method, the terminal may further determine distortion coefficients of each of the plurality of cameras by the method described below, and further optimize the calibration parameters of each of the plurality of cameras.
For example, after determining the internal parameter and the external parameter of each of the plurality of cameras, if the calibration bar includes at least four marker points, and three of the marker points are collinear, the terminal may obtain a measured distance between a middle marker point of the collinear three marker points on the calibration bar and any one of the remaining at least one marker point, and then the terminal may determine at least four estimated distances between the middle marker point of the collinear three marker points on the calibration bar and the any one marker point according to coordinates of at least eight three-dimensional points corresponding to each of the adjusted plurality of camera groups converted into the euclidean space. Generating a cost function based on at least four estimated distances, the actually measured distances, the internal parameters and the external parameters of each camera in the plurality of cameras, the coordinates of at least eight three-dimensional points corresponding to each camera in the plurality of adjusted camera groups and the distortion coefficients of each camera in the plurality of cameras, wherein the function value of the cost function is used for indicating the sum of the pixel error and the scale error of each camera in the plurality of cameras; adjusting the internal parameters and the external parameters of each camera in the plurality of cameras, the distortion coefficient of each camera in the plurality of cameras and the coordinates of at least eight three-dimensional points corresponding to each camera group in the plurality of adjusted camera groups so as to minimize the function value of the cost function; and updating the distortion coefficient of each camera in the plurality of cameras to the adjusted distortion coefficient of the corresponding camera, updating the internal parameters of each camera in the plurality of cameras to the adjusted internal parameters of the corresponding camera, and updating the external parameters of each camera in the plurality of cameras to the adjusted external parameters of the corresponding camera.
As shown in fig. 4, if the calibration rod includes four marker points and three marker points are collinear, the terminal may obtain an actual measurement distance between a marker point located at a middle position among the three marker points and any remaining marker point, where the actual measurement distance is measured by a worker. As can be seen from the foregoing description, for any camera group a, at least eight image point pairs corresponding to the camera group a are extracted from at least four groups of images, that is, each group of images includes at least two image point pairs, and each image point pair corresponds to at least two three-dimensional points, that is, each group of images corresponds to at least two three-dimensional points, based on which, the terminal can determine a mark point located at an intermediate position and a distance between the mark point and any one of the at least two three-dimensional points corresponding to each group of images, so as to obtain at least four estimated distances. Then, the terminal may generate a cost function according to the measured distance, the at least four estimated distances, the internal parameter and the external parameter of each of the plurality of cameras, the coordinates of the at least eight three-dimensional points corresponding to each of the plurality of cameras after being adjusted to be converted into the euclidean space, and the distortion coefficient of each of the plurality of cameras, where the cost function may be represented as follows:
Figure BDA0001721360920000211
wherein e is a cost function, w is a predetermined weight, dlFor the ith of the at least four estimated distances,
Figure BDA0001721360920000212
is the measured distance.
It should be noted that, in the following description,
Figure BDA0001721360920000213
and
Figure BDA0001721360920000214
can be represented by the following formulae (11) to (14),and
Figure BDA0001721360920000216
the expression can be expressed by the following formulas (15) to (18).
Figure BDA0001721360920000217
Figure BDA0001721360920000218
Figure BDA0001721360920000219
Figure BDA00017213609200002110
Wherein (u)k,vk) Is the image coordinate of the K-th image point, K is the internal reference matrix of the camera acquiring the K-th image point, n1、n2And n3Radial distortion factor, p, of camera for acquiring the k-th image point1And p2Is the tangential distortion coefficient of the camera that acquired the k-th image point.
Figure BDA00017213609200002111
Figure BDA0001721360920000221
Wherein (X)k,Yk,Zk) And the coordinates of the three-dimensional point corresponding to the kth image point are obtained, R is a rotation matrix of the camera acquiring the kth image point, and t is a translation matrix of the camera acquiring the kth image point.
After the cost function is constructed, the terminal can enable the initial value of each distortion coefficient to be 0, and then the terminal can continuously adjust the coordinates and the distortion coefficients of the internal parameter, the external parameter and the three-dimensional point by adopting a Levenberg-Marquardt algorithm so as to minimize the function value of the cost function. When the function value of the cost function reaches the minimum value, the terminal may take the internal parameter of each camera at this time as the internal parameter of the corresponding camera, take the external parameter of each camera at this time as the external parameter of the corresponding camera, and take the distortion coefficient of each camera at this time as the distortion coefficient of the corresponding camera.
In the embodiment of the application, the terminal can select any two cameras from the multiple cameras as an initial camera set, and the two cameras included in the initial camera set are calibrated by using the images acquired by the initial camera set to obtain calibration parameters; the method comprises the steps of taking a camera to be calibrated and a first calibration camera as a first camera set, taking the camera to be calibrated and a second calibration camera as a second camera set, calibrating the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, wherein the first calibration camera and the second calibration camera refer to any two cameras in the calibrated cameras. Therefore, in the embodiment of the application, as long as two cameras can acquire images meeting the calibration conditions, the images can be used as an initial camera set to calibrate the two cameras, and for the rest of the cameras, the calibration can be completed as long as the images meeting the calibration conditions are acquired by any two cameras in the calibrated cameras. Moreover, because the number of images which contain all the marking points on the calibration rod and need to be collected at any time is reduced, for the working personnel, when the calibration rod is swung, the degree of freedom of movement of the calibration rod is larger, the movement range is wider, and under the condition, the calibration result obtained when the camera is calibrated according to the collected images is more accurate.
In addition, in the embodiment of the application, the terminal can determine the internal parameters and the external parameters of all the cameras, and can determine the distortion coefficients of all the cameras by generating the cost function, so that the calibration accuracy of the cameras is further improved.
Next, a multi-camera calibration apparatus provided in an embodiment of the present application is described.
Referring to fig. 5, an embodiment of the present application provides a multi-camera calibration apparatus 500, where the apparatus 500 includes:
the first calibration module 501 is configured to select any two cameras from the multiple cameras as an initial camera group, and calibrate the two cameras included in the initial camera group by using images acquired by the initial camera group to obtain calibration parameters;
the second calibration module 502 is configured to use the camera to be calibrated and the first calibration camera as a first camera set, use the camera to be calibrated and the second calibration camera as a second camera set, and calibrate the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, where the first calibration camera and the second calibration camera refer to any two cameras among the calibrated cameras.
Optionally, the first calibration module 501 includes:
the first determining unit is used for determining a basic matrix between two cameras included in the initial camera group based on the images which are collected by the initial camera group and meet the calibration condition;
the first calibration unit is used for determining a projection matrix of a first camera in the initial camera set as a preset projection matrix and determining calibration parameters of the first camera based on the preset projection matrix;
the first calibration unit is further configured to determine a projection matrix of a second camera in the initial camera group based on a base matrix between two cameras in the initial camera group, and determine calibration parameters of the second camera based on the projection matrix of the second camera.
Optionally, the second calibration module 502 includes:
the second determining unit is used for determining a basic matrix between two cameras in the corresponding camera group based on the images which are acquired by each camera group in the first camera group and the second camera group and meet the calibration condition;
the third determining unit is used for determining the projection matrix of the camera to be calibrated based on the projection matrix of the first calibration camera, the projection matrix of the second calibration camera and a basic matrix between two cameras in each camera group in the first camera group and the second camera group;
and the second calibration unit is used for determining calibration parameters of the camera to be calibrated based on the projection matrix of the camera to be calibrated.
Optionally, the image meeting the calibration condition refers to at least four groups of images obtained by image acquisition of the calibration rod at different positions by the corresponding camera group, each group of the at least four groups of images includes two images obtained by image acquisition of the calibration rod at the same position by the two cameras in the corresponding camera group, and the two images both include all the marker points on the calibration rod.
Optionally, the second determination unit includes:
the first obtaining subunit is configured to obtain at least two image point pairs from each of at least four groups of images acquired by each of the first camera group and the second camera group, so as to obtain at least eight image point pairs;
the first determining subunit is configured to acquire image coordinates of two image points included in each of the at least eight image point pairs, and determine a basis matrix between two cameras in the corresponding camera group based on the image coordinates of the two image points included in each of the at least eight image point pairs.
Optionally, the calibration rod includes at least four marker points, and three of the at least four marker points are collinear;
the first obtaining subunit is specifically configured to:
detecting at least four marking points in a first image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the first image;
detecting at least four marking points in a second image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the second image;
and determining at least one image point pair according to the at least one mark point left in the first image except the three collinear mark points and the at least one mark point left in the second image except the three collinear mark points, and determining a mark point positioned in the middle of the three collinear mark points in the first image and a mark point positioned in the middle of the three collinear mark points in the second image as one image point pair.
Optionally, the apparatus 500 further comprises:
the first acquisition module is used for acquiring at least eight image point pairs from the images which are acquired by each camera group and meet the calibration conditions;
the first determining module is used for determining the coordinates of at least eight three-dimensional points corresponding to each camera group based on the image coordinates of each image point in at least eight image point pairs corresponding to each camera group and the projection matrixes of two cameras in the corresponding camera group;
the first determining module is further used for determining a reprojection error based on the image coordinates of each image point in the at least eight image point pairs corresponding to each camera group, the coordinates of at least eight three-dimensional points corresponding to each camera group and a projection matrix of each camera in the plurality of cameras;
the first adjusting module is used for adjusting the coordinates of at least eight three-dimensional points corresponding to each camera group and the projection matrix of each camera in the plurality of cameras under the condition that the image coordinates of each image point in at least eight image point pairs corresponding to each camera group are not changed, so that the reprojection error is minimized;
and the first updating module is used for updating the calibration parameters of the corresponding camera based on the adjusted projection matrix of each camera in the plurality of cameras.
Optionally, the calibration rod includes at least four marker points, and three of the at least four marker points are collinear;
the apparatus 500 further comprises:
the second determining module is used for determining at least four estimated distances between a mark point positioned at the middle position in the three collinear mark points on the calibration rod and any mark point in the rest at least one mark point based on the adjusted coordinates of at least eight three-dimensional points corresponding to each camera set;
the second acquisition module is used for acquiring the actual measurement distance between the mark point positioned in the middle of the three collinear mark points on the calibration rod and any one of the rest at least one mark point;
the second adjusting module is used for generating a cost function based on at least four estimated distances, actually measured distances, calibration parameters of each camera in the plurality of cameras, coordinates of at least eight three-dimensional points corresponding to each adjusted camera group and distortion coefficients of each camera in the plurality of cameras, and the function value of the cost function is used for indicating the sum of pixel errors and scale errors of the plurality of cameras;
the second adjusting module is further used for adjusting the calibration parameters of each camera in the plurality of cameras, the distortion coefficient of each camera in the plurality of cameras and the adjusted coordinates of at least eight three-dimensional points corresponding to each camera group so as to minimize a function value of the cost function;
and the second updating module is used for updating the distortion coefficient of each camera in the plurality of cameras into the adjusted distortion coefficient of the corresponding camera, and updating the calibration parameter of each camera in the plurality of cameras into the adjusted calibration parameter of the corresponding camera.
To sum up, in the embodiment of the present application, any two cameras can be selected from the multiple cameras as an initial camera set, and the two cameras included in the initial camera set are calibrated by using an image acquired by the initial camera set to obtain calibration parameters; the method comprises the steps of taking a camera to be calibrated and a first calibration camera as a first camera set, taking the camera to be calibrated and a second calibration camera as a second camera set, calibrating the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, wherein the first calibration camera and the second calibration camera refer to any two cameras in the calibrated cameras. Therefore, at any time when the plurality of cameras acquire images, as long as two cameras can acquire images meeting the calibration conditions at the same time, the cameras can be used as an initial camera set, the two cameras are calibrated, and for the rest of other cameras, the calibration can be completed as long as the images meeting the calibration conditions are acquired by any two cameras in the calibrated cameras, that is, when the plurality of cameras are calibrated, all the marking points of the calibration rod which is located at a certain position at a certain time do not need to be simultaneously located in the visual field range of each camera, so that the difficulty in calibrating the plurality of cameras of the multi-camera system with a large working area is reduced. Moreover, because the number of images which contain all the marking points on the calibration rod and need to be collected at any time is reduced, for the working personnel, when the calibration rod is swung, the degree of freedom of movement of the calibration rod is larger, the movement range is wider, and under the condition, the calibration result obtained when the camera is calibrated according to the collected images is more accurate.
It should be noted that: when calibrating a plurality of cameras, the multi-camera calibration apparatus provided in the above embodiment is only illustrated by dividing the functional modules, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules to complete all or part of the functions described above. In addition, the multi-camera calibration device and the multi-camera calibration method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Fig. 6 shows a block diagram of a terminal 600 for multi-camera calibration according to an exemplary embodiment of the present application. The terminal 600 may be: a notebook computer or a desktop computer. The terminal 600 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
In general, the terminal 600 includes: a processor 601 and a memory 602.
The processor 601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 601 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 601 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 602 may include one or more computer-readable storage media, which may be non-transitory. The memory 602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 602 is used to store at least one instruction for execution by the processor 601 to implement the multi-camera calibration method provided by the method embodiments herein.
In some embodiments, the terminal 600 may further optionally include: a peripheral interface 603 and at least one peripheral. The processor 601, memory 602, and peripheral interface 603 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 603 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 604, a touch screen display 605, a camera 606, an audio circuit 607, a positioning component 608, and a power supply 609.
The peripheral interface 603 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 601 and the memory 602. In some embodiments, the processor 601, memory 602, and peripheral interface 603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 601, the memory 602, and the peripheral interface 603 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 604 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 604 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 604 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 604 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display 605 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 605 is a touch display screen, the display screen 605 also has the ability to capture touch signals on or over the surface of the display screen 605. The touch signal may be input to the processor 601 as a control signal for processing. At this point, the display 605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 605 may be one, providing the front panel of the terminal 600; in other embodiments, the display 605 may be at least two, respectively disposed on different surfaces of the terminal 600 or in a folded design; in still other embodiments, the display 605 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 600. Even more, the display 605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 605 may be made of LCD (liquid crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 606 is used to capture images or video. Optionally, camera assembly 606 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 606 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuitry 607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 601 for processing or inputting the electric signals to the radio frequency circuit 604 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 601 or the radio frequency circuit 604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 607 may also include a headphone jack.
The positioning component 608 is used to locate the current geographic location of the terminal 600 to implement navigation or LBS (location based Service). The positioning component 608 can be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 609 is used to provide power to the various components in terminal 600. The power supply 609 may be ac, dc, disposable or rechargeable. When the power supply 609 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
Those skilled in the art will appreciate that the configuration shown in fig. 6 is not intended to be limiting of terminal 600 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
The memory further includes one or more programs, and the one or more programs are stored in the memory and configured to be executed by the CPU. The one or more programs include instructions for performing the multi-camera calibration method provided by the embodiments of the present application.
The embodiment of the present application further provides a non-transitory computer-readable storage medium, and when instructions in the storage medium are executed by a processor of a terminal, the terminal is enabled to execute the multi-camera calibration method provided in the embodiment shown in fig. 2 or 3.
Embodiments of the present application further provide a computer program product containing instructions, which when run on a computer, cause the computer to execute the multi-camera calibration method provided in the embodiments shown in fig. 2 or 3.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (16)

1. A multi-camera calibration method, the method comprising:
selecting any two cameras from a plurality of cameras as an initial camera set, and calibrating the two cameras included in the initial camera set by using images acquired by the initial camera set to obtain calibration parameters;
the method comprises the steps of taking a camera to be calibrated and a first calibration camera as a first camera set, taking the camera to be calibrated and a second calibration camera as a second camera set, calibrating the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, wherein the first calibration camera and the second calibration camera refer to any two cameras in the calibrated cameras.
2. The method according to claim 1, wherein the calibrating the two cameras included in the initial camera group by using the image acquired by the initial camera group to obtain calibration parameters comprises:
determining a basic matrix between two cameras included in the initial camera based on the images which are acquired by the initial camera set and meet the calibration condition;
determining a projection matrix of a first camera in the initial camera set as a preset projection matrix, and determining calibration parameters of the first camera based on the preset projection matrix;
and determining a projection matrix of a second camera in the initial camera group based on a basic matrix between two cameras in the initial camera group, and determining calibration parameters of the second camera based on the projection matrix of the second camera.
3. The method according to claim 1, wherein calibrating the camera to be calibrated based on the image acquired by the first camera group and the image acquired by the second camera group to obtain calibration parameters comprises:
determining a basis matrix between two cameras in the corresponding camera group based on the images which are acquired by each camera group in the first camera group and the second camera group and meet the calibration condition;
determining a projection matrix of the camera to be calibrated based on the projection matrix of the first calibration camera, the projection matrix of the second calibration camera and a basic matrix between two cameras in each of the first camera group and the second camera group;
and determining the calibration parameters of the camera to be calibrated based on the projection matrix of the camera to be calibrated.
4. The method according to claim 2 or 3, wherein the images meeting the calibration condition refer to at least four groups of images obtained by image acquisition of the calibration rod at different positions by the corresponding camera group, each group of the at least four groups of images comprises two images obtained by image acquisition of the calibration rod at the same position by two cameras in the corresponding camera group, and the two images both comprise all the marker points on the calibration rod.
5. The method of claim 4, wherein determining a basis matrix between two cameras in the respective camera group based on the images acquired by each of the first camera group and the second camera group that satisfy calibration conditions comprises:
acquiring at least two image point pairs from each of at least four groups of images acquired by each camera group in the first camera group and the second camera group to obtain at least eight image point pairs;
and acquiring the image coordinates of the two image points included in each image point pair of the at least eight image point pairs, and determining a basic matrix between the two cameras in the corresponding camera group based on the image coordinates of the two image points included in each image point pair of the at least eight image point pairs.
6. The method of claim 5, wherein the calibration bar comprises at least four marker points, and three of the at least four marker points are collinear;
the acquiring at least two image point pairs from each of at least four sets of images acquired by each of the first camera set and the second camera set includes:
detecting at least four marking points in a first image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the first image;
detecting at least four marking points in a second image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the second image;
determining at least one image point pair according to at least one mark point left in the first image except the three collinear mark points and at least one mark point left in the second image except the three collinear mark points, and determining a mark point located in the middle of the three collinear mark points in the first image and a mark point located in the middle of the three collinear mark points in the second image as one image point pair.
7. The method according to any one of claims 2-4, further comprising:
acquiring at least eight image point pairs from the images which are acquired by each camera group and meet the calibration conditions;
determining coordinates of at least eight three-dimensional points corresponding to each camera group based on the image coordinates of each image point in at least eight image point pairs corresponding to each camera group and the projection matrixes of the two cameras in the corresponding camera group;
determining a reprojection error based on the image coordinates of each image point in the at least eight image point pairs corresponding to each camera group, the coordinates of the at least eight three-dimensional points corresponding to each camera group, and the projection matrix of each camera of the plurality of cameras;
under the condition that the image coordinates of each image point in at least eight image point pairs corresponding to each camera group are unchanged, adjusting the coordinates of at least eight three-dimensional points corresponding to each camera group and the projection matrix of each camera in the plurality of cameras so as to minimize the reprojection error;
and updating the calibration parameters of the corresponding camera based on the adjusted projection matrix of each camera in the plurality of cameras.
8. The method of claim 7, wherein the calibration bar comprises at least four marker points, and three of the at least four marker points are collinear;
after updating the calibration parameters of the corresponding camera based on the adjusted projection matrix of each camera in the plurality of cameras, the method further includes:
determining at least four estimated distances between a mark point positioned at the middle position in the three collinear mark points on the calibration rod and any mark point in the rest at least one mark point based on the adjusted coordinates of at least eight three-dimensional points corresponding to each camera set;
acquiring an actual measurement distance between a mark point positioned in the middle of the three collinear mark points on the calibration rod and any one of the rest at least one mark point;
generating a cost function based on the at least four estimated distances, the measured distance, the updated calibration parameter of each camera in the plurality of cameras, the adjusted coordinates of the at least eight three-dimensional points corresponding to each camera group, and the distortion coefficient of each camera in the plurality of cameras, wherein a function value of the cost function is used for indicating the sum of the pixel error and the scale error of the plurality of cameras;
adjusting a calibration parameter of each camera in the plurality of cameras, a distortion coefficient of each camera in the plurality of cameras and coordinates of at least eight three-dimensional points corresponding to each adjusted camera group so as to minimize a function value of the cost function;
and updating the distortion coefficient of each camera in the plurality of cameras to the adjusted distortion coefficient of the corresponding camera, and updating the calibration parameter of each camera in the plurality of cameras to the adjusted calibration parameter of the corresponding camera.
9. A multi-camera calibration apparatus, characterized in that the apparatus comprises:
the system comprises a first calibration module, a second calibration module and a third calibration module, wherein the first calibration module is used for selecting any two cameras from a plurality of cameras as an initial camera set, and calibrating the two cameras included in the initial camera set by using images acquired by the initial camera set to obtain calibration parameters;
the second calibration module is configured to use a camera to be calibrated and a first calibration camera as a first camera set, use the camera to be calibrated and a second calibration camera as a second camera set, calibrate the camera to be calibrated based on an image acquired by the first camera set and an image acquired by the second camera set to obtain calibration parameters, where the first calibration camera and the second calibration camera refer to any two cameras among the calibrated cameras.
10. The apparatus of claim 9, wherein the first calibration module comprises:
the first determining unit is used for determining a basic matrix between two cameras included in the initial camera group based on the images which are collected by the initial camera group and meet the calibration condition;
the first calibration unit is used for determining a projection matrix of a first camera in the initial camera group as a preset projection matrix and determining calibration parameters of the first camera based on the preset projection matrix;
the first calibration unit is further configured to determine a projection matrix of a second camera in the initial camera group based on a base matrix between two cameras in the initial camera group, and determine calibration parameters of the second camera based on the projection matrix of the second camera.
11. The apparatus of claim 9, wherein the second calibration module comprises:
a second determining unit, configured to determine a basis matrix between two cameras in a corresponding camera group based on images that satisfy a calibration condition and are acquired by each of the first camera group and the second camera group;
a third determining unit, configured to determine a projection matrix of the camera to be calibrated based on the projection matrix of the first calibration camera, the projection matrix of the second calibration camera, and a basis matrix between two cameras in each of the first camera group and the second camera group;
and the second calibration unit is used for determining calibration parameters of the camera to be calibrated based on the projection matrix of the camera to be calibrated.
12. The device according to claim 10 or 11, wherein the images meeting the calibration condition refer to at least four groups of images obtained by image-capturing the calibration bar at different positions by the corresponding camera group, each group of the at least four groups of images includes two images obtained by image-capturing the calibration bar at the same position by two cameras in the corresponding camera group, and the two images each include all the marker points on the calibration bar.
13. The apparatus of claim 12, wherein the second determining unit comprises:
the first obtaining subunit is configured to obtain at least two image point pairs from each of at least four groups of images acquired by each of the first camera group and the second camera group, so as to obtain at least eight image point pairs;
a first determining subunit, configured to obtain image coordinates of two image points included in each of the at least eight image point pairs, and determine a basis matrix between two cameras in the corresponding camera group based on the image coordinates of the two image points included in each of the at least eight image point pairs.
14. The device of claim 13, wherein the calibration rod comprises at least four marker points, and three of the at least four marker points are collinear;
the first obtaining subunit is specifically configured to:
detecting at least four marking points in a first image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the first image;
detecting at least four marking points in a second image in each group of images, and determining a marking point positioned in the middle of three collinear marking points in the second image;
determining at least one image point pair according to at least one mark point left in the first image except the three collinear mark points and at least one mark point left in the second image except the three collinear mark points, and determining a mark point located in the middle of the three collinear mark points in the first image and a mark point located in the middle of the three collinear mark points in the second image as one image point pair.
15. The apparatus of claims 10-12, further comprising:
the first acquisition module is used for acquiring at least eight image point pairs from the images which are acquired by each camera group and meet the calibration conditions;
the first determining module is used for determining the coordinates of at least eight three-dimensional points corresponding to each camera group based on the image coordinates of each image point in at least eight image point pairs corresponding to each camera group and the projection matrixes of two cameras in the corresponding camera group;
the first determining module is further configured to determine a reprojection error based on the image coordinates of each image point in the at least eight image point pairs corresponding to each camera group, the coordinates of the at least eight three-dimensional points corresponding to each camera group, and the projection matrix of each camera in the plurality of cameras;
a first adjusting module, configured to adjust coordinates of at least eight three-dimensional points corresponding to each camera group and a projection matrix of each camera of the multiple cameras under a condition that an image coordinate of each image point in at least eight image point pairs corresponding to each camera group is unchanged, so as to minimize the reprojection error;
and the first updating module is used for updating the calibration parameters of the corresponding camera based on the adjusted projection matrix of each camera in the plurality of cameras.
16. The device of claim 15, wherein the calibration rod comprises at least four marker points, and three of the at least four marker points are collinear;
the device further comprises:
the second determining module is used for determining at least four estimated distances between a mark point positioned at the middle position in the three collinear mark points on the calibration rod and any mark point in the rest at least one mark point based on the adjusted coordinates of at least eight three-dimensional points corresponding to each camera set;
the second acquisition module is used for acquiring the actual measurement distance between the mark point positioned in the middle of the three collinear mark points on the calibration rod and any one of the rest at least one mark point;
a second adjusting module, configured to generate a cost function based on the at least four estimated distances, the measured distance, the updated calibration parameter of each of the plurality of cameras, the adjusted coordinates of the at least eight three-dimensional points corresponding to each camera group, and a distortion coefficient of each of the plurality of cameras, where a function value of the cost function is used to indicate a sum of a pixel error and a scale error of the plurality of cameras;
the second adjusting module is further configured to adjust a calibration parameter of each of the plurality of cameras, a distortion coefficient of each of the plurality of cameras, and coordinates of at least eight three-dimensional points corresponding to each adjusted camera group, so as to minimize a function value of the cost function;
and the second updating module is used for updating the distortion coefficient of each camera in the plurality of cameras into the adjusted distortion coefficient of the corresponding camera, and updating the calibration parameter of each camera in the plurality of cameras into the adjusted calibration parameter of the corresponding camera.
CN201810732946.5A 2018-07-05 2018-07-05 Multi-camera calibration method and device Active CN110689580B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810732946.5A CN110689580B (en) 2018-07-05 2018-07-05 Multi-camera calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810732946.5A CN110689580B (en) 2018-07-05 2018-07-05 Multi-camera calibration method and device

Publications (2)

Publication Number Publication Date
CN110689580A true CN110689580A (en) 2020-01-14
CN110689580B CN110689580B (en) 2022-04-15

Family

ID=69106865

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810732946.5A Active CN110689580B (en) 2018-07-05 2018-07-05 Multi-camera calibration method and device

Country Status (1)

Country Link
CN (1) CN110689580B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111566701A (en) * 2020-04-02 2020-08-21 深圳市瑞立视多媒体科技有限公司 Method, device and equipment for calibrating scanning field edge under large-space environment and storage medium
CN111836036A (en) * 2020-06-15 2020-10-27 南京澳讯人工智能研究院有限公司 Video image acquisition and processing device and system
CN112212977A (en) * 2020-09-22 2021-01-12 北京理工大学 High-speed high-resolution high-precision ultrahigh-temperature molten pool temperature field online monitoring device and method
WO2022007886A1 (en) * 2020-07-08 2022-01-13 深圳市瑞立视多媒体科技有限公司 Automatic camera calibration optimization method and related system and device
CN114205483A (en) * 2022-02-17 2022-03-18 杭州思看科技有限公司 Scanner precision calibration method and device and computer equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013145072A1 (en) * 2012-03-26 2013-10-03 三菱電機株式会社 Camera calibration method, camera calibration program and camera calibration device
CN104616348A (en) * 2015-01-15 2015-05-13 东华大学 Method for reconstructing fabric appearance based on multi-view stereo vision
CN104680535A (en) * 2015-03-06 2015-06-03 南京大学 Calibration target, calibration system and calibration method for binocular direct-vision camera
CN104794728A (en) * 2015-05-05 2015-07-22 成都元天益三维科技有限公司 Method for reconstructing real-time three-dimensional face data with multiple images
CN106127722A (en) * 2016-05-03 2016-11-16 深圳视觉龙智能传感器有限公司 The demarcation of polyphaser and para-position applying method
CN106875451A (en) * 2017-02-27 2017-06-20 安徽华米信息科技有限公司 Camera calibration method, device and electronic equipment
CN106982370A (en) * 2017-05-03 2017-07-25 武汉科技大学 A kind of camera high-precision calibration scaling board of many line-scan digital camera detecting systems and the method for realizing calibration

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013145072A1 (en) * 2012-03-26 2013-10-03 三菱電機株式会社 Camera calibration method, camera calibration program and camera calibration device
CN104616348A (en) * 2015-01-15 2015-05-13 东华大学 Method for reconstructing fabric appearance based on multi-view stereo vision
CN104680535A (en) * 2015-03-06 2015-06-03 南京大学 Calibration target, calibration system and calibration method for binocular direct-vision camera
CN104794728A (en) * 2015-05-05 2015-07-22 成都元天益三维科技有限公司 Method for reconstructing real-time three-dimensional face data with multiple images
CN106127722A (en) * 2016-05-03 2016-11-16 深圳视觉龙智能传感器有限公司 The demarcation of polyphaser and para-position applying method
CN106875451A (en) * 2017-02-27 2017-06-20 安徽华米信息科技有限公司 Camera calibration method, device and electronic equipment
CN106982370A (en) * 2017-05-03 2017-07-25 武汉科技大学 A kind of camera high-precision calibration scaling board of many line-scan digital camera detecting systems and the method for realizing calibration

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MATTEO MUNARO等: "OpenPTrack: Open source multi-camera calibration and people tracking for RGB-D camera networks", 《ROBOTICS AND AUTONOMOUS SYSTEMS》 *
方华猛 等: "高动态范围图像合成中相机响应函数的快速标定", 《光子学报》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111566701A (en) * 2020-04-02 2020-08-21 深圳市瑞立视多媒体科技有限公司 Method, device and equipment for calibrating scanning field edge under large-space environment and storage medium
CN111566701B (en) * 2020-04-02 2021-10-15 深圳市瑞立视多媒体科技有限公司 Method, device and equipment for calibrating scanning field edge under large-space environment and storage medium
CN111836036A (en) * 2020-06-15 2020-10-27 南京澳讯人工智能研究院有限公司 Video image acquisition and processing device and system
WO2022007886A1 (en) * 2020-07-08 2022-01-13 深圳市瑞立视多媒体科技有限公司 Automatic camera calibration optimization method and related system and device
CN112212977A (en) * 2020-09-22 2021-01-12 北京理工大学 High-speed high-resolution high-precision ultrahigh-temperature molten pool temperature field online monitoring device and method
CN112212977B (en) * 2020-09-22 2022-02-08 北京理工大学 High-speed high-resolution high-precision ultrahigh-temperature molten pool temperature field online monitoring device and method
CN114205483A (en) * 2022-02-17 2022-03-18 杭州思看科技有限公司 Scanner precision calibration method and device and computer equipment
CN114205483B (en) * 2022-02-17 2022-07-29 杭州思看科技有限公司 Scanner precision calibration method and device and computer equipment

Also Published As

Publication number Publication date
CN110689580B (en) 2022-04-15

Similar Documents

Publication Publication Date Title
CN110689580B (en) Multi-camera calibration method and device
CN112270718B (en) Camera calibration method, device, system and storage medium
CN109886208B (en) Object detection method and device, computer equipment and storage medium
CN113280752B (en) Groove depth measurement method, device and system and laser measurement equipment
CN109522863B (en) Ear key point detection method and device and storage medium
CN111256676B (en) Mobile robot positioning method, device and computer readable storage medium
CN111784841B (en) Method, device, electronic equipment and medium for reconstructing three-dimensional image
CN112150560A (en) Method and device for determining vanishing point and computer storage medium
CN113627413A (en) Data labeling method, image comparison method and device
CN111126276A (en) Lane line detection method, lane line detection device, computer equipment and storage medium
CN111932604A (en) Method and device for measuring human ear characteristic distance
CN112113665A (en) Temperature measuring method, device, storage medium and terminal
CN109754439B (en) Calibration method, calibration device, electronic equipment and medium
CN110505510B (en) Video picture display method and device in large-screen system and storage medium
CN111898535A (en) Target identification method, device and storage medium
CN111261174B (en) Audio classification method and device, terminal and computer readable storage medium
CN109714585B (en) Image transmission method and device, display method and device, and storage medium
CN112882094B (en) First-arrival wave acquisition method and device, computer equipment and storage medium
CN111899615B (en) Scoring method, device, equipment and storage medium for experiment
CN112243083B (en) Snapshot method and device and computer storage medium
CN110443841B (en) Method, device and system for measuring ground depth
CN111583339A (en) Method, device, electronic equipment and medium for acquiring target position
CN113689484B (en) Method and device for determining depth information, terminal and storage medium
CN112150554B (en) Picture display method, device, terminal and storage medium
CN112329355B (en) Method and device for determining single-well control area, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou Hikvision Robot Co.,Ltd.

Address before: 310051 5th floor, building 1, building 2, no.700 Dongliu Road, Binjiang District, Hangzhou City, Zhejiang Province

Patentee before: HANGZHOU HIKROBOT TECHNOLOGY Co.,Ltd.

CP03 Change of name, title or address