CN115994950A - Calibration method and device for binocular camera, computer readable medium and equipment - Google Patents

Calibration method and device for binocular camera, computer readable medium and equipment Download PDF

Info

Publication number
CN115994950A
CN115994950A CN202211623863.5A CN202211623863A CN115994950A CN 115994950 A CN115994950 A CN 115994950A CN 202211623863 A CN202211623863 A CN 202211623863A CN 115994950 A CN115994950 A CN 115994950A
Authority
CN
China
Prior art keywords
image
calibration
camera
shooting image
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211623863.5A
Other languages
Chinese (zh)
Inventor
艾绍华
梅海峰
詹东晖
黄春辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Ruiwei Information Technology Co ltd
Original Assignee
Xiamen Ruiwei Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Ruiwei Information Technology Co ltd filed Critical Xiamen Ruiwei Information Technology Co ltd
Priority to CN202211623863.5A priority Critical patent/CN115994950A/en
Publication of CN115994950A publication Critical patent/CN115994950A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a calibration method, device, computer readable medium and equipment for a binocular camera. The method comprises the following steps: acquiring a first shooting image and a second shooting image which are respectively shot by each camera aiming at the same shooting content; characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image is respectively determined to determine target calibration information corresponding to the binocular camera, and a first conversion matrix and a second conversion matrix from each shooting image to the coplanar line alignment plane are respectively determined; carrying out depth recognition on face images to be detected contained in each shooting image, and determining corresponding depth information; the depth information is compared with the actual distance to determine if the calibration result is valid. The technical scheme of the embodiment of the application can improve the calibration efficiency of the binocular camera and ensure the accuracy of the calibration result.

Description

Calibration method and device for binocular camera, computer readable medium and equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and apparatus for calibrating a binocular camera, a computer readable medium, and a device.
Background
In the face recognition field, the binocular camera can construct face three-dimensional depth information, and can effectively prevent various living attack modes such as pictures, videos and the like, so that the binocular camera is widely applied. In the prior art, the binocular camera needs to surround the calibration plate, photographs of the calibration plate are taken at different positions to calibrate, or a plurality of calibration plates are combined into a whole, and the binocular camera photographs a group of photographs to calibrate through arrangement at different angles. However, in the above method, the binocular camera needs to take a plurality of pictures, and the distance between the calibration plate and the binocular camera is fixed, and the depth information is insufficient, so that a certain loss exists in the accuracy of the calibration. Therefore, how to improve the calibration efficiency of the binocular camera and ensure the accuracy of the calibration result becomes a technical problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a calibration method, a device, a computer readable medium and equipment for a binocular camera, so that the calibration efficiency of the binocular camera can be improved at least to a certain extent, and the accuracy of a calibration result is ensured.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned in part by the practice of the application.
According to an aspect of an embodiment of the present application, there is provided a calibration method for a binocular camera, including:
acquiring a first shooting image and a second shooting image which are respectively shot by a first camera and a second camera in the binocular camera aiming at the same shooting content, wherein the shooting content comprises a plurality of calibration images with different distance settings and a face image to be checked, the calibration images comprise a plurality of characteristic points with equal distances in the transverse and longitudinal directions, the face image to be checked and the calibration images are not shielded from each other, and the calibration images are not shielded from each other;
performing image recognition on the first shooting image and the second shooting image, and respectively determining characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image;
determining target calibration information corresponding to the binocular camera according to the characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image, wherein the target calibration information comprises monocular calibration information and binocular three-dimensional calibration information;
according to the target calibration information, based on a Bouguet algorithm, respectively determining a first conversion matrix and a second conversion matrix from the first shooting image and the second shooting image to a coplanar line alignment plane;
Carrying out depth recognition on face images to be detected contained in the first shooting image and the second shooting image according to the first conversion matrix and the second conversion matrix, and determining depth information corresponding to the face images to be detected;
and comparing the depth information with the actual distance between the face image to be detected and the binocular camera, if the comparison result meets a preset rule, determining that the calibration result is effective, and carrying out association storage on the target calibration information, the first conversion matrix and the second conversion matrix.
In one aspect of the embodiments of the present application, there is provided a calibration device for a binocular camera, the device including:
the device comprises an acquisition module, a calibration module and a display module, wherein the acquisition module is used for acquiring a first shooting image and a second shooting image which are respectively shot by a first camera and a second camera in a binocular camera aiming at the same shooting content, the shooting content comprises face images to be checked and a plurality of calibration images, which are arranged at different distances, the calibration images comprise a plurality of characteristic points which are equidistant in the transverse direction and the longitudinal direction, and the face images to be checked and the calibration images are not shielded;
The first determining module is used for carrying out image recognition on the first shooting image and the second shooting image, and determining characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image respectively;
the second determining module is used for determining target calibration information corresponding to the binocular camera according to the characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image, wherein the target calibration information comprises monocular calibration information and binocular three-dimensional calibration information;
the third determining module is used for respectively determining a first conversion matrix and a second conversion matrix from the first shooting image and the second shooting image to a coplanar line alignment plane based on a Bouguet algorithm according to the target calibration information;
a fourth determining module, configured to perform depth recognition on face images to be detected included in the first captured image and the second captured image according to the first conversion matrix and the second conversion matrix, and determine depth information corresponding to the face images to be detected;
and the processing module is used for comparing the depth information with the actual distance between the face image to be detected and the binocular camera, determining that the calibration result is effective if the comparison result meets a preset rule, and carrying out association storage on the target calibration information, the first conversion matrix and the second conversion matrix.
According to an aspect of the embodiments of the present application, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a method for calibrating a binocular camera as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; and the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are enabled to realize the calibration method of the binocular camera.
According to an aspect of embodiments of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the calibration method of the binocular camera provided in the above embodiment.
In the technical schemes provided by some embodiments of the present application, by acquiring a first shot image and a second shot image respectively shot by a first camera and a second camera in a binocular camera for the same shot content, wherein the shot content includes a plurality of calibration images set at different distances and a face image to be checked, the calibration images include a plurality of feature points which are equidistant vertically and horizontally and do not block each other between the face image to be checked and the calibration images and between the calibration images, and image recognition is performed on the first shot image and the second shot image, and feature point information corresponding to each calibration image in the first shot image and the second shot image is respectively determined; and determining target calibration information corresponding to the binocular camera, respectively determining a first conversion matrix and a second conversion matrix from the first shooting image and the second shooting image to the coplanar line alignment plane based on the calibration information and a Bouguet algorithm, performing depth recognition on face images to be detected in the first shooting image and the second shooting image according to the first conversion matrix and the second conversion matrix, determining depth information corresponding to the face images to be detected, and comparing the depth information with the actual distance between the face images to be detected and the binocular camera, so as to determine whether a calibration result is effective. Therefore, the depth information required by calibration can be met through a plurality of calibration images arranged at different distances, and the binocular camera can achieve calibration and verification by taking one photo, so that the calibration efficiency is improved, and meanwhile, the accuracy of a calibration result is also ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. It is apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art. In the drawings:
FIG. 1 shows a flow diagram of a method of calibrating a binocular camera in accordance with one embodiment of the present application;
FIG. 2 shows a schematic view of a first captured image, a second captured image, according to one embodiment of the present application;
FIG. 3 shows a block diagram of a calibration device for a binocular camera in accordance with one embodiment of the present application;
fig. 4 shows a schematic diagram of a computer system suitable for use in implementing the electronic device of the embodiments of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present application. One skilled in the relevant art will recognize, however, that the aspects of the application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
Fig. 1 shows a flow diagram of a calibration method of a binocular camera according to one embodiment of the present application. The method may be applied to a terminal device, which may include, but is not limited to, one or more of a smart phone, a tablet computer, a laptop computer, a desktop computer; it may also be applied to a server, such as a physical server or a cloud server, which is not particularly limited in this application.
Referring to fig. 1, the calibration method of the binocular camera at least includes steps S110 to S160, which are described in detail as follows:
in step S110, a first shot image and a second shot image which are shot by a first camera and a second camera in the binocular camera respectively aiming at the same shot content are acquired, the shot content comprises a plurality of calibration images with different distance settings and a face image to be checked, the calibration images comprise a plurality of feature points with equal distances in the transverse and longitudinal directions, and the face image to be checked, the calibration images and the calibration images are not shielded from each other.
The calibration image can be a checkerboard image or a dot matrix image with a plurality of characteristic points, wherein the transverse and longitudinal distances among the characteristic points are equal.
In an embodiment, the application further provides calibration equipment, which comprises a module placement table, a plurality of calibration images with different specifications, a face image to be detected and a bracket. When the calibration is carried out, the face image to be detected and a plurality of calibration images can be placed at different distances on the module placing table. It should be understood that the different distances described in this application are different from the distance between the planes in which the binocular cameras lie. In an example, the number of calibration images may be 4, and since the focal length of the face recognition module is generally about 60cm, and the effective recognition distance is 30 cm-150 cm, the placement distances of the 4 calibration images may be 30cm, 70cm, 110cm and 150cm, respectively, and the face image to be detected may be placed at 90 cm. And the face image to be checked and the calibration image are not blocked by each other. For example, the four calibration images may be placed in a four-grid, the face image to be detected may be in the middle of the four calibration images, and so on.
The first camera and the second camera in the binocular camera can respectively shoot the face image to be detected and the calibration image so as to obtain a corresponding first shooting image and a corresponding second shooting image, and it is understood that the first shooting image and the second shooting image both comprise the face image to be detected and a plurality of calibration images.
In step S120, image recognition is performed on the first captured image and the second captured image, and feature point information corresponding to each calibration image in the first captured image and the second captured image is determined.
In this embodiment, the first shot image and the second shot image may be subjected to image recognition, so as to determine the feature point information corresponding to each calibration image in the first shot image and the second shot image.
In one embodiment of the present application, the calibration image may be a checkerboard image, and the corner points of the checkerboard image are feature points.
Performing image recognition on the first shot image and the second shot image, and determining feature point information corresponding to each calibration image in the first shot image and the second shot image respectively, wherein the feature point information comprises:
and respectively carrying out image recognition on the areas where the calibration images in the first shooting image and the second shooting image are located, and determining the quantity of angular points and the coordinates of the angular points corresponding to the calibration images in the first shooting image and the second shooting image to serve as characteristic point information.
Taking a plurality of calibration images as four lattices to be placed as an example (as shown in fig. 2), the shot image can be cut into 4 lattices according to the center point, the four calibration images are respectively and completely displayed in different lattices, and the face image to be detected is placed in the middle position so as not to shade the corner points of the calibration images.
At this time, image recognition is performed for each grid, the number and coordinates of the corner points in each grid (i.e. corresponding to each calibration image) are recognized, and the corner points can be numbered and ordered according to determinant, so as to obtain a corresponding corner point sequence. Thus, a sequence of 4 corner points can be obtained for one shot image, and both the first shot image and the second shot image can be processed in the manner described above.
In an example, after performing image recognition on the areas where the calibration images in the first captured image and the second captured image are located, and determining the number of corner points and the coordinates of the corner points corresponding to the calibration images in the first captured image and the second captured image as the feature point information, the method further includes:
comparing the numbers of the angular points corresponding to the same calibration image in the first shooting image and the second shooting image, and if the numbers of the angular points corresponding to the same calibration image are different, carrying out image capture again through the first camera and the second camera.
In this embodiment, to ensure the accuracy of the subsequent calibration result, if the numbers of the corner points corresponding to the same calibration image are different in the two captured images, the capturing may be performed again after adjustment, for example, whether the face image to be detected blocks the calibration image, or whether the camera adjusts the focal length.
In step S130, target calibration information corresponding to the binocular camera is determined according to the feature point information corresponding to each calibration image in the first shot image and the second shot image, where the target calibration information includes monocular calibration information and binocular stereo calibration information.
In an embodiment, the target calibration information may be obtained by calibrating the binocular camera according to the feature point information corresponding to each calibration image in the first shot image and the second shot image, where the target calibration information may include monocular calibration information and binocular stereo calibration information.
Specifically, monocular calibration can be performed on the first camera and the second camera based on a Zhang Zhengyou calibration method according to feature point information corresponding to each calibration image in the first shot image and the second shot image respectively, so as to determine an inner parameter matrix, an outer parameter matrix and radial distortion parameters corresponding to each of the first camera and the second camera, so as to obtain monocular calibration information.
And performing binocular three-dimensional calibration on the first camera and the second camera based on the internal reference matrix, the external reference matrix and the radial distortion parameters corresponding to each camera so as to determine the relative relation between the first camera and the second camera coordinate system, wherein the relative relation comprises a rotation matrix and a translation matrix between the two, and thus binocular three-dimensional calibration information is obtained.
Specifically, the relative relationship of the coordinate systems of the left and right cameras (first camera and second camera) can be described by using the rotation matrix R and the translation matrix T, specifically as follows:
a world coordinate system is established at the left camera, and a point P is assumed to exist in the space, and the coordinate of the point P in the world coordinate system is P w Its coordinates in the left and right camera coordinate system can be expressed as:
P l =R l P w +T l (1)
P r =R r P w +T r (2)
pushing out according to (1)
Figure SMS_1
Substituting (2) to obtain the following formula:
Figure SMS_2
from equation (3):
Figure SMS_3
Figure SMS_4
wherein R is l ,T l The rotation matrix and the translation vector of the relative calibration object obtained by monocular calibration of the left camera are R l ,T l The rotation matrix and the translation vector of the relative calibration object are obtained for the right camera through monocular calibration. The left camera and the right camera respectively perform monocular calibration to obtain R l 、T l 、R l 、T l . The rotation matrix R and the translation T between the left and right cameras can be found by taking the above formula.
In step S140, according to the target calibration information, based on a Bouguet algorithm, a first transformation matrix and a second transformation matrix from the first shot image and the second shot image to the coplanar line alignment plane are respectively determined.
In this embodiment, binocular stereo correction is performed on the first camera and the second camera based on a Bouguet algorithm according to the obtained target calibration information, and a first conversion matrix and a second conversion matrix of the first shot image and the second shot image to the coplanar line alignment plane are calculated.
The Bouguet method is to decompose the rotation matrix R and the translation matrix T into a rotation matrix R and a translation matrix R which are rotated by half of each of the left camera and the right camera l 、T l 、R l 、T l . The principle of decomposition is to minimize distortion caused by the re-projection of the left and right images and maximize the common area of the left and right views.
Specifically, a rotation matrix of the right image plane with respect to the left image plane is decomposed into two matrices R l And R is r Called the composite rotation matrix of the left and right cameras.
Figure SMS_5
Figure SMS_6
Wherein,,
Figure SMS_7
Figure SMS_8
is->
Figure SMS_9
Is a matrix of inverse of (a).
The left and right cameras are each rotated half way so that the optical axes of the left and right cameras are parallel. The imaging planes of the left and right cameras reach parallel at this time, but the base line is not parallel to the imaging plane.
Constructing a transformation matrix R rect Such that the baseline is parallel to the imaging plane. The method of construction is accomplished by the offset matrix T of the right camera relative to the left camera.
Structure e 1 The transformation matrix transforms the pole of the left view to infinity, so that the polar lines reach the level, and the translation vector between the projection centers of the left camera and the right camera is the left pole direction:
Figure SMS_10
e 2 the direction is orthogonal to the main optical axis direction, along the image direction, and e 1 Perpendicular, let e 2 The direction can pass e 1 Cross product with main optical axis direction and normalization obtain:
Figure SMS_11
obtain e 1 And e 2 After that, e 3 And e 1 And e 2 Orthogonalization, e 3 Naturally is the cross product of their two:
e 3 =e 1 ×e 2
the pole of the left camera can be converted to a matrix R at infinity rect The following are provided:
Figure SMS_12
the integral rotation matrix of the left and right cameras is obtained by multiplying the synthesized rotation matrix by the transformation matrix. The left and right camera coordinate systems are multiplied by the respective overall rotation matrices such that the main optical axes of the left and right cameras are parallel and the image plane is parallel to the base line. By the above two integral rotation matrices, an ideal binocular stereoscopic image arranged in parallel can be obtained.
E l =R rect R l
E r =R rect R r
In step S150, according to the first conversion matrix and the second conversion matrix, depth recognition is performed on the face image to be detected included in the first captured image and the second captured image, and depth information corresponding to the face image to be detected is determined.
In this embodiment, according to the first conversion matrix and the second conversion matrix, face detection and depth recognition may be performed on face images to be detected included in the first captured image and the second captured image, so as to determine depth information corresponding to the face images to be detected.
Specifically, face detection may be performed on the first shot image and the second shot image, first coordinate information of face contour mark points corresponding to the shot images may be identified, for example, each face may correspondingly identify 68 face contour mark points, and coordinate information of each face contour mark point may be determined as the first coordinate information.
And then converting the first coordinate information of the face contour marking points corresponding to the identified first shooting image and the second shooting image into a coplanar line alignment plane through a first conversion matrix and a second conversion matrix respectively so as to determine the second coordinate information of the face contour marking points corresponding to the first shooting image and the second shooting image.
Furthermore, the second sitting position of the points is marked according to the same face outlineAnd determining depth information corresponding to each face contour mark point by the coordinate difference in the X direction between the mark information. Specifically, the depth information z=bf/d, where B is the center distance of the first camera, the second camera, also called the baseline distance, is known, f is the camera focal length, which can be calculated in the internal reference matrix of the camera, d=x l -X r I.e. the difference of the X values in the second coordinate information of the same face contour mark point. By such pushing, the depth information corresponding to each face contour mark point can be calculated.
Based on the depth information of each face contour mark point, the target depth information of the face to be detected can be determined, for example, the average value of all the depth information can be calculated to be used as the target depth information of the face image to be detected, or the median value in all the depth information can be selected to be used as the target depth information of the face image to be detected, or a depth information can be randomly selected from the median value to be used as the target depth information corresponding to the face image to be detected, and the like. Those skilled in the art may choose the corresponding determination method according to actual implementation needs, which is not particularly limited in this application.
Referring to fig. 1, in step S160, the depth information is compared with an actual distance between the face image to be detected and the binocular camera, if the comparison result meets a predetermined rule, the calibration result is determined to be valid, and the target calibration information, the first conversion matrix and the second conversion matrix are stored in association.
In this embodiment, after determining the target depth information corresponding to the face image to be detected, the target depth information may be compared with the actual distance between the face image to be detected and the binocular camera. The person skilled in the art can preset a predetermined rule according to a previous experience to determine whether the comparison result meets the actual use requirement. For example, the predetermined rule may be that the target depth information is equal to the actual distance, or that there may be some error between the target depth information and the actual distance, etc.
Therefore, if the comparison result meets the preset rule, the calibration result is accurate, so that the target calibration information, the first conversion matrix and the second conversion matrix can be associated and stored for later use, if the comparison result does not meet the preset rule, the calibration result is poor, and the calibration can be carried out again after adjustment.
Based on the embodiment shown in fig. 1, a first shooting image and a second shooting image which are respectively shot by a first camera and a second camera in a binocular camera aiming at the same shooting content are obtained, wherein the shooting content comprises face images to be checked and a plurality of calibration images which are arranged at different distances, the calibration images comprise a plurality of feature points which are equidistant vertically and horizontally and do not block each other between the face images to be checked and the calibration images and between the calibration images, the first shooting image and the second shooting image are subjected to image recognition, and feature point information corresponding to each calibration image in the first shooting image and the second shooting image is respectively determined; and determining target calibration information corresponding to the binocular camera, respectively determining a first conversion matrix and a second conversion matrix from the first shooting image and the second shooting image to the coplanar line alignment plane based on the calibration information and a Bouguet algorithm, performing depth recognition on face images to be detected in the first shooting image and the second shooting image according to the first conversion matrix and the second conversion matrix, determining depth information corresponding to the face images to be detected, and comparing the depth information with the actual distance between the face images to be detected and the binocular camera, so as to determine whether a calibration result is effective. Therefore, the depth information required by calibration can be met through a plurality of calibration images arranged at different distances, and the binocular camera can achieve calibration and verification by taking one photo, so that the calibration efficiency is improved, and meanwhile, the accuracy of a calibration result is also ensured.
According to the calibration method for the binocular camera, the parameter calibration and calibration check of the camera can be completed only by shooting once, the calibration image and the face image to be detected are fixed, the camera does not need to move, no motion error is introduced, the distances between the plurality of calibration images and the camera are different, the calibration precision is higher, the depth information of the face at different distances can be well rebuilt, and the accuracy of the calibration effect is guaranteed.
The following describes an embodiment of the device of the present application, which may be used to perform the calibration method of the binocular camera in the above embodiment of the present application. For details not disclosed in the embodiments of the device of the present application, please refer to the embodiments of the calibration method of the binocular camera described in the present application.
Fig. 3 shows a block diagram of a calibration device for a binocular camera according to one embodiment of the present application.
Referring to fig. 3, a calibration device for a binocular camera according to an embodiment of the present application includes:
the acquiring module 310 is configured to acquire a first shot image and a second shot image respectively shot by a first camera and a second camera in the binocular camera for the same shot content, where the shot content includes a plurality of calibration images set at different distances and a face image to be checked, the calibration images include a plurality of feature points equidistant in a horizontal-vertical direction, and the face image to be checked, the calibration images and the face image to be checked are not blocked by each other;
A first determining module 320, configured to perform image recognition on the first captured image and the second captured image, and determine feature point information corresponding to each of the calibration images in the first captured image and the second captured image respectively;
the second determining module 330 is configured to determine target calibration information corresponding to the binocular camera according to feature point information corresponding to each calibration image in the first captured image and the second captured image, where the target calibration information includes monocular calibration information and binocular stereo calibration information;
a third determining module 340, configured to determine, according to the target calibration information, a first conversion matrix and a second conversion matrix from the first captured image and the second captured image to a coplanar line alignment plane based on a Bouguet algorithm;
a fourth determining module 350, configured to perform depth recognition on face images to be detected included in the first captured image and the second captured image according to the first conversion matrix and the second conversion matrix, and determine depth information corresponding to the face images to be detected;
and the processing module 360 is configured to compare the depth information with an actual distance between the face image to be detected and the binocular camera, determine that the calibration result is valid if the comparison result meets a predetermined rule, and store the target calibration information, the first conversion matrix and the second conversion matrix in an associated manner.
In one embodiment of the present application, the second determining module 330 is configured to: monocular calibration is carried out on the first camera and the second camera respectively according to the characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image, and an internal parameter matrix, an external parameter matrix and radial distortion parameters corresponding to each of the first camera and the second camera are determined; and according to the internal parameter matrix, the external parameter matrix and the radial distortion parameters corresponding to the first camera and the second camera respectively, binocular stereo calibration is carried out on the first camera and the second camera, and the relative relation between the first camera and the second camera coordinate system is determined, wherein the relative relation comprises a rotation matrix and a translation matrix between the first camera and the second camera.
In one embodiment of the present application, the fourth determining module 350 is configured to: face detection is carried out on the first shooting image and the second shooting image respectively, so that first coordinate information of face contour marking points corresponding to the first shooting image and the second shooting image respectively is obtained; converting first coordinate information of the face contour marking points corresponding to the first shooting image and the second shooting image to a coplanar line alignment plane through the first conversion matrix and the second conversion matrix respectively, and determining second coordinate information of the face contour marking points corresponding to the first shooting image and the second shooting image respectively; determining depth information corresponding to each face contour mark point according to the coordinate difference in the X direction between the second coordinate information of the same face contour mark point; and determining target depth information of the face image to be detected according to the depth information corresponding to each face contour mark point.
In one embodiment of the present application, the fourth determining module 350 is configured to: and determining the average value of all the depth information as the target depth information of the face image to be detected according to the depth information corresponding to each face contour mark point.
In one embodiment of the present application, the calibration image is a checkerboard image, and corner points of the checkerboard image are the feature points; the first determining module 320 is configured to: and respectively carrying out image recognition on the areas where the calibration images in the first shooting image and the second shooting image are located, and determining the quantity of angular points and the coordinates of the angular points corresponding to the calibration images in the first shooting image and the second shooting image to serve as characteristic point information.
In one embodiment of the present application, after performing image recognition on the areas where the calibration images in the first captured image and the second captured image are located, and determining the number of corner points and the coordinates of the corner points corresponding to the calibration images in the first captured image and the second captured image as the feature point information, the obtaining module 310 is further configured to: comparing the numbers of the angular points corresponding to the same calibration image in the first shooting image and the second shooting image, and if the numbers of the angular points corresponding to the same calibration image are different, carrying out image capture again through the first camera and the second camera.
Fig. 4 shows a schematic diagram of a computer system suitable for use in implementing the electronic device of the embodiments of the present application.
It should be noted that, the computer system of the electronic device shown in fig. 4 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 4, the computer system includes a central processing unit (Central Processing Unit, CPU) 401 which can perform various appropriate actions and processes, such as performing the method described in the above embodiment, according to a program stored in a Read-Only Memory (ROM) 402 or a program loaded from a storage section 408 into a random access Memory (Random Access Memory, RAM) 403. In the RAM 403, various programs and data required for the system operation are also stored. The CPU 401, ROM 402, and RAM 403 are connected to each other by a bus 404. An Input/Output (I/O) interface 405 is also connected to bus 404.
The following components are connected to the I/O interface 405: an input section 406 including a keyboard, a mouse, and the like; an output portion 407 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and the like, a speaker, and the like; a storage section 408 including a hard disk or the like; and a communication section 409 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 409 performs communication processing via a network such as the internet. The drive 410 is also connected to the I/O interface 405 as needed. A removable medium 411 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 410 as needed, so that a computer program read therefrom is installed into the storage section 408 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method shown in the flowchart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 409 and/or installed from the removable medium 411. When executed by a Central Processing Unit (CPU) 401, performs the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with a computer-readable computer program embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. A computer program embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by means of software, or may be implemented by means of hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer-readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to implement the methods described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, in accordance with embodiments of the present application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a usb disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. The method for calibrating the binocular camera is characterized by comprising the following steps of:
acquiring a first shooting image and a second shooting image which are respectively shot by a first camera and a second camera in a binocular camera aiming at the same shooting content, wherein the shooting content comprises a plurality of calibration images with different distance settings and a face image to be checked, the calibration images comprise a plurality of characteristic points with equal distances in the transverse direction and the longitudinal direction, and the face image to be checked, the calibration images and the calibration images are not shielded from each other;
Performing image recognition on the first shooting image and the second shooting image, and respectively determining characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image;
determining target calibration information corresponding to the binocular camera according to the characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image, wherein the target calibration information comprises monocular calibration information and binocular three-dimensional calibration information;
according to the target calibration information, based on a Bouguet algorithm, respectively determining a first conversion matrix and a second conversion matrix from the first shooting image and the second shooting image to a coplanar line alignment plane;
carrying out depth recognition on face images to be detected contained in the first shooting image and the second shooting image according to the first conversion matrix and the second conversion matrix, and determining depth information corresponding to the face images to be detected;
and comparing the depth information with the actual distance between the face image to be detected and the binocular camera, if the comparison result meets a preset rule, determining that the calibration result is effective, and carrying out association storage on the target calibration information, the first conversion matrix and the second conversion matrix.
2. The method according to claim 1, wherein determining target calibration information corresponding to the binocular camera according to feature point information corresponding to each calibration image in the first captured image and the second captured image comprises:
monocular calibration is carried out on the first camera and the second camera according to the characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image respectively, and an internal parameter matrix, an external parameter matrix and radial distortion parameters corresponding to each of the first camera and the second camera are determined;
and according to the internal parameter matrix, the external parameter matrix and the radial distortion parameters corresponding to the first camera and the second camera respectively, binocular stereo calibration is carried out on the first camera and the second camera, and the relative relation between the first camera and the second camera coordinate system is determined, wherein the relative relation comprises a rotation matrix and a translation matrix between the first camera and the second camera.
3. The method according to claim 1, wherein performing depth recognition on face images to be detected included in the first captured image and the second captured image according to the first conversion matrix and the second conversion matrix, and determining depth information corresponding to the face images to be detected includes:
Face detection is carried out on the first shooting image and the second shooting image respectively, so that first coordinate information of face contour marking points corresponding to the first shooting image and the second shooting image respectively is obtained;
converting first coordinate information of the face contour marking points corresponding to the first shooting image and the second shooting image to a coplanar line alignment plane through the first conversion matrix and the second conversion matrix respectively, and determining second coordinate information of the face contour marking points corresponding to the first shooting image and the second shooting image respectively;
determining depth information corresponding to each face contour mark point according to the coordinate difference in the X direction between the second coordinate information of the same face contour mark point;
and determining target depth information of the face image to be detected according to the depth information corresponding to each face contour mark point.
4. A method according to claim 3, wherein determining the target depth information of the face image to be detected according to the depth information corresponding to each of the face contour mark points comprises:
and determining the average value of all the depth information as the target depth information of the face image to be detected according to the depth information corresponding to each face contour mark point.
5. The method according to claim 1, wherein the calibration image is a checkerboard image, the corner points of which are the feature points;
performing image recognition on the first shot image and the second shot image, and determining feature point information corresponding to each calibration image in the first shot image and the second shot image respectively, wherein the feature point information comprises:
and respectively carrying out image recognition on the areas where the calibration images in the first shooting image and the second shooting image are located, and determining the quantity of angular points and the coordinates of the angular points corresponding to the calibration images in the first shooting image and the second shooting image to serve as characteristic point information.
6. The method according to claim 5, wherein after performing image recognition on the areas where the calibration images are located in the first captured image and the second captured image, and determining the number of corner points and the coordinates of the corner points corresponding to the calibration images in the first captured image and the second captured image as the feature point information, the method further comprises:
comparing the numbers of the angular points corresponding to the same calibration image in the first shooting image and the second shooting image, and if the numbers of the angular points corresponding to the same calibration image are different, carrying out image capture again through the first camera and the second camera.
7. A calibration device for a binocular camera, the device comprising:
the acquisition module is used for acquiring a first shooting image and a second shooting image which are shot by a first camera and a second camera in the binocular camera respectively aiming at the same shooting content, wherein the shooting content comprises a plurality of calibration images with different distance settings and a face image to be checked, the calibration images comprise a plurality of characteristic points with equal distance in the transverse direction and the longitudinal direction, and the face image to be checked, the calibration images and the calibration images are not shielded;
the first determining module is used for carrying out image recognition on the first shooting image and the second shooting image, and determining characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image respectively;
the second determining module is used for determining target calibration information corresponding to the binocular camera according to the characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image, wherein the target calibration information comprises monocular calibration information and binocular three-dimensional calibration information;
the third determining module is used for respectively determining a first conversion matrix and a second conversion matrix from the first shooting image and the second shooting image to a coplanar line alignment plane based on a Bouguet algorithm according to the target calibration information;
A fourth determining module, configured to perform depth recognition on face images to be detected included in the first captured image and the second captured image according to the first conversion matrix and the second conversion matrix, and determine depth information corresponding to the face images to be detected;
and the processing module is used for comparing the depth information with the actual distance between the face image to be detected and the binocular camera, determining that the calibration result is effective if the comparison result meets a preset rule, and carrying out association storage on the target calibration information, the first conversion matrix and the second conversion matrix.
8. The apparatus of claim 7, wherein the second determining module is configured to:
monocular calibration is carried out on the first camera and the second camera respectively according to the characteristic point information corresponding to each calibration image in the first shooting image and the second shooting image, and an internal parameter matrix, an external parameter matrix and radial distortion parameters corresponding to each of the first camera and the second camera are determined;
and according to the internal parameter matrix, the external parameter matrix and the radial distortion parameters corresponding to the first camera and the second camera respectively, binocular stereo calibration is carried out on the first camera and the second camera, and the relative relation between the first camera and the second camera coordinate system is determined, wherein the relative relation comprises a rotation matrix and a translation matrix between the first camera and the second camera.
9. A computer readable medium on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements a method for calibrating a binocular camera according to any of claims 1-6.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method of calibrating a binocular camera according to any of claims 1 to 6.
CN202211623863.5A 2022-12-16 2022-12-16 Calibration method and device for binocular camera, computer readable medium and equipment Pending CN115994950A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211623863.5A CN115994950A (en) 2022-12-16 2022-12-16 Calibration method and device for binocular camera, computer readable medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211623863.5A CN115994950A (en) 2022-12-16 2022-12-16 Calibration method and device for binocular camera, computer readable medium and equipment

Publications (1)

Publication Number Publication Date
CN115994950A true CN115994950A (en) 2023-04-21

Family

ID=85994717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211623863.5A Pending CN115994950A (en) 2022-12-16 2022-12-16 Calibration method and device for binocular camera, computer readable medium and equipment

Country Status (1)

Country Link
CN (1) CN115994950A (en)

Similar Documents

Publication Publication Date Title
CN106709899B (en) Method, device and equipment for calculating relative positions of two cameras
CN112634374B (en) Stereoscopic calibration method, device and system for binocular camera and binocular camera
CN110809786B (en) Calibration device, calibration chart, chart pattern generation device, and calibration method
US11282232B2 (en) Camera calibration using depth data
US20160117820A1 (en) Image registration method
CN106570907B (en) Camera calibration method and device
CN110611767B (en) Image processing method and device and electronic equipment
WO2019232793A1 (en) Two-camera calibration method, electronic device and computer-readable storage medium
CN108734738B (en) Camera calibration method and device
CN111340737B (en) Image correction method, device and electronic system
US20120162387A1 (en) Imaging parameter acquisition apparatus, imaging parameter acquisition method and storage medium
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
US20200267297A1 (en) Image processing method and apparatus
US10937180B2 (en) Method and apparatus for depth-map estimation
CN113436269B (en) Image dense stereo matching method, device and computer equipment
CN111160233B (en) Human face in-vivo detection method, medium and system based on three-dimensional imaging assistance
CN111260574B (en) Seal photo correction method, terminal and computer readable storage medium
CN111353945B (en) Fisheye image correction method, device and storage medium
CN115994950A (en) Calibration method and device for binocular camera, computer readable medium and equipment
CN115002345B (en) Image correction method, device, electronic equipment and storage medium
CN116843759A (en) Calibration verification method and system for binocular camera, computer equipment and medium
US20110228141A1 (en) Distance acquisition device, lens correcting system and method applying the distance acquisition device
CN112446928B (en) External parameter determining system and method for shooting device
CN113450398B (en) Method, device, equipment and readable medium for matching marker in calibration object
CN117649454B (en) Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination