CN115761007A - Real-time binocular camera self-calibration method - Google Patents

Real-time binocular camera self-calibration method Download PDF

Info

Publication number
CN115761007A
CN115761007A CN202211498713.6A CN202211498713A CN115761007A CN 115761007 A CN115761007 A CN 115761007A CN 202211498713 A CN202211498713 A CN 202211498713A CN 115761007 A CN115761007 A CN 115761007A
Authority
CN
China
Prior art keywords
static key
image
key point
binocular camera
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211498713.6A
Other languages
Chinese (zh)
Inventor
王鑫
郑继川
陈俊明
杨青海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Metoak Technology Beijing Co ltd
Original Assignee
Metoak Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Metoak Technology Beijing Co ltd filed Critical Metoak Technology Beijing Co ltd
Priority to CN202211498713.6A priority Critical patent/CN115761007A/en
Publication of CN115761007A publication Critical patent/CN115761007A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a real-time binocular camera self-calibration method which comprises the following steps of 1, acquiring two adjacent frames of images acquired by a binocular camera in the moving process of a mobile device according to a sampling period, and respectively recording the two adjacent frames of images as a first image and a second image; step 2, respectively extracting a first static key point in the first image and a second static key point in the second image; step 3, calculating an angle rotation matrix R and a coordinate translation matrix T according to IMU information in the moving process of the mobile device; step 4, performing first coordinate transformation on the second static key point by using the three-dimensional coordinate transformation matrix, and performing second coordinate transformation on the first static key point by using the inter-frame motion transformation matrix; step 5, respectively introducing corrected parallax offset into the first coordinate transformation result and the second coordinate transformation result, generating a correction function calculation formula, and calculating the corrected parallax offset meeting the preset condition; the corrected parallax offset is used to correct the parallax value d of the binocular camera.

Description

Real-time binocular camera self-calibration method
Technical Field
The invention relates to the field of image processing, in particular to a real-time binocular camera self-calibration method.
Background
Principle of binocular camera: the distance measurement is realized by utilizing the relationship that the difference (namely parallax difference) directly existing in the transverse coordinates of the target point imaged on the left view and the right view is inversely proportional to the distance between the target point and the imaging plane. That is, if the parallax disparity of the target point is known, the distance Z from the target point to the camera can be known. Specifically, the solving method of Z is as follows: z = fB/D, where f denotes a focal length of the camera, B denotes a center distance of the binocular camera, and D denotes a parallax of the target point.
The binocular correction requires that the binocular correction is to respectively eliminate distortion and line alignment of left and right views according to monocular internal reference data (focal length, imaging origin, distortion coefficient) and binocular relative position relationship (rotation matrix and translation vector) obtained after the cameras are calibrated, so that the imaging origin coordinates of the left and right views are consistent, the optical axes of the two cameras are parallel, the left and right imaging planes are coplanar, and the epipolar lines are aligned. Therefore, any pixel point on one image (such as a right view) and the pixel point corresponding to the same thing on the other image (such as a left view) necessarily have the same line number, and only one-dimensional search is needed to be carried out at the corresponding line number of the left view and the right view.
For a binocular camera, in the whole life cycle after calibration is completed after the camera leaves factory, due to factors such as environmental temperature, the left view and the right view are aligned to generate deviation. This phenomenon is commonly referred to as binocular disparity degradation, which can cause disparity matching errors and thus affect the ranging function.
For example, patent CN112861940A proposes a binocular disparity estimation model based on a neural network method, including: obtaining a sample left image, a sample right image and a parallax label; the binocular disparity estimation model comprises a feature extraction network module, a matching cost calculation module, a single-scale cost aggregation module, a multi-scale cost aggregation module and a disparity regression model; and errors of the parallax labels and the estimated parallax are calculated to train a binocular parallax estimation model, so that the defects of large calculation amount, long consumed time and poor effect of object edges and non-texture areas are overcome. Although the calculation effect of the binocular disparity estimation method is improved by calculating the error between the disparity label and the estimated disparity and performing improved training on the disparity estimation model, the influence caused by binocular disparity degradation is not considered in the process.
And patent CN111225201A proposes a method of parallax correction, comprising: acquiring two original images including a target object through a binocular camera; determining a first parallax of a target object in imaging areas of two original images; adjusting the positions of imaging areas in the two original images according to the first parallax and the preset parallax; and determining a target image based on the imaging area after the position adjustment. This patent can be rectified to the parallax error of binocular camera, avoids through demarcating the binocular camera and rectifies the extra calculated amount that the parallax error brought, has improved the formation of image uniformity of binocular camera. In the patent, the positions of the imaging areas in the two original images are adjusted and then adjusted and determined only according to the first parallax and the preset parallax, but when the binocular parallax is degraded, the preset parallax is changed in nature, the change value is difficult to determine, errors exist in imaging at the moment, and the correction is not accurate.
In summary, in the prior art, disparity matching errors and inaccurate calibration results caused by binocular disparity degradation are still to be further solved, and the method is significant for solving the technical problems.
Disclosure of Invention
In this regard, the present application proposes the following technical solutions:
a real-time binocular camera self-calibration method is disclosed, a binocular camera is installed on a mobile device, and the method comprises the following steps:
step 1, acquiring two adjacent frames of images acquired by a binocular camera in the moving process of a mobile device according to a sampling period, and respectively recording the two adjacent frames of images as a first image and a second image;
step 2, respectively extracting a first static key point in the first image and a second static key point in the second image;
step 3, calculating an interframe motion transformation matrix according to IMU information in the moving process of the mobile device, wherein the interframe motion transformation matrix comprises an angle rotation matrix R and a coordinate translation matrix T;
step 4, performing first coordinate transformation on the second static key point by using the three-dimensional coordinate transformation matrix, and performing second coordinate transformation on the first static key point by using the inter-frame motion transformation matrix;
step 5, respectively introducing corrected parallax offset into the first coordinate transformation result and the second coordinate transformation result, generating a correction function calculation formula, and calculating the corrected parallax offset meeting the preset conditions;
the corrected parallax offset is used for correcting the parallax value d of the binocular camera.
Preferably, before step 4, the method further comprises:
presetting a threshold value for extracting the static key points, recording the threshold value as a preset threshold value, accumulating the quantity of the first static key points and the second static key points selected in the step 1, respectively judging whether the quantity of the first static key points and the second static key points is greater than or equal to the preset threshold value, if so, executing a step 4, otherwise, discarding the image, and executing the step 1 again.
Preferably, the preset condition in step 5 is that a value result of the correction function calculation formula is 0;
the step 5 specifically comprises the following steps:
pairing the first static key point and the second static key point;
respectively performing first coordinate transformation and second coordinate transformation on the paired first static key point and second static key point, and introducing corrected parallax offset, wherein a calculation formula corresponding to a calculation formula of a correction function F (offset) is as follows:
Figure BDA0003966048570000031
in the formula, B represents the center distance between the left camera and the right camera in the binocular camera; d i,t 、d i,t+1 The parallax values are corresponding to the t +1 th static key point group, the t moment and the i th static key point group; c. C x 、c y Representing the offset of the optical axis of the camera in an image pixel coordinate system, and taking a pixel as a unit; f represents the focal length of the camera; (u) i,t ,v i,t )、(u i,t+1 ,v i,t+1 ) Pixel coordinates corresponding to the t +1 th and t th moments and the i th group of static key points;
when the value result of the calculation formula of the correction function is 0, recording the corrected parallax calculation result corresponding to each pair of static key point groups as a corrected parallax intermediate value;
and accumulating the corrected parallax intermediate values, calculating the average value of the corrected parallax intermediate values, and recording the calculation result of the average value as the corrected parallax offset.
Preferably, the preset condition corresponds to an objective function:
Figure BDA0003966048570000032
the step 5 specifically comprises the following steps:
pairing the first static key point and the second static key point;
respectively performing first coordinate transformation and second coordinate transformation on the paired first static key point and second static key point, and introducing corrected parallax offset, wherein a calculation formula corresponding to a calculation formula of a correction function F (offset) is as follows:
Figure BDA0003966048570000033
in the formula, B represents the center distance between a left camera and a right camera in the binocular camera; d i,t 、d i,t+1 The parallax values are corresponding to the t +1 th static key point group, the t moment and the i th static key point group; c. C x 、c y Representing the offset of the optical axis of the camera in the image pixel coordinate system, taking the pixel as a unit; f represents the focal length of the camera; (u) i,t ,v i,t )、(u i,t+1 ,v i,t+1 ) Pixel coordinates corresponding to the t +1 th, t-th moments and the i-th group of static key points;
a corrected disparity offset is calculated that satisfies the objective function.
Preferably, in the step 2, in the process of selecting the static keypoints, two adjacent frames of images are further filtered, and the filtering method is as follows:
identifying image information contained in two adjacent frames of images based on a deep learning method, further acquiring regions of interest in the two adjacent frames of images, extracting corresponding images in the corresponding regions of interest when the image information category in the regions of interest is judged to be a first target, and marking corresponding image corner points in the two extracted adjacent frames of images as static key points; wherein the first target is a moving object.
Preferably, in the step 2, in the process of selecting the static key points, the two adjacent frames of images are filtered, and the filtering method is as follows:
identifying image information contained in two adjacent images based on a deep learning method, further acquiring interested areas in the two adjacent images, and acquiring image corner points of corresponding images in the interested areas and marking the image corner points as static key points when the image information category in the interested areas is judged to be a second target; wherein the second target is a static object.
Preferably, the mobile device is a vehicle, and the calculation method of the angular rotation matrix R in step 3 is as follows:
Figure BDA0003966048570000041
wherein p, r and q are respectively a pitch angle, a roll angle and a yaw angle in the attitude angle.
Preferably, the solution of the attitude angles r, p, q comprises the following steps:
step 3-1, mounting an accelerometer on the vehicle, and reading acceleration values of the accelerometer around an X axis, a Y axis and a Z axis at the moment of t +1 as follows: alpha (alpha) ("alpha") x,t+1 ,α y,t+1 ,α z,t+1 And solving by the following formula to obtain r at the t +1 moment acc,t+1 And p acc,t+1
Figure BDA0003966048570000042
Wherein g represents a gravitational acceleration value;
step 3-2: reading acceleration values alpha of the accelerometer around X, Y and Z axes at any time t through an accelerometer and a gyroscope installed on the vehicle x,t ,α y,t ,α z,t And solving the r at the t moment by the following formula acc,t 、p acc,t
Figure BDA0003966048570000043
Reading the angular speed omega of the gyroscope rotating around the X, Y and Z axes at any time t xt 、ω yt 、ω zt The r at the t +1 moment is obtained by solving the following formula gyvo 、p gyvo 、q gyvo
Figure BDA0003966048570000051
Step 3-3, fusing the postures based on the results of the step 3-1 and the step 3-2:
Figure BDA0003966048570000052
wherein K represents a proportionality coefficient.
Preferably, the mobile device is a vehicle, the coordinate translation matrix T in the step 3 is obtained by calculating a relative displacement matrix T from T to T +1 through a vehicle speed signal;
Figure BDA0003966048570000053
wherein, T x And T z Transverse and longitudinal components of vehicle speed, T y Is the displacement component in the vertical direction;
the gyroscope is arranged on the vehicle, and the gyroscope reads the angular speed omega rotating around the X, Y and Z axes at the time of t +1 x,t+1 ,ω y,y+1 ,ω y,y+1 (ii) a And solving the matrix T by:
Figure BDA0003966048570000054
wherein V represents a vehicle speed; Δ T is the time interval from time T to time T + 1.
Preferably, the first and second static keypoints are image corner points in the first and second images, respectively.
The beneficial effect of this application:
(1) The method comprises the steps of selecting image corner points of static objects in two adjacent frames of images as key points to be corrected, accumulating the selected static key points and corresponding inter-frame motion transformation matrixes in the two adjacent frames of images, then carrying out mean calculation on each accumulated parallax offset, and taking the obtained mean as the final parallax offset to ensure the accuracy of data.
(2) The method and the device solve the problem of parallax matching errors caused by binocular parallax degradation, and greatly improve the measurement accuracy of the binocular camera.
The foregoing description is only an overview of the technical solutions of the present application, so that the technical means of the present application can be more clearly understood and the present application can be implemented according to the content of the description, and in order to make the above and other objects, features and advantages of the present application more clearly understood, the following detailed description is made with reference to the preferred embodiments of the present application and the accompanying drawings.
The above and other objects, advantages and features of the present application will become more apparent to those skilled in the art from the following detailed description of specific embodiments thereof, taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the following detailed description of the present application is made with reference to the embodiments in the drawings, but not to be construed as limiting the present application in any way.
FIG. 1 is a schematic flow chart illustrating the steps of the binocular camera self-calibration method of the present application;
fig. 2 is a schematic view of a coordinate system of the binocular stereo camera of the present application with reference to calibration.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. In the following description, specific details such as specific configurations and components are provided only to help the embodiments of the present application be fully understood. Accordingly, it will be apparent to those skilled in the art that various changes and modifications may be made to the embodiments described herein without departing from the scope and spirit of the present application. In addition, descriptions of well-known functions and constructions are omitted in the embodiments for clarity and conciseness.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, B exists alone, and A and B exist at the same time, and the term "/and" is used herein to describe another association object relationship, which means that two relationships may exist, for example, A/and B, may mean: the presence of a alone, and both cases a and B alone, and further, the character "/" herein generally means that the former and latter associated objects are in an "or" relationship.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion.
Example 1
The embodiment provides a real-time binocular camera self-calibration method, and solves the problems of parallax matching error and inaccurate calibration result caused by parallax degradation of a binocular camera.
For convenience of understanding, in this embodiment, a sampling period is set, that is, sampling times corresponding to two adjacent frames of images acquired by a binocular camera are set as a time t and a time t +1, and then the two adjacent frames of images at the time t and the time t +1 are sequentially sampled according to the sampling period, where the two adjacent frames of images in the sampling are sequentially recorded as a first image and a second image.
It should be noted that the size of the sampling period may be set manually as needed.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating steps of a binocular camera self-calibration method according to the present application. In the present embodiment, a vehicle is taken as a mobile device for example, wherein a binocular camera is mounted on the vehicle, and the binocular camera includes a left camera and a right camera. The binocular camera parallax calibration method in the embodiment comprises the following steps:
step 1, acquiring two adjacent frames of images acquired by a binocular camera at a time t and a time t +1 in a vehicle driving process according to a sampling period, and respectively recording the two adjacent frames of images as a first image and a second image, wherein the two adjacent frames of images can be acquired by any one of a left camera and a right camera;
and 2, respectively extracting a first static key point in the first image and a second static key point in the second image.
Preferably, the first static key point and the second static key point in this embodiment are image corner points in the first image and the second image, respectively.
In this embodiment, the image information included in two adjacent frames of images is identified by means of image identification, and the identified image information may be a pedestrian, a vehicle, a lane line, characters on a road, a roadside traffic signboard, or the like.
In this embodiment, the angular points are used as static key points in the correction process in consideration of the characteristics of uniqueness, rotation non-deformation and the like of the angular points, so that the characteristic values of the angular points are used for matching and positioning between the images.
It should be noted that, in this embodiment, by means of pixel point matching, corner points in the first image and the second image may be paired to form a static key point group, and details of the specific process are not repeated.
However, in the correction process, it is found that the correction result of the binocular camera is not ideal, and especially in a complex environment with many pedestrians and vehicles, research and development personnel can find the image corner points of moving objects such as pedestrians and vehicles by researching image data, and noise interference can be formed on the correction result of the binocular camera.
However, when trying to select only the image corner points of a stationary object (such as a lane line and a traffic signboard) as a correction basis, the accuracy of the correction result of the binocular camera is significantly improved.
Therefore, in the process of selecting the static keypoints in step 2, two adjacent images (the first image and the second image) need to be filtered. Wherein, the filtration treatment can adopt the following method:
one method is based on a deep learning method, image information contained in two adjacent frames of images is identified, regions of interest in the two adjacent frames of images are further obtained, when the type of the image information in the induction region is judged to be a first target, corresponding images in the corresponding regions of interest are extracted, then corresponding image corner points in the two extracted adjacent frames of images are marked as static key points, wherein the first target can be a moving object such as a pedestrian, a vehicle and the like;
and the other method is based on a deep learning method, image information contained in two adjacent frames of images is identified, an interested region in the two adjacent frames of images is further acquired, and when the image information category in the interested region is judged to be a second target, an image corner point corresponding to a corresponding image in the interested region is acquired and recorded as a static key point, wherein the second target is a static object such as a lane line, a traffic signboard, a character and the like.
Further, w static key points are selected from the two adjacent frames of images read in the step 1, and the pixel coordinate matrix A of the static key points at the time t +1 and t is stored t+1 、A t
Storing the key point pixel parallax set (d) at t +1 and t 1,t+1 ,d 2,t+1 ......d i,t+1 ......d w,t+1 ) And (d) 1,t ,d 2,t ......d i,t ......d w,t ) I =1,2, \ 8230, w, i is the serial number of the key point, i.e. the corner points with the same serial number at the time t +1 and t form a group of static key point groups, and the serial number of the static key point group is also i;
Figure BDA0003966048570000081
Figure BDA0003966048570000082
in the above formula, u i,t+1 Represents the pixel coordinate V of any ith static key point on the X axis at the moment of t +1 i,t+1 Representing the Y-axis pixel coordinate of any ith static key point at the moment of t + 1;
u i,t represents the pixel coordinate V of any ith static key point on the X axis at the moment t i,t Representing the Y-axis pixel coordinate of any ith static key point at the time t;
d i,t is the parallax value of any ith static key point at the moment t, d i,t+1 The disparity value of any ith static key point at the moment of t + 1.
And 3, calculating an interframe motion transformation matrix according to IMU information in the vehicle driving process, wherein the interframe motion transformation matrix comprises an angle rotation matrix R and a coordinate translation matrix T.
The angle rotation matrix R in the step 3 represents an angle rotation matrix from a camera coordinate system at the moment t to a camera coordinate system of a binocular camera at the moment t + 1;
calculating an angle rotation matrix R from t to t +1 as follows:
Figure BDA0003966048570000091
in the formula (3), p, r and q are respectively a pitch angle, a roll angle and a yaw angle in an attitude angle.
In the automatic driving process of the vehicle, the attitude angle and the acceleration of the vehicle can be obtained by adopting an attitude fusion mode through an Inertial Measurement Unit (IMU) arranged on the vehicle or a combination of an accelerometer and a gyroscope. Wherein the attitude angles r, p, q of the formula (3) adopt the following calculation formula:
Figure BDA0003966048570000092
it should be noted that, in this embodiment, the coordinate systems have been aligned uniformly by way of coordinate system transformation, and the origin of the coordinate system after being aligned is set at the vehicle head position, where the process of coordinate system transformation is not described again.
Further, the attitude angles p, R, and q in the angular rotation matrix R in step 3 are specifically calculated as follows:
step 3-1, resolving r through an accelerometer acc,t+1 And p acc,t+1
When the accelerometer rotates a certain attitude, the acceleration of gravity will produce corresponding components on 3 axes of acceleration, which is essentially the coordinates (0, g) in the geodetic coordinate system in the new coordinate system of the accelerometer itself, and the 3 values read by the accelerometer are the new coordinates (alpha) x ,α y ,α z )。
Figure BDA0003966048570000101
Wherein M is z 、M y 、M x The rotation moments corresponding to the angles around the coordinate axes of Z, Y and X in sequence at the moment tArraying; g represents a gravitational acceleration value; by solving the above equation, one can solve to obtain: r is acc 、p acc And (4) an angle.
By mounting an accelerometer on the vehicle, the acceleration values of the accelerometer around the X axis, the Y axis and the Z axis are read at the moment t +1 as follows: alpha is alpha x,t+1 ,α y,t+1 ,α z,t+1 (ii) a By using equation (5), r at time t +1 can be obtained acc,t+1 、p acc,t+1 The following:
Figure BDA0003966048570000102
step 3-2, calculating an attitude angle r through a gyroscope gyvo 、p gyvo 、q gyvo
Reading the acceleration value alpha of the accelerometer around the X, Y and Z axes in any time t through the accelerometer and the gyroscope arranged on the vehicle x,t+1 ,α y,t+1 ,α z,t+1 And r at the time t is obtained by solving the following formula acc,t 、p acc,t
Figure BDA0003966048570000103
The angular velocities measured by the gyroscope for rotation about 3 axes can be used to derive the angle by integrating the angular velocities. Specifically, the angular speed of the gyroscope rotating around the X, Y and Z axes measured at any time t is read to be omega xt 、ω yt 、ω zt Angular velocity used for attitude update
Figure BDA0003966048570000104
The following relationship is satisfied:
Figure BDA0003966048570000111
by simplifying the above equation, r at time t +1 can be obtained gyvo 、p gyvo 、q gyvo Such asThe following:
Figure BDA0003966048570000112
step 3-3, fusing the postures based on the results of the step 3-1 and the step 3-2:
Figure BDA0003966048570000113
wherein K is a proportionality coefficient. The scaling factor needs to be adjusted to the actual value, preferably 0.4.
In addition, it should be noted that:
the X axis, the Y axis and the Z axis measured by the accelerometer and the gyroscope are the X axis, the Y axis and the Z axis under a camera coordinate system:
the definition of the camera coordinate system is as follows:
1) Establishing a three-dimensional coordinate system by using a depth camera as an origin point, wherein the depth camera is the origin point of the camera coordinate system,
2) The X axis is along the transverse direction of the depth camera-color camera, and the Z axis is perpendicular to the X axis and points to the shooting direction;
3) The Y axis is vertical to both the X axis and the Z axis;
4) A proposed right-hand regular coordinate system.
Further, the coordinate translation matrix T in step 3 represents a relative displacement matrix of the camera coordinate system. Since the IMU information cannot provide location information, it is necessary to additionally introduce a signal providing location or speed, typically a vehicle speed signal or a GPS signal. Through the vehicle speed signal, a relative displacement matrix T in the time from T to T +1 can be obtained;
Figure BDA0003966048570000121
wherein, T x 、T y 、T z Is the component of the camera coordinate system origin at time t to the camera coordinate system origin at time t + 1.
Reading the angular speed omega rotating around the vehicle around three axes of X, Y and Z by the gyroscope arranged on the vehicle at the moment of t +1 x,t+1 ,ω y,y+1 ,ω y,y+1 The solving formula is as follows:
Figure BDA0003966048570000122
where V represents a vehicle speed (which can be obtained from a vehicle speed meter, and may be a vehicle speed at time T + 1), and Δ T represents a time interval from time T to time T + 1.
And 4, performing first coordinate transformation on the second static key point by using the three-dimensional coordinate transformation matrix, and performing second coordinate transformation on the first static key point by using the inter-frame motion transformation matrix.
Before step 4, the number of the first static key points and the number of the second static key points extracted from step 1 are also determined, specifically:
the threshold value of the extracted static key point is preset, namely the preset threshold value. Accumulating the first static key point and the second static key point selected in the step 1 with the first image and the second image, corresponding to the corresponding angle rotation matrix R and the coordinate translation matrix T, respectively judging whether the number accumulations of the first static key point and the second static key point are both larger than or equal to a preset threshold value, and if so, executing a step 4; if not, discarding the image and re-executing the step 1. The discarded image may be the first image, or the first image and the second image, and is determined by the sampling period/interval.
By presetting and judging the number of the static key points, the problem of poor camera correction effect caused by factors such as small number of the corner points, unstable key points of acquired image information and the like in the process of processing image corner points is avoided.
Step 5, respectively introducing corrected parallax offset into the first coordinate transformation result and the second coordinate transformation result, generating a correction function calculation formula, and calculating the corrected parallax offset meeting the preset condition; the corrected parallax offset is used for correcting the parallax value d of the binocular camera.
It should be noted that, the corrected result may be set to the inside of the binocular camera in real time to be effective, and meanwhile, the configuration file on the memory is updated, and then, after powering off and starting again, the configuration file still takes effect.
For any pixel point in the image, based on the binocular ranging principle, a two-dimensional coordinate can be converted into a point in a three-dimensional coordinate system through a three-dimensional coordinate transformation mode.
Setting the two-dimensional coordinate of any pixel point in the image as (u, v), and the three-dimensional coordinate corresponding to the point as (p) x 、p y 、p z ) The corresponding calculation formula is:
Figure BDA0003966048570000131
in the formula, B represents the center distance between the left camera and the right camera, and d is a parallax value; f represents the focal length of the camera; c. C x 、c y Indicating the amount of displacement of the optical axis of the camera in the image pixel coordinate system.
Similarly, after the angle rotation matrix R and the coordinate translation matrix T are obtained, if the binocular camera does not need to correct, the three-dimensional coordinates of the same pixel point (the static key point) in the next frame image at the T +1 th time may be calculated based on the three-dimensional coordinates of the point in the previous frame image at the T th time by the following calculation formula:
Figure BDA0003966048570000132
the relationship of corresponding static key points in two adjacent frames of images of an ideal binocular camera without correction can be equal when the relationship is substituted into the formula.
That is, in an ideal state, the relationship between corresponding static key points in two adjacent images is equal from left to right, as shown in the following formula:
Figure BDA0003966048570000133
however, due to factors such as ambient temperature and vibration, the physical structure of the camera is deformed, so that the relationship between the corresponding static key points in two adjacent frames no longer conforms to the above formula, and fig. 2 is a schematic diagram of a coordinate system for external reference calibration of the binocular stereo camera according to the present application, shown in fig. 2.
Based on the above formula, for two adjacent frames of images, performing a first coordinate transformation on the pixel coordinate of the ith second static keypoint in the second image at the time t +1 in a three-dimensional coordinate transformation mode, and introducing a corrected parallax offset to convert the two-dimensional pixel coordinate into a coordinate point in a three-dimensional space coordinate system, where the first coordinate transformation may adopt the following formula:
Figure BDA0003966048570000141
in the formula, B represents the center distance between a left camera and a right camera in the binocular camera; d i,t+1 The parallax value corresponding to the ith group of static key point group at the t +1 th moment; c. C x 、c y Representing the offset of the optical axis of the camera in the image pixel coordinate system, taking the pixel as a unit; f represents the focal length of the camera; (u) i,t+1 ,v i,t+1 ) Pixel coordinates corresponding to the t +1 th moment and the ith group of static key points;
and for the ith first static key point in the first image at the time t, performing second coordinate transformation based on the interframe motion transformation matrix, introducing corrected parallax offset, and calculating three-dimensional coordinates by adopting the following formula:
Figure BDA0003966048570000142
in the formula, B represents the center distance between the left camera and the right camera in the binocular camera; d i,t The parallax value corresponding to the ith group of static key point groups at the t moment; c. C x 、c y Representing camera optical axis at image pixel coordinatesThe offset in the system is in units of pixels; f represents the focal length of the camera; (u) i,t ,v i,t ) Pixel coordinates corresponding to the t-th moment and the i-th group of static key points;
therefore, the corrected parallax offset calculation formula is as follows:
Figure BDA0003966048570000143
wherein, B represents the center distance of the left camera and the right camera; c. C x 、c y Representing the offset of the optical axis of the camera in the image pixel coordinate system, taking the pixel as a unit; f denotes the focal length of the camera; B. c. C x And c y Can obtain an initial value through three-dimensional calibration, and belongs to a known quantity.
Further, the corrected parallax offset is calculated in step 5, the accumulated static key points and the inter-frame motion transformation matrices corresponding to the two adjacent frames of images, that is, the angle rotation matrix R and the coordinate translation matrix T, may be respectively substituted into a formula F (offset), when the value calculation result of the calculation formula of the correction function is 0, the corrected parallax calculation result corresponding to each pair of static key point groups obtains a plurality of corrected parallax intermediate values, and then the mean value calculation form is adopted, that is, the corrected parallax intermediate values are accumulated, the mean value of the corrected parallax intermediate values is calculated, and the mean value calculation result is recorded as the corrected parallax offset, thereby ensuring the accuracy of data and reducing the correction error.
Furthermore, a least square principle can be adopted to set an objective function, and the corrected parallax offset corresponding to the corrected parallax offset calculation formula is solved by optimizing the objective function.
The target function corresponding to the preset condition is as follows, and the optimization target is set to be the minimum integral deviation.
Figure BDA0003966048570000151
The specific solving process is not described in detail.
Then, pairing the first static key point and the second static key point; sequentially performing first coordinate transformation and second coordinate transformation on the paired first static key point and second static key point, and introducing corrected parallax offset, wherein a calculation formula corresponding to a correction function calculation formula is as follows:
Figure BDA0003966048570000152
a corrected disparity offset is calculated that satisfies the objective function.
In addition, in this embodiment, a calculation method of a convergence function may also be used to solve the corrected parallax offset corresponding to the calculation formula of the correction function F (offset), and the detailed process is not repeated.
Here, it should be noted that:
key point pixel coordinate matrix A at t +1 moment and t moment in step 2 t+1 、A t If the pixel in the left image is read, c in equation (15) x 、c y Is also the offset of the left camera optical axis in the left image pixel coordinate system; key point pixel coordinate matrix A at t +1 moment and t moment in step 2 t+1 、A t If the pixel in the right image is read, c in equation (15) x 、c y Also the amount of shift of the right camera optical axis in the right image pixel coordinate system.
The above-mentioned embodiments are merely preferred embodiments of the present application, which are not intended to limit the present application in any way, and it will be understood by those skilled in the art that various changes and modifications can be made without departing from the spirit and scope of the present application.

Claims (10)

1. A real-time binocular camera self-calibration method is disclosed, wherein the binocular camera is mounted on a mobile device, and the method is characterized by comprising the following steps:
step 1, acquiring two adjacent frames of images acquired by the binocular camera in the moving process of the mobile device according to a sampling period, and respectively recording the two adjacent frames of images as a first image and a second image;
step 2, respectively extracting a first static key point in the first image and a second static key point in the second image;
step 3, calculating an interframe motion transformation matrix according to IMU information in the moving process of the mobile device, wherein the interframe motion transformation matrix comprises an angle rotation matrix R and a coordinate translation matrix T;
step 4, performing first coordinate transformation on the second static key point by using a three-dimensional coordinate transformation matrix, and performing second coordinate transformation on the first static key point by using the inter-frame motion transformation matrix;
step 5, respectively introducing corrected parallax offset into the first coordinate transformation result and the second coordinate transformation result, generating a correction function calculation formula, and calculating the corrected parallax offset meeting the preset conditions;
wherein the corrected parallax offset is used for correcting the parallax value d of the binocular camera.
2. The real-time binocular camera self-calibration method according to claim 1, further comprising, before the step 4:
presetting a threshold value for extracting static key points, recording the threshold value as a preset threshold value, accumulating the number of the first static key points and the second static key points selected in the step 1, respectively judging whether the number of the first static key points and the number of the second static key points are greater than or equal to the preset threshold value, if so, executing the step 4, otherwise, discarding the image, and executing the step 1 again.
3. The real-time binocular camera self-calibration method according to claim 1, wherein the preset condition in the step 5 is that a value result of the correction function calculation formula is 0;
the step 5 specifically comprises:
pairing the first static key point and the second static key point;
respectively performing first coordinate transformation and second coordinate transformation on the paired first static key point and second static key point, and introducing the corrected parallax offset, wherein a calculation formula corresponding to a calculation formula of a correction function F (offset) is as follows:
Figure FDA0003966048560000011
in the formula, B represents the center distance between a left camera and a right camera in the binocular camera; d is a radical of i,t 、d i,t+1 The parallax values are corresponding to the t +1 th static key point group, the t moment and the i th static key point group; c. C x 、c y Representing the offset of the optical axis of the camera in the image pixel coordinate system, taking the pixel as a unit; f represents the focal length of the camera; (u) i,t ,v i,t )、(u i,t+1 ,v i,t+1 ) Pixel coordinates corresponding to the t +1 th, t-th moments and the i-th group of static key points;
when the value result of the correction function calculation formula is calculated to be 0, recording the correction parallax calculation result corresponding to each pair of static key point groups as a correction parallax intermediate value;
and accumulating the corrected parallax intermediate values, calculating the average value of the corrected parallax intermediate values, and recording the average value calculation result as the corrected parallax offset.
4. A real-time binocular camera self-calibration method according to any one of claims 1 or 3, wherein the objective function corresponding to the preset condition is:
Figure FDA0003966048560000021
the step 5 specifically includes:
pairing the first static keypoints and the second static keypoints;
respectively performing first coordinate transformation and second coordinate transformation on the paired first static key point and second static key point, and introducing the corrected parallax offset, wherein a calculation formula corresponding to a calculation formula of a correction function F (offset) is as follows:
Figure FDA0003966048560000022
in the formula, B represents the center distance between a left camera and a right camera in the binocular camera; d is a radical of i,t 、d i,t+1 The parallax values corresponding to the t +1 th, t-th moments and the i-th group of static key point groups; c. C x 、c y Representing the offset of the optical axis of the camera in the image pixel coordinate system, taking the pixel as a unit; f denotes the focal length of the camera; (u) i,t ,v i,t )、(u i,t+1 ,v i,t+1 ) Pixel coordinates corresponding to the t +1 th, t-th moments and the i-th group of static key points;
calculating the corrected disparity offset that satisfies the objective function.
5. The binocular camera self-calibration method in real time according to any one of claims 1 to 4, wherein in the step 2, during the static keypoint selection, the two adjacent frames of images are further filtered, and the filtering method is as follows:
identifying image information contained in the two adjacent frames of images based on a deep learning method, further acquiring the region of interest in the two adjacent frames of images, when the image information category in the region of interest is judged to be a first target, matting out the corresponding image in the region of interest, and then marking the corresponding image corner points in the two scratched adjacent frames of images as static key points; wherein the first target is a moving object.
6. The self-calibration method for the binocular camera in real time according to any one of claims 1 to 4, wherein in the step 2, during the static key point selection, the two adjacent frames of images are filtered, and the filtering method is as follows:
identifying image information contained in the two adjacent frames of images based on a deep learning method, further acquiring the region of interest in the two adjacent frames of images, and acquiring image corner points corresponding to the corresponding images in the region of interest and marking the image corner points as static key points when the type of the image information in the region of interest is judged to be a second target; wherein the second target is a static object.
7. The real-time binocular camera self-calibration method according to claim 1, wherein the mobile device is a vehicle, and the calculation method of the angular rotation matrix R in the step 3 is as follows:
Figure FDA0003966048560000031
wherein p, r and q are respectively a pitch angle, a roll angle and a yaw angle in the attitude angle.
8. The real-time binocular camera self-calibration method of claim 7, wherein the solution of the pose angles r, p, q comprises the steps of:
step 3-1, mounting an accelerometer on the vehicle, and reading acceleration values of the accelerometer around an X axis, a Y axis and a Z axis at the time t +1 as follows: alpha is alpha x,t+1 ,α y,t+1 ,α z,t+1 And solving the r at the t +1 moment by the following formula acc,t+1 And p acc,t+1
Figure FDA0003966048560000032
Wherein g represents a gravitational acceleration value;
step 3-2: reading the acceleration value alpha of the accelerometer around the X axis, the Y axis and the Z axis in any time t through an accelerometer and a gyroscope installed on the vehicle x,t ,α y,t ,α z,t And solving the r at the time t by the following formula acc,t 、p acc,t
Figure FDA0003966048560000033
Reading the angular speed omega of the gyroscope rotating around the X, Y and Z axes at any time t xt 、ω yt 、ω zt The r at the time t +1 is obtained by solving the following formula gyvo 、p gyvo 、q gyvo
Figure FDA0003966048560000041
Step 3-3, fusing the postures based on the results of the step 3-1 and the step 3-2:
Figure FDA0003966048560000042
wherein K represents a proportionality coefficient.
9. The real-time binocular camera self-calibration method according to claim 1, wherein the mobile device is a vehicle, the coordinate translation matrix T in the step 3 is calculated through a vehicle speed signal to obtain a relative displacement matrix T within a time period from T to T + 1;
Figure FDA0003966048560000043
wherein, T x And T z Transverse and longitudinal components of vehicle speed, T y Is the displacement component in the vertical direction;
a gyroscope is arranged on the vehicle, and the gyroscope is used for measuring the angular speed omega rotating around the X axis, the Y axis and the Z axis when the t +1 moment is read x,t+1 ,ω y,y+1 ,ω y,y+1 (ii) a And solving the matrix T by:
Figure FDA0003966048560000044
wherein V represents a vehicle speed; Δ T is the time interval from time T to time T + 1.
10. A real-time binocular camera self-calibration method according to any one of claims 1,2 or 3, wherein the first and second static keypoints are image corner points in the first and second images, respectively.
CN202211498713.6A 2022-11-28 2022-11-28 Real-time binocular camera self-calibration method Pending CN115761007A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211498713.6A CN115761007A (en) 2022-11-28 2022-11-28 Real-time binocular camera self-calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211498713.6A CN115761007A (en) 2022-11-28 2022-11-28 Real-time binocular camera self-calibration method

Publications (1)

Publication Number Publication Date
CN115761007A true CN115761007A (en) 2023-03-07

Family

ID=85338982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211498713.6A Pending CN115761007A (en) 2022-11-28 2022-11-28 Real-time binocular camera self-calibration method

Country Status (1)

Country Link
CN (1) CN115761007A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111862235A (en) * 2020-07-22 2020-10-30 中国科学院上海微***与信息技术研究所 Binocular camera self-calibration method and system
CN116503492A (en) * 2023-06-27 2023-07-28 北京鉴智机器人科技有限公司 Binocular camera module calibration method and calibration device in automatic driving system
CN117061719A (en) * 2023-08-11 2023-11-14 元橡科技(北京)有限公司 Parallax correction method for vehicle-mounted binocular camera

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111862235A (en) * 2020-07-22 2020-10-30 中国科学院上海微***与信息技术研究所 Binocular camera self-calibration method and system
CN116503492A (en) * 2023-06-27 2023-07-28 北京鉴智机器人科技有限公司 Binocular camera module calibration method and calibration device in automatic driving system
CN117061719A (en) * 2023-08-11 2023-11-14 元橡科技(北京)有限公司 Parallax correction method for vehicle-mounted binocular camera
CN117061719B (en) * 2023-08-11 2024-03-08 元橡科技(北京)有限公司 Parallax correction method for vehicle-mounted binocular camera

Similar Documents

Publication Publication Date Title
CN110569704B (en) Multi-strategy self-adaptive lane line detection method based on stereoscopic vision
CN115761007A (en) Real-time binocular camera self-calibration method
CN112734852B (en) Robot mapping method and device and computing equipment
CN111830953B (en) Vehicle self-positioning method, device and system
WO2020000137A1 (en) Integrated sensor calibration in natural scenes
WO2018142900A1 (en) Information processing device, data management device, data management system, method, and program
CN113903011B (en) Semantic map construction and positioning method suitable for indoor parking lot
CN108759823B (en) Low-speed automatic driving vehicle positioning and deviation rectifying method on designated road based on image matching
CN112347205B (en) Updating method and device for vehicle error state
CN112819711B (en) Monocular vision-based vehicle reverse positioning method utilizing road lane line
CN112150448B (en) Image processing method, device and equipment and storage medium
CN112136021A (en) System and method for constructing landmark-based high-definition map
CN114719873B (en) Low-cost fine map automatic generation method and device and readable medium
DE112018004529T5 (en) POSITION ESTIMATING DEVICE AND POSITION ESTIMATING METHOD OF A MOBILE UNIT
CN114758504A (en) Online vehicle overspeed early warning method and system based on filtering correction
CN111723778A (en) Vehicle distance measuring system and method based on MobileNet-SSD
CN114565510A (en) Lane line distance detection method, device, equipment and medium
CN111238490B (en) Visual positioning method and device and electronic equipment
CN114821530A (en) Deep learning-based lane line detection method and system
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN112424568A (en) System and method for constructing high-definition map
CN115388880B (en) Low-cost parking map construction and positioning method and device and electronic equipment
CN114998436A (en) Object labeling method and device, electronic equipment and storage medium
CN115294211A (en) Vehicle-mounted camera installation external parameter calibration method, system, device and storage medium
CN114511841B (en) Multi-sensor fusion idle parking space detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination