CN117649454B - Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium - Google Patents

Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117649454B
CN117649454B CN202410117685.1A CN202410117685A CN117649454B CN 117649454 B CN117649454 B CN 117649454B CN 202410117685 A CN202410117685 A CN 202410117685A CN 117649454 B CN117649454 B CN 117649454B
Authority
CN
China
Prior art keywords
rotation
filtering
error
correction
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410117685.1A
Other languages
Chinese (zh)
Other versions
CN117649454A (en
Inventor
田越
梁满
姚宏志
徐靖
海松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING YOYO TIANYU SYSTEM TECHNOLOGY CO LTD
Original Assignee
BEIJING YOYO TIANYU SYSTEM TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING YOYO TIANYU SYSTEM TECHNOLOGY CO LTD filed Critical BEIJING YOYO TIANYU SYSTEM TECHNOLOGY CO LTD
Priority to CN202410117685.1A priority Critical patent/CN117649454B/en
Publication of CN117649454A publication Critical patent/CN117649454A/en
Application granted granted Critical
Publication of CN117649454B publication Critical patent/CN117649454B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses an automatic correction method for external parameters of a binocular camera, which comprises the following steps: acquiring a plurality of groups of sample images acquired by a binocular camera; respectively extracting key points from two sample images of each group of sample images; matching key points of four sample images of adjacent frames to obtain a plurality of key point matching pairs; determining a rotation matrix of the corresponding camera gesture and a unit direction vector of the corresponding translation vector according to the key point matching pair and the camera internal reference; filtering the plurality of rotation matrixes and the plurality of unit direction vectors to obtain a plurality of filtering rotation matrixes and a plurality of corresponding filtering unit direction vectors; and carrying out optimization calculation according to the five angle initial values, the filtering rotation matrix and the filtering unit direction vector, and determining a correction rotation matrix and a correction translation vector. The correction method is simple and convenient, and improves the correction precision. The invention also discloses a device, electronic equipment and a computer readable storage medium for realizing the method.

Description

Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of binocular camera technologies, and in particular, to a binocular camera external parameter automatic correction method, apparatus, electronic device, and storage medium.
Background
Calibration or calibration of cameras is one of the core research content in computer vision, and is a key technology for reconstructing three-dimensional information from two-dimensional image pairs. Calibration is the process of establishing the precise geometric mapping relation between the space object point and the image point.
The most common of the conventional camera calibration methods is Zhang Zhengyou checkerboard calibration. Zhang Zhengyou checkerboard calibration method utilizes a checkerboard calibration plate, and after an image of a calibration plate is obtained, a corresponding image detection algorithm can be utilized to obtain pixel coordinates of each corner point. The Zhang Zhengyou calibration method is to fix the world coordinate system on the checkerboard, and as the world coordinate system of the calibration plate is defined manually in advance, the size of each grid on the calibration plate is known, and the physical coordinate of each corner under the world coordinate system can be calculated. The Zhang Zhengyou checkerboard calibration method utilizes the pixel coordinates of each corner point and the physical coordinates of each corner point under the world coordinate system to calibrate the camera, and obtains the internal and external parameter matrix and distortion parameters of the camera. In addition, the traditional camera calibration method also comprises a three-dimensional calibration field method, a coding mark block method and the like. The three-dimensional calibration field method utilizes a three-dimensional calibration field, three-dimensional coordinates of a calibration point in the three-dimensional calibration field in space are set manually or measured through a high-precision electronic theodolite, pixel coordinates of the calibration point in an image are obtained through a corresponding detection algorithm, and an internal parameter matrix and an external parameter matrix and a distortion coefficient of the camera are calculated according to the obtained three-dimensional coordinates and the pixel coordinates of the calibration point. In order to make the calibration points in the image and the three-dimensional calibration field correspond to each other one by one, the calibration points can be encoded, namely, an encoding mark block method, and the idea is as follows: unique identity information is loaded (i.e., encoded) on the object side marker blocks, and then each marker block containing encoded information is identified (i.e., decoded) in the image.
The existing calibration method needs to use equipment such as a calibration plate, a theodolite and the like except a camera and manual participation for calibration and calibration, and is complex and limited in calibration precision.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a binocular camera external parameter automatic correction method, a binocular camera external parameter automatic correction device, electronic equipment and a storage medium. The technical problems to be solved by the invention are realized by the following technical scheme:
a first aspect of an embodiment of the present invention provides, including: an automatic correction method for external parameters of a binocular camera comprises the following steps:
acquiring a plurality of groups of sample images acquired by a binocular camera; each group of sample images comprises a frame of sample images which are simultaneously and respectively acquired by a left camera and a right camera;
Respectively extracting key points from two sample images of each group of sample images by adopting a key point extraction algorithm; wherein the key points comprise characteristic points and/or angular points;
Matching key points of four sample images of adjacent frames to obtain a plurality of key point matching pairs;
Determining a rotation matrix of the corresponding camera gesture and a unit direction vector of the corresponding translation vector by adopting recoverPose algorithm according to the key point matching pair and the camera internal reference;
determining a plurality of filtering errors according to the plurality of rotation matrices and the plurality of unit direction vectors; wherein the filtering error comprises: a first filtered rotation error, a second filtered rotation error, a filtered translation error, and a filtered circulation error;
fitting the filtering errors through Gaussian distribution;
Screening a plurality of rotation matrixes and a plurality of unit direction vectors corresponding to the filtering errors with the filtering errors larger than the preset multiple standard deviation, and taking the rotation matrixes and the unit direction vectors as the filtering rotation matrixes and the corresponding filtering unit direction vectors;
And carrying out optimization calculation according to the five angle initial values, the filtering rotation matrix and the filtering unit direction vector, and determining a correction rotation matrix and a correction translation vector.
In one embodiment of the present invention, the determining a plurality of filtering errors according to the plurality of rotation matrices and the plurality of unit direction vectors includes:
Determining a plurality of first filtered rotation errors according to the plurality of rotation matrices and the rodgers formula;
Determining a plurality of second filtering rotation errors according to the rotation matrix and the Rodrigas formula corresponding to each group of sample images;
Determining a plurality of filtering translation errors according to the unit direction vectors corresponding to each group of sample images and the preset unit direction vectors;
and determining a plurality of filtering cyclic errors according to the plurality of rotation matrixes, the unit direction vector and the preset unit direction vector.
In one embodiment of the present invention, the optimizing calculation is performed according to the five initial angles, the filtering rotation matrix and the filtering unit direction vector, to determine a correction rotation matrix and a correction translation vector, including:
determining an initial rotation matrix and an initial translation vector according to the five initial angle values and the baseline length of the binocular camera;
Determining a plurality of corresponding correction errors according to the initial rotation matrix, the initial translation vector, the plurality of filtering rotation matrices and the plurality of filtering unit direction vectors; wherein the correction error includes: a first correction rotation error, a second correction rotation error, a correction translation error, and a correction circulation error;
determining an objective function according to the first correction rotation error, the second correction rotation error, the correction translation error, the correction cycle error and the corresponding preset weights;
adjusting the initial values of the five angles to determine five target angle optimization values corresponding to the minimum value of the target function;
and determining a correction rotation matrix and a correction translation vector according to the five target angle optimization values and the baseline length of the binocular camera.
In one embodiment of the present invention, the determining a plurality of first filtered rotation errors according to a plurality of rotation matrices and a rodrich formula includes:
determining a plurality of first matrices from the plurality of rotation matrices;
Calculating radian values corresponding to the rotation angles of the first matrixes through a Rodrigues formula; wherein, the radian value is a first filtered rotation error.
In one embodiment of the present invention, the determining a plurality of second filtered rotation errors according to the rotation matrix and the rodgers formula corresponding to each set of sample images includes:
determining a plurality of second matrixes according to the rotation matrixes corresponding to each group of sample images;
calculating radian values of the rotation angles of the second matrixes respectively through a Rodrigues formula;
And calculating the average value of radian values of the rotation angles of the two second matrixes corresponding to each group of sample images of the adjacent frames, and taking the average value as a second filtering rotation error.
In one embodiment of the present invention, the determining a plurality of filtering translation errors according to the unit direction vector corresponding to each set of sample images and a preset unit direction vector includes:
and calculating the average value of the difference values of the unit direction vectors corresponding to each group of sample images of the adjacent frames and the preset unit direction vectors respectively, and taking the average value as a filtering translation error.
A second aspect of an embodiment of the present invention provides an automatic correction device for binocular camera external parameters, including:
the acquisition module is used for acquiring a plurality of groups of sample images acquired by the binocular camera; each group of sample images comprises a frame of sample images which are simultaneously and respectively acquired by a left camera and a right camera;
The extraction module is used for respectively extracting key points from the two sample images of each group of sample images by adopting a key point extraction algorithm; wherein the key points comprise characteristic points and/or angular points;
the matching module is used for matching key points of four sample images of adjacent frames to obtain a plurality of key point matching pairs;
the determining module is used for determining a rotation matrix of the corresponding camera gesture and a unit direction vector of the corresponding translation vector by adopting recoverPose algorithm according to the key point matching pair and the camera internal reference;
An error module for determining a plurality of filtering errors according to the plurality of rotation matrices and the plurality of unit direction vectors; wherein the filtering error comprises: a first filtered rotation error, a second filtered rotation error, a filtered translation error, and a filtered circulation error;
the fitting module is used for fitting the filtering errors through Gaussian distribution;
the screening module is used for screening a plurality of rotation matrixes and a plurality of unit direction vectors corresponding to the filtering errors with the filtering errors larger than the preset multiple standard deviation, and the rotation matrixes and the corresponding unit direction vectors are used as filtering rotation matrixes;
And the optimization module is used for carrying out optimization calculation according to the five angle initial values, the filtering rotation matrix and the filtering unit direction vector, and determining a correction rotation matrix and a correction translation vector.
A third aspect of the embodiment of the present invention provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the method for automatically correcting the external parameters of the binocular camera provided in the first aspect of the embodiment of the present invention when the processor executes the program.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a binocular camera external parameter automatic correction method provided by the first aspect of the embodiments of the present invention.
The invention has the beneficial effects that:
According to the method, the rotation matrix is calculated and filtered through the corresponding characteristic points in the binocular camera image, and finally the correction rotation matrix and the correction translation vector are obtained through optimization calculation, so that the binocular camera external parameter automatic correction is realized, manual and external mechanical equipment participation is not needed, the correction method is simple and convenient, the influence of external factors is reduced, the correction precision is improved, the applicable scene is expanded, and the method is suitable for online automatic correction of industrial scenes.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
Fig. 1 is a schematic flow chart of a binocular camera external parameter automatic correction method according to an embodiment of the present invention;
Fig. 2 is a schematic diagram of a binocular camera according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of three-dimensional vectors of filtered cyclic errors according to an embodiment of the present invention;
Fig. 4 is a block diagram of an automatic correction device for binocular camera external parameters according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to specific examples, but embodiments of the present invention are not limited thereto.
As shown in fig. 1, a first aspect of the embodiment of the present invention provides a binocular camera external parameter automatic correction method, which includes the following steps:
step 11, acquiring a plurality of groups of sample images acquired by a binocular camera; each group of sample images comprises a frame of sample images which are simultaneously and respectively acquired by a left camera and a right camera.
Step 12, respectively extracting key points from two sample images of each group of sample images by adopting a key point extraction algorithm; wherein the key points comprise characteristic points and/or corner points.
And step 13, matching key points of four sample images of adjacent frames to obtain a plurality of key point matching pairs.
And 14, determining a rotation matrix of the corresponding camera gesture and a unit direction vector of the corresponding translation vector by adopting recoverPose algorithm according to the key point matching pair and the camera internal reference.
Step 15, determining a plurality of filtering errors according to the plurality of rotation matrixes and the plurality of unit direction vectors.
Wherein, the filtering error includes: the first filter rotation error, the second filter rotation error, the filter translation error, and the filter circulation error.
Step 16, fitting the filtering error by Gaussian distribution.
And step 17, screening a plurality of rotation matrixes and a plurality of unit direction vectors corresponding to the filtering errors with the filtering errors larger than the preset multiple standard deviation, and taking the rotation matrixes and the unit direction vectors as the filtering rotation matrixes and the corresponding filtering unit direction vectors.
And step 18, performing optimization calculation according to the five angle initial values, the filtering rotation matrix and the filtering unit direction vector, and determining a correction rotation matrix and a correction translation vector.
In the embodiment, the rotation matrix is calculated and filtered through the corresponding characteristic points in the binocular camera image, and finally the correction rotation matrix and the correction translation vector are obtained through optimization calculation, so that the binocular camera external parameter automatic correction is realized, manual and external mechanical equipment participation is not needed, the correction method is simple and convenient, the influence of external factors is reduced, the correction precision is improved, the applicable scene is expanded, and the method is suitable for the online automatic correction of the industrial scene.
The second aspect of the embodiment of the invention provides an automatic correction method for external parameters of a binocular camera, which comprises the following steps:
step 21, a plurality of groups of sample images acquired by the binocular camera are acquired.
Each group of sample images comprises a frame of sample images which are simultaneously and respectively acquired by a left camera and a right camera. In this step, each group of sample images includes two sample images, a left camera sample image and a right camera sample image, and the two sample images are one frame of images acquired by the left camera and the right camera at the same time, that is, each group of sample images is a left camera sample image and a right camera sample image of each frame.
Specifically, a binocular camera is placed in a suitable scene and a sufficient number of binocular camera sample images (100 sets, each set containing one left and one right camera sample image) are acquired while moving or rotating the binocular camera apparatus. It is required that the image or scene contains as many textures as possible, and that no moving object can be in the scene during the capturing of the image (or that the synchronization error of the two cameras is less than 1 microsecond). The step requires that the camera pose corresponding to the two adjacent frames of images is not changed too much, that is, a large enough shared area exists in the two adjacent frames of images of the left camera and the right camera (generally, the requirement can be met under the condition that the moving speed of the binocular device is not very high or the frame rate of the camera is relatively high).
In step 22, key points are extracted for the two sample images of each group of sample images.
Wherein the key points include feature points and/or corners.
In this step, for any one of the left and right camera sample images acquired in step 21, feature points and/or corner points of each set of sample images are extracted by using a feature point extraction algorithm and/or a corner point extraction algorithm, respectively.
Specifically, the algorithm for extracting the characteristic points and/or the angular points of the image comprises, but is not limited to, a characteristic point extraction algorithm such as Sift, surf and the like, a Harris, shi-Tomasi isocenter extraction algorithm, and various characteristic point or angular point extraction algorithms based on machine learning or deep learning, and the various algorithms can be mixed and used, or only one of the algorithms can be used.
And step 23, matching key points of four sample images of each adjacent frame to obtain a plurality of key point matching pairs.
In this step, one frame of sample image includes one left camera sample image and one right camera sample image, and two adjacent frames have four images in total, that is, two front and rear images of the left camera, two front and rear images of the right camera, and the four images can be divided into four pairs of image pairs, and feature points and/or corner points of each pair of image pairs are matched by using a feature point and/or corner point matching algorithm, so that each image pair obtains a key point matching pair.
Wherein, the firstFrame left camera sample image and/>The right frame camera sample image is an image pair, the/>Frame left camera sample image and/>The right frame camera sample image is an image pair, the/>Frame left camera sample image and the firstThe frame left camera sample image is an image pair, the/>Frame right camera sample image and/>The right camera sample image of the frame is an image pair, each two adjacent frames are the first and second frames, the second and third frames … …, and so on.
For example, the four images of the first frame and the second frame are a first frame left camera sample image, a first frame right camera sample image, a second frame left camera sample image and a second frame right camera sample image, respectively, and then the first frame left camera sample image and the first frame right camera sample image are an image pair, the second frame left camera sample image and the second frame right camera sample image are an image pair, the first frame left camera sample image and the second frame left camera sample image are an image pair, the first frame right camera sample image and the second frame right camera sample image are an image pair, and feature points and/or corner points of each pair of images are matched by using a feature point and/or corner point matching algorithm, so as to obtain four pairs of key point matching pairs.
The feature point and/or corner matching algorithm includes, but is not limited to, a variety of feature point and/or corner matching algorithms such as violent matching and FLANN, and various matching algorithms based on machine learning or deep learning can be adopted.
And step 24, determining a rotation matrix of the corresponding camera gesture and a unit direction vector of the corresponding translation vector according to the key point matching pair and the camera internal reference.
In this step, the camera internal parameters include camera internal parameter matrixes, distortion coefficients and the like of the left and right cameras, and the camera internal parameters are calculated in advance by using a camera internal parameter calibration method in the prior art. The unit direction vectors of the rotation matrix and the translation vector of the camera pose of the image pair corresponding to the keypoint matching pair can be calculated respectively by using recoverPose algorithm: first, theFrame left camera sample image and/>The rotation matrix corresponding to the sample image of the right camera is/>The unit direction vector of the translation vector is; First/>Frame left camera sample image and/>The rotation matrix corresponding to the sample image of the right camera is/>The unit direction vector of the translation vector is/>; First/>Frame left camera sample image and/>The rotation matrix corresponding to the sample image of the frame left camera is/>The unit direction vector of the translation vector is/>; First/>Frame right camera sample image and/>The rotation matrix corresponding to the sample image of the right camera is/>The unit direction vector of the translation vector is/>
Illustratively, the rotation matrix corresponding to the first frame left camera sample image and the first frame right camera sample image isThe unit direction vector of the translation vector is/>; The rotation matrix corresponding to the second frame left camera sample image and the second frame right camera sample image is/>The unit direction vector of the translation vector is/>; The rotation matrix corresponding to the first frame left camera sample image and the second frame left camera sample image is/>The unit direction vector of the translation vector is/>; The rotation matrix corresponding to the first frame right camera sample image and the second frame right camera sample image is/>The unit direction vector of the translation vector is
And step 25, filtering the plurality of rotation matrixes and the plurality of unit direction vectors to obtain a plurality of filtering rotation matrixes and a plurality of corresponding filtering unit direction vectors.
Specifically, step 25 includes steps 251-253:
Step 251, determining a plurality of filtering errors according to the plurality of rotation matrices and the plurality of unit direction vectors.
Wherein, the filtering error includes: the first filter rotation error, the second filter rotation error, the filter translation error, and the filter circulation error.
Here, as shown in fig. 2, the relationship between the attitudes of the cameras of the adjacent frames of the left and right cameras will be described:
it is known that a binocular camera captures two sets of left and right camera images at two adjacent locations, respectively, namely left and right camera image frame 1 at location 1 and image frame 2 at location 2, as shown in fig. 2. Assume that in position 1, the rotation matrix and translation vector between the left and right cameras are respectively In position 2, the rotation matrix and translation vector between the left and right cameras are/>, respectivelyIn the process of moving the binocular device from the position 1 to the position 2, the rotation matrix and the translation vector of the left camera are respectively as followsThe rotation matrix and translation vector of the right camera are/>, respectively
Ideally the pose relationship of the cameras of adjacent frames of the left camera and the right camera:
(1)
(2)
considering that binocular devices are rigid body transformations during movement, a binocular device is a motion vector Order-makingThe above equation becomes:
(3)
(4)
considering again that the displacement length of the left camera movement and the displacement length of the right camera movement are uncertain in the process of moving the binocular device from the position 1 to the position 2 Is known (i.e. straight line distance of left and right cameras/>) So can let/>Wherein/>The straight line distance of the left camera and the right camera,The unit direction vector of the translation vector of the left and right camera movements respectively, the expression (4) becomes:
(5)
In an ideal case, the pose relationship of the cameras of adjacent frames of the left camera and the right camera satisfies the formulas (1) and (2), but in the actual test, the pose relationship calculated by the image feature points cannot be strictly satisfied, and four errors exist: two rotation errors, translation errors and cyclic errors.
Specifically, step 251 includes steps 2511-2514:
at step 2511, a plurality of first filtered rotation errors are determined according to a plurality of rotation matrices and a rodgers equation.
In this step, the first filtered rotation error is the error generated by the above formula (3), and the formula (3) is required to ensure that the rotation matrices at both ends are equalIs a unitary matrix, i.e., the corresponding rotation angle of the matrix is required to be zero. Therefore, the radian value corresponding to the rotation angle of the matrix can be calculated through the Rodrigues formula, and the radian value is the first filtering rotation error.
Here, first, a plurality of first matrices are determined based on a plurality of rotation matrices, and the first matrices are calculated as followsA first matrix is calculated for each two adjacent frames, and a plurality of first matrices are calculated from all the rotation matrices obtained in the step 24. Illustratively, taking first and second frame sample images as examples, the first matrix is/>. And then calculating radian values corresponding to the rotation angles of each first matrix through a Rodrigues formula, namely obtaining all first filtering rotation errors.
Step 2512, determining a plurality of second filtered rotation errors according to the rotation matrix and the rodgers formula corresponding to each set of sample images.
The second filtering rotation error is defined as an error between a rotation matrix between the left and right cameras to be calculated and a rotation matrix calculated from left and right camera images photographed in a camera pose of any adjacent frame, and the rotation matrix can be calculated by a roside formulaThe corresponding radian value of the corresponding rotation angle, and the average value of the two radian values is the second filtering rotation error.
Specifically, a plurality of corresponding second matrices are first determined according to the rotation matrices corresponding to each group of sample images, the firstThe second matrix of the group sample image is calculated as/>First/>The second matrix of the group sample image is calculated as/>And so on. Then respectively calculating radian values of rotation angles of each second matrix through a Rodrigues formula, and finally calculating average values of the radian values of rotation angles of two second matrices corresponding to each group of sample images of adjacent frames to serve as second filtering rotation errors, namely/>And the corresponding rotation angle corresponds to two radian values, and the average value of the two radian values is used as a second filtering rotation error. Illustratively, taking the first frame and the second frame sample images as examples,/>And taking the average value of two radian values corresponding to the corresponding rotation angle as a second filtering rotation error, and sequentially calculating to obtain a plurality of second filtering rotation errors.
Step 2513, determining a plurality of filtering translational errors according to the unit direction vector corresponding to each group of sample images and the preset unit direction vector.
In this step, the translational error is defined as the translational vector between the left and right cameras to be calculated (taking its unit vector, denoted as) And the translation vector calculated from the left and right camera images captured in the camera pose of any adjacent frame. Calculating the error between the two unit vectors can be accomplished by calculating the length of the difference between the two vectors, i.e
Specifically, a unit direction vector corresponding to each group of sample images of adjacent frames is calculatedAnd/>Respectively with the preset unit direction vector/>As an average of the differences of (a) as a filtered translation error, i.e. Illustratively, the filtered translation error of the first frame and the second frame isAnd sequentially calculating the filtering translation errors between every two adjacent frames to obtain a plurality of filtering translation errors.
Step 2514, determining a plurality of filtering cyclic errors according to the plurality of rotation matrices, the unit direction vector and the preset unit direction vector.
The filter loop error is defined as the error produced by equation (5). In the formula (5), due toAre all three-dimensional vectors,/>Are all rotation matrices, so equation (5) can be understood as having a common origin/>Three-dimensional vector/>And/>Is a difference in (a) between the two. As shown in fig. 3. Due to/>It is unknown that geometric knowledge (as shown in FIG. 3) can be resolved spatially due to the three-dimensional vector/>In the same plane, when three-dimensional vector/>The error of equation (5) is also minimized when the spatial distance of (a) is minimized. So can/>As a cyclic error. Order theThe filter cycle error can be calculated by the following formula:
(6)。
specifically, in performing the filter loop error calculation, . Illustratively, taking the first frame and the second frame sample images as examples,/>
The filtered errors are fit by gaussian distribution, step 252.
In this step, the first filtered rotation error, the second filtered rotation error, the filtered translation error, and the filtered circulation error are fitted, respectively.
Step 253, screening a plurality of rotation matrices and a plurality of unit direction vectors corresponding to the filtering errors with the filtering errors larger than the preset multiple standard deviation, and taking the rotation matrices and the corresponding unit direction vectors as filtering rotation matrices.
Preferably, the data after deleting the relevant data (the relevant data is a set of data, that is, the rotation matrix of the error or the unit direction vector corresponding to the adjacent frame of the unit direction vector and the unit direction vector of the four translation vectors) corresponding to the error with any error greater than 2 times of standard deviation is used as the filtering rotation matrix and the corresponding filtering unit direction vector.
And step 26, performing optimization calculation according to the five angle initial values, the filtering rotation matrix and the filtering unit direction vector, and determining a correction rotation matrix and a correction translation vector.
In this step, the rotation matrix is correctedAnd correcting the translation vector/>Can be expressed as five angle valuesAnd a length value/>Since the distance between the left and right cameras is known/>(Baseline Length of binocular Camera/>)) Therefore, only five angle values/>, are optimizedAnd (3) obtaining the product.
Here, consider correcting the rotation matrixCan be decomposed into along/>A rotation matrix of three coordinate axes is provided,For a rotation matrix in which the left camera rotates to an equilibrium position,/>A rotation matrix for rotating the right camera to the equilibrium position, soAnd/>Can be expressed as/>, respectivelyAndWherein/>Respectively, a rotation matrix along three coordinate axes,/>Respectively corresponding rotation angles.
Finally, the rotation matrix is correctedAnd correcting the translation vector/>Can be expressed as:
by correcting the rotation matrix/> The formula of (2) shows that the rotation of the left and right cameras along the X-axis/>Can be combined and uniformly expressed by rotation of the right camera along the X axis, namely/>. Correction rotation matrix/>And correcting the translation vector/>There are a total of 6 degrees of freedom by correcting the rotation matrix/>And correcting the translation vector/>As can be seen from the calculation formula of (2), it can be used/>,/>,/>,/>,/>And a vector/>, representing the sizeRepresentation of/>I.e. its component along the X-axis) is the length of the binocular camera baseline, also referred to as binocular camera baseline vector.
Thus, the length of binocular camera baselines is knownAnd remain unchanged, at this point the instant/>,/>,/>,/>Correction rotation matrix/>, representing left and right camerasAnd correcting the translation vector/>
Step 26 includes steps 261-265:
step 261, determining an initial rotation matrix and an initial translation vector according to the five angle initial values and the baseline length of the binocular camera.
In this step, five initial angles are setLength of combined binocular camera baseline/>According to the principle of converting the rotation matrix and the translation vector into angles, five initial angles and the length/>, of the binocular camera baseline are obtainedIs converted into an initial rotation matrix and an initial translation vector.
Step 262, determining a correction error according to the initial rotation matrix, the initial translation vector, the plurality of filtered rotation matrices and the plurality of filtered unit direction vectors; wherein correcting the error includes: the first correction rotation error, the second correction rotation error, the correction translation error, and the correction cycle error.
In this step, a plurality of optimization errors are first calculated according to an initial rotation matrix, an initial translation vector, a plurality of filter rotation matrices, and a plurality of filter unit direction vectors, where the calculation method of the optimization errors is the same as the calculation method of the filter errors described above, except that the calculation is performed using the initial rotation matrix, the initial translation vector, the plurality of filter rotation matrices, and the plurality of filter unit direction vectors. The optimization errors comprise a first optimization rotation error, a second optimization rotation error, an optimization translation error and an optimization circulation error, and the calculation method is the same as that of the first filtering rotation error, the second filtering rotation error, the filtering translation error and the filtering circulation error.
Then, the calculated plurality of first optimized rotation error calculation averages are used as first correction rotation errors, the calculated plurality of second optimized rotation error calculation averages are used as second correction rotation errors, the calculated plurality of optimized translation error calculation averages are used as correction translation errors, and the calculated plurality of optimized circulation error calculation averages are used as correction circulation errors.
Step 263, determining an objective function according to the first corrected rotation error, the second corrected rotation error, the corrected translation error, the corrected cyclic error and the corresponding preset weights.
In this step, the four errors are multiplied by corresponding preset weights and added as final errors, i.e., objective functions.
In step 264, the initial values of the five angles are adjusted to determine five target angle optimization values corresponding to the minimum value of the target function. And continuously adjusting the five angle values through an optimization algorithm, calculating a final error according to the process of calculating the correction error, and enabling the objective function value (final error) to be minimum, so that the five angle values corresponding to the minimum objective function value are obtained as objective angle optimization values.
Preferably, the optimization algorithm may employ a Nelder-Mead algorithm.
Step 265, calculating a correction rotation matrix according to the above-mentioned conversion process of angle and rotation matrix according to the five target angle optimization values and the baseline length of the binocular cameraAnd correcting the translation vector/>
As shown in fig. 4, a third aspect of the embodiment of the present invention provides an automatic binocular camera external parameter correction apparatus, including:
An acquisition module 31 for acquiring a plurality of sets of sample images acquired by the binocular camera; each group of sample images comprises a frame of sample images which are simultaneously and respectively acquired by a left camera and a right camera;
An extracting module 32, configured to extract key points from two sample images of each set of sample images by using a key point extracting algorithm; wherein the key points comprise characteristic points and/or angular points;
The matching module 33 is configured to match key points of four sample images of adjacent frames to obtain a plurality of key point matching pairs;
The determining module 34 is configured to determine, according to the key point matching pair and the camera internal reference, a rotation matrix of the corresponding camera pose and a unit direction vector of the corresponding translation vector by adopting recoverPose algorithm;
An error module 35 for determining a plurality of filtered errors based on the plurality of rotation matrices and the plurality of unit direction vectors; wherein, the filtering error includes: a first filtered rotation error, a second filtered rotation error, a filtered translation error, and a filtered circulation error;
A fitting module 36, configured to fit the filtering error to a gaussian distribution;
The filtering module 37 is configured to filter a plurality of rotation matrices and a plurality of unit direction vectors corresponding to a filtering error with a filtering error greater than a preset multiple standard deviation, as a filtering rotation matrix and a corresponding filtering unit direction vector;
The optimization module 38 is configured to perform optimization calculation according to the five angle initial values, the filter rotation matrix, and the filter unit direction vector, and determine a correction rotation matrix and a correction translation vector.
In one embodiment, the error module 35 is further configured to determine a plurality of first filtered rotational errors based on the plurality of rotational matrices and the rodrich formula;
determining a plurality of second filtering rotation errors according to the rotation matrix and the Rodrigas formula corresponding to each group of sample images;
Determining a plurality of filtering translation errors according to the unit direction vectors corresponding to each group of sample images and the preset unit direction vectors;
and determining a plurality of filtering cyclic errors according to the plurality of rotation matrixes, the unit direction vector and the preset unit direction vector.
In one embodiment, the optimization module 38 is further configured to determine an initial rotation matrix and an initial translation vector based on the five initial angle values and a baseline length of the binocular camera;
Determining a plurality of corresponding correction errors according to the initial rotation matrix, the initial translation vector, the plurality of filtering rotation matrices and the plurality of filtering unit direction vectors; wherein correcting the error includes: a first correction rotation error, a second correction rotation error, a correction translation error, and a correction circulation error;
determining an objective function according to the first correction rotation error, the second correction rotation error, the correction translation error, the correction circulation error and the corresponding preset weights;
adjusting the initial values of the five angles to determine five target angle optimization values corresponding to the minimum value of the target function;
And determining a correction rotation matrix and a correction translation vector according to the five target angle optimization values and the baseline length of the binocular camera.
In one embodiment, the error module 35 is further configured to determine a plurality of first matrices from the plurality of rotation matrices;
calculating an radian value corresponding to the rotation angle of each first matrix through a Rodrigues formula; wherein, the radian value is the first filtering rotation error.
In one embodiment, the error module 35 is further configured to determine a plurality of second matrices from the rotation matrices corresponding to each set of sample images;
Calculating radian values of the rotation angles of each second matrix respectively through a Rodrigues formula;
and calculating the average value of radian values of rotation angles of two second matrixes corresponding to each group of sample images of the adjacent frames, and taking the average value as a second filtering rotation error.
In one embodiment, the error module 35 is further configured to calculate an average value of differences between the unit direction vectors corresponding to each group of sample images of the adjacent frames and the preset unit direction vectors, as the filtered translation error.
The fourth aspect of the embodiment of the invention also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the steps of the binocular camera external parameter automatic correction method when executing the program.
The fifth aspect of the embodiment of the present invention further provides a computer readable storage medium having a computer program stored thereon, where the computer program when executed by a processor implements the steps of the above-mentioned method for automatically correcting a binocular camera external parameter.
The Memory may include random access Memory (Random Access Memory, RAM) or Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but may also be a digital signal processor (DIGITAL SIGNAL Processing, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components.
The method provided by the embodiment of the invention can be applied to electronic equipment. Specifically, the electronic device may be: desktop computers, portable computers, intelligent mobile terminals, servers, etc. Any electronic device capable of implementing the present invention is not limited herein, and falls within the scope of the present invention.
For the apparatus/electronics embodiments, the description is relatively simple as it is substantially similar to the method embodiments, with reference to the description of the method embodiments in part.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (5)

1. The binocular camera external parameter automatic correction method is characterized by comprising the following steps of:
acquiring a plurality of groups of sample images acquired by a binocular camera; each group of sample images comprises a frame of sample images which are simultaneously and respectively acquired by a left camera and a right camera;
Respectively extracting key points from two sample images of each group of sample images by adopting a key point extraction algorithm; wherein the key points comprise characteristic points and/or angular points;
Two of four sample images of adjacent frames are used as an image pair, and each pair of images is subjected to key point matching to obtain a plurality of key point matching pairs;
Determining a plurality of rotation matrixes of the corresponding camera gestures and unit direction vectors of a plurality of translation vectors by adopting recoverPose algorithm according to the plurality of key point matching pairs and the camera internal parameters;
determining a plurality of filtering errors according to the plurality of rotation matrices and the plurality of unit direction vectors; wherein the filtering error comprises: a first filtered rotation error, a second filtered rotation error, a filtered translation error, and a filtered circulation error;
fitting the filtering errors through Gaussian distribution;
Screening a plurality of rotation matrixes and a plurality of unit direction vectors corresponding to the filtering errors with the filtering errors larger than the preset multiple standard deviation, and taking the rotation matrixes and the unit direction vectors as the filtering rotation matrixes and the corresponding filtering unit direction vectors;
Performing optimization calculation according to the five angle initial values, the filtering rotation matrix and the filtering unit direction vector, and determining a correction rotation matrix and a correction translation vector;
The determining a plurality of filtering errors from the plurality of rotation matrices and the plurality of unit direction vectors includes:
Determining a plurality of first filtered rotation errors according to the plurality of rotation matrices and the rodgers formula;
Determining a plurality of second filtering rotation errors according to the rotation matrix and the Rodrigas formula corresponding to each group of sample images;
Determining a plurality of filtering translation errors according to the unit direction vectors corresponding to each group of sample images and the preset unit direction vectors;
determining a plurality of filtering cycle errors according to the plurality of rotation matrixes, the unit direction vector and the preset unit direction vector;
The optimizing calculation is performed according to the five angle initial values, the filtering rotation matrix and the filtering unit direction vector, and the determining of the correction rotation matrix and the correction translation vector comprises the following steps:
Determining an initial rotation matrix and an initial translation vector according to the five initial angle values and the baseline length of the binocular camera; wherein, five angles include: left camera along Rotation angle of coordinate axis rotated to equilibrium position/>Left camera along/>Rotation angle of coordinate axis rotated to equilibrium position/>Right camera along/>Rotation angle of coordinate axis rotated to equilibrium position/>Right camera along/>Rotation angle of coordinate axis rotated to equilibrium position/>Right camera along/>Rotation angle of coordinate axis rotated to equilibrium position/>
Determining a plurality of corresponding correction errors according to the initial rotation matrix, the initial translation vector, the plurality of filtering rotation matrices and the plurality of filtering unit direction vectors; wherein the correction error includes: a first correction rotation error, a second correction rotation error, a correction translation error, and a correction circulation error;
determining an objective function according to the first correction rotation error, the second correction rotation error, the correction translation error, the correction cycle error and the corresponding preset weights;
adjusting the initial values of the five angles to determine five target angle optimization values corresponding to the minimum value of the target function;
determining a correction rotation matrix and a correction translation vector according to the five target angle optimization values and the baseline length of the binocular camera;
the determining a plurality of first filtered rotation errors from the plurality of rotation matrices and the rodgers equation includes:
determining a plurality of first matrices from the plurality of rotation matrices;
calculating radian values corresponding to the rotation angles of the first matrixes through a Rodrigues formula; wherein the radian value is a first filtered rotation error;
the determining a plurality of second filtering rotation errors according to the rotation matrix and the rode formula corresponding to each group of sample images comprises the following steps:
determining a plurality of second matrixes according to the rotation matrixes corresponding to each group of sample images;
calculating radian values of the rotation angles of the second matrixes respectively through a Rodrigues formula;
Calculating the average value of radian values of the rotation angles of the two second matrixes corresponding to each group of sample images of the adjacent frames, and taking the average value as a second filtering rotation error;
The determining a plurality of filtering cycle errors according to the plurality of rotation matrices, the unit direction vector and the preset unit direction vector comprises: and determining a plurality of filtering cyclic errors according to the unit direction vectors of the rotation matrixes and the translation vectors corresponding to the i-th frame right camera sample image and the i+1-th frame right camera sample image, the preset unit direction vectors, the rotation matrixes corresponding to the i+1-th frame left camera sample image and the i+1-th frame right camera sample image, and the unit direction vectors of the translation vectors corresponding to the i-th frame left camera sample image and the i+1-th frame left camera sample image.
2. The method of claim 1, wherein determining a plurality of filtered translational errors from the corresponding unit direction vectors and the preset unit direction vectors for each set of sample images comprises:
and calculating the average value of the difference values of the unit direction vectors corresponding to each group of sample images of the adjacent frames and the preset unit direction vectors respectively, and taking the average value as a filtering translation error.
3. An automatic binocular camera external parameter correction device, comprising:
the acquisition module is used for acquiring a plurality of groups of sample images acquired by the binocular camera; each group of sample images comprises a frame of sample images which are simultaneously and respectively acquired by a left camera and a right camera;
The extraction module is used for respectively extracting key points from the two sample images of each group of sample images by adopting a key point extraction algorithm; wherein the key points comprise characteristic points and/or angular points;
The matching module is used for matching the key points of each pair of image pairs which are two by two and are used as an image pair of four sample images of adjacent frames to obtain a plurality of key point matching pairs;
The determining module is used for determining a plurality of rotation matrixes of the corresponding camera gestures and unit direction vectors of a plurality of translation vectors by adopting recoverPose algorithm according to the plurality of key point matching pairs and the camera internal parameters;
an error module for determining a plurality of filtering errors according to the plurality of rotation matrices and the plurality of unit direction vectors; wherein the filtering error comprises: a first filtered rotation error, a second filtered rotation error, a filtered translation error, and a filtered circulation error;
the fitting module is used for fitting the filtering errors through Gaussian distribution;
the screening module is used for screening a plurality of rotation matrixes and a plurality of unit direction vectors corresponding to the filtering errors with the filtering errors larger than the preset multiple standard deviation, and the rotation matrixes and the corresponding unit direction vectors are used as filtering rotation matrixes;
the optimization module is used for carrying out optimization calculation according to the five angle initial values, the filtering rotation matrix and the filtering unit direction vector, and determining a correction rotation matrix and a correction translation vector;
the error module is further configured to determine a plurality of first filtered rotational errors according to a plurality of rotational matrices and a rode's formula;
Determining a plurality of second filtering rotation errors according to the rotation matrix and the Rodrigas formula corresponding to each group of sample images;
Determining a plurality of filtering translation errors according to the unit direction vectors corresponding to each group of sample images and the preset unit direction vectors;
determining a plurality of filtering cycle errors according to the plurality of rotation matrixes, the unit direction vector and the preset unit direction vector;
The optimization module is also used for determining an initial rotation matrix and an initial translation vector according to the five angle initial values and the baseline length of the binocular camera; wherein, five angles include: left camera along Rotation angle of coordinate axis rotated to equilibrium position/>Left camera along/>Rotation angle of coordinate axis rotated to equilibrium position/>Right camera along/>Rotation angle of coordinate axis rotated to equilibrium position/>Right camera along/>Rotation angle of coordinate axis rotated to equilibrium position/>Right camera along/>Rotation angle of coordinate axis rotated to equilibrium position/>
Determining a plurality of corresponding correction errors according to the initial rotation matrix, the initial translation vector, the plurality of filtering rotation matrices and the plurality of filtering unit direction vectors; wherein the correction error includes: a first correction rotation error, a second correction rotation error, a correction translation error, and a correction circulation error;
determining an objective function according to the first correction rotation error, the second correction rotation error, the correction translation error, the correction cycle error and the corresponding preset weights;
adjusting the initial values of the five angles to determine five target angle optimization values corresponding to the minimum value of the target function;
determining a correction rotation matrix and a correction translation vector according to the five target angle optimization values and the baseline length of the binocular camera;
the determining a plurality of first filtered rotation errors from the plurality of rotation matrices and the rodgers equation includes:
determining a plurality of first matrices from the plurality of rotation matrices;
calculating radian values corresponding to the rotation angles of the first matrixes through a Rodrigues formula; wherein the radian value is a first filtered rotation error;
the determining a plurality of second filtering rotation errors according to the rotation matrix and the rode formula corresponding to each group of sample images comprises the following steps:
determining a plurality of second matrixes according to the rotation matrixes corresponding to each group of sample images;
calculating radian values of the rotation angles of the second matrixes respectively through a Rodrigues formula;
Calculating the average value of radian values of the rotation angles of the two second matrixes corresponding to each group of sample images of the adjacent frames, and taking the average value as a second filtering rotation error;
The determining a plurality of filtering cycle errors according to the plurality of rotation matrices, the unit direction vector and the preset unit direction vector comprises: and determining a plurality of filtering cyclic errors according to the unit direction vectors of the rotation matrixes and the translation vectors corresponding to the i-th frame right camera sample image and the i+1-th frame right camera sample image, the preset unit direction vectors, the rotation matrixes corresponding to the i+1-th frame left camera sample image and the i+1-th frame right camera sample image, and the unit direction vectors of the translation vectors corresponding to the i-th frame left camera sample image and the i+1-th frame left camera sample image.
4. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements a binocular camera auto-correction method according to claim 1 or 2 when the program is executed.
5. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements a binocular camera external parameter automatic correction method as claimed in claim 1 or 2.
CN202410117685.1A 2024-01-29 2024-01-29 Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium Active CN117649454B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410117685.1A CN117649454B (en) 2024-01-29 2024-01-29 Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410117685.1A CN117649454B (en) 2024-01-29 2024-01-29 Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117649454A CN117649454A (en) 2024-03-05
CN117649454B true CN117649454B (en) 2024-05-31

Family

ID=90049949

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410117685.1A Active CN117649454B (en) 2024-01-29 2024-01-29 Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117649454B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106251305A (en) * 2016-07-29 2016-12-21 长春理工大学 A kind of realtime electronic image stabilizing method based on Inertial Measurement Unit IMU
WO2018196391A1 (en) * 2017-04-28 2018-11-01 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted camera
RU2017124310A3 (en) * 2017-07-07 2019-01-10
CN110956661A (en) * 2019-11-22 2020-04-03 大连理工大学 Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix
CN110992392A (en) * 2019-11-20 2020-04-10 北京影谱科技股份有限公司 Key frame selection method and device based on motion state
JP2020107938A (en) * 2018-12-26 2020-07-09 株式会社デンソーアイティーラボラトリ Camera calibration device, camera calibration method, and program
CN112070814A (en) * 2020-08-31 2020-12-11 杭州迅蚁网络科技有限公司 Target angle identification method and device
CN112132894A (en) * 2020-09-08 2020-12-25 大连理工大学 Mechanical arm real-time tracking method based on binocular vision guidance
CN112907680A (en) * 2021-02-22 2021-06-04 上海数川数据科技有限公司 Automatic calibration method for rotation matrix of visible light and infrared double-light camera
CN115464657A (en) * 2022-09-29 2022-12-13 杭州电子科技大学 Hand-eye calibration method of rotary scanning device driven by motor
CN115471534A (en) * 2022-08-31 2022-12-13 华南理工大学 Underwater scene three-dimensional reconstruction method and equipment based on binocular vision and IMU
CN115578466A (en) * 2021-07-05 2023-01-06 武汉Tcl集团工业研究院有限公司 Camera calibration method and device, computer readable storage medium and electronic equipment
CN116244386A (en) * 2023-02-10 2023-06-09 北京友友天宇***技术有限公司 Identification method of entity association relation applied to multi-source heterogeneous data storage system
CN116342674A (en) * 2019-01-21 2023-06-27 重庆交通大学 Method for calculating asphalt pavement construction depth by three-dimensional model
CN116957987A (en) * 2023-08-28 2023-10-27 深圳大学 Multi-eye polar line correction method, device, computer equipment and storage medium
CN117218210A (en) * 2023-08-29 2023-12-12 上海大学 Binocular active vision semi-dense depth estimation method based on bionic eyes
CN117274960A (en) * 2023-08-01 2023-12-22 哈尔滨工业大学 Non-driving gesture recognition method and system for L3-level automatic driving vehicle driver

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106251305A (en) * 2016-07-29 2016-12-21 长春理工大学 A kind of realtime electronic image stabilizing method based on Inertial Measurement Unit IMU
WO2018196391A1 (en) * 2017-04-28 2018-11-01 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted camera
RU2017124310A3 (en) * 2017-07-07 2019-01-10
JP2020107938A (en) * 2018-12-26 2020-07-09 株式会社デンソーアイティーラボラトリ Camera calibration device, camera calibration method, and program
CN116342674A (en) * 2019-01-21 2023-06-27 重庆交通大学 Method for calculating asphalt pavement construction depth by three-dimensional model
CN110992392A (en) * 2019-11-20 2020-04-10 北京影谱科技股份有限公司 Key frame selection method and device based on motion state
CN110956661A (en) * 2019-11-22 2020-04-03 大连理工大学 Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix
CN112070814A (en) * 2020-08-31 2020-12-11 杭州迅蚁网络科技有限公司 Target angle identification method and device
CN112132894A (en) * 2020-09-08 2020-12-25 大连理工大学 Mechanical arm real-time tracking method based on binocular vision guidance
CN112907680A (en) * 2021-02-22 2021-06-04 上海数川数据科技有限公司 Automatic calibration method for rotation matrix of visible light and infrared double-light camera
CN115578466A (en) * 2021-07-05 2023-01-06 武汉Tcl集团工业研究院有限公司 Camera calibration method and device, computer readable storage medium and electronic equipment
CN115471534A (en) * 2022-08-31 2022-12-13 华南理工大学 Underwater scene three-dimensional reconstruction method and equipment based on binocular vision and IMU
CN115464657A (en) * 2022-09-29 2022-12-13 杭州电子科技大学 Hand-eye calibration method of rotary scanning device driven by motor
CN116244386A (en) * 2023-02-10 2023-06-09 北京友友天宇***技术有限公司 Identification method of entity association relation applied to multi-source heterogeneous data storage system
CN117274960A (en) * 2023-08-01 2023-12-22 哈尔滨工业大学 Non-driving gesture recognition method and system for L3-level automatic driving vehicle driver
CN116957987A (en) * 2023-08-28 2023-10-27 深圳大学 Multi-eye polar line correction method, device, computer equipment and storage medium
CN117218210A (en) * 2023-08-29 2023-12-12 上海大学 Binocular active vision semi-dense depth estimation method based on bionic eyes

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A new technique of camera calibration:a geometric approach based on principal lines;Jen-Hui Chuang 等;《arXiv:1908.06539》;20190818;全文 *
改进的异构双摄标定和立体匹配方法;何平征;;福建电脑;20170625(06);全文 *
结合SURF算法的双目视觉测距方法;孙鹏;马鹏博;郎宇博;单大国;赖伟;赵祎明;;中国刑警学院学报;20201020(05);全文 *

Also Published As

Publication number Publication date
CN117649454A (en) 2024-03-05

Similar Documents

Publication Publication Date Title
CN110349251B (en) Three-dimensional reconstruction method and device based on binocular camera
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
CN108225216B (en) Structured light system calibration method and device, structured light system and mobile device
CN108470370A (en) The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds
CN108038886B (en) Binocular camera system calibration method and device and automobile
CN111243035B (en) Camera calibration method and device, electronic equipment and computer-readable storage medium
CN107507277B (en) Three-dimensional point cloud reconstruction method and device, server and readable storage medium
CN111897349A (en) Underwater robot autonomous obstacle avoidance method based on binocular vision
JP2010513907A (en) Camera system calibration
EP2622576A1 (en) Method and apparatus for solving position and orientation from correlated point features in images
CN111899282A (en) Pedestrian trajectory tracking method and device based on binocular camera calibration
CN112802124A (en) Calibration method and device for multiple stereo cameras, electronic equipment and storage medium
CN112686950B (en) Pose estimation method, pose estimation device, terminal equipment and computer readable storage medium
CN106570907B (en) Camera calibration method and device
CN111768449B (en) Object grabbing method combining binocular vision with deep learning
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN112470192A (en) Dual-camera calibration method, electronic device and computer-readable storage medium
CN108053375A (en) Image data correction method, device and its automobile
Eichhardt et al. Affine correspondences between central cameras for rapid relative pose estimation
CN109215118B (en) Incremental motion structure recovery optimization method based on image sequence
CN116433843A (en) Three-dimensional model reconstruction method and device based on binocular vision reconstruction route
CN107067441B (en) Camera calibration method and device
CN111915681B (en) External parameter calibration method, device, storage medium and equipment for multi-group 3D camera group
CN117333367A (en) Image stitching method, system, medium and device based on image local features
CN117649454B (en) Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant