CN104167001B - Large-visual-field camera calibration method based on orthogonal compensation - Google Patents

Large-visual-field camera calibration method based on orthogonal compensation Download PDF

Info

Publication number
CN104167001B
CN104167001B CN201410432768.6A CN201410432768A CN104167001B CN 104167001 B CN104167001 B CN 104167001B CN 201410432768 A CN201410432768 A CN 201410432768A CN 104167001 B CN104167001 B CN 104167001B
Authority
CN
China
Prior art keywords
camera
points
prime
orthogonal
scale factor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410432768.6A
Other languages
Chinese (zh)
Other versions
CN104167001A (en
Inventor
贾振元
刘巍
王亮
刘阳
杨景豪
王灵丽
张弛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201410432768.6A priority Critical patent/CN104167001B/en
Publication of CN104167001A publication Critical patent/CN104167001A/en
Application granted granted Critical
Publication of CN104167001B publication Critical patent/CN104167001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a large-visual-field camera calibration method based on orthogonal compensation, belongs to the field of computer visual inspection and image processing and relates to a method for calibrating internal parameters and external parameters of a camera in a large forged piece dimension measurement system. Generally, in an active visual calibration process, two translation guide rails of an electric-control platform are not complete orthogonally, and calibration is conducted simply by blindly selecting a high-precision electric-control platform and regarding that the translation guide rails of the selected electric-control platform are orthogonal approximately. According to the large-visual-field camera calibration method based on orthogonal compensation, the internal parameters and the external parameters of a camera are obtained by obtaining the principal point, the scale factor, the translation vector and the rotation matrix. According to the large-visual-field camera calibration method based on orthogonal compensation, the guide rail orthogonal degree is considered during calibration, namely orthogonal compensation is introduced, and thus the errors, caused by the guide rail orthogonal degree, of a calibration result are reduced.

Description

Large-view-field camera calibration method based on orthogonal compensation
Technical Field
The invention belongs to the field of computer vision detection and image processing, relates to a method for calibrating internal and external parameters of a camera in a large forging dimension measurement system, and particularly relates to a method for calibrating a large-field-of-view camera based on orthogonal compensation.
Background
One of the basic tasks of the large forging dimension visual measurement is to recover three-dimensional geometric information according to the two-dimensional image information of the large forging in the forging process and obtain real-time dimension parameters. To perform the task of finding the corresponding surface points of the spatial object using the image points, a geometric model of the camera imaging is determined, the parameters of which are referred to as camera parameters. The camera intrinsic parameters are parameters of the camera itself related to geometric and optical characteristics, and the extrinsic parameters are the three-dimensional position and orientation of the camera relative to a world coordinate system.
The main factors influencing the calibration accuracy of the camera include: and (5) accurate and robust extraction of the calibration characteristic points. The process of determining the internal and external parameters of the camera is called camera calibration, and the precision of the calibration method directly influences the precision of computer vision measurement. Therefore, the research of quickly, simply and accurately calibrating the camera is undoubtedly of great significance.
At present, according to different camera calibration modes, the method can be divided into a traditional calibration method, a camera self-calibration method and a camera calibration method based on active vision; based on the advantages of high stability and robustness of the active vision calibration method, two translation guide rails of the electric control platform are not completely orthogonal in the active vision calibration process, and the calibration is only carried out by blindly selecting the electric control platform with high precision to be approximately considered as the orthogonality. Aiming at the problem, the factor of the orthogonality of the guide rail is considered to be added in the calibration process, namely, the orthogonality compensation is introduced, so that the error of the orthogonality of the guide rail on the calibration result is reduced.
Disclosure of Invention
The invention relates to a large-view-field camera calibration method based on orthogonal compensation, which comprises the steps of firstly obtaining a principal point coordinate by utilizing a varifocal principal point solving technology, then obtaining 12 images by utilizing four groups of orthogonal motions, and solving a scale factor by utilizing the 12 images under the condition of considering the orthogonality of a four-dimensional electric control platform, wherein internal parameters are calibrated; and finally, calculating external parameters by combining the calculated internal parameters.
The technical scheme adopted by the invention is as follows:
a large visual field camera calibration method based on orthogonal compensation, usually two translation guide rails of an electric control platform are not completely orthogonal in the active visual calibration process, and the calibration only selects the electric control platform with high precision to be approximately orthogonal, so that the camera calibration method based on orthogonal compensation is provided to obtain the internal and external parameters of the camera by respectively solving a principal point and a scale factor, a translation vector and a rotation matrix; the method comprises the following specific steps:
step 1: principal point of zoom method
The method comprises the steps of shooting a two-dimensional target by a camera under different focal lengths, moving an optical center along an optical axis when the focal length is changed, and expanding and contracting the two-dimensional target in a camera view field in an equal proportion to obtain target images under different view fields, wherein images with one point in different images are kept unchanged and are called expansion and contraction centers, and the expansion and contraction centers in the images of different view fields are main points, so that intersection points of all characteristic points of the target under the least square meaning of connecting lines under different focal lengths are the main points.
Step 2: target image acquisition method by using active vision calibration method of four-dimensional electric control platform
The four-dimensional electric control platform is controlled to drive the camera to do orthogonal motion in four groups of planes, each group of orthogonal motion comprises two translations, the camera stays at three positions in each orthogonal motion to further obtain three images, the camera has a pitching angle or a panning angle which is not less than 5 degrees between each group of orthogonal motion on the premise of ensuring that the target is in a view field, and a total of 12 target images are obtained through two pure translation motions of the four groups.
And step 3: determining a scale factor
The parallel straight line intersects with the plane at the same infinite point, namely a vanishing point, the 12 images after distortion correction are grouped for the vanishing point estimation, the connecting line of the corresponding points in each group of 3 images obtains 2 vanishing points, and 4 groups of 8 vanishing points; the obtained 8 blanking point coordinates, principal point coordinates and inter-rail angles are substituted into the following formula:
wherein theta is the angle between the two translation guide rails of the electric control platform, vi1、vi2Pixel coordinates (homogeneous form) of pairs of blanking points obtained for two translational movements under the ith set of orthogonal movements, w-K-TK-1Is an image of an absolute quadratic curve, which is specified in the form:
obtaining a binary equation set containing four nonlinear equations through the two equations and the eight blanking point coordinates, and iteratively solving the scale factor f by using a nonlinear least square methodxAnd fy
And 4, step 4: calibrating external parameters
A linear equation can be obtained from a homogeneous set of coordinates of corresponding image points, as follows:
uu′f11+vu′f12+u′f13+uv′f21+vv′f22+v′f23+uf31+vf32+f33=0 (3)
wherein f isijThe element in the ith row and the jth column of the basic matrix;
[uu′,vu′,u′,uv′,vv′,v′,u,v,1]f=0 (4)
wherein f ═ f11,f12,f13,f21,f22,f23,f31,f32,f33]TIs a 9-dimensional column vector;
if given n sets of corresponding points mi,mi' } the following homogeneous linear equation set
At least 8 groups of corresponding image points are given, the homogeneous linear equation set can be linearly solved under the condition of an unknown non-zero scale factor difference, constraint | | | f | | | | 1 is applied, and when the corresponding image points are accurately matched, a basic matrix can be obtained by only using 8 groups of corresponding points; and then the essential matrix under the condition of a non-zero scale factor difference can be obtainedWhereins is a phase difference scale factor; by using E*=Udiag((σ12)/2,(σ12)/2,0)VTAsWherein the eigenvalue σ is1≥σ2≥σ3
From the essence matrix E*External parameters between cameras can be estimatedHas the following 4 possible combinations:
[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3]. (6)
wherein,u3is the 3 rd column vector of U,
by using Chirality constraint, the yaw angle is taken as a positive acute angle and a translational vectorComponent of (A) in the X directionJudging the correct parameter combination if the value is negative; and (3) reconstructing two characteristic points with accurately known distance information by using the obtained internal parameters of the cameras and the external parameters between the cameras with a difference of a scale factor, wherein the ratio of the reconstructed distance to the actual distance is the scale factor s.
Drawings
Fig. 1 is a schematic diagram of a principal point of zoom.
FIG. 2 is a schematic structural diagram of the apparatus of the present invention.
Fig. 3 is a schematic diagram of epipolar geometry in binocular stereo vision.
In the figure: 1, a platform; 2, electrically controlling the platform by the left camera; 3, electrically controlling the platform by the right camera; 4, a left camera; 5 a right camera; 6 two-dimensional chessboard format calibration plate.
Detailed Description
The following describes the embodiments of the present invention in further detail with reference to the drawings and technical solutions.
Camera calibration typically employs a classical pinhole imaging model, whose expression is as follows:
wherein (X)w,Yw,Zw,1)TIs the homogeneous coordinate of a space point in a world coordinate system, (u, v,1)TFor corresponding image pixel coordinate system o0Homogeneous coordinates in uv, αxF/dx is o0Scale factor on the u-axis in the uv coordinate system, αyF/dy is o0The scale factor on the v axis in the uv coordinate system, f is the focal length of the camera lens, dx and dy are the horizontal and vertical physical dimensions of the pixel respectively, (u0,v0) As principal point coordinates, pcFor the scale factor, K is the camera internal parameter matrix, [ R | t]Is the external parameter matrix of the camera, wherein R is the rotation matrix and t is the translation vector.
The camera internal parameters include principal point coordinates (u)0,v0) Scale factor αx、αyCoefficient of radial distortion k1、k2And tangential distortion coefficient p1、p2. The camera external parameters are the orientation of the camera coordinate system relative to the world coordinate system, and comprise a rotation matrix R and a translation vector t.
Step 1: calculating a principal point by a zoom method;
the two-dimensional targets are shot by a camera under different focal lengths, as shown in fig. 1, when the focal length is changed, an optical center moves along an optical axis, the two-dimensional targets are expanded and contracted in equal proportion in the field of view of the camera to obtain target images under different field of view, images with one point in different images are kept unchanged and are called expansion and contraction centers, and the expansion and contraction centers in the images of different field of view are main points, so that intersection points of all characteristic points of the targets under the least square meaning of connecting lines under different focal lengths are the main points.
Step 2: acquiring a target image by using an active vision calibration method of a four-dimensional electric control platform;
as shown in fig. 2, experimental equipment is set up, two cameras are fixed on two four-dimensional electronic control platforms, the four-dimensional electronic control platforms are controlled to drive the cameras to do orthogonal motion in four groups of planes, each group of orthogonal motion comprises two translations, the cameras stay at three positions in each orthogonal motion to obtain three images, the cameras in each group of orthogonal motion have a pitch angle or a pan angle not less than 5 degrees on the premise that a target is ensured to be in a view field, and a total of 12 target images are obtained through two pure translation motions of the four groups.
And step 3: calculating a scale factor;
the parallel straight line intersects with the plane at the same infinite point, namely a vanishing point, the 12 images after distortion correction are grouped for the vanishing point estimation, the connecting line of the corresponding points in each group of 3 images obtains 2 vanishing points, and 4 groups of 8 vanishing points; the obtained 8 blanking point coordinates, principal point coordinates and inter-rail angles are substituted into the following formula:
wherein theta is the angle between the two translation guide rails of the electric control platform, vi1、vi2Pixel coordinates (homogeneous form) of pairs of blanking points obtained for two translational movements under the ith set of orthogonal movements, w-K-TK-1Is an image of an absolute quadratic curve, which is specified in the form:
obtaining a binary equation set containing four nonlinear equations through the two equations and the eight blanking point coordinates, and iteratively solving the scale factor f by using a nonlinear least square methodxAnd fy
And 4, step 4: calibrating external parameters;
fig. 3 is a schematic diagram of epipolar geometry in binocular stereo vision.
A linear equation can be obtained for each group of homogeneous coordinates of the corresponding image points, as follows:
uu′f11+vu′f12+u′f13+uv′f21+vv′f22+v′f23+uf31+vf32+f33=0 (3)
wherein f isijThe element in the ith row and the jth column of the basic matrix;
[uu′,vu′,u′,uv′,vv′,v′,u,v,1]f=0 (4)
wherein f ═ f11,f12,f13,f21,f22,f23,f31,f32,f33]TIs a 9-dimensional column vector;
if given n sets of corresponding points mi,mi' } the following homogeneous linear equation set
At least 8 groups of corresponding image points are given, the homogeneous linear equation set can be linearly solved under the condition of an unknown non-zero scale factor difference, constraint | | | f | | | | 1 is applied, and when the corresponding image points are accurately matched, a basic matrix can be obtained by only using 8 groups of corresponding points; and then the essential matrix under the condition of a non-zero scale factor difference can be obtainedWhereins is a phase difference scale factor; by using E*=Udiag((σ12)/2,(σ12)/2,0)VTAsWherein the eigenvalue σ is1≥σ2≥σ3
From the essence matrix E*External parameters between cameras can be estimatedHas the following 4 possible combinations:
[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3]. (6)
wherein,u3is the 3 rd column vector of U,
by using Chirality constraint, the yaw angle is taken as a positive acute angle and a translational vectorComponent of (A) in the X directionJudging the correct parameter combination if the value is negative; and (3) reconstructing two characteristic points with accurately known distance information by using the obtained internal parameters of the cameras and the external parameters between the cameras with a difference of a scale factor, wherein the ratio of the reconstructed distance to the actual distance is the scale factor s.

Claims (1)

1. A calibration method based on orthogonal compensation is characterized by comprising the following steps:
step 1: principal point of zoom method
Shooting the two-dimensional target by using a camera under different focal lengths, wherein when the focal length is changed, the optical center moves along the optical axis, the two-dimensional target is expanded and contracted in equal proportion in the field of view of the camera to obtain target images under different field of view, the images of one point in different images are kept unchanged and are called as expansion and contraction centers, and the expansion and contraction centers in the images of different field of view are main points, so that the intersection point of each characteristic point of the target under the least square meaning of connecting lines under different focal lengths is the main point;
step 2: target image acquisition method by using active vision calibration method of four-dimensional electric control platform
Controlling a four-dimensional electric control platform to drive a camera to do orthogonal motion in four groups of planes, wherein each group of orthogonal motion comprises two translations, the camera stays at three positions in each orthogonal motion to further obtain three images, and under the premise of ensuring that a target is in a view field, the camera has a pitching angle or a panning angle which is not less than 5 degrees between each group of orthogonal motion, and a total of 12 target images are obtained through two pure translation motions of the four groups;
and step 3: determining a scale factor
The parallel straight line intersects with the plane at the same infinite point, namely a vanishing point, the 12 images after distortion correction are grouped for the vanishing point estimation, the connecting line of the corresponding points in each group of 3 images obtains 2 vanishing points, and 4 groups of 8 vanishing points; the obtained 8 blanking point coordinates, principal point coordinates and inter-rail angles are substituted into the following formula:
cos θ = v i 1 T wv i 2 v i 1 T wv i 1 v i 2 T wv i 2 , i = 1 , 2 , 3 , 4 - - - ( 1 )
wherein theta is the angle between the two translation guide rails of the electric control platform, vi1、vi2Pixel coordinates of pairs of blanking points obtained for two translational movements under the ith set of orthogonal movements, w ═ K-TK-1Is an image of an absolute quadratic curve, which is specified in the form:
w = 1 / f x 2 0 - u 0 / f x 2 0 1 / f y 2 - v 0 / f y 2 - u 0 / f x 2 - v 0 / f y 2 u 0 2 / f x 2 + v 0 2 / f y 2 + 1 - - - ( 2 )
through the twoObtaining a binary equation set containing four nonlinear equations by the equation and the coordinates of eight blanking points, and iteratively solving a scale factor f by using a nonlinear least square methodxAnd fyWherein u is0,v0Is the coordinate of the principal point of the camera;
and 4, step 4: calibrating external parameters;
a linear equation can be obtained from a homogeneous set of coordinates of corresponding image points, as follows:
uu′f11+vu′f12+u′f13+uv′f21+vv′f22+v′f23+uf31+vf32+f33=0 (3)
wherein f isijThe element in the ith row and the jth column of the basic matrix;
[uu′,vu′,u′,uv′,vv′,v′,u,v,1]f=0 (4)
wherein f ═ f11,f12,f13,f21,f22,f23,f31,f32,f33]TIs a 9-dimensional column vector;
if given n sets of corresponding points mi,m′iThe following homogeneous linear equation set
A f = u 1 u 1 ′ v 1 u 1 ′ u 1 ′ u 1 v 1 ′ v 1 v 1 ′ v 1 ′ u 1 v 1 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . u n u n ′ v n u n ′ u n ′ u n v n ′ v n v n ′ v n ′ u n v n 1 f = 0 - - - ( 5 )
At least 8 groups of corresponding image points are given, the homogeneous linear equation set can be linearly solved under the condition of an unknown non-zero scale factor difference, constraint | | | f | | | | 1 is applied, and when the corresponding image points are accurately matched, a basic matrix is solved by using only 8 groups of corresponding points; further obtaining the essential matrix under the condition of a non-zero scale factor differenceWhereins is a phase difference scale factor; by using E*=Udiag((σ12)/2,(σ12)/2,0)VTAsWherein the eigenvalue σ is1≥σ2≥σ3
From the essence matrix E*Estimating extrinsic parameters between camerasHas the following 4 possible combinations:
[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3]. (6)
wherein,u3is the 3 rd column vector of U,
by using Chirality constraint, the yaw angle is taken as a positive acute angle and a translational vectorComponent of (A) in the X directionJudging the correct parameter combination if the value is negative; and (3) reconstructing two characteristic points with accurately known distance information by using the obtained internal parameters of the cameras and the external parameters between the cameras with a difference of a scale factor, wherein the ratio of the reconstructed distance to the actual distance is the scale factor s.
CN201410432768.6A 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation Active CN104167001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410432768.6A CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410432768.6A CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Publications (2)

Publication Number Publication Date
CN104167001A CN104167001A (en) 2014-11-26
CN104167001B true CN104167001B (en) 2017-02-15

Family

ID=51910792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410432768.6A Active CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Country Status (1)

Country Link
CN (1) CN104167001B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751452A (en) * 2015-03-08 2015-07-01 大连理工大学 Monocular camera calibration method based on any known movement
CN105716542B (en) * 2016-04-07 2018-04-10 大连理工大学 A kind of three-dimensional data joining method based on flexible characteristic point
CN106157322B (en) * 2016-07-28 2019-04-26 北京航空航天大学 A kind of camera installation site scaling method based on plane mirror
CN109727291B (en) * 2018-12-28 2020-10-23 北京航空航天大学 High-precision online calibration method for zoom camera
CN114152190B (en) * 2021-11-15 2023-10-24 苏州铸正机器人有限公司 Industrial camera precision and working space test platform
CN114663527A (en) * 2022-03-22 2022-06-24 张慧 Camera self-calibration method under general motion

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101285676A (en) * 2008-06-10 2008-10-15 北京航空航天大学 Multi-visual sense sensor calibration method based on one-dimensional target
CN101581569A (en) * 2009-06-17 2009-11-18 北京信息科技大学 Calibrating method of structural parameters of binocular visual sensing system
WO2013111229A1 (en) * 2012-01-23 2013-08-01 日本電気株式会社 Camera calibration device, camera calibration method, and camera calibration program
CN103530880A (en) * 2013-10-16 2014-01-22 大连理工大学 Camera calibration method based on projected Gaussian grid pattern

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101285676A (en) * 2008-06-10 2008-10-15 北京航空航天大学 Multi-visual sense sensor calibration method based on one-dimensional target
CN101581569A (en) * 2009-06-17 2009-11-18 北京信息科技大学 Calibrating method of structural parameters of binocular visual sensing system
WO2013111229A1 (en) * 2012-01-23 2013-08-01 日本電気株式会社 Camera calibration device, camera calibration method, and camera calibration program
CN103530880A (en) * 2013-10-16 2014-01-22 大连理工大学 Camera calibration method based on projected Gaussian grid pattern

Also Published As

Publication number Publication date
CN104167001A (en) 2014-11-26

Similar Documents

Publication Publication Date Title
CN110296691B (en) IMU calibration-fused binocular stereo vision measurement method and system
CN109767476B (en) Automatic focusing binocular camera calibration and depth calculation method
CN104167001B (en) Large-visual-field camera calibration method based on orthogonal compensation
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
CN106981083B (en) The substep scaling method of Binocular Stereo Vision System camera parameters
CN104268876B (en) Camera calibration method based on partitioning
CN104374338B (en) Single-axis rotation angle vision measurement method based on fixed camera and single target
CN103759670B (en) A kind of object dimensional information getting method based on numeral up short
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN108510551B (en) Method and system for calibrating camera parameters under long-distance large-field-of-view condition
CN104537707B (en) Image space type stereoscopic vision moves real-time measurement system online
CN103278138B (en) Method for measuring three-dimensional position and posture of thin component with complex structure
CN111192235B (en) Image measurement method based on monocular vision model and perspective transformation
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN109859272A (en) A kind of auto-focusing binocular camera scaling method and device
CN109974618B (en) Global calibration method of multi-sensor vision measurement system
CN105809706B (en) A kind of overall calibration method of the more camera systems of distribution
CN113724337B (en) Camera dynamic external parameter calibration method and device without depending on tripod head angle
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN109215118B (en) Incremental motion structure recovery optimization method based on image sequence
CN109084959B (en) Optical axis parallelism correction method based on binocular distance measurement algorithm
CN114332191A (en) Three-dimensional point cloud error compensation method and device
CN114812558B (en) Monocular vision unmanned aerial vehicle autonomous positioning method combining laser ranging
CN111583342A (en) Target rapid positioning method and device based on binocular vision
CN104123726B (en) Heavy forging measuring system scaling method based on vanishing point

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant