CN109754434B - Camera calibration method, device, user equipment and storage medium - Google Patents

Camera calibration method, device, user equipment and storage medium Download PDF

Info

Publication number
CN109754434B
CN109754434B CN201811618418.3A CN201811618418A CN109754434B CN 109754434 B CN109754434 B CN 109754434B CN 201811618418 A CN201811618418 A CN 201811618418A CN 109754434 B CN109754434 B CN 109754434B
Authority
CN
China
Prior art keywords
coordinates
sample
pixel point
coordinate conversion
distortion correction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811618418.3A
Other languages
Chinese (zh)
Other versions
CN109754434A (en
Inventor
宋秀峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201811618418.3A priority Critical patent/CN109754434B/en
Publication of CN109754434A publication Critical patent/CN109754434A/en
Application granted granted Critical
Publication of CN109754434B publication Critical patent/CN109754434B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a camera calibration method, a camera calibration device, user equipment and a storage medium. According to the method, tangential distortion coefficients in the distortion of the initial coordinate conversion formula between world coordinates and pixel point coordinates are removed, a distortion correction coordinate conversion formula is obtained, and then the distortion correction coordinate conversion formula is trained according to sample world coordinates and sample pixel point coordinates, so that camera parameters in the distortion correction coordinate conversion formula are obtained, and camera calibration is achieved. Therefore, through the technical scheme of the invention, the tangential distortion coefficient in the initial conversion formula is removed, so that the number of camera parameters is reduced, the convergence speed is improved, and the calibration time is shortened.

Description

Camera calibration method, device, user equipment and storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a camera calibration method, a camera calibration device, a user equipment, and a storage medium.
Background
Generally, a camera can be roughly divided into three parts: lens, photosensitive element and processing circuit. When light passes through the lens, an "image" of an object is formed on the photosensitive element, and the process is similar to the principle of pinhole imaging, and then a series of processes are performed to become an electronic picture. Of course, certain errors occur in the imaging process, which is a curved surface as if a person were looking through the cat's eyes. In order to eliminate or correct these errors, camera calibration is often required before the camera leaves the factory, so that the camera parameters for eliminating or correcting these errors are obtained.
Camera calibration can be divided into plane calibration and three-dimensional calibration, wherein the plane calibration needs angular change, generally needs at least 3 pictures with different angles, but needs more than ten pictures or more to obtain better precision, wherein a typical calibration method is a Zhang's algorithm which is mature and integrated into MATLAB, so that the camera calibration method is widely applied in laboratories; the three-dimensional calibration is realized by the adoption of a three-dimensional calibration block, the shooting target is required to be a point in space, namely, all characteristic points cannot be on the same plane, the method is simple and easy to implement, but has high requirements on tools and algorithms, and the three-dimensional calibration method is widely applied to factory production, wherein algorithms such as Direct Linear Transformation (DLT), radial uniform constraint (RAC) and the like are adopted. In the prior art, more companies all adopt an RAC algorithm to calibrate cameras, because the existing RAC algorithm establishes a coordinate conversion formula for converting world coordinates and pixel coordinates when the cameras are calibrated, and then trains the coordinate conversion formula to obtain camera parameters in the coordinate conversion formula, the camera calibration is realized, but because the camera parameters in the coordinate conversion formula are too many, the convergence speed is slower when the coordinate conversion formula is trained, so that the calibration time is longer.
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The invention mainly aims to provide a camera calibration method, a camera calibration device, user equipment and a storage medium, and aims to solve the technical problem that in the prior art, the convergence speed of camera calibration is low, so that the calibration time is long.
In order to achieve the above object, the present invention provides a camera calibration method, which includes the steps of:
removing tangential distortion coefficients in the distortion of the initial coordinate conversion type between the world coordinates and the pixel point coordinates to obtain a distortion correction coordinate conversion type;
training the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration.
Preferably, the distortion correction coordinate conversion is as follows
Wherein, (u, v) is the sample pixel point coordinates, (X) w ,Y w ,Z w ) Is the world coordinate of the sample, R is a rotation matrix, T isTranslation matrix, f is focal length, (u) 0 ,v 0 ) Is the coordinate of the origin of the image coordinate in the coordinate system of the pixel point coordinate, dx is the physical dimension of the pixel point coordinate in the x-axis direction of the coordinate system of the image coordinate, dy is the physical dimension of the pixel point coordinate in the y-axis direction of the coordinate system of the image coordinate, η "= (1+k1xr+k2xr) 2 +k3*r 3 +b), k1, k2 and k3 are radial distortion coefficients, b is tangential distortion compensation value, r=x '' 2 +y' 2 ,x'=Xc/Zc,y'=Yc/Zc,(X c ,Y c ,Z c ) The world coordinates are converted to coordinates in the coordinate system where the camera coordinates are located for the sample.
Preferably, the training the distortion correction coordinate conversion formula according to the sample world coordinate and the sample pixel point coordinate to obtain the camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration specifically includes:
training the distortion correction coordinate conversion formula through an fminearch function according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration.
Preferably, the fminesearch function uses a minimum reprojection error between the sample world coordinates and the sample pixel point coordinates as a training condition, wherein the reprojection error is a difference value between a predicted pixel point coordinate and the sample pixel point coordinates, and the predicted pixel point coordinates are obtained after the sample world coordinates are substituted into the distortion correction coordinate conversion.
Preferably, the method for calibrating the camera further includes, before removing the tangential distortion coefficient in the distortion of the initial coordinate conversion formula between the world coordinate and the pixel point coordinate and obtaining the distortion correction coordinate conversion formula:
and obtaining the sample world coordinates and the sample pixel point coordinates of each corner point on the target calibration plate.
Preferably, the target calibration plate is a three-dimensional calibration plate, and the three-dimensional calibration plate comprises two plane calibration plates which are vertically connected.
Preferably, the acquiring the sample world coordinates and the sample pixel coordinates of each corner point on the target calibration board specifically includes:
and acquiring an image to be processed at the joint of the two plane calibration plates on the three-dimensional calibration plate, determining sample pixel point coordinates of each corner point on the three-dimensional calibration plate according to the image to be processed, and receiving sample world coordinates of each corner point on the three-dimensional calibration plate input by a user.
In addition, in order to achieve the above object, the present invention also provides a camera calibration apparatus, including:
the distortion correction module is used for removing tangential distortion coefficients in the distortion of the initial coordinate conversion type between the world coordinates and the pixel point coordinates to obtain a distortion correction coordinate conversion type;
and the formula training module is used for training the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration.
In addition, to achieve the above object, the present invention also provides a user equipment, including: the camera calibration system comprises a memory, a processor and a camera calibration program stored on the memory and capable of running on the processor, wherein the camera calibration program realizes the steps of the camera calibration method when being executed by the processor.
In addition, in order to achieve the above object, the present invention also provides a storage medium having stored thereon a camera calibration program which, when executed by a processor, implements the steps of the camera calibration method as described above.
According to the method, tangential distortion coefficients in the distortion of the initial coordinate conversion formula between world coordinates and pixel point coordinates are removed, a distortion correction coordinate conversion formula is obtained, and then the distortion correction coordinate conversion formula is trained according to sample world coordinates and sample pixel point coordinates, so that camera parameters in the distortion correction coordinate conversion formula are obtained, and camera calibration is achieved. Therefore, through the technical scheme of the invention, the tangential distortion coefficient in the initial conversion formula is removed, so that the number of camera parameters is reduced, the convergence speed is improved, and the calibration time is shortened.
Drawings
FIG. 1 is a schematic diagram of a user equipment architecture of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart of a camera calibration method according to a first embodiment of the present invention;
FIG. 3 is a flowchart of a camera calibration method according to a second embodiment of the present invention;
fig. 4 is a block diagram of a first embodiment of the camera calibration apparatus according to the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic diagram of a user equipment structure of a hardware running environment according to an embodiment of the present invention.
As shown in fig. 1, the user equipment may include: a processor 1001, such as a CPU, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display (Display), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage server separate from the aforementioned processor 1001.
It can be understood that the user device is a device capable of calibrating the camera, and may be a personal computer, a tablet computer, or a dedicated calibration device, which is not limited in this embodiment.
Those skilled in the art will appreciate that the structure shown in fig. 1 is not limiting of the user device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 1, an operating system, a network communication module, a user interface module, and a camera calibration program may be included in the memory 1005 as one type of storage medium.
The user device invokes the camera calibration program stored in the memory 1005 via the processor 1001 and performs the following operations:
removing tangential distortion coefficients in the distortion of the initial coordinate conversion type between the world coordinates and the pixel point coordinates to obtain a distortion correction coordinate conversion type;
training the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration.
Further, the processor 1001 may call a camera calibration program stored in the memory 1005, and further perform the following operations:
training the distortion correction coordinate conversion formula through an fminearch function according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration.
Further, the processor 1001 may call a camera calibration program stored in the memory 1005, and further perform the following operations:
and obtaining the sample world coordinates and the sample pixel point coordinates of each corner point on the target calibration plate.
Further, the processor 1001 may call a camera calibration program stored in the memory 1005, and further perform the following operations:
and acquiring an image to be processed at the joint of the two plane calibration plates on the three-dimensional calibration plate, determining sample pixel point coordinates of each corner point on the three-dimensional calibration plate according to the image to be processed, and receiving sample world coordinates of each corner point on the three-dimensional calibration plate input by a user.
In this embodiment, the tangential distortion coefficient in the distortion amount of the initial coordinate conversion formula between the world coordinate and the pixel point coordinate is removed, a distortion correction coordinate conversion formula is obtained, and then the distortion correction coordinate conversion formula is trained according to the sample world coordinate and the sample pixel point coordinate, so as to obtain the camera parameters in the distortion correction coordinate conversion formula, so as to realize camera calibration. Therefore, through the technical scheme of the invention, the tangential distortion coefficient in the initial conversion formula is removed, so that the number of camera parameters is reduced, the convergence speed is improved, and the calibration time is shortened.
Based on the hardware structure, the embodiment of the camera calibration method is provided.
Referring to fig. 2, fig. 2 is a flowchart of a first embodiment of a camera calibration method according to the present invention.
In a first embodiment, the camera calibration method includes the steps of:
s10: and removing tangential distortion coefficients in the distortion of the initial coordinate conversion type between the world coordinates and the pixel point coordinates to obtain a distortion correction coordinate conversion type.
The initial coordinate conversion formula is generally generated by the following steps (1) to (3):
step (1), converting a world coordinate system (namely, a coordinate system where world coordinates are located) into a camera coordinate system (namely, a coordinate system where camera coordinates are located) through an external parameter matrix;
hypothesis (X) w ,Y w ,Z w ) For the sample world coordinate of a certain angular point in the world coordinate system, the origin of the world coordinate can be set as the vertex of a certain plane calibration plate in the three-dimensional calibration plate, wherein the X axis, the Y axis and the Z axis of the world coordinate system are mutually perpendicular, and the Z axis of the world coordinate system is perpendicular to the plane calibration plate where the origin of the world coordinate is located.
When the sample world coordinates are converted into a camera coordinate system, the sample world coordinates can be represented by adding one dimension to the sample world coordinates by homogeneous coordinates (so that the sample world coordinates can be conveniently translated. The conversion from the world coordinate system to the camera coordinate system is realized by multiplying a 3×4 external parameter matrix by the left, and the conversion formula is as follows:
wherein, (X c ,Y c ,Z c ) The coordinates of the sample world coordinate are converted into the coordinates of the camera coordinate system, R is a rotation matrix, and T is a translation matrix.
Step (2), converting a camera coordinate system into an image physical coordinate system through an internal reference matrix;
in a specific implementation, the image physical coordinate system (i.e. the coordinate system in which the image coordinates are located) may use the center of the image plane of the camera as the origin of coordinates, where the X-axis and the Y-axis of the image physical coordinate system are respectively parallel to two perpendicular sides of the image plane, and the coordinate values are represented by (X, Y). The physical coordinate system of an image is a physical unit (e.g., millimeters) representing the location of a pixel in the image.
In this embodiment, the amount of distortion can be calculated by the following formula:
r=x' 2 +y' 2 formula (3)
Wherein, the liquid crystal display device comprises a liquid crystal display device, η 1 is the distortion quantity of the x-axis in the coordinates of a camera coordinate system, eta 2 The distortion amount of the y-axis in the coordinates of the camera coordinate system is calculated by calculating the distortion amount of the y-axis in the coordinates of the camera coordinate system, wherein x 'is a value normalized by the x-axis direction in the coordinates of the camera coordinate system, y' is a value normalized by the y-axis direction in the coordinates of the camera coordinate system, x 'is a value distorted by the x-axis direction in the coordinates of the camera coordinate system, and y' is a value distorted by the y-axis direction in the coordinates of the camera coordinate systemThe latter value, r, is the distance of the coordinates of the camera coordinate system from the center of the image plane, k1, k2 and k3 are radial distortion coefficients, and p1 and p2 are tangential distortion coefficients.
And then the camera coordinate system is converted into an image physical coordinate system through the focal length and the distortion amount, and the distortion occurs in the process of converting the camera coordinate system into the image physical coordinate system. The object of the operation is a camera coordinate system. This step is the process of adding focal length transformations and aberrations.
Converting the above formulas (5) and (6) into a matrix form, the matrix form being referred to formula (7):
wherein f is the focal length, (X) c ,Y c ,Z c ) Is the coordinates in the camera coordinate system, and (x, y) is the coordinates in the physical coordinate system of the image.
And (3) converting the physical coordinate system of the image into a pixel coordinate system (namely, the coordinate system where the coordinates of the pixel points are located) through a pixel conversion matrix.
The coordinate values of the X-axis and the Y-axis are respectively parallel to the X-axis and the Y-axis of the pixel coordinate system by taking the top left corner vertex of the camera image plane as the origin, and can be expressed by (u, v). The image acquired by the camera is firstly in the form of a standard electrical signal, and then is converted into a digital image through analog-to-digital conversion. The stored form of each image is an M x N array, and the value of each element in the image of M rows and N columns represents the gray scale of the image point. Each such element is called a pixel, and the pixel coordinate system is an image coordinate system in units of pixels.
This step is to perform the conversion on the same plane, but the representation unit is replaced first, and the position of the origin of coordinates is replaced.
Wherein, (u, v) is the sample pixel point coordinates in the pixel coordinate system, (u) 0 ,v 0 ) Is the coordinate of the origin of the image coordinates in the pixel coordinate system, dx is the physical dimension of the pixel point coordinates in the x-axis direction of the image coordinate system, and dy is the physical dimension of the pixel point coordinates in the y-axis direction of the image coordinate system.
After the above formulas (1) to (10) are combined, the initial coordinate conversion formula can be determined as:
when training the initial coordinate conversion formula, the convergence rate of the initial coordinate conversion formula is slow due to the excessive camera parameters in the initial coordinate conversion formula, so that the calibration time is long.
That is, the distortion amount can be unified as η "= (1+k1r+k2r 2 +k3*r 3 +b), b is the compensation value corresponding to the tangential distortion coefficient removed, and may be called tangential distortion compensation value, so that the above equation (5) may be modified asEquation (6) can be modified to +.>Accordingly, equation (7) can be modified to +.>
Correspondingly, the coordinate conversion to be processed can be expressed as
Wherein, (u, v) is the sample pixel point coordinates, (X) w ,Y w ,Z w ) Is the sample world coordinate, R is the rotation matrix, T is the translation matrix, f is the focal length, (u) 0 ,v 0 ) Is the coordinate of the origin of the image coordinate in the coordinate system of the pixel point coordinate, dx is the physical dimension of the pixel point coordinate in the x-axis direction of the coordinate system of the image coordinate, dy is the physical dimension of the pixel point coordinate in the y-axis direction of the coordinate system of the image coordinate, η "= (1+k1xr+k2xr) 2 +k3*r 3 +b), k1, k2 and k3 are radial distortion coefficients, b is tangential distortion compensation value, r=x '' 2 +y' 2 ,x'=Xc/Zc,y'=Yc/Zc,(X c ,Y c ,Z c ) The world coordinates are converted to coordinates in the coordinate system where the camera coordinates are located for the sample.
S20: training the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration.
In order to obtain the sample world coordinates and the sample pixel coordinates, in this embodiment, before step S10, the method may further include:
and obtaining the sample world coordinates and the sample pixel point coordinates of each corner point on the target calibration plate.
It should be noted that, the target calibration board is a calibration board adopted when the camera calibration is locally performed, and generally, the calibration board is usually provided with a plurality of points, which are corner points for performing the camera calibration.
It will be appreciated that in order to improve the calibration efficiency, therefore, in this embodiment, the target calibration plate may be a three-dimensional calibration plate, and the three-dimensional calibration plate includes two vertically connected planar calibration plates.
It should be understood that, in order to obtain the world coordinates of the samples and the pixel coordinates of the samples of more angular points, multiple shooting by the camera is avoided, in this embodiment, an image to be processed at the joint of two plane calibration plates on the three-dimensional calibration plate may be obtained, the pixel coordinates of the samples of each angular point on the three-dimensional calibration plate may be determined according to the image to be processed, and the world coordinates of the samples of each angular point on the three-dimensional calibration plate input by the user may be received.
Specifically, the sample pixel point coordinates can be determined from the image to be processed, and the sample world coordinates, because they generally need to be obtained by measurement, can be input by a user, and accordingly, the sample world coordinates of each corner point on the three-dimensional calibration plate input by the user in the world coordinate system can be received.
After the distortion correction coordinate conversion formula is obtained, the distortion correction coordinate conversion formula can be trained according to the sample world coordinates and the sample pixel point coordinates, after the camera parameters of the camera to be calibrated are obtained, the camera parameters and the distortion correction coordinate conversion formula can be written into the camera to be calibrated, and the camera to be calibrated can eliminate errors in the imaging process of the camera according to the camera parameters and the distortion correction coordinate conversion formula in the subsequent use process.
In a specific implementation, 7200 angular points of sample world coordinates and sample pixel point coordinates can be obtained, and 7200 groups of data, namely u, v and X w ,Y w ,Z w It is known that 7200 sets of data are available, and the distortion correction coordinate conversion can be trained according to the 7200 sets of data to obtain camera parameters dx, dy, u0, v0, f, R, T, k1, k2, k3 and b of the camera to be calibrated.
In the embodiment, tangential distortion coefficients in the distortion of an initial coordinate conversion formula between world coordinates and pixel point coordinates are removed to obtain a distortion correction coordinate conversion formula, and then the distortion correction coordinate conversion formula is trained according to sample world coordinates and sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration. Therefore, through the technical scheme of the invention, the tangential distortion coefficient in the initial conversion formula is removed, so that the number of camera parameters is reduced, the convergence speed is improved, and the calibration time is shortened.
Referring to fig. 3, fig. 3 is a flowchart illustrating a second embodiment of the camera calibration method according to the present invention, and based on the embodiment shown in fig. 2, the second embodiment of the camera calibration method according to the present invention is proposed.
In the second embodiment, step S20 specifically includes:
s21: and training the distortion correction coordinate conversion type through an fminearch function according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters of the camera to be calibrated so as to realize camera calibration.
The fminesearch function is a function in a Matlab tool box, and a derivative-free method is adopted to calculate the minimum value of an unconstrained multivariable function, so that the distortion correction coordinate conversion formula can be trained relatively quickly.
In order to reduce the re-projection error in the camera calibration process, in this embodiment, the fminesearch function uses the minimum re-projection error between the sample world coordinate and the sample pixel point coordinate as a training condition, the re-projection error is a difference value between a predicted pixel point coordinate and the sample pixel point coordinate, the predicted pixel point coordinate is obtained after substituting the sample world coordinate into the distortion correction coordinate conversion formula, and when the RAC algorithm in the prior art is adopted for training, the re-projection error is 0.6034, and in this embodiment, the distortion correction coordinate conversion formula is trained by the fminesearch function, the re-projection error is 0.9084, and the re-projection error is obviously reduced.
In addition, an embodiment of the present invention further provides a camera calibration device, referring to fig. 4, where the camera calibration device includes:
the distortion correction module 10 is configured to remove a tangential distortion coefficient in an initial coordinate transformation amount between the world coordinate and the pixel point coordinate, and obtain a distortion correction coordinate transformation formula;
the formula training module 20 is configured to train the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates, and obtain the camera parameters in the distortion correction coordinate conversion formula, so as to achieve camera calibration.
The modules in the above apparatus may be used to implement the steps in the above method, which are not described herein.
In addition, the embodiment of the invention also provides a storage medium, wherein the storage medium stores a camera calibration program, and the camera calibration program realizes the following operations when being executed by a processor:
removing tangential distortion coefficients in the distortion of the initial coordinate conversion type between the world coordinates and the pixel point coordinates to obtain a distortion correction coordinate conversion type;
training the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration.
Further, the camera calibration program when executed by the processor further performs the following operations:
training the distortion correction coordinate conversion formula through an fminearch function according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration.
Further, the camera calibration program when executed by the processor further performs the following operations:
and obtaining the sample world coordinates and the sample pixel point coordinates of each corner point on the target calibration plate.
Further, the camera calibration program when executed by the processor further performs the following operations:
and acquiring an image to be processed at the joint of the two plane calibration plates on the three-dimensional calibration plate, determining sample pixel point coordinates of each corner point on the three-dimensional calibration plate according to the image to be processed, and receiving sample world coordinates of each corner point on the three-dimensional calibration plate input by a user.
In this embodiment, the tangential distortion coefficient in the distortion amount of the initial coordinate conversion formula between the world coordinate and the pixel point coordinate is removed, a distortion correction coordinate conversion formula is obtained, and then the distortion correction coordinate conversion formula is trained according to the sample world coordinate and the sample pixel point coordinate, so as to obtain the camera parameters in the distortion correction coordinate conversion formula, so as to realize camera calibration. Therefore, through the technical scheme of the invention, the tangential distortion coefficient in the initial conversion formula is removed, so that the number of camera parameters is reduced, the convergence speed is improved, and the calibration time is shortened.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (5)

1. The camera calibration method is characterized by comprising the following steps of:
removing tangential distortion coefficients in the distortion of the initial coordinate conversion type between the world coordinates and the pixel point coordinates to obtain a distortion correction coordinate conversion type;
training the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration;
the distortion correction coordinate conversion is as follows:
wherein, the method comprises the following steps ofuv) Is the coordinates of sample pixel pointsX wY wZ w ) For the sample world coordinates,Rin order to rotate the matrix is rotated,Tin order to translate the matrix,fis focal length [ (]u 0 , v 0 ) Is the coordinate of the origin of the image coordinates in the coordinate system where the pixel point coordinates are located,dxis the physical dimension of the pixel point coordinates in the x-axis direction of the coordinate system in which the image coordinates are located,dyis the physical dimension of the pixel point coordinates in the y-axis direction of the coordinate system in which the image coordinates are located,k1、k2 andk3 are radial distortion coefficients, b is tangential distortion compensation value, +.>,/>,/>,(X cY cZ c ) Converting the world coordinates of the sample into coordinates in a coordinate system where the camera coordinates are located;
training the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration, and specifically comprising the following steps:
training the distortion correction coordinate conversion formula through an fminearch function according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration;
the method comprises the steps of removing tangential distortion coefficients in the distortion of the initial coordinate conversion formula between world coordinates and pixel point coordinates, and before obtaining the distortion correction coordinate conversion formula, further comprising:
acquiring sample world coordinates and sample pixel point coordinates of each corner point on a target calibration plate;
the target calibration plate is a three-dimensional calibration plate, and the three-dimensional calibration plate comprises two plane calibration plates which are vertically connected;
the obtaining the sample world coordinates and the sample pixel point coordinates of each corner point on the target calibration plate specifically comprises the following steps:
and acquiring an image to be processed at the joint of the two plane calibration plates on the three-dimensional calibration plate, determining sample pixel point coordinates of each corner point on the three-dimensional calibration plate according to the image to be processed, and receiving sample world coordinates of each corner point on the three-dimensional calibration plate input by a user.
2. The camera calibration method according to claim 1, wherein the fminearch function uses a minimum reprojection error between the sample world coordinates and sample pixel coordinates as a training condition, the reprojection error being a difference between a predicted pixel point coordinate and a sample pixel point coordinate, the predicted pixel point coordinate being obtained by substituting the sample world coordinates into the distortion correction coordinate conversion.
3. A camera calibration apparatus, comprising:
the distortion correction module is used for removing tangential distortion coefficients in the distortion of the initial coordinate conversion type between the world coordinates and the pixel point coordinates to obtain a distortion correction coordinate conversion type;
the formula training module is used for training the distortion correction coordinate conversion formula according to the sample world coordinates and the sample pixel point coordinates to obtain camera parameters in the distortion correction coordinate conversion formula so as to realize camera calibration;
the distortion correction coordinate conversion is as follows:
wherein, the method comprises the following steps ofuv) Is the coordinates of sample pixel pointsX wY wZ w ) For the sample world coordinates,Rin order to rotate the matrix is rotated,Tin order to translate the matrix,fis focal length [ (]u 0 , v 0 ) Is the coordinate of the origin of the image coordinates in the coordinate system where the pixel point coordinates are located,dxis the physical dimension of the pixel point coordinates in the x-axis direction of the coordinate system in which the image coordinates are located,dyis the physical dimension of the pixel point coordinates in the y-axis direction of the coordinate system in which the image coordinates are located,k1、k2 andk3 are all radial distortionsCoefficient b is tangential distortion compensation value, +.>,/>,/>,(X cY cZ c ) Converting the world coordinates of the sample into coordinates in a coordinate system where the camera coordinates are located;
the formula training module is further configured to train the distortion correction coordinate conversion formula through an fminearch function according to the sample world coordinates and the sample pixel point coordinates, so as to obtain camera parameters in the distortion correction coordinate conversion formula, so as to achieve camera calibration;
the distortion correction module is further used for obtaining an image to be processed at the joint of two plane calibration plates on the three-dimensional calibration plate, determining sample pixel point coordinates of each corner point on the three-dimensional calibration plate according to the image to be processed, and receiving sample world coordinates of each corner point on the three-dimensional calibration plate input by a user, wherein the three-dimensional calibration plate comprises two plane calibration plates which are vertically connected.
4. A user device, the user device comprising: memory, a processor and a camera calibration program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the camera calibration method according to any one of claims 1 to 2.
5. A storage medium having stored thereon a camera calibration program which, when executed by a processor, implements the steps of the camera calibration method according to any one of claims 1 to 2.
CN201811618418.3A 2018-12-27 2018-12-27 Camera calibration method, device, user equipment and storage medium Active CN109754434B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811618418.3A CN109754434B (en) 2018-12-27 2018-12-27 Camera calibration method, device, user equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811618418.3A CN109754434B (en) 2018-12-27 2018-12-27 Camera calibration method, device, user equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109754434A CN109754434A (en) 2019-05-14
CN109754434B true CN109754434B (en) 2023-08-29

Family

ID=66404257

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811618418.3A Active CN109754434B (en) 2018-12-27 2018-12-27 Camera calibration method, device, user equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109754434B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110503691B (en) * 2019-07-01 2024-02-20 广州超音速自动化科技股份有限公司 Pole piece lamination calibration method of lithium battery, terminal equipment and storage device
CN110539748B (en) * 2019-08-27 2023-05-16 北京纵目安驰智能科技有限公司 Congestion car following system and terminal based on look-around
CN111256707A (en) * 2019-08-27 2020-06-09 北京纵目安驰智能科技有限公司 Congestion car following system and terminal based on look around
CN110555402A (en) * 2019-08-27 2019-12-10 北京纵目安驰智能科技有限公司 congestion car following method, system, terminal and storage medium based on look-around
CN110595382A (en) * 2019-09-20 2019-12-20 苏州德尔富自动化科技有限公司 3D space vision curved surface measuring equipment and measured data processing method
CN112862895B (en) * 2019-11-27 2023-10-10 杭州海康威视数字技术股份有限公司 Fisheye camera calibration method, device and system
CN111027522B (en) * 2019-12-30 2023-09-01 华通科技有限公司 Bird detection positioning system based on deep learning
CN113516717A (en) * 2020-04-10 2021-10-19 富华科精密工业(深圳)有限公司 Camera device external parameter calibration method, electronic equipment and storage medium
CN111665493A (en) * 2020-06-12 2020-09-15 江苏卫国防务技术有限公司 Low-slow small target detection method based on digital beam forming technology
CN114677445A (en) * 2020-12-25 2022-06-28 北京小米移动软件有限公司 Camera calibration method, camera calibration device and storage medium
CN112562014B (en) * 2020-12-29 2024-07-02 纵目科技(上海)股份有限公司 Camera calibration method, system, medium and device
CN112991465A (en) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 Camera calibration method and device, electronic equipment and computer readable medium
CN113157092B (en) * 2021-04-08 2023-03-24 海信视像科技股份有限公司 Visualization method, terminal device and storage medium

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201838036U (en) * 2010-11-09 2011-05-18 姜光 Bi-planar calibration plate
WO2011143813A1 (en) * 2010-05-19 2011-11-24 深圳泰山在线科技有限公司 Object projection method and object projection sysytem
CN102592124A (en) * 2011-01-13 2012-07-18 汉王科技股份有限公司 Geometrical correction method, device and binocular stereoscopic vision system of text image
CN102930544A (en) * 2012-11-05 2013-02-13 北京理工大学 Parameter calibration system of vehicle-mounted camera
CN105080023A (en) * 2015-09-02 2015-11-25 南京航空航天大学 Automatic tracking and positioning jet flow extinguishing method
CN106548477A (en) * 2017-01-24 2017-03-29 长沙全度影像科技有限公司 A kind of multichannel fisheye camera caliberating device and method based on stereo calibration target
CN106709955A (en) * 2016-12-28 2017-05-24 天津众阳科技有限公司 Space coordinate system calibrate system and method based on binocular stereo visual sense
WO2017092631A1 (en) * 2015-11-30 2017-06-08 宁波舜宇光电信息有限公司 Image distortion correction method for fisheye image, and calibration method for fisheye camera
CN107025670A (en) * 2017-03-23 2017-08-08 华中科技大学 A kind of telecentricity camera calibration method
CN206460515U (en) * 2017-01-24 2017-09-01 长沙全度影像科技有限公司 A kind of multichannel fisheye camera caliberating device based on stereo calibration target
CN107170345A (en) * 2017-04-11 2017-09-15 广东工业大学 Towards the teaching method and device based on machine vision and gyroscope of industrial robot
CN107317953A (en) * 2017-06-30 2017-11-03 上海兆芯集成电路有限公司 Camera bearing calibration and the device using this method
WO2017215018A1 (en) * 2016-06-15 2017-12-21 上海葡萄纬度科技有限公司 Educational toy kit and convex mirror imaging correction method thereof
CN107507246A (en) * 2017-08-21 2017-12-22 南京理工大学 A kind of camera marking method based on improvement distortion model
CN107610185A (en) * 2017-10-12 2018-01-19 长沙全度影像科技有限公司 A kind of fisheye camera fast calibration device and scaling method
CN108269235A (en) * 2018-02-26 2018-07-10 江苏裕兰信息科技有限公司 A kind of vehicle-mounted based on OPENGL looks around various visual angles panorama generation method
CN108447095A (en) * 2018-01-31 2018-08-24 潍坊歌尔电子有限公司 A kind of fisheye camera scaling method and device
CN108550171A (en) * 2018-04-20 2018-09-18 东北大学 The line-scan digital camera scaling method containing Eight Diagrams coding information based on Cross ration invariability
WO2018196391A1 (en) * 2017-04-28 2018-11-01 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted camera

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104729429B (en) * 2015-03-05 2017-06-30 深圳大学 A kind of three dimensional shape measurement system scaling method of telecentric imaging
WO2017080451A1 (en) * 2015-11-11 2017-05-18 Zhejiang Dahua Technology Co., Ltd. Methods and systems for binocular stereo vision

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011143813A1 (en) * 2010-05-19 2011-11-24 深圳泰山在线科技有限公司 Object projection method and object projection sysytem
CN201838036U (en) * 2010-11-09 2011-05-18 姜光 Bi-planar calibration plate
CN102592124A (en) * 2011-01-13 2012-07-18 汉王科技股份有限公司 Geometrical correction method, device and binocular stereoscopic vision system of text image
CN102930544A (en) * 2012-11-05 2013-02-13 北京理工大学 Parameter calibration system of vehicle-mounted camera
CN105080023A (en) * 2015-09-02 2015-11-25 南京航空航天大学 Automatic tracking and positioning jet flow extinguishing method
WO2017092631A1 (en) * 2015-11-30 2017-06-08 宁波舜宇光电信息有限公司 Image distortion correction method for fisheye image, and calibration method for fisheye camera
WO2017215018A1 (en) * 2016-06-15 2017-12-21 上海葡萄纬度科技有限公司 Educational toy kit and convex mirror imaging correction method thereof
CN106709955A (en) * 2016-12-28 2017-05-24 天津众阳科技有限公司 Space coordinate system calibrate system and method based on binocular stereo visual sense
CN206460515U (en) * 2017-01-24 2017-09-01 长沙全度影像科技有限公司 A kind of multichannel fisheye camera caliberating device based on stereo calibration target
CN106548477A (en) * 2017-01-24 2017-03-29 长沙全度影像科技有限公司 A kind of multichannel fisheye camera caliberating device and method based on stereo calibration target
CN107025670A (en) * 2017-03-23 2017-08-08 华中科技大学 A kind of telecentricity camera calibration method
CN107170345A (en) * 2017-04-11 2017-09-15 广东工业大学 Towards the teaching method and device based on machine vision and gyroscope of industrial robot
WO2018196391A1 (en) * 2017-04-28 2018-11-01 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted camera
CN107317953A (en) * 2017-06-30 2017-11-03 上海兆芯集成电路有限公司 Camera bearing calibration and the device using this method
CN107507246A (en) * 2017-08-21 2017-12-22 南京理工大学 A kind of camera marking method based on improvement distortion model
CN107610185A (en) * 2017-10-12 2018-01-19 长沙全度影像科技有限公司 A kind of fisheye camera fast calibration device and scaling method
CN108447095A (en) * 2018-01-31 2018-08-24 潍坊歌尔电子有限公司 A kind of fisheye camera scaling method and device
CN108269235A (en) * 2018-02-26 2018-07-10 江苏裕兰信息科技有限公司 A kind of vehicle-mounted based on OPENGL looks around various visual angles panorama generation method
CN108550171A (en) * 2018-04-20 2018-09-18 东北大学 The line-scan digital camera scaling method containing Eight Diagrams coding information based on Cross ration invariability

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"一种改进的相机标定方法";刘旭宏等;《计算机工程与应用》;第44卷(第29期);第229-230页第2节 *

Also Published As

Publication number Publication date
CN109754434A (en) 2019-05-14

Similar Documents

Publication Publication Date Title
CN109754434B (en) Camera calibration method, device, user equipment and storage medium
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
US11997397B2 (en) Method, apparatus, and device for processing images, and storage medium
WO2020097851A1 (en) Image processing method, control terminal and storage medium
CN110996081B (en) Projection picture correction method and device, electronic equipment and readable storage medium
CN110738707A (en) Distortion correction method, device, equipment and storage medium for cameras
WO2019171984A1 (en) Signal processing device, signal processing method, and program
JP2020523703A5 (en)
CN112862895B (en) Fisheye camera calibration method, device and system
CN111709999A (en) Calibration plate, camera calibration method and device, electronic equipment and camera system
CN113052868A (en) Cutout model training and image cutout method and device
CN116580103A (en) Lithium battery measurement calibration method and device
US10510163B2 (en) Image processing apparatus and image processing method
CN111415314A (en) Resolution correction method and device based on sub-pixel level visual positioning technology
CN113034565B (en) Depth calculation method and system for monocular structured light
CN111222446B (en) Face recognition method, face recognition device and mobile terminal
CN107645634A (en) A kind of undistorted wide angle network video camera and safety defense monitoring system
CN109177138B (en) Method and device for aligning glass and membrane
CN115100225B (en) Method and device for determining error field of camera view field, electronic equipment and medium
WO2023070862A1 (en) Method and apparatus for correcting image distortion of wide-angle lens, and photographing device
CN112894154B (en) Laser marking method and device
CN111353945A (en) Fisheye image correction method, fisheye image correction device and storage medium
CN116012242A (en) Camera distortion correction effect evaluation method, device, medium and equipment
CN112614194B (en) Data processing method, system and device of image acquisition equipment
CN114693769A (en) Calibration method and device for C-arm machine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant