CN109658459B - Camera calibration method, device, electronic equipment and computer-readable storage medium - Google Patents

Camera calibration method, device, electronic equipment and computer-readable storage medium Download PDF

Info

Publication number
CN109658459B
CN109658459B CN201811453896.3A CN201811453896A CN109658459B CN 109658459 B CN109658459 B CN 109658459B CN 201811453896 A CN201811453896 A CN 201811453896A CN 109658459 B CN109658459 B CN 109658459B
Authority
CN
China
Prior art keywords
camera
calibration
image
images
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811453896.3A
Other languages
Chinese (zh)
Other versions
CN109658459A (en
Inventor
方攀
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zeku Technology Shanghai Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201811453896.3A priority Critical patent/CN109658459B/en
Publication of CN109658459A publication Critical patent/CN109658459A/en
Application granted granted Critical
Publication of CN109658459B publication Critical patent/CN109658459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The application relates to a camera calibration method, a camera calibration device, an electronic device and a computer readable storage medium. The method comprises the following steps: the method comprises the steps of shooting through a first camera and a second camera to obtain a first group of calibration images, calibrating the first camera and the second camera according to the first group of calibration images to obtain first calibration information, shooting through the first camera and a third camera to obtain a second group of calibration images, wherein the third camera is a depth camera, calibrating the first camera and the third camera according to the second group of calibration images to obtain second calibration information, and the accuracy of camera calibration can be improved.

Description

Camera calibration method, device, electronic equipment and computer-readable storage medium
Technical Field
The present application relates to the field of image technologies, and in particular, to a method and an apparatus for calibrating a camera, an electronic device, and a computer-readable storage medium.
Background
Before the camera leaves the factory, the camera needs to be calibrated to obtain calibration information of the camera, so that the camera can process images according to the calibration information, and the processed images can restore objects in a three-dimensional space.
Disclosure of Invention
The embodiment of the application provides a camera calibration method, a camera calibration device, electronic equipment and a computer-readable storage medium, which can improve the accuracy of camera calibration.
A camera calibration method, comprising:
shooting through a first camera and a second camera to obtain a first group of calibration images;
calibrating the first camera and the second camera according to the first group of calibration images to obtain first calibration information;
shooting through the first camera and a third camera to obtain a second group of calibration images, wherein the third camera is a depth camera;
and calibrating the first camera and the third camera according to the second group of calibration images to obtain second calibration information.
A camera calibration device comprises:
the first image acquisition module is used for shooting through the first camera and the second camera to obtain a first group of calibration images;
the first calibration module is used for calibrating the first camera and the second camera according to the first group of calibration images to obtain first calibration information;
the second image acquisition module is used for shooting through the first camera and a third camera to obtain a second group of calibration images, wherein the third camera is a depth camera;
and the second calibration module is used for calibrating the first camera and the third camera according to the second group of calibration images to obtain second calibration information.
An electronic device comprising a camera, a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
shooting through a first camera and a second camera to obtain a first group of calibration images;
calibrating the first camera and the second camera according to the first group of calibration images to obtain first calibration information;
shooting through the first camera and a third camera to obtain a second group of calibration images, wherein the third camera is a depth camera;
and calibrating the first camera and the third camera according to the second group of calibration images to obtain second calibration information.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
shooting through a first camera and a second camera to obtain a first group of calibration images;
calibrating the first camera and the second camera according to the first group of calibration images to obtain first calibration information;
shooting through the first camera and a third camera to obtain a second group of calibration images, wherein the third camera is a depth camera;
and calibrating the first camera and the third camera according to the second group of calibration images to obtain second calibration information.
According to the camera calibration method, the camera calibration device, the electronic equipment and the computer readable storage medium, a first group of calibration images are obtained by shooting through the first camera and the second camera, the first camera and the second camera are calibrated according to the first group of calibration images to obtain first calibration information, a second group of calibration images are obtained by shooting through the first camera and the third camera, wherein the third camera is a depth camera, and the first camera and the third camera are calibrated according to the second group of calibration images to obtain second calibration information, so that the accuracy of camera calibration can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an application environment for a camera calibration method in one embodiment;
FIG. 2 is a flow diagram of a camera calibration method in one embodiment;
FIG. 3 is a flow diagram of testing first calibration information in one embodiment;
FIG. 4 is a flow diagram of a calibration test performed in one embodiment;
FIG. 5 is a flow chart of a camera calibration apparatus in one embodiment;
FIG. 6 is a block diagram showing an internal configuration of an electronic apparatus according to an embodiment;
FIG. 7 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first camera may be referred to as a second camera, and similarly, a second camera may be referred to as a first camera, without departing from the scope of the present application. The first camera and the second camera are both cameras, but they are not the same camera.
Fig. 1 is a schematic diagram of an application environment of the camera calibration method in one embodiment. As shown in fig. 1, the application environment includes an electronic device 110 and a calibration board 120. The calibration plate (chart plate) 120 is provided with chart patterns, and the calibration plate 120 can rotate to keep poses of different angles. The electronic device 110 may capture an image containing the chart pattern of the calibration plate 120 through a camera. The electronic device 110 includes a first camera, a second camera, and a third camera arranged in an arbitrary order. The electronic device 110 may perform shooting through the first camera and the second camera to obtain a first set of calibration images, perform calibration processing on the first camera and the second camera according to the first set of calibration images to obtain first calibration information, perform shooting through the first camera and the third camera to obtain a second set of calibration images, and perform calibration processing on the first camera and the third camera according to the second set of calibration images to obtain second calibration information. The first camera may be a color (RGB) camera, and the second camera and the third camera may be, but are not limited to, one or more of a color camera, a black and white camera, a telephoto camera, a wide-angle camera, and a depth camera. The calibration board 120 may be a three-dimensional calibration board or a two-dimensional calibration board. The electronic device 110 may be, but is not limited to, various cell phones, computers, portable devices, and the like.
Fig. 2 is a flow diagram of a camera calibration method in one embodiment. The camera calibration method in the present embodiment is described by taking the electronic device in fig. 1 as an example. As shown in fig. 2, the camera calibration method includes steps 202 to 208. Wherein:
step 202, shooting through the first camera and the second camera to obtain a first group of calibration images.
The electronic device may include a first camera, a second camera, and a third camera. Each camera can be a camera arranged in the electronic equipment or a camera arranged outside the electronic equipment; the camera included in the electronic device may be a front camera or a rear camera. The first camera may be any camera, and is not limited herein. For example, the first camera and the second camera may both be color cameras, may be a combination of color cameras and black-and-white cameras, may be a combination of a telephoto camera and a black-and-white camera, and the like, but are not limited thereto.
The electronic equipment shoots through first camera and second camera, and specifically, the electronic equipment can shoot the calibration board through first camera and second camera, obtains the first group of calibration image that includes the calibration image. The first group of calibration images comprises calibration images shot by the first camera and the second camera. The calibration plate can be a two-dimensional calibration plate or a three-dimensional calibration plate. The three-dimensional calibration plate is a calibration plate comprising at least three calibration surfaces. When the calibration plate is a three-dimensional calibration plate, each calibration image obtained by the electronic equipment contains at least three different calibration patterns.
And 204, calibrating the first camera and the second camera according to the first group of calibration images to obtain first calibration information.
The calibration processing refers to the operation of solving parameters in a geometric model imaged by the camera, and the shot image can restore an object in a space through the geometric model imaged by the camera. The calibration information refers to camera parameters and the like obtained after the camera is calibrated, and the calibration information can be used for correcting the image acquired by the camera, so that the corrected image can restore the object in the space. Specifically, the calibration information may include internal parameters, external parameters, distortion coefficients, and the like of the camera. The first calibration information is calibration information corresponding to the first camera and the second camera. The first calibration information comprises monocular calibration information corresponding to the first camera and the second camera respectively and binocular calibration information of the first camera and the second camera, wherein the monocular calibration information can comprise internal parameters, external parameters and distortion coefficients, and the binocular calibration information can comprise external parameters between the first camera and the second camera.
The electronic device calibrates the first camera and the second camera according to the first group of calibration images, and specifically, the electronic device may calibrate the cameras by using a conventional camera calibration method, a camera self-calibration method, a zhangying calibration method between the conventional calibration method and the self-calibration method, and the like, so as to obtain first calibration information corresponding to the first camera and the second camera. The electronic equipment can respectively obtain a first calibration image shot by a first camera in the first group of calibration images, search the characteristic points of the first calibration image, and obtain the internal parameter, the external parameter and the distortion coefficient corresponding to the first camera according to the characteristic points; similarly, the electronic device may also obtain the internal parameter, the external parameter, and the distortion coefficient corresponding to the second camera through a second calibration image captured by the second camera, and then calculate the external parameter between the first camera and the second camera according to the external parameters respectively corresponding to the first camera and the second camera.
And step 206, shooting through the first camera and a third camera to obtain a second group of calibration images, wherein the third camera is a depth camera.
The third camera may be a depth camera. Specifically, the third camera may be a structured light camera, a TOF (Time of flight, Time of flight ranging) camera, or the like, and may also be another camera capable of acquiring depth information. The structured light camera projects controllable light spots, light bars or light surface structures to the surface of the measured object; and receives reflected light of a controllable light spot, light bar or smooth structure, and obtains a depth image according to the deformation amount of the emitted light. The TOF camera transmits near infrared light to a scene; receiving the reflected near infrared rays, and acquiring depth information of a scene by calculating the time difference or phase difference of the reflected near infrared rays; and representing the outline of the scene with different colors for different distances to acquire a depth image. The electronic equipment can shoot the calibration board through the first camera and the third camera, and the obtained second group of calibration images comprise calibration images shot by the first camera and the third camera respectively. The calibration image shot by the third camera may be a visual image, that is, the calibration image may include scene information shot by the camera, and may be, for example, a color image, a grayscale image, and the like.
And 208, calibrating the first camera and the third camera according to the second group of calibration images to obtain second calibration information.
The second calibration information comprises internal parameters, external parameters and distortion coefficients corresponding to the first camera and the third camera respectively, and the external parameters between the first camera and the third camera. Similar to the electronic device performing calibration processing on the first camera and the second camera according to the first group of calibration images, the electronic device may perform calibration processing on the first camera and the third camera according to the second group of calibration images to obtain second calibration information. The second calibration information contains the calibration information of the depth camera and can be used for correcting the depth image shot by the depth camera, so that the corrected depth image can accurately represent the depth information of the image collected by the first camera.
In an embodiment, the electronic device may also perform shooting through the first camera and the third camera to obtain a second group of calibration images, perform calibration processing on the first camera and the third camera according to the second group of calibration images to obtain second calibration information, perform shooting through the first camera and the second camera to obtain a first group of calibration images, perform calibration processing on the first camera and the second camera according to the first group of calibration images to obtain first calibration information, where the order of camera calibration is not limited.
In the embodiment of the application, the electronic device performs shooting through the first camera and the second camera to obtain a first group of calibration images, performs calibration processing on the first camera and the second camera according to the first group of calibration images to obtain first calibration information, performs shooting through the first camera and the third camera to obtain a second group of calibration images, wherein the third camera is a depth camera, and performs calibration processing on the first camera and the third camera according to the second group of calibration images to obtain second calibration information. Because can be to the three camera that electronic equipment contained, and one of them camera carries out calibration processing for the degree of depth camera, obtain first calibration information and contain the second calibration information of degree of depth camera calibration information, can improve the accuracy that the camera was markd. And then in the use of camera, can obtain corresponding calibration information according to the camera that uses and carry out the correction processing to the image of gathering, when needing to use the depth information of the object of being shot in the image, can adopt the third camera to gather the depth image to carry out the correction to the depth image that the third camera gathered according to second calibration information, can improve image processing's accuracy.
In one embodiment, the provided camera calibration method for capturing images by a first camera and a second camera includes: shooting calibration plates at least three angles through a first camera and a second camera to obtain a first group of calibration images; shooting through the first camera and the third camera, and the process of obtaining a second group of calibration images comprises: and shooting the calibration plates at least three angles through the first camera and the third camera to obtain a second group of calibration images.
When the calibration plate is a two-dimensional calibration plate, i.e., a flat plate having only one calibration surface, the calibration plate can be rotated through a plurality of angles by the rotation axis. As shown in fig. 1, one of at least three angles of the calibration plate may be 0 degree, and the other angles are ± θ degrees. The value of theta can be determined according to actual requirements on the basis of ensuring decoupling between postures. The angle of the calibration board can be controlled by the electronic device, for example, after the electronic device finishes shooting the calibration board at the first angle, the electronic device can control the calibration board to adjust to the second angle, and then the electronic device shoots the calibration board at the second angle through the camera. In some embodiments, the angle of the calibration plate may be controlled by a human or other electronic device, and the like, but is not limited thereto. The electronic equipment can shoot the calibration plates at least three angles through the first camera and the second camera to obtain a first group of calibration images, and shoot the calibration plates at least three angles through the first camera and the third camera to obtain a second group of calibration images. For example, the electronic device captures calibration plates at five angles of 0 degree, -20 degree, +20 degree, -30 degree and +30 degree through the first camera and the second camera to obtain a first set of calibration images corresponding to the first camera and the second camera. The first group of calibration images comprise calibration images shot by the first camera and the second camera for calibration plates at different angles.
In an embodiment, in the provided camera calibration method, the third camera is a depth camera, and the step of acquiring the second set of calibration images captured by the first camera and the third camera may include: shooting the calibration plate through the first camera and the third camera to obtain a third calibration image corresponding to the first camera and a fourth calibration image corresponding to the third camera, wherein the fourth calibration image is a gray image; and acquiring a second group of calibration images comprising a third calibration image and a fourth calibration image.
The gray image is an image with a pixel point having only one sampling color. Specifically, a grayscale image has many levels of color depth between black and white. The image acquired by the electronic device through the third camera may be a grayscale image. In the embodiment of the application, the electronic device shoots the calibration board through the first camera and the third camera, so that a third calibration image shot by the first camera and a fourth calibration image shot by the third camera can be obtained, and the fourth calibration image is a gray image containing the calibration pattern of the calibration board. The electronic device may acquire a second set of calibration images consisting of a third calibration image and a fourth calibration image. The first camera and the third camera are used for shooting to obtain a second group of calibration images containing the gray level images shot by the third camera, and the first camera and the third camera are calibrated to improve the calibration accuracy of the cameras.
FIG. 3 is a flow diagram of testing first calibration information in one embodiment. As shown in fig. 3, in an embodiment, before capturing by the first camera and the third camera to obtain the second set of calibration images, the method for calibrating a camera further includes:
step 302, a first calibration image shot by a first camera and a second calibration image shot by a second camera in the first group of calibration images are obtained.
The electronic device may obtain a first calibration image captured by a first camera and a second calibration image captured by a second camera in the first set of calibration images. When calibration images shot by the electronic equipment at calibration plates at different angles exist in the first group of calibration images, the electronic equipment can acquire a first calibration image shot by the first camera and a second calibration image shot by the second camera under the calibration plates at the same angle.
And 304, correcting the first calibration image and the second calibration image according to the first calibration information to obtain a corresponding first target image and a corresponding second target image.
The electronic device corrects the first calibration image and the second calibration image according to the first calibration information, and specifically, the electronic device corrects the first calibration image and the second calibration image according to monocular calibration information and binocular calibration information included in the first calibration information to obtain a processed first target image and a processed second target image. Wherein the feature points in the first target image and the second target image are horizontally aligned.
Step 306, calculating first depth information according to the first target image and the second target image.
The depth information is a distance between the subject and the camera in the image. The electronic device calculates the first depth information according to the first target image and the second target image, and specifically, the electronic device may search for corresponding feature points in the first target image and the second target image, and obtain the parallax of the object to be photographed between the first camera and the second camera according to the corresponding feature points in the first target image and the second target image, so as to convert the parallax into the depth information of the object to be photographed.
And 308, when the first depth information is matched with the preset depth information, performing shooting through the first camera and the third camera to obtain a second group of calibration images.
The preset depth information may be a distance value between the camera and the object to be photographed, which is measured in advance. Specifically, the electronic device and the calibration board may be placed according to the preset depth information, so that the electronic device captures a first calibration image and a second calibration image through a first camera and a second camera, and performs correction processing on the first calibration image and the second calibration image to obtain a first target image and a second target image, where a distance theory between a captured object in the first target image and the second target image and the calibration board should be the same as the preset depth information. The electronic device can calculate to obtain first depth information through the first target image and the second target image, and then match the first depth information with preset depth information. The electronic device may calculate the matching degree according to the depth difference of each pixel point in the first depth information and the preset depth information, for example, the average value and the variance of the depth difference of each pixel point, the number of pixel points whose depth difference is smaller than the preset difference, and the like may be calculated, which is not limited thereto.
The first depth information is matched with the preset depth information, which indicates that the first calibration information obtained by the electronic equipment through calibration processing of the first camera and the second camera is accurate, and the electronic equipment can accurately restore the object in the three-dimensional space according to the first target image and the second target image after the electronic equipment corrects the first calibration image and the second calibration image according to the first calibration information. When the first depth information is matched with the preset depth information, performing shooting through the first camera and the third camera to obtain a second group of calibration images; when the first depth information is not matched with the preset depth information, the electronic equipment can shoot through the first camera and the second camera again, calibration processing is carried out on the first camera and the second camera, the fact that the first camera and the third camera continue to be calibrated under the condition that calibration of the first camera and the second camera is not accurate is avoided, and the camera calibration efficiency can be improved.
In one embodiment, a camera calibration method is provided in which the third camera is a depth camera, and the method may further include: when a first group of calibration images are obtained by shooting through the first camera and the second camera, shooting through the third camera to obtain corresponding depth images; and taking the depth information contained in the depth image as preset depth information.
The third camera is a depth camera. And the third camera can collect a depth image, wherein the depth image contains depth information between the object to be shot and the camera. The electronic equipment can shoot through the first camera and the second camera, when a first group of calibration images are obtained, the third camera shoots to obtain the depth images corresponding to the first calibration images and the second calibration images, the depth information contained in the depth images is used as the preset depth information, the measuring error generated when the distance between the electronic equipment and the calibration plate is measured manually can be avoided, and the accuracy of the preset depth information is improved.
FIG. 4 is a flow diagram of a calibration test performed in one embodiment. As shown in fig. 4, in an embodiment, the second set of calibration images includes a third calibration image captured by the first camera and a fourth calibration image captured by the third camera, and the camera calibration method may further include:
and step 402, correcting the third calibration image and the fourth calibration image according to the second calibration information to obtain a third target image and a fourth target image.
The electronic device corrects the third calibration image and the fourth calibration image according to the second calibration information, and specifically, the electronic device corrects the third calibration image and the fourth calibration image according to monocular calibration information and binocular calibration information included in the second calibration information to obtain a processed third target image and a processed fourth target image. And the feature points in the third target image and the fourth target image are horizontally aligned.
Step 404, extracting a first pixel point in the first target image, extracting a second pixel point corresponding to the first pixel point in the second target image, and obtaining a first parallax according to the first pixel point and the second pixel point.
The electronic device may extract pixel points in the first target image and the second target image using a Scale-invariant feature transform (SIFT) method or a Speeded Up Robust Features (SURF) method. Specifically, the electronic device may extract a first pixel point in the first target image, extract a second pixel point corresponding to the first pixel point in the second target image by using a stereo matching algorithm, and then calculate the first parallax according to the first pixel point and the second pixel point.
Step 406, extracting a third pixel point in the third target image, extracting a fourth pixel point of the third pixel point corresponding to the fourth target image, and obtaining a second parallax according to the third pixel point and the fourth pixel point.
The electronic device may extract pixel points in the third target image and the fourth target image using a scale invariant feature transformation method or an accelerated robust feature method. Specifically, the electronic device may extract a third pixel point in the third target image, extract a fourth pixel point corresponding to the third pixel point in the fourth target image by using a stereo matching algorithm, and then calculate the second parallax according to the third pixel point and the fourth pixel point.
Step 408, calculating a parallax difference value between the first parallax and the second parallax, and generating a prompt signal that the calibration test is passed when the parallax difference value is smaller than a difference threshold value.
Specifically, the electronic device may obtain an absolute value of a difference between the first parallax and the second parallax as a parallax difference, and compare the absolute value of the difference with a difference threshold, where the difference threshold is preset by an engineer in a calibration process of the camera module, and the setting of the preset threshold is determined according to specific situations. If the parallax difference is smaller than the difference threshold, it indicates that the actual error of the camera calibration result is within the error allowable range, and further generates a prompt signal that the calibration test passes, where the prompt signal is used to prompt a calibration test processing unit of the electronic device, and the camera calibration result passes the calibration test, and the electronic device may store the first calibration information and the second calibration information obtained by the camera calibration process according to the prompt signal.
In an embodiment, the electronic device may also shoot a new image through the first camera, the second camera and the third camera, and the image collected by each camera may not be limited to the image obtained by shooting the calibration board, so that the shot image is corrected according to the first calibration information and the second calibration information, and the parallax between the corrected images is obtained to verify whether the calibration test passes or not.
In one embodiment, the provided camera calibration method may further include: when the parallax difference is larger than or equal to the difference threshold, generating a prompt signal indicating that the calibration test fails; and carrying out second calibration treatment.
When the parallax difference is greater than the difference threshold, it indicates that the actual error of the calibration result of the camera exceeds the error allowable range, that is, the first calibration information and the second calibration information obtained after the electronic device performs calibration processing on the camera are inaccurate, and further generates a prompt signal indicating that the calibration test fails, wherein the prompt signal is used for prompting a calibration test processing unit of the electronic device, and if the calibration result of the camera does not pass the calibration test, the camera needs to be calibrated for the second time.
The electronic equipment corrects the calibration images according to the calibration information obtained by the calibration processing, obtains the parallax between the corrected target images, and compares the parallax information to check whether the calibration result of the camera is qualified, so that the accuracy of the camera calibration can be improved.
It should be understood that although the various steps in the flow charts of fig. 2-4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-4 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
Fig. 5 is a block diagram of a camera calibration apparatus according to an embodiment. As shown in fig. 5, the camera calibration apparatus includes a first image obtaining module 502, a first calibration module 504, a second image obtaining module 506, and a second calibration module 508. Wherein:
a first image obtaining module 502, configured to perform shooting through a first camera and a second camera to obtain a first set of calibration images;
the first calibration module 504 is configured to perform calibration processing on the first camera and the second camera according to the first group of calibration images to obtain first calibration information;
a second image obtaining module 506, configured to take a picture through the first camera and the third camera to obtain a second set of calibration images;
the second calibration module 508 is configured to perform calibration processing on the first camera and the third camera according to the second group of calibration images to obtain second calibration information.
The camera calibration device provided by the embodiment of the application obtains a first group of calibration images by shooting through the first camera and the second camera, calibrates the first camera and the second camera according to the first group of calibration images to obtain first calibration information, and shoots through the first camera and the third camera to obtain a second group of calibration images, wherein the third camera is a depth camera, calibrates the first camera and the third camera according to the second group of calibration images to obtain second calibration information, so that calibration information containing the depth camera after calibration can be obtained, and the accuracy of camera calibration can be improved.
In an embodiment, the first image obtaining module 502 may be further configured to shoot calibration boards at least three angles through the first camera and the second camera to obtain a first set of calibration images; the second image obtaining module 506 may further be configured to shoot calibration boards at least three angles through the first camera and the third camera to obtain a second set of calibration images.
In an embodiment, the provided camera calibration apparatus may further include a calibration testing module 510, where the calibration testing module 510 may be configured to obtain a first calibration image captured by a first camera and a second calibration image captured by a second camera in the first set of calibration images; correcting the first calibration image and the second calibration image according to the first calibration information to obtain a corresponding first target image and a corresponding second target image; calculating first depth information according to the first target image and the second target image; and when the first depth information is matched with the preset depth information, performing shooting through the first camera and the third camera to obtain a second group of calibration images.
In an embodiment, the calibration testing module 510 may be further configured to, when a first set of calibration images is obtained by shooting with the first camera and the second camera, obtain a corresponding depth image by shooting with the third camera; taking depth information contained in the depth image as preset depth information; and when the first depth information is matched with the preset depth information, performing shooting through the first camera and the third camera to obtain a second group of calibration images.
In an embodiment, the calibration testing module 510 may be further configured to perform correction processing on the third calibration image and the fourth calibration image according to the second calibration information to obtain a third target image and a fourth target image; extracting a first pixel point in the first target image, extracting a second pixel point corresponding to the first pixel point in the second target image, and acquiring a first parallax according to the first pixel point and the second pixel point; extracting a third pixel point in the third target image, extracting a fourth pixel point corresponding to the third pixel point in the fourth target image, and acquiring a second parallax according to the third pixel point and the fourth pixel point; and calculating a parallax difference value of the first parallax and the second parallax, and generating a prompt signal that the calibration test is passed when the parallax difference value is smaller than a difference threshold value.
In one embodiment, the calibration test module 510 may be further configured to generate a warning signal indicating that the calibration test fails when the disparity difference is greater than or equal to the difference threshold; and carrying out second calibration treatment.
In an embodiment, the second image obtaining module 506 may be further configured to shoot the calibration board through the first camera and the third camera to obtain a third calibration image corresponding to the first camera and a fourth calibration image corresponding to the third camera, where the fourth calibration image is a grayscale image; and acquiring a second group of calibration images comprising a third calibration image and a fourth calibration image.
The division of the modules in the camera calibration device is merely used for illustration, and in other embodiments, the camera calibration device may be divided into different modules as needed to complete all or part of the functions of the camera calibration device.
Fig. 6 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 6, the electronic device includes a processor and a memory connected by a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program is executable by a processor for implementing a camera calibration method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc.
The implementation of each module in the camera-calibration apparatus provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 7 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 7, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 7, the image processing circuit includes an ISP processor 740 and control logic 750. The image data captured by the imaging device 710 is first processed by the ISP processor 740, and the ISP processor 740 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 710. The imaging device 710 may include a camera having one or more lenses 712 and an image sensor 714. The image sensor 714 may include an array of color filters (e.g., Bayer filters), and the image sensor 714 may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor 714 and provide a set of raw image data that may be processed by the ISP processor 740. The sensor 720 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 740 based on the type of sensor 720 interface. The sensor 720 interface may utilize a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, image sensor 714 may also send raw image data to sensor 720, sensor 720 may provide raw image data to ISP processor 740 based on the type of sensor 720 interface, or sensor 720 may store raw image data in image memory 730.
ISP processor 740 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 740 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 740 may also receive image data from image memory 730. For example, sensor 720 interface sends raw image data to image memory 730, and the raw image data in image memory 730 is then provided to ISP processor 740 for processing. The image Memory 730 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
ISP processor 740 may perform one or more image processing operations, such as temporal filtering, upon receiving raw image data from image sensor 714 interface or from sensor 720 interface or from image memory 730. The processed image data may be sent to image memory 730 for additional processing before being displayed. ISP processor 740 receives processed data from image memory 730 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The image data processed by ISP processor 740 may be output to display 770 for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of ISP processor 740 may also be sent to image memory 730 and display 770 may read image data from image memory 730. In one embodiment, image memory 730 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 740 may be transmitted to the encoder/decoder 760 for encoding/decoding image data. The encoded image data may be saved and decompressed before being displayed on the display 770 device. The encoder/decoder 760 may be implemented by a CPU or GPU or coprocessor.
The statistical data determined by ISP processor 740 may be sent to control logic 750 unit. For example, the statistical data may include image sensor 714 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 712 shading correction, and the like. Control logic 750 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 710 and control parameters of ISP processor 740 based on the received statistical data. For example, the control parameters of imaging device 710 may include sensor 720 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 712 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 712 shading correction parameters.
In an embodiment of the present application, the image processing circuit may include at least three imaging devices (cameras) 710, and the above-described camera calibration method may be implemented using the image processing technique of fig. 7.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the camera calibration method.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform a camera calibration method.
Any reference to memory, storage, database, or other medium used by embodiments of the present application may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus (Rambus) direct RAM (RDRAM), direct bused dynamic RAM (DRDRAM), and Rambus Dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A camera calibration method, comprising:
shooting through a first camera and a second camera to obtain a first group of calibration images;
calibrating the first camera and the second camera according to the first group of calibration images to obtain first calibration information;
acquiring a first calibration image shot by the first camera and a second calibration image shot by the second camera in the first group of calibration images;
correcting the first calibration image and the second calibration image according to the first calibration information to obtain a corresponding first target image and a corresponding second target image;
calculating first depth information according to the first target image and the second target image;
when shooting is carried out through a first camera and a second camera to obtain a first group of calibration images, shooting is carried out through a third camera to obtain corresponding depth images, depth information contained in the depth images is used as preset depth information, and when the first depth information is matched with the preset depth information, shooting is carried out through the first camera and the third camera to obtain a second group of calibration images, wherein the third camera is a depth camera;
and calibrating the first camera and the third camera according to the second group of calibration images to obtain second calibration information.
2. The method according to claim 1, wherein the capturing by the first camera and the second camera to obtain the first set of calibration images comprises:
shooting calibration plates at least three angles through the first camera and the second camera to obtain a first group of calibration images;
the shooting is carried out through the first camera and the third camera to obtain a second group of calibration images, and the method comprises the following steps:
and shooting the calibration plates at least three angles through the first camera and the third camera to obtain a second group of calibration images.
3. The method according to claim 1, wherein the second set of calibration images includes a third calibration image captured by the first camera and a fourth calibration image captured by the third camera, and the method further comprises:
correcting the third calibration image and the fourth calibration image according to the second calibration information to obtain a third target image and a fourth target image;
extracting a first pixel point in the first target image, extracting a second pixel point corresponding to the first pixel point in the second target image, and acquiring a first parallax according to the first pixel point and the second pixel point;
extracting a third pixel point in the third target image, extracting a fourth pixel point corresponding to the fourth target image from the third pixel point, and acquiring a second parallax according to the third pixel point and the fourth pixel point;
and calculating a parallax difference value of the first parallax and the second parallax, and generating a prompt signal that the calibration test is passed when the parallax difference value is smaller than a difference threshold value.
4. The method of claim 3, further comprising:
when the parallax difference is larger than or equal to the difference threshold, generating a prompt signal indicating that the calibration test fails; and carrying out second calibration treatment.
5. The method of claim 1, wherein the capturing by the first camera and the third camera to obtain the second set of calibration images comprises:
shooting a calibration board through the first camera and the third camera to obtain a third calibration image corresponding to the first camera and a fourth calibration image corresponding to the third camera, wherein the fourth calibration image is a gray image;
and acquiring a second group of calibration images comprising the third calibration image and the fourth calibration image.
6. A camera calibration device is characterized by comprising:
the first image acquisition module is used for shooting through the first camera and the second camera to obtain a first group of calibration images;
the first calibration module is used for calibrating the first camera and the second camera according to the first group of calibration images to obtain first calibration information;
the calibration testing module is used for acquiring a first calibration image shot by the first camera and a second calibration image shot by the second camera in the first group of calibration images; correcting the first calibration image and the second calibration image according to the first calibration information to obtain a corresponding first target image and a corresponding second target image; calculating first depth information according to the first target image and the second target image; when a first group of calibration images are obtained by shooting through a first camera and a second camera, shooting through a third camera to obtain a corresponding depth image, and taking depth information contained in the depth image as preset depth information, wherein the third camera is a depth camera;
the second image acquisition module is used for shooting through the first camera and the third camera to obtain a second group of calibration images when the first depth information is matched with the preset depth information;
and the second calibration module is used for calibrating the first camera and the third camera according to the second group of calibration images to obtain second calibration information.
7. The apparatus of claim 6,
the first image acquisition module is further configured to shoot calibration plates at least three angles through the first camera and the second camera to obtain the first group of calibration images;
and the second image acquisition module is also used for shooting calibration plates at least three angles through the first camera and the third camera to obtain a second group of calibration images.
8. The apparatus according to claim 6, wherein the second image obtaining module is further configured to shoot a calibration board through the first camera and a third camera to obtain a third calibration image corresponding to the first camera and a fourth calibration image corresponding to the third camera, where the fourth calibration image is a grayscale image; and acquiring a second group of calibration images comprising the third calibration image and the fourth calibration image.
9. An electronic device comprising a camera, a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the camera calibration method according to any one of claims 1 to 5.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201811453896.3A 2018-11-30 2018-11-30 Camera calibration method, device, electronic equipment and computer-readable storage medium Active CN109658459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811453896.3A CN109658459B (en) 2018-11-30 2018-11-30 Camera calibration method, device, electronic equipment and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811453896.3A CN109658459B (en) 2018-11-30 2018-11-30 Camera calibration method, device, electronic equipment and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN109658459A CN109658459A (en) 2019-04-19
CN109658459B true CN109658459B (en) 2020-11-24

Family

ID=66111980

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811453896.3A Active CN109658459B (en) 2018-11-30 2018-11-30 Camera calibration method, device, electronic equipment and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN109658459B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111832425A (en) * 2020-06-22 2020-10-27 五邑大学 Target ranging method and device for agricultural trolley and storage medium
CN112598751A (en) * 2020-12-23 2021-04-02 Oppo(重庆)智能科技有限公司 Calibration method and device, terminal and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107610185A (en) * 2017-10-12 2018-01-19 长沙全度影像科技有限公司 A kind of fisheye camera fast calibration device and scaling method
CN107730462A (en) * 2017-09-30 2018-02-23 努比亚技术有限公司 A kind of image processing method, terminal and computer-readable recording medium
CN108288294A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of outer ginseng scaling method of a 3D phases group of planes

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107730462A (en) * 2017-09-30 2018-02-23 努比亚技术有限公司 A kind of image processing method, terminal and computer-readable recording medium
CN107610185A (en) * 2017-10-12 2018-01-19 长沙全度影像科技有限公司 A kind of fisheye camera fast calibration device and scaling method
CN108288294A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of outer ginseng scaling method of a 3D phases group of planes

Also Published As

Publication number Publication date
CN109658459A (en) 2019-04-19

Similar Documents

Publication Publication Date Title
CN109767467B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
CN109712192B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
CN111246089B (en) Jitter compensation method and apparatus, electronic device, computer-readable storage medium
CN109584312B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN110610465B (en) Image correction method and device, electronic equipment and computer readable storage medium
US10306165B2 (en) Image generating method and dual-lens device
CN109598763B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN109600548B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109685853B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109598764B (en) Camera calibration method and device, electronic equipment and computer-readable storage medium
CN110290323B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109559352B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN109559353B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
CN112004029B (en) Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium
CN108924426B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109660718B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109963080B (en) Image acquisition method and device, electronic equipment and computer storage medium
CN109951641B (en) Image shooting method and device, electronic equipment and computer readable storage medium
CN112470192A (en) Dual-camera calibration method, electronic device and computer-readable storage medium
CN111246100B (en) Anti-shake parameter calibration method and device and electronic equipment
CN112257713A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109697737B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN109584311B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN109658459B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210615

Address after: Room 01, 8th floor, No.1 Lane 61, shengxia Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 200120

Patentee after: Zheku Technology (Shanghai) Co., Ltd

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Patentee before: OPPO Guangdong Mobile Communications Co.,Ltd.

TR01 Transfer of patent right