WO2021129437A1 - 一种无需白图像的光场相机检校方法及*** - Google Patents

一种无需白图像的光场相机检校方法及*** Download PDF

Info

Publication number
WO2021129437A1
WO2021129437A1 PCT/CN2020/136062 CN2020136062W WO2021129437A1 WO 2021129437 A1 WO2021129437 A1 WO 2021129437A1 CN 2020136062 W CN2020136062 W CN 2020136062W WO 2021129437 A1 WO2021129437 A1 WO 2021129437A1
Authority
WO
WIPO (PCT)
Prior art keywords
light field
microlens
microlens array
image
line feature
Prior art date
Application number
PCT/CN2020/136062
Other languages
English (en)
French (fr)
Inventor
关鸿亮
段福洲
孟祥慈
Original Assignee
首都师范大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 首都师范大学 filed Critical 首都师范大学
Priority to AU2020413529A priority Critical patent/AU2020413529B2/en
Publication of WO2021129437A1 publication Critical patent/WO2021129437A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera

Definitions

  • the invention relates to the technical fields of image measurement and computer vision, and in particular to a method and system for checking and calibrating a light field camera without a white image.
  • the existing method uses a thin lens model to describe the main lens, a pinhole model to describe the micro lens, and changes in different shooting parameters (such as aperture, zoom, focus, etc.), especially when the light field camera is focused differently.
  • the distance between the lens and the sensor plane caused by the parameter change is different, so that the absolute coordinates of the same projection point in the microlens on the sensor and the position of the projection point of the microlens center relative to the center of the CCD (charge coupled device) array are changed, so use
  • the existing calibration method is used to calibrate the light field camera. After the white image required for the calibration of the center point grid is obtained, the shooting parameters need to be kept fixed, and then other data required for calibrating the light field camera is obtained.
  • the final camera calibration result is the camera parameters under the shooting parameters. In this way, if the shooting parameters change during the data acquisition process, the white image and the required light field data need to be shot again. After importing the light field data into the computer, also pay attention to storing the corresponding white image.
  • the software provided by the manufacturer will approximately match the built-in white image. If the shooting parameters of the data do not match any of the built-in white image parameters, the white image with the closest shooting parameters is used as the center point grid data source of the data. Although this method of approximately matching the built-in white image is convenient, it cannot guarantee the calibration accuracy of the center point grid.
  • the purpose of the present invention is to provide a light field camera calibration method and system that does not require a white image, so as to solve the problem that the existing methods for calibrating non-focused light field cameras generally rely on white images and the camera calibration accuracy is low.
  • the present invention provides the following solutions:
  • a method for checking and calibrating a light field camera without a white image comprising:
  • the light field camera includes a lens, a microlens array, and an image sensor;
  • the line feature is used as calibration data to calibrate the internal and external parameters of the projection model of the light field camera.
  • the performing the calibration of the microlens array according to the original image of the light field, and generating the calibration result of the microlens array and the center point grid of the microlens array specifically includes:
  • the physical parameters include the physical spacing of the microlenses in the microlens array and the physical spacing of pixels in the original image of the light field;
  • the optimal posture parameter is the inspection result of the microlens array
  • the image projection points of the physical centers of all the microlenses in the microlens array constitute a grid of center points of the microlens images of the microlens array.
  • the extracting line features of the original light field image by using a template matching method specifically includes:
  • the optimal line feature template is converted into the line feature of the original image of the light field.
  • the calibrating the internal and external parameters of the projection model of the light field camera using the line feature as calibration data specifically includes:
  • the internal and external parameters that minimize the value of the cost function are the calibration values of the internal and external parameters.
  • a light field camera calibration system that does not require a white image, the system includes:
  • the light field original image acquisition module is used to acquire the light field original image of the electronic checkerboard taken by the light field camera;
  • the light field camera includes a lens, a micro lens array and an image sensor;
  • the microlens array calibration module is used to calibrate the microlens array according to the original image of the light field, and generate the calibration result of the microlens array and the center point grid of the microlens array;
  • a line feature extraction module for extracting line features of the original image of the light field by using a template matching method
  • the internal and external parameter calibration module is used to calibrate the internal and external parameters of the projection model of the light field camera using the line feature as calibration data.
  • a physical parameter acquisition unit for acquiring physical parameters of the microlens array; the physical parameters include the physical spacing of the microlenses in the microlens array and the physical spacing of pixels in the original image of the light field;
  • a microlens physical center determining unit configured to determine the physical center of each microlens in the microlens array according to the physical parameters of the microlens array;
  • a physical center image projection point determination unit configured to determine an image projection point of the physical center of each microlens in the microlens array according to the original image of the light field
  • a posture parameter acquisition unit for acquiring the posture parameters and the range of the posture parameters of the microlens array
  • a mapping relationship establishment unit configured to determine the mapping relationship between the physical center of each microlens in the microlens array, the image projection point of the physical center of the microlens, and the posture parameters of the microlens array;
  • An objective function establishing unit configured to establish an objective function according to the mapping relationship
  • An objective function optimization unit configured to optimize the attitude parameter within the scope of the attitude parameter, so that the objective function reaches a global minimum
  • the microlens array calibration unit is used to determine that the posture parameter when the objective function reaches the global minimum is the optimal posture parameter; the optimal posture parameter is the check result of the microlens array;
  • the center point grid determination unit is used to bring the optimal attitude parameter into the mapping relationship to obtain the image projection point of the physical center of each microlens in the microlens array; all of the microlens arrays
  • the image projection points of the physical center of the microlens constitute a grid of center points of the microlens image of the microlens array.
  • the line feature extraction module specifically includes:
  • the line feature template obtaining unit is used to obtain the preset line feature template and template parameter range;
  • a normalized cross-correlation value calculation unit configured to calculate a normalized cross-correlation value between the center coordinates of the microlens in the microlens image and the center pixel of the line feature template;
  • a line feature template optimization unit configured to optimize the template parameters of the line feature template within the range of the template parameters to maximize the normalized cross-correlation value
  • An optimal line feature template determining unit configured to determine the line feature template that maximizes the normalized cross-correlation value as the optimal line feature template of the microlens image
  • the line feature conversion unit is used to convert the optimal line feature template into the line feature of the original image of the light field.
  • the internal and external parameter calibration module specifically includes:
  • a light field camera projection model acquisition unit configured to acquire a light field camera projection model of the light field camera
  • a cost function establishing unit configured to establish a cost function according to the line feature and the light field camera projection model
  • a cost function optimization unit for adjusting the internal and external parameters of the projection model of the light field camera to minimize the value of the cost function
  • the internal and external parameter calibration unit is used to determine that the internal and external parameters that minimize the value of the cost function are the calibration values of the internal and external parameters.
  • the present invention discloses the following technical effects:
  • the invention discloses a method and system for checking and calibrating a light field camera without a white image.
  • the method first obtains an original light field image of an electronic checkerboard taken by a light field camera, and then performs a microlens array based on the original light field image
  • the calibration of the microlens array is generated and the center point grid of the microlens array is generated; the template matching method is used to extract the line features of the original image of the light field and use the line features as the calibration data Calibrate the internal and external parameters of the projection model of the light field camera.
  • the method of the present invention does not rely on the white image, and only needs to process the original light field of the checkerboard to obtain the center point grid of the microlens, the array attitude and the internal and external parameters of the camera projection model.
  • the light field camera has high calibration accuracy and wide application range. specialty.
  • FIG. 1 is a schematic diagram of the influence of changes of different focusing parameters on the coordinates of the projection point of the light field camera in the prior art
  • Fig. 2 is a flow chart of a method for checking and calibrating a light field camera without a white image provided by the present invention
  • Fig. 3 is a schematic diagram of the technical route of the light field camera calibration method without white image provided by the present invention
  • FIG. 4 is a schematic diagram of the technical process of the microlens array calibration provided by the present invention.
  • FIG. 5 is a schematic diagram of the attitude parameters of the microlens array provided by the present invention: the rotation angle ⁇ 1 of the microlens array, the tilt parameters ⁇ 1 , ⁇ 2 in the direction perpendicular to the optical axis, and the offsets T x , Ty ;
  • FIG. 6 is a schematic diagram of the mapping relationship between the physical center of the microlens and the image projection points of the physical center of the microlens provided by the present invention
  • FIG. 7 is a schematic diagram of the posture parameter optimization process provided by the present invention.
  • Fig. 8 is a schematic diagram of the line feature provided by the present invention.
  • FIG. 9 is a schematic diagram of the expression of line feature templates of different parameter combinations provided by the present invention.
  • FIG. 10 is a schematic diagram of the normalized cross-correlation matching process provided by the present invention.
  • FIG. 11 is a schematic diagram of the process of establishing a projection model of a light field camera provided by the present invention.
  • Fig. 12 is a block diagram of a light field camera calibration system provided by the present invention that does not require a white image.
  • the purpose of the present invention is to provide a light field camera calibration method and system that does not require a white image, so as to solve the problem that the existing methods for calibrating non-focused light field cameras generally rely on white images and the camera calibration accuracy is low.
  • FIG. 2 is a flow chart of a method for checking and calibrating a light field camera without a white image provided by the present invention.
  • FIG. 3 is a schematic diagram of the technical route of the light field camera calibration method without white image provided by the present invention.
  • a method for checking and calibrating a light field camera without a white image provided by the present invention specifically includes:
  • Step 1 Obtain the original light field image of the electronic checkerboard taken by the light field camera.
  • the light field camera is a camera composed of a lens, a micro lens array and an image sensor, and can capture a four-dimensional light field.
  • the microlens array is a two-dimensional array composed of a plurality of microlens units.
  • the invention uses a light field camera to photograph an electronic checkerboard to obtain the original light field data (light field original image), and uses a screen measurement software to obtain the physical size of the checkerboard.
  • Step 2 Perform calibration of the microlens array according to the original image of the light field, and generate a calibration result of the microlens array and a grid of center points of the microlens array.
  • FIG. 4 is a schematic diagram of the technical process of the microlens array calibration provided by the present invention. Specifically, as shown in Figure 4, step 2 specifically includes:
  • Step 201 Obtain physical parameters of the microlens array.
  • the physical parameters include the physical spacing of microlenses in the microlens array and the physical spacing of pixels in the original image of the light field; the physical parameters of the microlens array are used to determine the physical spacing of each microlens in the microlens array.
  • Physical center. The physical center C ij of each microlens in the microlens array is:
  • i represents the number of columns
  • j represents the number of rows
  • x cij and y cij are the abscissa and ordinate of C ij respectively
  • d is the physical distance of the microlenses in the microlens array
  • l is the light The physical spacing of pixels in the original image of the field.
  • Step 202 Determine the image projection point of the physical center of each microlens in the microlens array according to the original image of the light field.
  • the invention converts the original image of the light field into the frequency domain through Fourier transform, and calculates the coordinates of the projection point of the actual physical center of the microlens on the image plane.
  • the coordinates of the corners of the hexagon (p0, p1, p2, p3, p4, p5) can be expressed by the radius of the circumscribed circle:
  • p0 to p5 are the coordinates of the intersection of the circumscribed circle and the hexagon, that is, the coordinates of the six corners of the hexagonal microlens; R is the radius of the circumscribed circle.
  • the original data of the light field is converted to the frequency domain through Fourier transform, and the coordinates of the six peaks are found near the corner coordinates of the microlens hexagon, that is, the six darkest pixel positions around each microlens image are found.
  • a local mapping P which is the sum of the distances between a certain point in the microlens image and the six darkest pixels around it.
  • a point makes the local mapping P value the smallest, that is, the sum of the distance between the point and the six darkest pixels around it is the smallest.
  • the point is the center of the hexagon, and its coordinates are the image of the physical center of the microlens Projection point.
  • Step 203 Obtain the attitude parameter and the attitude parameter range of the microlens array.
  • FIG. 5 is a schematic diagram of the attitude parameters of the micro lens array provided by the present invention.
  • the posture parameters set by the present invention include: the rotation angle ⁇ 1 of the microlens array, the tilt parameters ⁇ 1 , ⁇ 2 in the direction perpendicular to the optical axis, and the offsets T x , Ty .
  • the ideal center of the microlens array as the origin, establish a first spatial rectangular coordinate system, where the z-axis is parallel to the optical axis direction; taking the actual center of the microlens array as the origin, establish a second spatial rectangular coordinate system, The z-axis is parallel to the optical axis direction; the ideal center of the microlens array is offset in the x-axis direction from the actual center of the microlens array by T x , the ideal center of the microlens array Compared with the actual center of the microlens array, the offset in the y-axis direction is Ty .
  • the angle between the xoy plane of the first space rectangular coordinate system and the y-axis of the xoy plane of the second space rectangular coordinate system is ⁇ 1 ; the first space rectangular coordinate system is on the xoz plane and the second space rectangular coordinate
  • the included angle between the x-axis of the xoz plane of the system is ⁇ 1 ; the included angle between the Yoz plane of the first spatial rectangular coordinate system and the y-axis of the Yoz plane of the second spatial rectangular coordinate system is ⁇ 2 .
  • the posture parameter range is set as: the offset T x and Ty does not exceed the range of one microlens, perpendicular to the optical axis direction
  • the tilt parameter (including the angle between the x-axis of the xoz plane of the first space rectangular coordinate system and the x-axis of the second space rectangular coordinate system is ⁇ 1 , and the first space rectangular coordinate system on the yoz plane
  • the included angle ⁇ 2 ) and the rotation angle ⁇ 1 between the y-axis of the Yoz plane of the second spatial rectangular coordinate system and the rotation angle ⁇ 1 are within ⁇ 0.1 degrees.
  • Step 204 Determine the mapping relationship between the physical center of each microlens in the microlens array, the image projection point of the physical center of the microlens, and the posture parameters of the microlens array.
  • the present invention derives the mapping relationship among the physical center of the microlens, the image projection point of the physical center of the microlens, and the attitude parameters of the microlens array.
  • 6 is a schematic diagram of the mapping relationship between the physical center of the microlens and the image projection points of the physical center of the microlens provided by the present invention.
  • the image projection points of the center of the main lens, the physical center of the microlens, and the physical center of the microlens are on a straight line.
  • (x c , y c ) is the image projection point of the physical center of the microlens
  • (x c ′, y c ′) is the actual physical center of the lens.
  • Step 205 Establish an objective function according to the mapping relationship.
  • the present invention defines an objective function F, which is used to calculate the sum of the distances between each center point in the grid and the ideal center point Cij:
  • s, ⁇ 1 , ⁇ 2 , ⁇ , T x , T y are the attitude parameters of the micro lens array in step 203, and T is the actual center point grid defined in step 204 that can be obtained by the attitude parameters
  • the calculation model of the net coordinates P is the local mapping defined in step 202; M is the number of microlenses included in each row in the microlens array; N is the number of microlenses included in each column in the microlens array.
  • Step 206 Optimize the pose parameters within the range of the pose parameters, so that the objective function reaches the global minimum.
  • Fig. 7 is a schematic diagram of the posture parameter optimization process provided by the present invention.
  • the posture parameters within the range of posture parameters set in step 203 are optimized and combined, and are respectively substituted into the function F in step 205 for calculation.
  • F reaches the global minimum, even if the local mapping P of all microlens images reaches the minimum, the grid of the center point at this time is the result of the calibrated microlens grid, and the corresponding attitude parameter is the result of calibrating the microlens array .
  • Step 207 Determine that the posture parameter when the objective function reaches the global minimum is the optimal posture parameter; the optimal posture parameter is the inspection result of the microlens array.
  • Step 208 Bring the optimal posture parameter into the mapping relationship to obtain the image projection point of the physical center of each microlens in the microlens array.
  • the image projection points of the physical centers of all the microlenses in the microlens array constitute a grid of center points of the microlens images of the microlens array.
  • Step 3 Use a template matching method to extract line features of the original image of the light field.
  • step 2 use step 2 to calibrate the center point grid of the microlens array without the white image method to calibrate the parameters of the projection model of the light field camera.
  • the step 3 specifically includes:
  • Step 301 Obtain a preset line feature template and template parameter range.
  • Fig. 8 is a schematic diagram of the line feature provided by the present invention.
  • FIG. 9 is a schematic diagram of the expression of the line feature templates of different parameter combinations provided by the present invention.
  • the template parameters of the line feature template include ⁇ 2 and t, and the set template parameter ranges are: -90° ⁇ 90°, -r ⁇ t ⁇ r, where r is the microlens radius. Taking the center of the square as the origin, in a square with a side length of 2r, draw straight lines with different combinations of parameters to obtain a preset line feature template, as shown in Figure 9.
  • Step 302 Calculate the normalized cross-correlation value between the center coordinate of the microlens and the center pixel of the line feature template in the microlens image.
  • step 208 the center point grid of the microlens image is obtained, and the line feature template generated in step 301 is matched with the microlens image using the normalized cross-correlation (NCC) method to fit the original light field image Line feature.
  • Normalized cross-correlation is a measure of the similarity or linear relationship between two images. It is a matching method based on image gray information. The specific formula is:
  • I is the target image
  • T is the template image
  • M*N is the size of the template.
  • Step 303 Optimize the template parameters of the line feature template within the range of the template parameters to maximize the normalized cross-correlation value.
  • FIG. 10 is a schematic diagram of the normalized cross-correlation matching process provided by the present invention.
  • (x c , y c ) represents the center coordinates of the microlens image in the camera coordinate system
  • (x r , y r ) is the fractional result of (x c , y c ) rounded, and the center pixel of the template and the center point coordinates of the microlens image are used as the reference point for normalized cross-correlation method matching.
  • the template parameters of the line feature template are optimized within the range of the template parameters to maximize the normalized cross-correlation value.
  • the template with the largest correlation value (NCC value) is selected as the optimal line feature template of the microlens image, and the line feature of the optimal line feature template is converted into xsin ⁇ 2 +ycos ⁇ 2 +t+x r sin ⁇ 2 +y r cos ⁇ In the form of 2, the line characteristics of the original image of the light field are obtained.
  • Step 304 Determine that the line feature template that maximizes the normalized cross-correlation value is the optimal line feature template of the microlens image; convert the optimal line feature template into the original image of the light field Line characteristics.
  • Step 4 Use the line feature as calibration data to calibrate the internal and external parameters of the projection model of the light field camera.
  • the step 4 specifically includes:
  • Step 401 Obtain a light field camera projection model of the light field camera.
  • Figure 11 is a schematic diagram of the process of establishing the projection model of the light field camera provided by the present invention. As shown in Figure 11, since the main lens of the light field camera is described by a thin lens model, the micro lens is described by a pinhole model, and the light travels in a straight line in space. , The process of imaging the image point (X, Y, Z) on the light field camera sensor can be described according to Figure 11 to establish the projection model of the initial light field camera:
  • u,v is the point coordinates on the imaging plane
  • u c , v c is the center point coordinates of the micro lens on the imaging plane
  • f is the focal length of the main lens
  • (X,Y,Z) is the object point coordinates
  • the object point coordinates (X, Y, Z) pass through the main lens to form the image point coordinates (X', Y', Z')
  • L m is the distance from the lens to the lens array
  • L c is the distance from the lens to the sensor.
  • R is a 3*3 rotation matrix
  • R 11 -R 13 , R 21 -R 23 , R 31 -R 33 are elements in the rotation matrix R
  • the values of R 13 , R 23 and R 33 are all 0
  • T is a 3*1 translation matrix
  • t 1 , t 2 , and t 3 are elements in the translation matrix t.
  • World Coordinate System Also known as the measurement coordinate system, it is a three-dimensional rectangular coordinate system that can describe the spatial position of the camera and the object to be measured based on it. The position of the world coordinate system can be freely determined according to the actual situation.
  • the camera coordinate system is a three-dimensional rectangular coordinate system, the origin is located at the optical center of the lens, the x and y axes are respectively parallel to the two sides of the phase plane, and the z axis is the lens optical axis, which is perpendicular to the image plane.
  • the line feature matched by the template is introduced into formula (6) and combined with formula (7) to derive the camera parameter calculation formula described by linear features, and obtain: focal length f, rotation matrix R, translation matrix t, first radial direction Distortion coefficient k 1 , second radial distortion coefficient k 2 , distance from micro lens array to main lens Distance from CCD sensor to main lens
  • the calculation formula of the camera parameters described by the linear feature is introduced into formula (6) and combined with formula (7) to derive the camera parameter calculation formula described by linear features, and obtain: focal length f, rotation matrix R, translation matrix t, first radial direction Distortion coefficient k 1 , second radial distortion coefficient k 2 , distance from micro lens array to main lens Distance from CCD sensor to main lens.
  • Step 402 Establish a cost function according to the line feature and the light field camera projection model.
  • X C , Y C , Z C are the coordinates of the center point of the micro lens in the camera coordinate system
  • f x is the component of the focal length f on the x-axis
  • f y is the component of the focal length f on the y-axis.
  • the cost function g is the sum of squared distances between the line feature in the world coordinate system and the line feature obtained by template matching, where k′ is the slope of the line feature, and a, b, and c are the line features obtained by template matching The parameters of the template.
  • Step 403 Adjust the internal and external parameters of the projection model of the light field camera to minimize the value of the cost function; determine that the internal and external parameters that minimize the value of the cost function are the calibration values of the internal and external parameters.
  • the value of the cost function g is adjusted according to the camera parameter calculation formula described by the linear feature and the distorted corner coordinates in the image coordinate system.
  • the value of the cost function g is minimized, and the calibration values of the internal and external parameters of the camera are obtained, including the focal length f, the principal point coordinates (C x , Cy ), the first radial distortion coefficient k 1 , and the second radial distortion coefficient k 2.
  • a light-field camera calibration system that does not require a white image specifically includes:
  • the light field original image acquisition module 501 is used to acquire the light field original image of the electronic checkerboard taken by the light field camera;
  • the light field camera includes a lens, a microlens array, and an image sensor;
  • the microlens array calibration module 502 is configured to calibrate the microlens array according to the original image of the light field, and generate a calibration result of the microlens array and a grid of center points of the microlens array;
  • the microlens array calibration module 502 specifically includes:
  • a physical parameter acquisition unit for acquiring physical parameters of the microlens array; the physical parameters include the physical spacing of the microlenses in the microlens array and the physical spacing of pixels in the original image of the light field;
  • a microlens physical center determining unit configured to determine the physical center of each microlens in the microlens array according to the physical parameters of the microlens array;
  • a physical center image projection point determination unit configured to determine an image projection point of the physical center of each microlens in the microlens array according to the original image of the light field
  • a posture parameter acquisition unit for acquiring the posture parameters and the range of the posture parameters of the microlens array
  • a mapping relationship establishment unit configured to determine the mapping relationship between the physical center of each microlens in the microlens array, the image projection point of the physical center of the microlens, and the posture parameters of the microlens array;
  • An objective function establishing unit configured to establish an objective function according to the mapping relationship
  • An objective function optimization unit configured to optimize the attitude parameter within the scope of the attitude parameter, so that the objective function reaches a global minimum
  • the microlens array calibration unit is used to determine that the posture parameter when the objective function reaches the global minimum is the optimal posture parameter; the optimal posture parameter is the check result of the microlens array;
  • the center point grid determination unit is used to bring the optimal attitude parameter into the mapping relationship to obtain the image projection point of the physical center of each microlens in the microlens array; all of the microlens arrays
  • the image projection points of the physical center of the microlens constitute a grid of center points of the microlens image of the microlens array;
  • the line feature extraction module 503 is used to extract the line features of the original image of the light field by using a template matching method
  • the line feature extraction module 503 specifically includes:
  • the line feature template obtaining unit is used to obtain the preset line feature template and template parameter range;
  • a normalized cross-correlation value calculation unit configured to calculate a normalized cross-correlation value between the center coordinates of the microlens in the microlens image and the center pixel of the line feature template;
  • a line feature template optimization unit configured to optimize the template parameters of the line feature template within the range of the template parameters to maximize the normalized cross-correlation value
  • An optimal line feature template determining unit configured to determine the line feature template that maximizes the normalized cross-correlation value as the optimal line feature template of the microlens image
  • a line feature conversion unit configured to convert the optimal line feature template into the line feature of the original image of the light field
  • the internal and external parameter calibration module 504 is used to calibrate the internal and external parameters of the projection model of the light field camera using the line feature as calibration data;
  • the internal and external parameter calibration module 504 specifically includes:
  • a light field camera projection model acquisition unit configured to acquire a light field camera projection model of the light field camera
  • a cost function establishing unit configured to establish a cost function according to the line feature and the light field camera projection model
  • a cost function optimization unit for adjusting the internal and external parameters of the projection model of the light field camera to minimize the value of the cost function
  • the internal and external parameter calibration unit is used to determine that the internal and external parameters that minimize the value of the cost function are the calibration values of the internal and external parameters.
  • the present invention discloses a method and system for checking and calibrating a light field camera without a white image.
  • the method first obtains an original light field image of an electronic checkerboard taken by a light field camera, and then performs the microscopic operation according to the original light field image.
  • the calibration of the lens array generates the calibration result of the microlens array and the center point grid of the microlens array; the template matching method is used to extract the line features of the original image of the light field and use the line features as the inspection
  • the calibration data calibrates the internal and external parameters of the projection model of the light field camera.
  • the method of the present invention is a light field camera calibration method that does not require a white image, does not rely on white images, and only needs a checkerboard original light field to obtain the microlens center point grid, array posture, and calibration values of the internal and external parameters of the camera projection model. Realize the calibration of micro lens array and camera projection model. Moreover, the method of the present invention only needs the raw data of the checkerboard light field, so it is suitable for the verification of Lytro generation, Lytro Illum, and self-made light field cameras, etc., and has a wider application range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

一种无需白图像的光场相机检校方法及***。首先获取光场相机拍摄的电子棋盘格的光场原始图像(1),根据所述光场原始图像进行微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网(2);采用模板匹配方法提取所述光场原始图像的线特征(3);将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数(4)。本方法不依赖白图像,只需对棋盘格原始光场进行处理即可获得微透镜中心点格网、阵列姿态以及相机投影模型的内外参数,具有光场相机检校精度高、适应范围广的特点。

Description

一种无需白图像的光场相机检校方法及***
本申请要求于2019年12月23日提交中国专利局、申请号为201911338530.6、发明名称为“一种无需白图像的光场相机检校方法及***”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及图像测量以及计算机视觉技术领域,特别是涉及一种无需白图像的光场相机检校方法及***。
背景技术
传统相机检校通过主距、主点、旋转矩阵与平移矩阵等参数描述物点到像点的转换过程。而光场相机通过微透镜与传感器形成的双平面模型记录光线,所以光场相机的检校除了要获得传统检校参数,还要获得微透镜中心点格网,微透镜阵列姿态,微透镜与传感器间距等。标定微透镜中心格网,就是求光线在双平面模型中与其中一个平面的交点,是各种应用与计算的基础。
国内外现有检校非聚焦性光场相机的方法,先使用基于极值的方法从白图像中获取微透镜中心点格网;再建立投影模型,将从子孔径图像或者全聚焦图像中识别的棋盘格角点坐标与棋盘格角点物理坐标带入投影模型,求得投影模型参数,完成光场相机的检校。
现有方法使用子孔径图像进行检校,首先要对光场原始数据进行旋转、重采样、排布方式修正等预处理,进而得到子孔径图像。然后从子孔径图像中选取角点特征作为像点,实际得到的相机检校参数描述的是预处理后的相机,所以使用子孔径图像进行检校牺牲了一定精度。
如图1所示,现有方法采用薄透镜模型描述主透镜,针孔模型描述微透镜,不同拍摄参数的变化(例如光圈、变焦、对焦等参数),尤其是光场相机不同对焦时,对焦参数变化引起的镜头到传感器平面的距离不同,使微透镜中同一投影点在传感器上的绝对坐标以及微透镜中心的投影点相对于CCD(电荷藕合器件)阵列中心的位置发生变化,所以使用现有的检校方法标定光场相机,在得到标定中心点格网所需要的白图像后,需要保持拍摄参数的固定,再获取检校光场相机所需的其他数据。最终的相 机检校结果是该拍摄参数下的相机参数。这样如果获取数据过程中,拍摄参数发生变化,则需重新拍摄白图像和所需的光场数据。将光场数据导入电脑后,还要注意将与之对应的白图像进行储存。在使用Lytro与Raytrix光场相机时,厂商提供的软件会近似匹配内置的白图像。若数据的拍摄参数与任何一张内置的白图像参数不吻合,则采用拍摄参数最相近的白图像作为数据的中心点格网数据源。这种近似匹配内置白图像的方式虽然便捷,但不能保证中心点格网的标定精度。
可见,现有检校非聚焦性光场相机的方法,普遍依赖白图像,且相机检校精度低。
发明内容
本发明的目的是提供一种无需白图像的光场相机检校方法及***,以解决现有检校非聚焦性光场相机的方法普遍依赖白图像,且相机检校精度低的问题。
为实现上述目的,本发明提供了如下方案:
一种无需白图像的光场相机检校方法,所述方法包括:
获取光场相机拍摄的电子棋盘格的光场原始图像;所述光场相机包括镜头、微透镜阵列和图像传感器;
根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网;
采用模板匹配方法提取所述光场原始图像的线特征;
将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数。
可选的,所述根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网,具体包括:
获取所述微透镜阵列的物理参数;所述物理参数包括所述微透镜阵列中微透镜的物理间距以及所述光场原始图像中像素的物理间距;
根据所述微透镜阵列的物理参数确定所述微透镜阵列中每个微透镜的物理中心;
根据所述光场原始图像确定所述微透镜阵列中每个微透镜的物理中心的图像投影点;
获取所述微透镜阵列的姿态参数及姿态参数范围;
确定所述微透镜阵列中每个微透镜的物理中心、微透镜的物理中心的图像投影点以及所述微透镜阵列的姿态参数三者之间的映射关系;
根据所述映射关系建立目的函数;
在所述姿态参数范围内优化所述姿态参数,使所述目的函数达到全局最小值;
确定使所述目的函数达到全局最小值时的姿态参数为最优姿态参数;所述最优姿态参数为所述微透镜阵列的检校结果;
将所述最优姿态参数带入所述映射关系中,得到所述微透镜阵列中每个微透镜的物理中心的图像投影点;
所述微透镜阵列中所有微透镜的物理中心的图像投影点构成所述微透镜阵列的微透镜图像的中心点格网。
可选的,所述采用模板匹配方法提取所述光场原始图像的线特征,具体包括:
获取预设的线特征模板及模板参数范围;
计算所述微透镜图像中所述微透镜的中心坐标与所述线特征模板的中心像素的归一化互相关值;
在所述模板参数范围内优化所述线特征模板的模板参数,令所述归一化互相关值最大;
确定令所述归一化互相关值最大的所述线特征模板为所述微透镜图像的最优线特征模板;
将所述最优线特征模板转换为所述光场原始图像的线特征。
可选的,所述将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数,具体包括:
获取所述光场相机的光场相机投影模型;
根据所述线特征和所述光场相机投影模型建立代价函数;
调节所述光场相机投影模型的内外参数,令所述代价函数的值最小;
确定令所述代价函数的值最小的内外参数为所述内外参数的标定值。
一种无需白图像的光场相机检校***,所述***包括:
光场原始图像获取模块,用于获取光场相机拍摄的电子棋盘格的光场原始图像;所述光场相机包括镜头、微透镜阵列和图像传感器;
微透镜阵列检校模块,用于根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网;
线特征提取模块,用于采用模板匹配方法提取所述光场原始图像的线特征;
内外参数标定模块,用于将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数。
可选的,所述微透镜阵列检校模块具体包括:
物理参数获取单元,用于获取所述微透镜阵列的物理参数;所述物理参数包括所述微透镜阵列中微透镜的物理间距以及所述光场原始图像中像素的物理间距;
微透镜物理中心确定单元,用于根据所述微透镜阵列的物理参数确定所述微透镜阵列中每个微透镜的物理中心;
物理中心图像投影点确定单元,用于根据所述光场原始图像确定所述微透镜阵列中每个微透镜的物理中心的图像投影点;
姿态参数获取单元,用于获取所述微透镜阵列的姿态参数及姿态参数范围;
映射关系建立单元,用于确定所述微透镜阵列中每个微透镜的物理中心、微透镜的物理中心的图像投影点以及所述微透镜阵列的姿态参数三者之间的映射关系;
目的函数建立单元,用于根据所述映射关系建立目的函数;
目的函数优化单元,用于在所述姿态参数范围内优化所述姿态参数,使所述目的函数达到全局最小值;
微透镜阵列检校单元,用于确定使所述目的函数达到全局最小值时的姿态参数为最优姿态参数;所述最优姿态参数为所述微透镜阵列的检校结果;
中心点格网确定单元,用于将所述最优姿态参数带入所述映射关系中,得到所述微透镜阵列中每个微透镜的物理中心的图像投影点;所述微透镜阵列中所有微透镜的物理中心的图像投影点构成所述微透镜阵列的微透镜图像的中心点格网。
可选的,所述线特征提取模块具体包括:
线特征模板获取单元,用于获取预设的线特征模板及模板参数范围;
归一化互相关值计算单元,用于计算所述微透镜图像中所述微透镜的中心坐标与所述线特征模板的中心像素的归一化互相关值;
线特征模板优化单元,用于在所述模板参数范围内优化所述线特征模板的模板参数,令所述归一化互相关值最大;
最优线特征模板确定单元,用于确定令所述归一化互相关值最大的所述线特征模板为所述微透镜图像的最优线特征模板;
线特征转换单元,用于将所述最优线特征模板转换为所述光场原始图像的线特征。
可选的,所述内外参数标定模块具体包括:
光场相机投影模型获取单元,用于获取所述光场相机的光场相机投影模型;
代价函数建立单元,用于根据所述线特征和所述光场相机投影模型建立代价函数;
代价函数优化单元,用于调节所述光场相机投影模型的内外参数,令所述代价函数的值最小;
内外参数标定单元,用于确定令所述代价函数的值最小的内外参数为所述内外参数的标定值。
根据本发明提供的具体实施例,本发明公开了以下技术效果:
本发明公开了一种无需白图像的光场相机检校方法及***,所述方法 首先获取光场相机拍摄的电子棋盘格的光场原始图像,然后根据所述光场原始图像进行微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网;采用模板匹配方法提取所述光场原始图像的线特征并将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数。本发明方法不依赖白图像,只需对棋盘格原始光场进行处理即可获得微透镜中心点格网、阵列姿态以及相机投影模型的内外参数,具有光场相机检校精度高、适应范围广的特点。
说明书附图
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为现有技术中不同对焦参数的变化对光场相机投影点坐标的影响示意图;
图2为本发明提供的无需白图像的光场相机检校方法流程图;
图3为本发明提供的无需白图像的光场相机检校方法的技术路线示意图;
图4为本发明提供的微透镜阵列检校的技术流程示意图;
图5为本发明提供的微透镜阵列的姿态参数:微透镜阵列的旋转角度θ 1,垂直光轴方向的倾斜参数σ 1、σ 2,以及偏移T x,T y示意图;
图6为本发明提供的微透镜的物理中心以及微透镜的物理中心的图像投影点之间的映射关系示意图;
图7为本发明提供的姿态参数优化过程示意图;
图8为本发明提供的线特征的示意图;
图9为本发明提供的不同参数组合的线特征模板的表达示意图;
图10为本发明提供的归一化互相关匹配过程示意图;
图11为本发明提供的光场相机投影模型建立过程示意图;
图12为本发明提供的无需白图像的光场相机检校***框图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进 行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明的目的是提供一种无需白图像的光场相机检校方法及***,以解决现有检校非聚焦性光场相机的方法普遍依赖白图像,且相机检校精度低的问题。
为使本发明的上述目的、特征和优点能够更加明显易懂,下面结合附图和具体实施方式对本发明作进一步详细的说明。
图2为本发明提供的无需白图像的光场相机检校方法流程图。图3为本发明提供的无需白图像的光场相机检校方法的技术路线示意图。如图2和图3所示,本发明提供的一种无需白图像的光场相机检校方法,具体包括:
步骤1:获取光场相机拍摄的电子棋盘格的光场原始图像。
所述光场相机是由镜头、微透镜阵列和图像传感器组成的相机,可以捕获四维光场。所述微透镜阵列是由多个微透镜单元所组成的二维阵列。
本发明使用光场相机拍摄电子棋盘格,得到光场原始数据(光场原始图像),并使用屏幕测量软件获取棋盘格物理尺寸。
步骤2:根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网。
获取棋盘格原始光场后首先进行微透镜阵列检校,本发明使用一种无需白图像的微透镜阵列中心点格网标定方法。图4为本发明提供的微透镜阵列检校的技术流程示意图。具体的,如图4所示,步骤2具体包括:
步骤201:获取所述微透镜阵列的物理参数。
所述物理参数包括所述微透镜阵列中微透镜的物理间距以及所述光场原始图像中像素的物理间距;根据所述微透镜阵列的物理参数确定所述微透镜阵列中每个微透镜的物理中心。所述微透镜阵列中每个微透镜的物理中心C ij为:
Figure PCTCN2020136062-appb-000001
其中i表示列数,j表示行数,
Figure PCTCN2020136062-appb-000002
表示所述微透镜阵列中第j行第i列的微透镜的物理中心坐标,x cij、y cij分别为C ij的横纵坐标,d为微透镜阵列中微透镜的物理间距,l为光场原始图像中像素的物理间距。
步骤202:根据所述光场原始图像确定所述微透镜阵列中每个微透镜的物理中心的图像投影点。
本发明将光场原始图像通过傅里叶变换转换到频域,计算微透镜实际物理中心在像平面上的投影点坐标。
根据六边形角点的几何关系,六边形角点坐标(p0、p1、p2、p3、p4、p5)可以用外切圆半径表示:
Figure PCTCN2020136062-appb-000003
Figure PCTCN2020136062-appb-000004
Figure PCTCN2020136062-appb-000005
Figure PCTCN2020136062-appb-000006
Figure PCTCN2020136062-appb-000007
Figure PCTCN2020136062-appb-000008
其中,p0~p5为所述外切圆与所述六边形的交点坐标,即所述六边形微透镜的六个角点坐标;R为所述外切圆半径。
将光场原始数据通过傅里叶变换转换到频域,在微透镜六边形角点坐标附近分别找到六个峰值所在的坐标,即找到每个微透镜图像周边六个最 暗的像素位置。
定义一个局部映射P,为微透镜图像中的某点与周围六个最暗像素的距离和。当某点使局部映射P值最小时,即该点与周围六个最暗像素的距离和最小,根据几何原理,该点为六边形的中心,其坐标即为微透镜的物理中心的图像投影点。
步骤203:获取所述微透镜阵列的姿态参数及姿态参数范围。
图5为本发明提供的微透镜阵列的姿态参数示意图。参见图5,本发明设定的姿态参数包括:微透镜阵列的旋转角度θ 1,垂直光轴方向的倾斜参数σ 1、σ 2,以及偏移T x,T y
具体的,以微透镜列阵的理想中心为原点,建立第一空间直角坐标系,其中z轴平行于光轴方向;以微透镜列阵的实际中心为原点,建立第二空间直角坐标系,其中z轴平行于光轴方向;所述微透镜列阵的理想中心相较于所述微透镜列阵的实际中心在x轴的方向偏移为T x,所述微透镜列阵的理想中心相较于所述微透镜列阵的实际中心在y轴的方向偏移为T y。所述第一空间直角坐标系在xoy面与第二空间直角坐标系的xoy面的y轴之间的夹角为θ 1;所述第一空间直角坐标系在xoz面与第二空间直角坐标系的xoz面的x轴之间的夹角为σ 1;所述第一空间直角坐标系在yoz面与第二空间直角坐标系的yoz面的y轴之间的夹角为σ 2
考虑到理想微透镜图像中心与实际微透镜图像中心之间只有很小的差异,因此设定所述姿态参数范围为:偏移T x和T y不超过一个微透镜的范围,垂直光轴方向的倾斜参数(包括所述第一空间直角坐标系在xoz面与第二空间直角坐标系的xoz面的x轴之间的夹角为σ 1,和所述第一空间直角坐标系在yoz面与第二空间直角坐标系的yoz面的y轴之间的夹角σ 2)以及旋转角度θ 1在±0.1度内。
步骤204:确定所述微透镜阵列中每个微透镜的物理中心、微透镜的物理中心的图像投影点以及所述微透镜阵列的姿态参数三者之间的映射关系。
本发明根据光场相机内部的投影过程,推导微透镜的物理中心、微透镜物理中心的图像投影点、微透镜阵列姿态参数,三者间的映射关系。图6为本发明提供的微透镜的物理中心以及微透镜的物理中心的图像投影 点之间的映射关系示意图。参见图6,由于微透镜近似为针孔模型,主透镜中心、微透镜物理中心、微透镜物理中心的图像投影点在一条直线上。图6中(x c,y c)为微透镜物理中心的图像投影点,(x c′,y c′)为透镜的实际物理中心。
由三角形相似可以推出:
Figure PCTCN2020136062-appb-000009
其中将式(3)中
Figure PCTCN2020136062-appb-000010
简化为s,由于微透镜阵列的安装工艺,可以将sinθ简化为ε,cosθ简化为1,从而得到所述微透镜物理中心、所述微透镜物理中心的图像投影点、所述微透镜阵列的姿态参数,三者间的映射关系T为:
Figure PCTCN2020136062-appb-000011
步骤205:根据所述映射关系建立目的函数。
为了计算得到中心点格网与理想中心点格网的近似程度,本发明定义一个目的函数F,用于计算格网中每个中心点与理想中心点Cij的距离和:
Figure PCTCN2020136062-appb-000012
式(5)中,s,σ 12,ε,T x,T y是步骤203中微透镜阵列的姿态参数,T是步骤204中定义的可以通过姿态参数得到对应的实际中心点格网坐标的计算模型。P是步骤202中定义的局部映射;M为微透镜阵列中每行包含的微透镜个数;N为微透镜阵列中每列包含的微透镜的个数。
步骤206:在所述姿态参数范围内优化所述姿态参数,使所述目的函数达到全局最小值。
图7为本发明提供的姿态参数优化过程示意图。如图7所示,将步骤203中设定的姿态参数范围内的各姿态参数进行优化组合,分别代入步骤205中的函数F中进行计算。当F达到全局最小值,即使所有微透镜图像的局部映射P都达到最小,此时的中心点格网即为标定的微透镜格网结果,对应的姿态参数即为检校微透镜阵列的结果。
步骤207:确定使所述目的函数达到全局最小值时的姿态参数为最优姿态参数;所述最优姿态参数为所述微透镜阵列的检校结果。
步骤208:将所述最优姿态参数带入所述映射关系中,得到所述微透镜阵列中每个微透镜的物理中心的图像投影点。所述微透镜阵列中所有微透镜的物理中心的图像投影点即构成所述微透镜阵列的微透镜图像的中心点格网。
步骤3:采用模板匹配方法提取所述光场原始图像的线特征。
接下来,使用步骤2无需白图像方法标定出的微透镜阵列的中心点格网,进行光场相机投影模型参数的标定。
所述步骤3具体包括:
步骤301:获取预设的线特征模板及模板参数范围。
图8为本发明提供的线特征的示意图。图9为本发明提供的不同参数组合的线特征模板的表达示意图。如图8、图9所示,使用方程xsinθ 2+ycosθ 2+t=0表示直线,其中参数θ 2表示直线与横轴的夹角,参数t表示直线到原点的最短距离。
所述线特征模板的模板参数包括θ 2和t,设定模板参数范围为:-90°≤θ≤90°,-r≤t≤r,其中r为微透镜半径。以正方形中心为原点,在边长为2r的正方形中,画出不同参数组合的直线,得到预设的线特征模板,如图9所示。
步骤302:计算所述微透镜图像中所述微透镜的中心坐标与所述线特征模板的中心像素的归一化互相关值。
步骤208中得到了所述微透镜图像的中心点格网,使用归一化互相关(NCC)方法将步骤301中生成的所述线特征模板与微透镜图像匹配,拟合光场原始图像中的线特征。归一化互相关是两个图像之间的相似性或线性关系的一种量度,是基于图像灰度信息的匹配方法,具体公式为:
Figure PCTCN2020136062-appb-000013
I为目标图像,T为模板图像,M*N为模板的大小。
步骤303:在所述模板参数范围内优化所述线特征模板的模板参数,令所述归一化互相关值最大。
图10为本发明提供的归一化互相关匹配过程示意图。图10中(x c,y c)表示相机坐标系中微透镜图像的中心坐标,(x t,y t)是模板的中心像素(x t=y t=r)。(x r,y r)是(x c,y c)取整后的小数部分结果,以模板的中心像素与微透镜图像的中心点坐标为参考点进行归一化互相关方法的匹配。
在所述模板参数范围内优化所述线特征模板的模板参数,令所述归一化互相关值最大。选取相关值(NCC值)最大的模板作为该微透镜图像的最优线特征模板,同时将最优线特征模板的线特征转换成xsinθ 2+ycosθ 2+t+x rsinθ 2+y rcosθ 2的形式,得到所述光场原始图像的线特征。
步骤304:确定令所述归一化互相关值最大的所述线特征模板为所述微透镜图像的最优线特征模板;将所述最优线特征模板转换为所述光场原始图像的线特征。
步骤4:将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数。
所述步骤4具体包括:
步骤401:获取所述光场相机的光场相机投影模型。
图11为本发明提供的光场相机投影模型建立过程示意图,如图11所示,由于光场相机主透镜用薄透镜模型描述,微透镜用针孔模型描述,同时光在空间中沿直线传播,像点(X,Y,Z)在光场相机传感器上成像的过程可根据图11描述说明,从而建立初始光场相机的投影模型:
Figure PCTCN2020136062-appb-000014
其中
Figure PCTCN2020136062-appb-000015
(u,v)为成像平面上的点坐标,(u c,v c)为成像平面上的微透镜中心点坐标,f为主透镜的焦距,(X,Y,Z)为物点坐标,物点坐标(X,Y,Z)经过主透镜后形成像点坐标为(X',Y',Z'),L m为透镜到透镜阵列的距离,L c为透镜到传感器的距离。
获取世界坐标系与相机坐标系间的转换式:
Figure PCTCN2020136062-appb-000016
其中,R为3*3的旋转矩阵,R 11-R 13、R 21-R 23、R 31-R 33为旋转矩阵R内的元素,R 13、R 23和R 33的取值均为0,t为3*1的平移矩阵,t 1、t 2、t 3为平移矩阵t内的元素。世界界坐标系:也称为测量坐标系,是一个三维直角坐标系,以其为基准可以描述相机和待测物体的空间位置。世界坐标系的位置可以根据实际情况自由确定。相机坐标系是三维直角坐标系,原点位于镜头光心处,x、y轴分别与相面的两边平行,z轴为镜头光轴,与像平面垂直。
将模板匹配的线特征带入公式(6),并与公式(7)联立,推导使用线性特征描述的相机参数计算式,得到:焦距f,旋转矩阵R,平移矩阵t,第一径向畸变系数k 1,第二径向畸变系数k 2,微透镜阵列到主透镜的距离
Figure PCTCN2020136062-appb-000017
CCD传感器到主透镜的距离
Figure PCTCN2020136062-appb-000018
的线性特征描述的相机参数计算式。
步骤402:根据所述线特征和所述光场相机投影模型建立代价函数。
具体的,使用公式(7)把棋盘格邻近角点(u 1,v 1)和(u 2,v 2)转换到相机坐标系内,代入径向畸变模型
Figure PCTCN2020136062-appb-000019
并计算畸变后的坐标,并根据公式(6)建立光场相机投影模型
Figure PCTCN2020136062-appb-000020
将畸变后的角点坐标转换到图像坐标系。其中,X C、Y C、Z C为相机坐标系下的微透镜中心点的坐标,f x为焦距f在x轴上的分量,f y为焦距f在y轴上的分量。
定义一个代价函数g:
Figure PCTCN2020136062-appb-000021
所述代价函数g是世界坐标系中的线特征与由模板匹配得到的线特征之间的距离平方和,其中k′为线特征的斜率,a、b、c为模板匹配而得的线特征模板的参数。
步骤403:调节所述光场相机投影模型的内外参数,令所述代价函数的值最小;确定令所述代价函数的值最小的内外参数为所述内外参数的标定值。
具体的,根据线性特征描述的相机参数计算式以及图像坐标系中畸变后的角点坐标调节代价函数g的值。令所述代价函数g的值最小,得到相机内外参数的标定值,包括焦距f、像主点坐标(C x,C y)、第一径向畸变系数k 1、第二径向畸变系数k 2、旋转矩阵R以及平移矩阵的值t。从而完成了光场相机投影模型的标定。
基于本发明提供的一种无需白图像的光场相机检校方法,本发明还提供一种无需白图像的光场相机检校***。如图12所示,一种无需白图像的光场相机检校***,具体包括:
光场原始图像获取模块501,用于获取光场相机拍摄的电子棋盘格的光场原始图像;所述光场相机包括镜头、微透镜阵列和图像传感器;
微透镜阵列检校模块502,用于根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网;
所述微透镜阵列检校模块502具体包括:
物理参数获取单元,用于获取所述微透镜阵列的物理参数;所述物理参数包括所述微透镜阵列中微透镜的物理间距以及所述光场原始图像中像素的物理间距;
微透镜物理中心确定单元,用于根据所述微透镜阵列的物理参数确定所述微透镜阵列中每个微透镜的物理中心;
物理中心图像投影点确定单元,用于根据所述光场原始图像确定所述 微透镜阵列中每个微透镜的物理中心的图像投影点;
姿态参数获取单元,用于获取所述微透镜阵列的姿态参数及姿态参数范围;
映射关系建立单元,用于确定所述微透镜阵列中每个微透镜的物理中心、微透镜的物理中心的图像投影点以及所述微透镜阵列的姿态参数三者之间的映射关系;
目的函数建立单元,用于根据所述映射关系建立目的函数;
目的函数优化单元,用于在所述姿态参数范围内优化所述姿态参数,使所述目的函数达到全局最小值;
微透镜阵列检校单元,用于确定使所述目的函数达到全局最小值时的姿态参数为最优姿态参数;所述最优姿态参数为所述微透镜阵列的检校结果;
中心点格网确定单元,用于将所述最优姿态参数带入所述映射关系中,得到所述微透镜阵列中每个微透镜的物理中心的图像投影点;所述微透镜阵列中所有微透镜的物理中心的图像投影点构成所述微透镜阵列的微透镜图像的中心点格网;
线特征提取模块503,用于采用模板匹配方法提取所述光场原始图像的线特征;
所述线特征提取模块503具体包括:
线特征模板获取单元,用于获取预设的线特征模板及模板参数范围;
归一化互相关值计算单元,用于计算所述微透镜图像中所述微透镜的中心坐标与所述线特征模板的中心像素的归一化互相关值;
线特征模板优化单元,用于在所述模板参数范围内优化所述线特征模板的模板参数,令所述归一化互相关值最大;
最优线特征模板确定单元,用于确定令所述归一化互相关值最大的所述线特征模板为所述微透镜图像的最优线特征模板;
线特征转换单元,用于将所述最优线特征模板转换为所述光场原始图像的线特征;
内外参数标定模块504,用于将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数;
所述内外参数标定模块504具体包括:
光场相机投影模型获取单元,用于获取所述光场相机的光场相机投影模型;
代价函数建立单元,用于根据所述线特征和所述光场相机投影模型建立代价函数;
代价函数优化单元,用于调节所述光场相机投影模型的内外参数,令所述代价函数的值最小;
内外参数标定单元,用于确定令所述代价函数的值最小的内外参数为所述内外参数的标定值。
本发明公开了一种无需白图像的光场相机检校方法及***,所述方法首先获取光场相机拍摄的电子棋盘格的光场原始图像,然后根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网;采用模板匹配方法提取所述光场原始图像的线特征并将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数。本发明方法为无需白图像的光场相机检校方法,不依赖白图像,只需棋盘格原始光场,即可获得微透镜中心点格网,阵列姿态与相机投影模型内外参数的标定值,实现微透镜阵列和相机投影模型的检校。并且,本发明方法只需棋盘格光场的原始数据,因此适用于Lytro一代,Lytro Illum,以及自制的光场相机等的校验,适用范围更广。
本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。
本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处。综上所述,本说明书内容不应理解为对本发明的限制。

Claims (8)

  1. 一种无需白图像的光场相机检校方法,其特征在于,所述方法包括:
    获取光场相机拍摄的电子棋盘格的光场原始图像;所述光场相机包括镜头、微透镜阵列和图像传感器;
    根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网;
    采用模板匹配方法提取所述光场原始图像的线特征;
    将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数。
  2. 根据权利要求1所述的光场相机检校方法,其特征在于,所述根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网,具体包括:
    获取所述微透镜阵列的物理参数;所述物理参数包括所述微透镜阵列中微透镜的物理间距以及所述光场原始图像中像素的物理间距;
    根据所述微透镜阵列的物理参数确定所述微透镜阵列中每个微透镜的物理中心;
    根据所述光场原始图像确定所述微透镜阵列中每个微透镜的物理中心的图像投影点;
    获取所述微透镜阵列的姿态参数及姿态参数范围;
    确定所述微透镜阵列中每个微透镜的物理中心、微透镜的物理中心的图像投影点以及所述微透镜阵列的姿态参数三者之间的映射关系;
    根据所述映射关系建立目的函数;
    在所述姿态参数范围内优化所述姿态参数,使所述目的函数达到全局最小值;
    确定使所述目的函数达到全局最小值时的姿态参数为最优姿态参数;所述最优姿态参数为所述微透镜阵列的检校结果;
    将所述最优姿态参数带入所述映射关系中,得到所述微透镜阵列中每个微透镜的物理中心的图像投影点;
    所述微透镜阵列中所有微透镜的物理中心的图像投影点构成所述微透镜阵列的微透镜图像的中心点格网。
  3. 根据权利要求2所述的光场相机检校方法,其特征在于,所述采用模板匹配方法提取所述光场原始图像的线特征,具体包括:
    获取预设的线特征模板及模板参数范围;
    计算所述微透镜图像中所述微透镜的中心坐标与所述线特征模板的中心像素的归一化互相关值;
    在所述模板参数范围内优化所述线特征模板的模板参数,令所述归一化互相关值最大;
    确定令所述归一化互相关值最大的所述线特征模板为所述微透镜图像的最优线特征模板;
    将所述最优线特征模板转换为所述光场原始图像的线特征。
  4. 根据权利要求3所述的光场相机检校方法,其特征在于,所述将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数,具体包括:
    获取所述光场相机的光场相机投影模型;
    根据所述线特征和所述光场相机投影模型建立代价函数;
    调节所述光场相机投影模型的内外参数,令所述代价函数的值最小;
    确定令所述代价函数的值最小的内外参数为所述内外参数的标定值。
  5. 一种无需白图像的光场相机检校***,其特征在于,所述***包括:
    光场原始图像获取模块,用于获取光场相机拍摄的电子棋盘格的光场原始图像;所述光场相机包括镜头、微透镜阵列和图像传感器;
    微透镜阵列检校模块,用于根据所述光场原始图像进行所述微透镜阵列的检校,生成所述微透镜阵列的检校结果以及所述微透镜阵列的中心点格网;
    线特征提取模块,用于采用模板匹配方法提取所述光场原始图像的线特征;
    内外参数标定模块,用于将所述线特征作为检校数据标定所述光场相机的投影模型的内外参数。
  6. 根据权利要求5所述的光场相机检校***,其特征在于,所述微透镜阵列检校模块具体包括:
    物理参数获取单元,用于获取所述微透镜阵列的物理参数;所述物理参数包括所述微透镜阵列中微透镜的物理间距以及所述光场原始图像中像素的物理间距;
    微透镜物理中心确定单元,用于根据所述微透镜阵列的物理参数确定所述微透镜阵列中每个微透镜的物理中心;
    物理中心图像投影点确定单元,用于根据所述光场原始图像确定所述微透镜阵列中每个微透镜的物理中心的图像投影点;
    姿态参数获取单元,用于获取所述微透镜阵列的姿态参数及姿态参数范围;
    映射关系建立单元,用于确定所述微透镜阵列中每个微透镜的物理中心、微透镜的物理中心的图像投影点以及所述微透镜阵列的姿态参数三者之间的映射关系;
    目的函数建立单元,用于根据所述映射关系建立目的函数;
    目的函数优化单元,用于在所述姿态参数范围内优化所述姿态参数,使所述目的函数达到全局最小值;
    微透镜阵列检校单元,用于确定使所述目的函数达到全局最小值时的姿态参数为最优姿态参数;所述最优姿态参数为所述微透镜阵列的检校结果;
    中心点格网确定单元,用于将所述最优姿态参数带入所述映射关系中,得到所述微透镜阵列中每个微透镜的物理中心的图像投影点;所述微透镜阵列中所有微透镜的物理中心的图像投影点构成所述微透镜阵列的微透镜图像的中心点格网。
  7. 根据权利要求6所述的光场相机检校***,其特征在于,所述线特征提取模块具体包括:
    线特征模板获取单元,用于获取预设的线特征模板及模板参数范围;
    归一化互相关值计算单元,用于计算所述微透镜图像中所述微透镜的中心坐标与所述线特征模板的中心像素的归一化互相关值;
    线特征模板优化单元,用于在所述模板参数范围内优化所述线特征模板的模板参数,令所述归一化互相关值最大;
    最优线特征模板确定单元,用于确定令所述归一化互相关值最大的所 述线特征模板为所述微透镜图像的最优线特征模板;
    线特征转换单元,用于将所述最优线特征模板转换为所述光场原始图像的线特征。
  8. 根据权利要求7所述的光场相机检校***,其特征在于,所述内外参数标定模块具体包括:
    光场相机投影模型获取单元,用于获取所述光场相机的光场相机投影模型;
    代价函数建立单元,用于根据所述线特征和所述光场相机投影模型建立代价函数;
    代价函数优化单元,用于调节所述光场相机投影模型的内外参数,令所述代价函数的值最小;
    内外参数标定单元,用于确定令所述代价函数的值最小的内外参数为所述内外参数的标定值。
PCT/CN2020/136062 2019-12-23 2020-12-14 一种无需白图像的光场相机检校方法及*** WO2021129437A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2020413529A AU2020413529B2 (en) 2019-12-23 2020-12-14 Method and system for calibrating light field camera without white images

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911338530.6 2019-12-23
CN201911338530.6A CN111340888B (zh) 2019-12-23 2019-12-23 一种无需白图像的光场相机检校方法及***

Publications (1)

Publication Number Publication Date
WO2021129437A1 true WO2021129437A1 (zh) 2021-07-01

Family

ID=71186737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/136062 WO2021129437A1 (zh) 2019-12-23 2020-12-14 一种无需白图像的光场相机检校方法及***

Country Status (3)

Country Link
CN (1) CN111340888B (zh)
AU (1) AU2020413529B2 (zh)
WO (1) WO2021129437A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113923445A (zh) * 2021-10-13 2022-01-11 中国航发湖南动力机械研究所 用于移轴成像条件下的光场相机校准方法及***
CN114066991A (zh) * 2021-10-11 2022-02-18 北京师范大学 一种基于空间平面单应不动点约束的光场相机标定方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111340888B (zh) * 2019-12-23 2020-10-23 首都师范大学 一种无需白图像的光场相机检校方法及***
CN114636385B (zh) * 2020-12-15 2023-04-28 奕目(上海)科技有限公司 基于光场相机的三维成像方法和***及三维成像测量产线
CN114666573A (zh) * 2022-03-23 2022-06-24 北京拙河科技有限公司 一种光场相机校准方法及***

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104537663A (zh) * 2014-12-26 2015-04-22 广东中科遥感技术有限公司 一种图像抖动的快速校正方法
EP3023826A1 (en) * 2014-11-20 2016-05-25 Thomson Licensing Light field imaging device
CN108093237A (zh) * 2017-12-05 2018-05-29 西北工业大学 高空间分辨率光场采集装置与图像生成方法
CN110060303A (zh) * 2019-03-18 2019-07-26 英特科利(江苏)医用内窥影像技术有限公司 一种光场相机的两步标定方法
CN111340888A (zh) * 2019-12-23 2020-06-26 首都师范大学 一种无需白图像的光场相机检校方法及***

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB704415A (en) * 1950-11-10 1954-02-24 Edgar Gretener Finished lenticulated film and process for producing the photographic recording thereon
US5680171A (en) * 1993-10-21 1997-10-21 Lo; Allen Kwok Wah Method and apparatus for producing composite images and 3D pictures
CN102157004A (zh) * 2011-04-18 2011-08-17 东华大学 用于超视场零件高精度影像测量仪的自动图像拼接方法
CN102930242B (zh) * 2012-09-12 2015-07-08 上海交通大学 一种公交车车型识别方法
CN104089628B (zh) * 2014-06-30 2017-02-08 中国科学院光电研究院 光场相机的自适应几何定标方法
CN105488810B (zh) * 2016-01-20 2018-06-29 东南大学 一种聚焦光场相机内外参数标定方法
CN106296661B (zh) * 2016-07-29 2019-06-28 深圳市未来媒体技术研究院 一种适用于光场相机的标定预处理方法
CN107230232B (zh) * 2017-04-27 2020-06-30 东南大学 聚焦型光场相机的f数匹配方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3023826A1 (en) * 2014-11-20 2016-05-25 Thomson Licensing Light field imaging device
CN104537663A (zh) * 2014-12-26 2015-04-22 广东中科遥感技术有限公司 一种图像抖动的快速校正方法
CN108093237A (zh) * 2017-12-05 2018-05-29 西北工业大学 高空间分辨率光场采集装置与图像生成方法
CN110060303A (zh) * 2019-03-18 2019-07-26 英特科利(江苏)医用内窥影像技术有限公司 一种光场相机的两步标定方法
CN111340888A (zh) * 2019-12-23 2020-06-26 首都师范大学 一种无需白图像的光场相机检校方法及***

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114066991A (zh) * 2021-10-11 2022-02-18 北京师范大学 一种基于空间平面单应不动点约束的光场相机标定方法
CN113923445A (zh) * 2021-10-13 2022-01-11 中国航发湖南动力机械研究所 用于移轴成像条件下的光场相机校准方法及***
CN113923445B (zh) * 2021-10-13 2023-09-26 中国航发湖南动力机械研究所 用于移轴成像条件下的光场相机校准方法及***

Also Published As

Publication number Publication date
CN111340888A (zh) 2020-06-26
AU2020413529B2 (en) 2023-04-06
AU2020413529A1 (en) 2021-08-26
CN111340888B (zh) 2020-10-23

Similar Documents

Publication Publication Date Title
WO2021129437A1 (zh) 一种无需白图像的光场相机检校方法及***
US11272161B2 (en) System and methods for calibration of an array camera
CN109859272B (zh) 一种自动对焦双目摄像头标定方法及装置
CN105488810B (zh) 一种聚焦光场相机内外参数标定方法
TWI761684B (zh) 影像裝置的校正方法及其相關影像裝置和運算裝置
CN110874854B (zh) 一种基于小基线条件下的相机双目摄影测量方法
CN114636385B (zh) 基于光场相机的三维成像方法和***及三维成像测量产线
US20210364288A1 (en) Optical measurement and calibration method for pose based on three linear array charge coupled devices (ccd) assisted by two area array ccds
CN112489137A (zh) 一种rgbd相机标定方法及***
CN111080705A (zh) 一种自动对焦双目摄像头标定方法及装置
JP2012198031A (ja) 画像補正方法及び画像補正装置
TW201719579A (zh) 影像擷取裝置及其產生深度資訊的方法與自動校正的方法
CN112489141B (zh) 车载摄像头单板单图带中继镜的产线标定方法及装置
CN113963065A (zh) 一种基于外参已知的镜头内参标定方法及装置、电子设备
CN112258581A (zh) 一种多鱼眼镜头全景相机的现场标定方法
CN111292380B (zh) 图像处理方法及装置
Ueno et al. Compound-Eye Camera Module as Small as 8.5$\times $8.5$\times $6.0 mm for 26 k-Resolution Depth Map and 2-Mpix 2D Imaging
Zhang et al. Improved camera calibration method and accuracy analysis for binocular vision
CN113345024B (zh) 判断相机模块的组装品质的方法
CN111754587A (zh) 一种基于单焦距聚焦拍摄图像的变焦镜头快速标定方法
CN112197701B (zh) 应用于大幅面工件的三维数据提取方法
Kumar et al. Non-frontal camera calibration using focal stack imagery
CN117745844A (zh) 一种基于三维背景导向纹影的多相机标定方法及***
CN114373019A (zh) 一种利用最优化方法对无公共视场相机进行标定的方法
CN115205392A (zh) 光场相机的空间位置标定方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20907854

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020413529

Country of ref document: AU

Date of ref document: 20201214

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20907854

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09.02.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20907854

Country of ref document: EP

Kind code of ref document: A1