WO2020232971A1 - 鱼眼相机标定***、方法、装置、电子设备及存储介质 - Google Patents

鱼眼相机标定***、方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2020232971A1
WO2020232971A1 PCT/CN2019/113441 CN2019113441W WO2020232971A1 WO 2020232971 A1 WO2020232971 A1 WO 2020232971A1 CN 2019113441 W CN2019113441 W CN 2019113441W WO 2020232971 A1 WO2020232971 A1 WO 2020232971A1
Authority
WO
WIPO (PCT)
Prior art keywords
max
target
plane
polyhedral
pentagonal
Prior art date
Application number
PCT/CN2019/113441
Other languages
English (en)
French (fr)
Inventor
苏显渝
艾佳
Original Assignee
四川深瑞视科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201920745752.9U external-priority patent/CN209821888U/zh
Priority claimed from CN201910431510.7A external-priority patent/CN110163922B/zh
Application filed by 四川深瑞视科技有限公司 filed Critical 四川深瑞视科技有限公司
Priority to EP19929895.1A priority Critical patent/EP3944194B1/en
Publication of WO2020232971A1 publication Critical patent/WO2020232971A1/zh
Priority to US17/505,658 priority patent/US11380016B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • G06T3/047Fisheye or wide-angle transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Definitions

  • This application relates to the technical field of camera calibration, and more specifically, to a fisheye camera calibration system, method, device, electronic equipment and storage medium.
  • Camera calibration is one of the key technologies in machine vision, photogrammetry, 3D imaging, and image geometric correction. Its main function is to estimate the internal and external parameters of the camera. The accuracy of the calibration result and the stability of the calibration algorithm directly affect the accuracy of subsequent work. In a general calibration method, multiple images need to be collected, so it is necessary to manually move the calibration board or camera. In actual applications, it not only takes time and effort, but also increases production costs.
  • This application proposes a fisheye camera calibration system, method, device, electronic equipment and storage medium to improve the above problems.
  • an embodiment of the present application provides a fish-eye camera calibration system, which includes a polyhedral target, a fish-eye camera, and an electronic device.
  • the polyhedral target includes an inner surface and a plurality of marking points arranged on the inner surface, the inner surface is composed of a plurality of hexagonal planes and pentagonal planes;
  • the fisheye camera is used to photograph the polyhedron A target, collecting a target image, the target image including a polyhedral target and an image of a plurality of marking points arranged on the inner surface of the polyhedral target;
  • the electronic device is used to, according to the target image, Fit the selected radial distortion model and the isometric projection model to obtain the initial values of the distortion parameters k 1 , k 2 , k 3 , k 4 , and k 5 ; according to the distortion parameters k 1 , k 2 , k 3
  • the initial values of k 4 , k 5 and the formula r max k 1 ⁇ max
  • the number of pixels, m u and m v are the number of pixels per unit distance in the horizontal direction and the number of pixels per unit distance in the vertical direction of the image coordinates, respectively; the electronic device is also used to obtain from the polyhedral target The translation matrix T j and the rotation matrix R j of the pentagonal plane and the hexagonal plane where each mark point is located; the electronic device is also used to use the Levenberg-Marquardt algorithm to pair k 1 , k 2 , k 3 , k 4
  • the initial values of k 5, u 0 , v 0 , mu , m v , T j , and R j are optimized to determine the imaging model parameters of the fisheye camera.
  • an embodiment of the present application provides a fisheye camera calibration method.
  • an embodiment of the present application provides a fisheye camera calibration device.
  • the device includes: an image acquisition module for acquiring a target image.
  • the target image includes a polyhedral target and a device set on the polyhedral target. Images of multiple marking points on the inner surface; a camera calibration module for fitting the selected radial distortion model with the isometric projection model according to the target image to obtain distortion parameters k 1 , k 2 , k 3.
  • an embodiment of the present application provides an electronic device, including: one or more processors; a memory; and one or more programs.
  • the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs are configured to execute the aforementioned method.
  • embodiments of the present application provide a computer-readable storage medium, and the computer-readable storage medium stores program code, and the program code can be invoked by a processor to execute the above-mentioned method.
  • Fig. 1 shows a schematic diagram of a fisheye camera calibration system provided by an embodiment of the present application.
  • Fig. 2 shows a schematic diagram of a polyhedral target provided by an embodiment of the present application.
  • Fig. 3 shows a schematic view of the splicing of the inner surface of the polyhedral target provided by an embodiment of the present application.
  • Fig. 4 shows a plan view of the inner surface of the polyhedral target provided by an embodiment of the present application.
  • Fig. 5 shows a target provided by an embodiment of the present application in a first perspective.
  • FIG. 6 shows a schematic diagram of bonding a sub-marking layer provided by an embodiment of the present application to the pentagonal plane or the hexagonal plane.
  • Fig. 7 shows a schematic diagram of the arrangement of a light source provided by an embodiment of the present application.
  • Fig. 8 shows a schematic diagram of another polyhedral target provided by an embodiment of the present application.
  • Fig. 9 shows a flowchart of a fisheye camera calibration method provided by an embodiment of the present application.
  • FIG. 10 shows a schematic diagram of the relationship between the sphere center of the first virtual sphere and the pentagonal plane or the hexagonal plane provided by an embodiment of the present application.
  • Fig. 11 shows a structural block diagram of a fisheye camera calibration device provided by an embodiment of the present application.
  • Fig. 12 shows a structural block diagram of an electronic device proposed by an embodiment of the present application for executing a fisheye camera calibration method according to an embodiment of the present application.
  • Fig. 13 shows a storage medium provided by an embodiment of the present application for storing or carrying program code for implementing the fisheye camera calibration method according to the embodiment of the present application.
  • Camera calibration is one of the key technologies in machine vision, photogrammetry, 3D imaging, and image geometric correction. Its main function is to estimate the internal and external parameters of the camera. The accuracy of the calibration result and the stability of the calibration algorithm directly affect the accuracy of subsequent work.
  • the general perspective camera can be represented by a pinhole model, and can be calibrated by using perspective projection mapping and affine transformation.
  • fisheye cameras have been widely used in panoramic vision, video surveillance, car navigation, virtual reality and other fields due to their large field of view.
  • the large field of view also brings serious image distortion, which affects the intuitive visual perception of the human eye and the utilization of image information.
  • the fisheye camera needs to be calibrated.
  • the mature method that has been studied is to use the plane target, and there are tool software based on the plane target, such as: Matlab toolbox and Opencv tool software.
  • the plane calibration board is placed in different positions in front of the camera to collect multiple target images to obtain the calibration raw data with a larger distribution range.
  • This method requires multiple placement of the calibration board at different positions and acquisition of target images, or multiple rotation of the camera in different positions and acquisition of target images.
  • the inventor proposes the fisheye camera calibration system, method, device, electronic equipment and storage medium of the present application.
  • the target image includes an image of a polyhedral target and a plurality of marking points arranged on the inner surface of the polyhedral target;
  • the radial distortion model will be selected according to the target image with equidistant projection model fitting to obtain the distortion parameters k 1, k 2, k 3 , k 4, k 5 of the initial value; k 1, k 2 based on the distortion parameter, k 3, k 4, k 5 of
  • the initial value and the formula r max k 1 ⁇ max +k 2 ⁇ max 3 +k 3 ⁇ max 5 +k 4 ⁇ max 7 +k 5 ⁇ max 9 calculate the radius r max , where ⁇ max represents the fisheye camera Maximum field of view; ellipse fitting is performed on the polyhedral target in the target image, and initial values of u 0 , v 0 , mu
  • an embodiment of the present application provides a calibration system for a fisheye camera based on a polyhedral target.
  • the system may include a polyhedral target 100, a fisheye camera 200, and an electronic device 300, wherein the fisheye The camera 200 and the electronic device 300 may be one device or two devices.
  • the fisheye camera 200 is a camera with a fisheye lens.
  • the polyhedral target 100 includes an inner surface and a plurality of marking points arranged on the inner surface.
  • the inner surface is formed by splicing a plurality of polygonal planes, the plurality of polygonal planes are spliced to form a vertex, the vertex is located on a first virtual spherical surface, and the center of the first virtual spherical The distance is equal.
  • the fish-eye camera 200 is set at the center of the sphere of the first virtual spherical surface, and photographs the polyhedral target 100 to obtain a target image.
  • the target image includes a polyhedral target and an image set on the polyhedral target. An image of multiple marked points on the inner surface.
  • the system may further include a light source, which is arranged on one side of the inner surface of the polyhedral target, so that the light in the inner surface is sufficient, so that the target image obtained by shooting the target 100 with a fisheye camera is clearer, Thereby improving the accuracy of the fisheye camera calibration.
  • a light source which is arranged on one side of the inner surface of the polyhedral target, so that the light in the inner surface is sufficient, so that the target image obtained by shooting the target 100 with a fisheye camera is clearer, Thereby improving the accuracy of the fisheye camera calibration.
  • the polyhedral target 100 includes a housing, an inner surface, and a plurality of marking points arranged on the inner surface.
  • FIG. 2 shows a schematic diagram of a polyhedral target provided by an embodiment of the present application.
  • the polyhedral target 100 includes a housing, an inner surface 110 and a marking point 120.
  • the shell may be a polyhedron as shown in FIG. 2, it may also be a hemispherical shape, or a rectangular shape, which can be set as required, and will not be repeated here.
  • the inner surface 110 is a hemispherical-like inner surface formed by splicing a plurality of pentagonal planes and hexagonal planes, wherein the material for making the pentagonal plane and the hexagonal plane can be metal such as steel sheet. Pieces, or other materials that are not easily deformed, the splicing method can be fixed connection such as welding, or movable connection such as hinge connection. The material and splicing method of pentagonal plane and hexagonal plane can be based on actual needs The choice is not limited here.
  • the inner surface 110 may be formed by splicing multiple pentagonal planes 111 and hexagonal planes 112. Please refer to FIG. 3, which shows a schematic view of the splicing of the inner surface of the polyhedral target.
  • the number of complete pentagonal planes used for splicing the inner surface 110 is 4, the number of complete hexagonal planes is 8, the number of partial pentagonal planes is 4, and the number of partial hexagons is 4.
  • the number of shape planes is 4.
  • the vertex 113 of the inner surface 110 of the spliced polyhedron is located on the first virtual spherical surface, that is, the distance from the center of the first virtual spherical surface to each vertex 113 on the inner surface 110 is equal.
  • all pentagonal planes 111 are tangent to the second virtual sphere, that is, the line connecting the center of the second virtual sphere and the center of the pentagonal plane 111 It is perpendicular to the pentagonal plane 111; in the spliced inner surface 110, all the hexagonal planes 112 are tangent to the third virtual sphere, that is, the center of the third virtual sphere and the hexagonal plane 112 The line connecting the center of the is perpendicular to the hexagonal plane 112.
  • the spherical center of the first virtual spherical surface, the spherical center of the second virtual spherical surface, and the spherical center of the third virtual spherical surface coincide.
  • the structure of the inner surface 110 of the polyhedral target is similar to that of a soccer ball, which can be understood as a soccer ball cut along a cross-section through the center of the ball.
  • the polyhedral target provided in this application may be along a full five
  • the cross-section of the polygonal plane 111 and the hexagonal plane 112 that has the largest number of cross-centers is cut. It is understandable that it can also be cut from a different cross-section through the center of the sphere, so the part of the pentagonal plane and part of the hexagonal plane may not be a complete pentagonal plane or half of the hexagonal plane. .
  • FIG. 4 shows a plan view of the inner surface of the polyhedral target provided by the embodiment of the present application. It can be seen from the expanded plan view that the part of the pentagonal plane 111 is half of the complete pentagonal plane, and the part of the hexagonal plane 112 is half of the complete hexagonal plane.
  • FIG. 5 shows the target provided by the embodiment of the present application in a first perspective.
  • the first viewing angle is defined as a viewing angle that passes through the center of the first virtual spherical surface and is perpendicular to the opposite surface of the inner surface 110.
  • half of the hexagonal plane 112 numbered 1, 2, 6, 7 is parallel to the direction of the first viewing angle, and the projection is a line, numbered 3, 5, 8, 10
  • the polygonal plane 111 is also parallel to the direction of the first viewing angle, and the projection is a line.
  • the marking point 120 may be arranged on a marking layer, which is arranged on the inner surface 110, the marking layer includes a plurality of sub-marking layers, and the sub-marking layers may be circular marking images. Or a checkerboard mark image, wherein the sub-marking layer includes a plurality of mark points 120, and the size of the mark points 120 may be different.
  • the marking points 120 are arranged on a plurality of pentagonal planes and hexagonal planes spliced into the inner surface 110. Please refer to FIG. 6, which shows a schematic diagram of the adhesion of the sub-marking layer where the marking points are located and the pentagonal plane or the hexagonal plane provided by the embodiment of the present application.
  • the sub-marking layer is attached to the pentagonal plane 111 or the hexagonal plane 112, which may be a pentagonal plane attached to a sub-marking layer, and a hexagonal plane attached to a sub-marking
  • the center of the sub-marking layer coincides with the center of the pentagonal plane 111 or the hexagonal plane 112, and one side of the sub-marking layer is at least the same as the pentagonal plane 111 or the hexagonal plane 112. One side is parallel.
  • the marking layer is composed of a plurality of sub-marking layers
  • the sub-marking layer is composed of a plurality of marking points 120 of different sizes, wherein the marking points 120 of different sizes in each sub-marking layer form the mark Marking point combination mode, different sub-marker layers have different marking point combination modes.
  • the fisheye camera 200 captures the target 100 to obtain the target image, it can be determined according to the marking point combination mode in the sub-marker layer The corresponding relationship between the marking points in the target image and the marking points 120 in the target.
  • the polyhedral target may further include a light source 130.
  • the light source 130 may be arranged on the inner surface of the polyhedral target through The setting of, makes the light in the inner surface 110 sufficient, and the target image obtained by shooting the polyhedral target with the fisheye camera is clearer, thereby improving the accuracy of the fisheye camera calibration.
  • the marking point may be a plurality of holes on the inner surface 110.
  • FIG. 8 shows a schematic diagram of a target provided by an embodiment of the present application.
  • the size of the multiple holes on the inner surface can be different, and the marking points 120 with different sizes on each pentagonal plane or hexagonal plane form a marking point combination mode.
  • the combination of marking points can be different.
  • the size and distribution of the holes can be set according to actual needs and is not limited here.
  • the target includes a light source 130, the light source 130 is disposed inside the housing of the polyhedral target, and the light emitted by the light source 130 through the hole is formed on the hemispherical inner surface 111 The marking point 120. Therefore, the target image can be obtained by capturing the target by the fisheye camera, and the fisheye camera can be calibrated.
  • an embodiment of the present application provides a fisheye camera calibration method, which is applied to the above fisheye camera calibration system.
  • the method may include:
  • Step S210 Obtain a target image, the target image including a polyhedral target and images of a plurality of marking points arranged on the inner surface of the polyhedral target.
  • the electronic device can obtain the target image taken by the fish-eye camera. Specifically, the electronic device can receive the target image sent by the fish-eye camera through the network or other means, or it can be obtained through the U disk or memory card.
  • Target image includes an image of a polyhedral target and a plurality of marking points arranged on the inner surface of the polyhedral target.
  • Step S220 Fit the selected radial distortion model with the isometric projection model according to the target image to obtain initial values of distortion parameters k 1 , k 2 , k 3 , k 4 , and k 5 .
  • This application selects isometric projection as the fisheye camera model.
  • the fisheye camera Since the projection model of the fisheye camera allows the existence of camera distortion in order to project the largest possible scene into a limited image plane, and the radial distortion of the fisheye camera is very serious, the fisheye camera mainly considers the radial distortion. If only the radial distortion of the fish-eye camera is considered, the selected isometric model is the fish-eye camera model.
  • the specific radial distortion model of the fish-eye camera can be expressed as:
  • r is the distance between the pixel on the image and the principal point of the camera system
  • is the angle between the incident light and the optical axis of the system
  • k 1 , k 2 , k 3 , k 4 and k 5 are fish-eye cameras
  • the five distortion parameters in the radial distortion model of, together with the four parameters u 0 , v 0 , m u , and m v described above constitute the internal parameters of the fisheye camera.
  • the distortion parameters are not limited to k 1 , k 2 , k 3 , k 4, k 5, but also k 6 , k 7 , up to k n, etc., in the embodiment of this application, k 1 ,k 2 ,k 3 ,k 4, k 5 are used for explanation, and 9 internal parameters of k 1 , k 2 , k 3 , k 4, k 5, u 0 , v 0 , m u and m v are used to describe fish The imaging characteristics of the eye camera.
  • Step S240 Perform ellipse fitting on the polyhedral target in the target image, and obtain the initial values of u 0 , v 0 , m u , m v according to the fitted ellipse and the value of r max , where (u 0 , V 0 ) is the main point, and mu and m v are the number of pixels per unit distance in the horizontal direction and the number of pixels per unit distance in the vertical direction of the image coordinates, respectively.
  • (u 0 , v 0 ) is the main point
  • mu is the number of pixels per unit distance in the horizontal direction of the image coordinates
  • m v is the number of pixels per unit distance in the vertical direction of the image coordinates.
  • the collected target image is an image of the inner surface of a polyhedral target
  • the inner surface is similar to a hemispherical inner surface
  • the boundary is similar to a circle in space
  • the image is similar to an ellipse in the image coordinate system, as shown in Figure 5.
  • the ellipse fitting equation is Among them, the values of a and b can be obtained by measuring the fitting ellipse of the target image, and the value of the principal point (u 0 , v 0 ) can be obtained from this.
  • m u and m v is the number of pixels per unit distance in the horizontal direction and the number of pixels per unit distance in the vertical direction of the image coordinates.
  • Step S250 Obtain the translation matrix T j and the rotation matrix R j of the pentagonal plane and the hexagonal plane where each marking point is located according to the polyhedral target.
  • the pentagonal plane and the hexagonal plane where the marking point is located refer to the marking layer plane where the marking point is located (that is, the pentagonal plane or the hexagonal plane to which the marking layer is attached).
  • the translation matrix T j and the rotation matrix R j of the pentagonal plane and the hexagonal plane where the marking point is located may be obtained according to the splicing relationship of the polyhedral target.
  • the splicing method of the polyhedral target can refer to the corresponding content of the foregoing embodiment, and in order to avoid repetition, it will not be repeated here.
  • the line connecting the center of the first virtual sphere and the center of each pentagonal plane is perpendicular to the pentagonal plane, and the center of the first virtual sphere is perpendicular to the center of each pentagonal plane.
  • the line of the center is perpendicular to the hexagonal plane.
  • point O is the center of the first virtual sphere
  • the center point of the pentagonal plane 111 is point A
  • the line connecting the center of the first virtual sphere and the center of the pentagonal plane is the line segment OA
  • the line segment OA is perpendicular to the pentagonal plane 111
  • the center point of the hexagonal plane 112 is point B
  • the line connecting the center of the first virtual sphere and the center of the pentagonal plane is the line segment OB.
  • OB is perpendicular to the pentagonal plane 112.
  • the angle between the pentagonal plane and the hexagonal plane and the angle between the hexagonal plane and the hexagonal plane can be calculated. Please refer to the planar expanded view of the inner surface of the polyhedral target shown in FIG. 4 and the target under the first viewing angle shown in FIG. 5.
  • each The central angles corresponding to the pentagonal plane or the hexagonal plane and the line segment can be expressed by trigonometric function expressions containing "R", for example, the pentagonal plane and hexagonal plane numbered 1-10 in Figure 4
  • the central angle corresponding to each line segment on the bottom side can be expressed by the trigonometric function expression containing "R”, and it can be seen from Figure 5 that these numbers on the bottom side of the pentagonal plane and the hexagonal plane numbered 1-10
  • the line segment forms a closed decagon around the center of the first virtual sphere.
  • the sum of the center angles corresponding to the ten sides should be equal to 360°.
  • the value of R can be calculated by solving the inverse trigonometric function. From this, the value of the radius of the second virtual spherical surface, the value of the radius of the third virtual spherical surface, and the angle between each pentagonal plane and the hexagonal plane can be obtained.
  • each pentagonal plane and hexagonal plane can be calculated by using their rotation matrix R j and translation vector T j relative to the world coordinate system. These calculated rotation matrix R j and translation vector T j will be used as the initial value for camera parameter optimization.
  • a world coordinate system is established according to the polyhedral target, the origin of the world coordinate system is set at the center of the first virtual spherical surface, and according to the established world coordinate system, for each of the polyhedral targets spliced
  • the positions of the pentagonal plane and the hexagonal plane relative to the world coordinate system can be expressed by the calculated rotation matrix R j and translation matrix T j .
  • the marking layer is arranged on the polygonal plane, wherein the polygonal plane includes a pentagonal plane and a hexagonal plane. Because the center of the marking layer coincides with the center of the pentagonal plane or the hexagonal plane, and one side of the marking layer coincides with the center of the pentagonal plane. One side of the polygonal plane or the hexagonal plane is parallel, so that the position of the mark point on each mark layer on the pentagonal plane or the hexagonal plane can be obtained, so as to obtain the coordinates of the mark point relative to the world The coordinates of the system.
  • the marking points in the target image can be determined according to the combination of the marking points and the actual The corresponding relationship of the marking points in the polyhedral target can determine the marking layer corresponding to the marking point.
  • the origin of the world coordinate system is set at the center of the first virtual spherical surface, and the XY plane of the world coordinate system coincides with the bottom surface of the inner surface of the polyhedral target, it can be concluded that the i-th marking point on the j-th polygonal plane is in the world
  • the longitude and latitude in the coordinate system are: In this way, the longitude and latitude of each marking point on the polyhedral target in the world coordinate system can be obtained.
  • Step S260 using the Levenberg-Marquardt algorithm to optimize the initial values of k 1 , k 2 , k 3 , k 4 , k 5 , u 0 , v 0 , mu , m v , T j , and R j to determine the The imaging model parameters of the fisheye camera.
  • the Levenberg-Marquardt algorithm can be used to determine k 1 .
  • the initial values of k 2 , u 0 , v 0 , mu , m v , T j , R j and the initial values of high-order distortion parameters k 3 , k 4 , and k 5 set to 0 are optimized to determine the The imaging model parameters of the fisheye camera.
  • the projection value of the marker point is the world coordinate corresponding to the marker point obtained according to the world coordinate system established by the polyhedral target, and the pixel coordinates corresponding to the marker point are calculated according to the projection model;
  • the measurement point refers to the use of a fisheye camera to photograph the polyhedron After the target, measure the coordinates of the marked points in the target image corresponding to the image coordinate system.
  • the fisheye camera calibration method proposed in this application collects a target image of a polyhedral target.
  • the target image includes a polyhedral target and a plurality of marking layers arranged on the inner surface; the selected image is selected according to the target image.
  • the radial distortion model is fitted with the isometric projection model to obtain the initial values of the distortion parameters k 1 , k 2 , k 3 , k 4 , and k 5 ;
  • the polyhedral target in the target image is fitted with an ellipse, obtaining u 0, v 0 is the initial value; obtaining m u, an initial value of m v, m u and m v are the number of pixels in the vertical direction and the number of pixels in the horizontal direction on the image coordinate distance units per unit distance ;
  • According to the polyhedral target obtain the translation matrix T j and rotation matrix R j of the pentagonal plane and the hexagonal plane where each mark point is located, and the initial values of other
  • FIG. 11 shows a fisheye camera calibration device 400 provided by an embodiment of the present application.
  • the device 400 includes an image acquisition module 410, a camera calibration module 420, and a numerical optimization module 430.
  • the image acquisition module 410 is used to acquire a target image, the target image includes a polyhedral target and images of a plurality of marking points arranged on the inner surface of the polyhedral target;
  • the camera calibration module 420 is used to Describe the target image, and fit the selected radial distortion model and the isometric projection model to the initial values of the distortion parameters k 1 , k 2 , k 3 , k 4 , and k 5 ;
  • the camera calibration module is also used to perform ellipse fitting on the polyhedral target in the target image, according
  • the camera calibration module 420 is further configured to determine that the vertices of the polyhedral target are located on a first virtual sphere, the distance from the center of the first virtual sphere to each vertex is equal, and the first virtual
  • the relationship between the center of the sphere and the center of each pentagonal plane or hexagonal plane is perpendicular to the pentagonal plane or hexagonal plane to obtain each pentagonal plane or hexagonal plane relative to the world translation matrix T j and the rotation matrix R j coordinate system; each according to the pentagonal or hexagonal plane relative to the plane of the world coordinate system translation matrix T j and the rotation matrix R j, and the center of the marking layer
  • the centers of the pentagonal plane or the hexagonal plane coincide, and one side of the marking layer is parallel to one side of the pentagonal plane or the hexagonal plane to obtain the coordinates of each marking point relative to the world coordinate system.
  • the numerical optimization module 430 is further configured to use the Levenberg-Marquardt algorithm to minimize the sum of squares of the difference between the projection points of the points on the polyhedral target and the measurement points of the points on the polyhedral target.
  • the method and device for fisheye camera calibration collects a target image of a polyhedral target.
  • the target image includes a polyhedral target and an image set on the inner surface of the polyhedral target. Images of multiple marker points; fit the selected radial distortion model with the isometric projection model according to the target image, and obtain the initial values of the distortion parameters k 1 , k 2 , k 3 , k 4 , and k 5 ;
  • the initial values of the distortion parameters k 1 , k 2 , k 3 , k 4 , k 5 and the formula r max k 1 ⁇ max +k 2 ⁇ max 3 +k 3 ⁇ max 5 +k 4 ⁇ max 7 +k 5 ⁇ max 9 is calculated to obtain the radius r max , where ⁇ max represents the maximum angle of view of the fisheye camera; ellipse fitting is performed on the polyhedral target in the target image, according to the fitted ellipse and r The value of max obtain
  • the initial values of the parameters k 3 , k 4 , and k 5 are set to 0, and the Levenberg-Marquardt algorithm is used to calculate these parameters (k 1 , k 2 , k 3 , k 4 , k 5 , u 0 , v 0 , mu , m
  • the initial values of v , T j , R j ) are optimized to determine the imaging model parameters of the fisheye camera. Through the polyhedral target, there is no need to move the target or the camera to collect multiple target images. Only one target image can be collected to realize the fast and high-precision calibration of the fisheye camera.
  • the coupling or direct coupling or communication connection between the displayed or discussed modules may be through some interfaces.
  • the indirect coupling or communication connection of the devices or modules may be electrical, Mechanical or other forms.
  • each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules.
  • the electronic device 500 may be a mobile terminal capable of data processing, such as a mobile phone or a tablet computer.
  • the electronic device 500 in this application may include one or more of the following components: a processor 510, a memory 520, and one or more application programs, where one or more application programs may be stored in the memory 520 and configured to One or more processors 510 execute, and one or more programs are configured to execute the methods described in the foregoing method embodiments.
  • the processor 510 may include one or more processing cores.
  • the processor 510 uses various interfaces and lines to connect various parts of the entire electronic device 500, and executes by running or executing instructions, programs, code sets, or instruction sets stored in the memory 520, and calling data stored in the memory 520.
  • the processor 510 may use at least one of digital signal processing (Digital Signal Processing, DSP), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), and Programmable Logic Array (Programmable Logic Array, PLA).
  • DSP Digital Signal Processing
  • FPGA Field-Programmable Gate Array
  • PLA Programmable Logic Array
  • the processor 510 may integrate one or a combination of a central processing unit (CPU), a graphics processing unit (GPU), a modem, and the like.
  • the CPU mainly processes the operating system, user interface, and application programs; the GPU is used for rendering and drawing of display content; the modem is used for processing wireless communication. It can be understood that the above-mentioned modem may not be integrated into the processor 510, but may be implemented by a communication chip alone.
  • the memory 520 may include random access memory (RAM) or read-only memory (Read-Only Memory).
  • the memory 520 may be used to store instructions, programs, codes, code sets or instruction sets.
  • the memory 520 may include a storage program area and a storage data area, where the storage program area may store instructions for implementing the operating system and instructions for implementing at least one function (such as touch function, sound playback function, image playback function, etc.) , Instructions for implementing the following method embodiments, etc.
  • the data storage area can also store data created by the electronic device 500 during use (such as phone book, audio and video data, chat record data), and the like.
  • FIG. 13 shows a structural block diagram of a computer-readable storage medium provided by an embodiment of the present application.
  • the computer-readable storage medium 600 stores program code, and the program code can be invoked by a processor to execute the method described in the foregoing method embodiment.
  • the computer-readable storage medium 600 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the computer-readable storage medium 600 includes a non-transitory computer-readable storage medium.
  • the computer-readable storage medium 600 has a storage space for the program code 610 for executing any method steps in the above methods. These program codes can be read out from or written into one or more computer program products.
  • the program code 610 may be compressed in a suitable form, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Studio Devices (AREA)
  • Geometry (AREA)

Abstract

本申请公开了一种鱼眼相机标定***、方法、装置、电子设备及存储介质。所述***包括多面体标靶,鱼眼相机以及电子设备:所述多面体标靶包括内表面以及设置于所述内表面的多个标记点,所述内表面由多个六边形平面和五边形平面组成;所述鱼眼相机用于拍摄所述多面体标靶,采集标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面的多个标记点的图像;所述电子设备用于获取k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值,使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。通过多面体标靶,不需要移动标靶或相机采集多幅标靶图像,只采集一幅标靶图像即可实现鱼眼相机的快速和高精度的标定。

Description

鱼眼相机标定***、方法、装置、电子设备及存储介质
本申请要求于2019年5月22日提交的申请号为CN201910431510.7和申请号为CN201920745752.9的中国专利申请的优先权,在此通过引用将其全部内容并入本文。
技术领域
本申请涉及相机标定技术领域,更具体地,涉及一种鱼眼相机标定***、方法、装置、电子设备及存储介质。
背景技术
相机标定是机器视觉、摄影测量、3D成像和图像几何校正等工作中的关键技术之一,它的主要作用是估计相机的内外参数。标定结果的精度和标定算法的稳定性直接影响后续工作的准确性。一般的标定方法中,需要采集多张图像,因此需要人为的移动标定板或是相机,在实际的应用中,不仅费时费力,也增加了生产的成本。
发明内容
本申请提出了一种鱼眼相机标定***、方法、装置、电子设备及存储介质,以改善上述问题。
第一方面,本申请实施例提供了一种鱼眼相机的标定***,该***包括多面体标靶,鱼眼相机以及电子设备。所述多面体标靶包括内表面以及设置于所述内表面的多个标记点,所述内表面由多个六边形平面和五边形平面组成;所述鱼眼相机用于拍摄所述多面体标靶,采集标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像;所述电子设备用于根据所述标靶图像,将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示鱼眼相机的最大视场角;所述电子设备还用于对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;所述电子设备还用于根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j;所述电子设备还用于使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。
第二方面,本申请实施例提供了一种鱼眼相机标定方法,该方法包括:获取标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图 像;根据所述标靶图像,将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示所述鱼眼相机的最大视场角;对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j;使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。
第三方面,本申请实施例提供了一种鱼眼相机标定装置,该装置包括:图像获取模块,用于获取标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像;相机标定模块,用于根据所述标靶图像,将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;所述相机标定模块还用于根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示所述鱼眼相机的最大视场角;所述相机标定模块还用于对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;所述相机标定模块还用于根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j;数值优化模块,用于使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。
第四方面,本申请实施例提供了一种电子设备,包括:一个或多个处理器;存储器;一个或多个程序。其中所述一个或多个程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行,所述一个或多个程序配置用于执行上述的方法。
第五方面,本申请实施例提供了一种计算机可读取存储介质,所述计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行如上述的方法。
本申请的这些方面或其他方面在以下实施例的描述中会更加简明易懂。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出了本申请一个实施例提供的鱼眼相机标定***的示意图。
图2示出了本申请一个实施例提供的一种多面体标靶的示意图。
图3示出了本申请一个实施例提供的多面体标靶内表面的拼接示意图。
图4示出了本申请一个实施例提供的多面体标靶内表面的平面展开图。
图5示出了本申请一个实施例提供的在第一视角下的标靶。
图6示出了本申请一个实施例提供的子标记层与所述五边形平面或六边形平面的贴合示意图。
图7示出了本申请一个实施例提供的光源的设置的示意图。
图8示出了本申请一个实施例提供的另一种多面体标靶的示意图。
图9示出了本申请一个实施例提供的鱼眼相机标定方法的流程图。
图10示出了本申请一个实施例提供的第一虚拟球面的球心与五边形平面或六边形平面的关系示意图。
图11示出了本申请一个实施例提供的鱼眼相机标定装置的结构框图。
图12示出了本申请实施例提出的用于执行根据本申请实施例的鱼眼相机标定方法的电子设备的结构框图。
图13示出了本申请实施例提供的用于保存或者携带实现根据本申请实施例的鱼眼相机标定方法的程序代码的存储介质。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
相机标定是机器视觉、摄影测量、3D成像和图像几何校正等工作中的关键技术之一,它的主要作用是估计相机的内外参数。标定结果的精度和标定算法的稳定性直接影响后续工作的准确性。一般视角相机可以用针孔模型表示,使用透视投影映射和仿射变换即可进行标定。近年来,鱼眼相机由于其超大的视场范围,在全景视觉、视频监控,汽车导航,虚拟现实等领域得到广泛应用。然而大视场同时也带来严重的图像畸变,影响人眼直观的视觉感受以及图像信息的利用。为了对图像畸变进行校正,需要对鱼眼相机进行标定。
目前已经研究成熟的方法是采用平面标靶,已有基于平面标靶的工具软件,例如:Matlab工具箱及Opencv工具软件。在这些方法中,将平面标定板放置于相机前不同的位置采集多张标靶图像,以获得分布范围较大的标定原始数据。该方法需多次摆放标定板在不同位置并采集标靶图像,或者多次转动相机在不同方位并采集标靶图像。这些方法对于某些需要快速安装鱼眼相机并完成标定的场合并不适用,比如批量生产或组装鱼眼相机的生产线。
因此,发明人提出了本申请的鱼眼相机标定***、方法、装置、电子设备及存储介质。通过采集多面体标靶的标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像;根据标靶图像将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示鱼眼相机的最大视场角;对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v 分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j;使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。通过多面体标靶,不需要移动标靶或相机采集多幅标靶图像,只采集一幅标靶图像即可实现鱼眼相机的快速和高精度的标定。
请参阅图1,本申请实施例提供了一种基于多面体标靶的鱼眼相机的标定***,具体的,该***可以包括多面体标靶100、鱼眼相机200以及电子设备300,其中,鱼眼相机200和电子设备300可以是一个装置,也可以为两个装置,所述鱼眼相机200为带有鱼眼镜头的相机。所述多面体标靶100包括内表面以及设置于所述内表面的多个标记点。其中,所述内表面由多个多边形平面拼接而成,所述多个多边形平面拼接后形成顶点,所述顶点位于第一虚拟球面上,所述第一虚拟球面的球心到每个顶点的距离相等。所述鱼眼相机200设置于所述第一虚拟球面的球心位置,拍摄所述多面体标靶100,获得标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像。
所述***还可以包括光源,光源设置在所述多面体标靶的内表面一侧,使得内表面中的光线充足,从而在使用鱼眼相机拍摄所述标靶100得到的标靶图像更加清晰,从而提升对鱼眼相机标定的准确度。
鱼眼相机200可以将拍摄的标靶图像发送给电子设备300,电子设备300根据标靶图像,将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示鱼眼相机的最大视场角;对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点;根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j;使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数,从而对鱼眼相机200进行标定。
其中,所述多面体标靶100包括壳体、内表面以及设置于所述内表面的多个标记点。请参阅图2,示出了本申请实施例提供的多面体标靶的示意图。多面体标靶100包括壳体、内表面110和标记点120。其中,所述壳体可以是如图2所示出的多面体,也可以是半球形,或是矩形,可以根据需要进行设置,在此不再赘述。所述内表面110为由多个五边形平面和六边形平面拼接而成的类似于半球形的内表面,其中,制作五边形平面和六边形平面的材料可以是钢铁片等金属片,或是其他不易变形的材料,拼接方式可以是固定连接的方式如焊接,也可以是活动连接的方式如铰链连接,五边形平面和六边形平面的材料和拼接方式可以根据实际需要进行选择,在此不做限定。
具体的,所述内表面110可以由多个五边形平面111和六边形平面112拼接而成。请参阅图3,示出了所述多面体标靶内表面的拼接示意图。其中,拼接所述内表面110使用 的完整的五边形平面的数目为4个,完整的六边形平面的数目为8个,部分的五边形平面的数目为4个,部分的六边形平面的数目为4个。拼接后的多面体的内表面110的顶点113位于第一虚拟球面上,即所述第一虚拟球面的球心到所述内表面110上的每个顶点113的距离相等。其中,所述拼接后的内表面110中,所有的五边形平面111与第二虚拟球面相切,即所述第二虚拟球面的球心与所述五边形平面111的中心的连线与该五边形平面111垂直;拼接后的内表面110中,所有的六边形平面112与第三虚拟球面相切,即所述第三虚拟球面的球心与所述六边形平面112的中心的连线与该六边形平面112垂直。所述第一虚拟球面的球心、第二虚拟球面的球心、第三虚拟球面的球心重合。
可以理解的是,所述多面体标靶的的内表面110的结构类似于足球,可以理解为一个足球沿过球心的截面切开,本申请所提供的多面体标靶可以是沿着使完整五边形平面111和六边形平面112的数量最多的过球心的截面切开。可以理解的是,也可以从过球心的不同的截面切开,因此所述部分的五边形平面和部分的六边形平面可以不为完整的五边形平面或六边形平面的一半。
请参阅图4,示出了本申请实施例提供的多面体标靶内表面的平面展开图。由该平面展开图可以得知,所述部分的五边形平面111为完整的五边形平面的一半,所述部分的六边形平面112为完整的六边形平面的一半。同时请参阅图5,示出了本申请实施例提供的在第一视角下的标靶。其中,所述第一视角定义为过第一虚拟球面的球心且垂直于所述内表面110的对立面的视角。在第一视角下,编号为1、2、6、7的半个六边形平面112与第一视角的方向平行,则投影为一条线,编号为3、5、8、10的半个五边形平面111也与第一视角的方向平行,则投影为一条线。
在一些实施方式中,所述标记点120可以设置在标记层上,标记层设置于所述内表面110上,所述标记层包括多个子标记层,所述子标记层可以是圆形标记图像或是棋盘格标记图像,其中,所述子标记层包括多个标记点120,标记点120的大小可以不同。具体的,所述标记点120设置于拼接成所述内表面110的多个五边形平面和六边形平面上。请参阅图6,示出了本申请实施例提供的标记点所在的子标记层与所述五边形平面或六边形平面的贴合示意图。其中,所述子标记层贴合于所述五边形平面111或六边形平面112上,可以是,一个五边形平面贴合一个子标记层,一个六边形平面贴合一个子标记层,所述子标记层的中心与所述五边形平面111或六边形平面112的中心重合,且所述子标记层的一边至少与所述五边形平面111或六边形平面112的一条边平行。
具体的,如图6所示,标记层由多个子标记层组成,子标记层由多个大小不同的标记点120组成,其中,每个子标记层中大小不同的标记点120形成该标记层的标记点组合方式,不同的子标记层,标记点组合方式不同,在所述鱼眼相机200拍摄所述标靶100得到标靶图像时,则可以根据所述子标记层中标记点组合方式确定所述标靶图像中的标记点与标靶中的标记点120的对应关系。
请参阅图7,示出了本申请实施例提的光源设置的示意图,所述多面体标靶还可以包 括光源130,所述光源130可以设置在多面体标靶的内表面一侧,,通过光源130的设置,使得内表面110中的光线充足,在使用鱼眼相机拍摄多面体标靶得到的标靶图像更加清晰,从而提升对鱼眼相机标定的准确度。
在一些实施方式中,所述标记点可以是在所述内表面110上的多个孔洞。可请参阅图8,示出了本申请实施例提供的标靶的示意图。其中,所述内表面上的多个孔洞的大小可以不同,每个五边形平面或者六边形平面上大小不同的标记点120形成一种标记点组合方式,不同的五边形平面和六边形平面,形成的标记点组合方式可以不同。当然,孔洞的大小和分布可以根据实际需要进行设置,在此不做限定。对应的,如图7所示,所述标靶包括光源130,所述光源130设置于所述多面体标靶的壳体内部,光源130通过孔洞射出的光在所述半球形内表面111上形成所述标记点120。从而可以通过鱼眼相机拍摄到所述标靶得到标靶图像,对鱼眼相机进行标定。
请参阅图9,本申请实施例提供了一种鱼眼相机标定方法,应用于上述鱼眼相机标定***,具体的,该方法可以包括:
步骤S210,获取标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像。
电子设备可以获取到使用鱼眼相机拍摄的标靶图像,具体的,可以是电子设备通过网络等方式接收到鱼眼相机发送的标靶图像,也可以是,通过U盘或存储卡获取到标靶图像。其中标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像。
步骤S220,根据所述标靶图像,将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值。
普通的相机成像遵循针孔相机模型,在成像过程中实际场景中的直线仍被投影为图像平面上的直线,但是鱼眼相机如果按照针孔相机模型成像,投影图像会变得非常大,当相机视场角达到180度时,图像会变为无穷大。因此鱼眼相机的超广角特性,不能通过投影变换将一个半球的视场投射到一个有限的图像平面上,必须采用其他的模型表示。
本申请选取等距投影作为鱼眼相机模型。其中,等距投影映射方式可表示为r=fθ,其中,r表示所述标靶图像中的任意一点到畸变中心的距离,f表示所述鱼眼相机的焦距,θ表示入射光线与所述鱼眼相机光轴之间的夹角。
由于鱼眼相机的投影模型为了将尽可能大的场景投影到有限的图像平面内,允许了相机畸变的存在,且鱼眼相机的径向畸变非常严重,所以鱼眼相机主要考虑径向畸变。若只考虑鱼眼相机的径向畸变,基于选取的等距模型为鱼眼相机模型,具体的鱼眼相机的径向畸变模型可以表示为:
r(θ)=k 1θ+k 2θ 3+k 3θ 5+k 4θ 7+k 5θ 9+…
其中,r为图像上像素点与相机***的主点间的距离,θ为入射光线与***光轴的夹角,其中,k 1,k 2,k 3,k 4,k 5为鱼眼相机的径向畸变模型中的5个畸变参数,与前面所述的u 0,v 0, m u,m v,4个参数一起构成鱼眼相机的内参数。在这种模型中,畸变参数并不限定为,k 1,k 2,k 3,k 4,k 5,还可以有k 6,k 7,直至k n等,本申请实施例以,k 1,k 2,k 3,k 4,k 5进行说明,用k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u以及m v这9个内参数来表述鱼眼相机的成像特性。
将选取的径向畸变模型即r(θ)=k 1θ+k 2θ 3+k 3θ 5+k 4θ 7+k 5θ 9…与等距投影模型r=fθ进行拟合。在畸变参数中,高阶畸变参数的影响较小,在计算初始值时,可以将高阶畸变参数的初始值设为0,那么可以认为k 3=0,k 4=0,k 5=0,因此,在本申请实施例中,径向畸变模型可以表示为r=k 1θ+k 2θ 3,将该径向畸变模型与所需要的投影方程r=fθ拟合,利用厂方提供的参数焦距f和最大视场角(例如,某一鱼眼相机焦距8mm,最大视场角θ max为180度),可以得到k 1=f,k 2=0。
步骤S230,根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示最大视场角。
由于k 1=f,k 2=0,k 3,k 4,k 5都设定为0;半径公式为r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9,因此可以计算获得半径为r max=fθ max。以前述焦距f为8mm、最大视场角θ max为180度的鱼眼相机为例,将得到的畸变参数的值以及该鱼眼相机的最大视场角为180度,即θ max=π/2代入公式,r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9得到r max=f*π/2。
步骤S240,对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数。
其中,(u 0,v 0)为主点,m u为图像坐标水平方向上单位距离的像素点个数,m v为图像坐标垂直方向上单位距离的像素点个数。对所述标靶图像中的所述多面体标靶进行椭圆拟合,获得a,b,u 0,v 0的初始值,其中,a为椭圆长轴的长,b为椭圆短轴的长。
由于采集的标靶图像为多面体标靶的内表面的图像,该内表面类似于半球形的内表面,边界在空间上类似为一个圆,成像在图像坐标系上类似为一个椭圆,如图5所示。那么从获取的圆的边界点,拟合一个椭圆,通过提取椭圆的边界就可以知道椭圆的长轴的长a和椭圆短轴的长b。具体的,椭圆拟合方程为
Figure PCTCN2019113441-appb-000001
其中,a,b的值通过对标靶图像的拟合椭圆的测量可以获知,由此可以求出主点(u 0,v 0)的值。
在一种实施方式中,可以根据a,b的值以及公式m u=a/r max和m v=b/r max,获得m u,m v的初始值。根据拟合椭圆可以获得椭圆的长轴的长a和椭圆的短轴的长b,由于半径公式为r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9,k 1=f,k 2=0,k 3、k 4、k 5都设定为0,由此得出r max=fθ max。进一步,根据r max=fθ max以及公式m u=a/r max和m v=b/r max得出m u,m v的初始值。
在另一种实施方式中,对于全幅相机,可以将主点放置到图像中心,并使用鱼眼相机厂商给出的像素尺寸值来获的m u,m v的初始值,其中,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数。
步骤S250,根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平 移矩阵T j和旋转矩阵R j
标记点所在的五边形平面和六边形平面是指标记点所在的标记层平面(即标记层所贴合的五边形平面或六边形平面)。具体的,标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j,可以是根据多面体标靶的拼接关系来获得。所述多面体标靶的拼接方式可参照前述实施例对应的内容,为避免重复,在此不再赘述。
由于所述第一虚拟球面的球心与所述每一个五边形平面中心的连线与该五边形平面垂直,且所述第一虚拟球面的球心与所述每一个六边形平面中心的连线与该六边形平面垂直,可参阅图10,示出了第一虚拟球面的球心与五边形平面或六边形平面的关系示意图。其中,O点为所述第一虚拟球面的球心,五边形平面111的中心点为A点,第一虚拟球面的球心与所述五边形平面的中心的连线则为线段OA,线段OA垂直于所述五边形平面111;六边形平面112的中心点为B点,第一虚拟球面的球心与所述五边形平面的中心的连线则为线段OB,线段OB垂直于所述五边形平面112。
通过三角关系,可以计算出拼接所述多面体标靶的五边形平面与六边形平面之间的夹角以及六边形平面与六边形平面之间的夹角。请参阅图4所示出的多面体标靶内表面的平面展开图和图5所示的第一视角下的标靶,若将所述第一虚拟球面的半径设为R,那么图4中各个五边形平面或六边形平面和线段对应的圆心角都可以用含“R”的三角函数表达式表达出来,例如,图4中编号为1-10的五边形平面和六边形平面底边各个线段对应的圆心角都已经能用含“R”的三角函数表达式表示出来,而从图5可以看出编号为1-10的五边形平面和六边形平面底边的这些线段围绕第一虚拟球面的球心形成了一个封闭的十边形,十条边对应的圆心角的和应该等于360°,通过解反三角函数则可以计算出R的值。由此可以得出第二虚拟球面的半径的值、第三虚拟球面的半径的值,以及各个五边形平面和六边形平面之间的夹角。
因此,每个五边形平面和六边形平面的位置可以用他们相对于世界坐标系的旋转矩阵R j和平移向量T j都可以通过计算得到,这些计算得到的旋转矩阵R j和平移向量T j将作为相机参数优化的初值。
根据所述多面体标靶建立世界坐标系,将所述世界坐标系的原点设置在所述第一虚拟球面的球心位置,根据所建立的世界坐标系,对于拼接所述多面体标靶的每一个五边形平面和六边形平面相对于世界坐标系的位置,都可以用计算得到的旋转矩阵R j和平移矩阵T j表示出来。
标记层设置在所述多边形平面上,其中多边形平面包括五边形平面和六边形平面,由于标记层的中心与五边形平面或六边形平面的中心重合,且标记层的一边与五边形平面或六边形平面的一条边平行,由此可以求出每个标记层上的标记点在所述五边形平面或六边形平面的位置,从而求出标记点相对于世界坐标系的坐标。其中,由于所述标记层中的标记点的大小可以不同,不同的标记层中的标记点可以是不同的组合方式,那么,可以根据标记点的组合方式确定标靶图像中的标记点与实际的多面体标靶中的标记点的对应关系, 则可以确定标记点对应的标记层。
其中,那么标记层中的标记点的坐标可以表示为X i=(X i,Y i,0) T,其中,X i第i个标记点在五边形平面或六边形平面上的位置向量,X i和Y i表示该向量在五边形平面或六边形平面上的X和Y坐标,由此,每个五边形平面或六边形平面(即多边形平面)上贴合的标记层中的的标记点可以表示为X c(j,i)=R jX(j,i)+T j,其中,X c(j,i)指在第j个多边形平面,第i个标记点的世界坐标。其中j不少于20,即多边形平面的数量不少于20个。X c(j,i)可以进一步表示为在世界坐标系中的三个分量(X c x,X c y,X c z)的形式为:X c(j,i)=(X c x,X c y,X c z) T,从而可以获知,贴合在五边形平面或六边形平面上的标记层中的每个标记点的世界坐标。
由于世界坐标系原点设置在第一虚拟球面的球心,且世界坐标系的XY平面与多面体标靶的内表面的底面重合,可以得出第j个多边形平面上的第i个标记点在世界坐标系中的经度和维度为:
Figure PCTCN2019113441-appb-000002
从而可以获取到所述多面体标靶上的每个标记点在世界坐标系中的经度和纬度。
步骤S260,使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。
其中,在得到k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值后可以使用Levenberg-Marquardt算法对k 1,k 2,u 0,v 0,m u,m v,T j,R j的初始值和设定为0的高阶畸变参数k 3,k 4,k 5的初始值进行优化以确定所述鱼眼相机的成像模型参数。
具体的,使标靶上的标记点的投影值与标靶上的标记点的测量点之间差的平方和最小。其中,标记点的投影值为根据多面体标靶建立的世界坐标系得到的标记点对应的世界坐标后,根据投影模型计算出标记点对应的像素坐标;测量点是指在使用鱼眼相机拍摄多面体标靶后,测量标靶图像中标记点对应在图像坐标系中的坐标。
本申请提出的鱼眼相机标定方法,通过采集多面体标靶的标靶图像,所述标靶图像包括多面体标靶以及设置于所述内表面上的多个标记层;根据标靶图像将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;对所述标靶图像中的多面体标靶进行椭圆拟合,获得u 0,v 0的初始值;获取m u,m v的初始值,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j,其他高阶畸变参数k 3,k 4,k 5的初始值设为0,使用Levenberg-Marquardt算法对这些参数(k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j)的初始值进行优化以确定所述鱼眼相机的成像模型参数。通过采集多面体标靶图像,不需要移动标靶或相机采集多幅标靶图像,只需要采集一幅标靶图像即可实现鱼眼相机的快速和高精度的标定。
请参阅图11,其示出了本申请实施例提供的一种鱼眼相机标定装置400,所述装置400包括图像获取模块410,相机标定模块420,数值优化模块430。
图像获取模块410,用于获取标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像;相机标定模块420,用于根据所述标靶图像,将选取的径向畸变模型与等距投影模型进行拟合畸变参数k 1、k 2、k 3、k 4、k 5的初始值;所述相机标定模块还用于根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示所述鱼眼相机的最大视场角;所述相机标定模块还用于对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;所述相机标定模块还用于根据所述多面体标靶的获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j;数值优化模块430,使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。
进一步的,所述相机标定模块420还用于选取径向畸变模型r=k 1θ+k 2θ 3+k 3θ 5+k 4θ 7+k 5θ 9以及等距投影模型为r=fθ,其中,r表示所述标靶图像中的点到畸变中心的距离,f表示所述鱼眼相机的焦距,θ表示入射光线与所述鱼眼相机光轴之间的夹角;所述相机标定模块420还用于将所述畸变模型与等距投影模型进行拟合得到k 1=f,k 2=0,k 3、k 4、k 5都设定为0。
进一步的,所述相机标定模块420还用于根据k 1=f,k 2=0,k 3、k 4、k 5都设定为0以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9,计算得到半径r max=f*θ max
进一步的,所述相机标定模块420还用于对所述标靶图像中的所述多面体标靶进行椭圆拟合,获得椭圆拟合方程为
Figure PCTCN2019113441-appb-000003
其中,a为椭圆长轴的长,b为椭圆短轴的长;获取所述标靶图像中的所述多面体标靶的边界点,获得a,b的值;根据a,b的值、r max的值以及公式m u=a/r max和m v=b/r max,获得m u,m v的初始值。
进一步的,所述相机标定模块420还用于根据所述多面体标靶的顶点位于第一虚拟球面上,所述第一虚拟球面的球心到每个顶点的距离相等,且所述第一虚拟球面的球心与每个五边形平面或六边形平面的中心的连线与该五边形平面或六边形平面垂直的关系获得每个五边形平面或六边形平面相对于世界坐标系的平移矩阵T j和旋转矩阵R j;根据所述每个五边形平面或六边形平面相对于世界坐标系的平移矩阵T j和旋转矩阵R j,以及标记层的中心与所述五边形平面或六边形平面的中心重合,且所述标记层的一边与所述五边形平面或六边形平面的一条边平行获得每个标记点相对于世界坐标系的坐标。
进一步的,所述数值优化模块430还用于利用Levenberg-Marquardt算法使所述多面体标靶上的点的投影点与所述多面体标靶上的点的测量点之间差的平方和最小。
需要说明的是,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述装置和模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
综上所述,本申请提供的鱼眼相机标定的方法及装置,通过采集多面体标靶的标靶图 像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像;根据标靶图像将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示所述鱼眼相机的最大视场角;对所述标靶图像中的多面体标靶进行椭圆拟合,根据所述拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j,其他高阶畸变参数k 3,k 4,k 5的初始值设为0,使用Levenberg-Marquardt算法对这些参数(k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j)的初始值进行优化以确定所述鱼眼相机的成像模型参数。通过多面体标靶,不需要移动标靶或相机采集多幅标靶图像,只采集一幅标靶图像即可实现鱼眼相机的快速和高精度的标定。
在本申请所提供的几个实施例中,所显示或讨论的模块相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或模块的间接耦合或通信连接,可以是电性,机械或其它的形式。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
请参考图12,其示出了本申请实施例提供的一种电子设备的结构框图。该电子设备500可以是手机、平板电脑等能进行数据处理的移动终端。
本申请中的电子设备500可以包括一个或多个如下部件:处理器510、存储器520、以及一个或多个应用程序,其中一个或多个应用程序可以被存储在存储器520中并被配置为由一个或多个处理器510执行,一个或多个程序配置用于执行如前述方法实施例所描述的方法。
处理器510可以包括一个或者多个处理核。处理器510利用各种接口和线路连接整个电子设备500内的各个部分,通过运行或执行存储在存储器520内的指令、程序、代码集或指令集,以及调用存储在存储器520内的数据,执行电子设备500的各种功能和处理数据。可选地,处理器510可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。处理器510可集成中央处理器(Central Processing Unit,CPU)、图像处理器(Graphics Processing Unit,GPU)和调制解调器等中的一种或几种的组合。其中,CPU主要处理操作***、用户界面和应用程序等;GPU用于负责显示内容的渲染和绘制;调制解调器用于处理无线通信。可以理解的是,上述调制解调器也可以不集成到处理器510中,单独通过一块通信芯片进行实现。
存储器520可以包括随机存储器(Random Access Memory,RAM),也可以包括只读 存储器(Read-Only Memory)。存储器520可用于存储指令、程序、代码、代码集或指令集。存储器520可包括存储程序区和存储数据区,其中,存储程序区可存储用于实现操作***的指令、用于实现至少一个功能的指令(比如触控功能、声音播放功能、图像播放功能等)、用于实现下述各个方法实施例的指令等。存储数据区还可以存储电子设备500在使用中所创建的数据(比如电话本、音视频数据、聊天记录数据)等。
请参考图13,其示出了本申请实施例提供的一种计算机可读存储介质的结构框图。该计算机可读存储介质600中存储有程序代码,所述程序代码可被处理器调用执行上述方法实施例中所描述的方法。
计算机可读存储介质600可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。可选地,计算机可读存储介质600包括非瞬时性计算机可读介质(non-transitory computer-readable storage medium)。计算机可读存储介质600具有执行上述方法中的任何方法步骤的程序代码610的存储空间。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。程序代码610可以例如以适当形式进行压缩。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不驱使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (20)

  1. 一种基于多面体标靶的鱼眼相机标定***,其特征在于,包括多面体标靶,鱼眼相机以及电子设备:
    所述多面体标靶包括内表面以及设置于所述内表面的多个标记点,所述内表面由多个六边形平面和五边形平面组成;
    所述鱼眼相机用于拍摄所述多面体标靶,采集标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像;
    所述电子设备用于根据所述标靶图像,将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;
    所述电子设备还用于根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示所述鱼眼相机的最大视场角;
    所述电子设备还用于对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;
    所述电子设备还用于根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j
    所述电子设备还用于使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。
  2. 根据权利要求1所述的***,其特征在于,所述多面体标靶包括壳体以及多个标记点,所述壳体具有内表面,所述内表面由多个五边形平面和六边形平面拼接而成,拼接后的形成的顶点位于第一虚拟球面中心与每个六边形平面或五边形平面的中心的连线与所述六边形平面或五边形平面垂直。
  3. 根据权利要求2所述的***,其特征在于,所述内表面中的所有五边形平面与第二虚拟球面相切,所述内表面中的所有六边形平面与第三虚拟球面相切,第二虚拟球面的球心、第三虚拟球面的球心与所述第一虚拟球面的球心重合。
  4. 根据权利要求1至3任一项所述的***,其特征在于,所述多个六边形平面和五边形平面包括8个完整的正六边形平面、4个完整的正五边形平面、4个部分的正五边形平面以及4个部分的正六边形平面,所述五边形平面的边长和所述六边形平面的边长相等。
  5. 根据权利要求4所述的***,其特征在于,所述部分的正五边形平面为半个正五边形平面,所述部分的正六边形平面为半个正六边形平面。
  6. 根据权利要求1至5任一项所述的***,其特征在于,所述多面体标靶还包括标记层,所述标记层包括多个子标记层,所述子标记层由多个大小不同的标记点形成并贴合于所述多个六边形平面和五边形平面上。
  7. 根据权利要求6所述的***,其特征在于,所述子标记层的中心与所述六边形平面 或五边形平面的中心重合,所述子标记层的一边至少与所述六边形平面或五边形平面的其中一条边平行。
  8. 根据权利要求1至5任一项所述的***,其特征在于,所述标记点为形成于所述内表面的多个孔洞。
  9. 根据权利要求1至8任一项所述的***,其特征在于,所述内表面由多个五边形和六边形金属片拼接而成。
  10. 根据权利要求1至9任一项所述的***,其特征在于,所述鱼眼相机设置在第一虚拟球面的球心位置拍摄所述多面体标靶。
  11. 一种基于多面体标靶的鱼眼相机标定方法,其特征在于,基于如权利要求1至10所述的标定***对鱼眼相机进行标定,所述方法包括:
    获取标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像;
    根据所述标靶图像,将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;
    根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示所述鱼眼相机的最大视场角;
    对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;
    根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j
    使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。
  12. 根据权利要求11所述的方法,其特征在于,所述将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值,包括:
    选取径向畸变模型r=k 1θ+k 2θ 3+k 3θ 5+k 4θ 7+k 5θ 9以及等距投影模型为r=fθ,其中,r表示所述标靶图像中的点到畸变中心的距离,f表示所述鱼眼相机的焦距,θ表示入射光线与所述鱼眼相机光轴之间的夹角;
    将所述畸变模型与等距投影模型进行拟合得到k 1=f,k 2=0,k 3、k 4、k 5都设定为0。
  13. 根据权利要求11或12所述的方法,其特征在于,所述根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,包括:
    根据k 1=f,k 2=0,k 3、k 4、k 5都设定为0以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9,计算得到半径r max=f*θ max
  14. 根据权利要求11至13任一项所述的方法,其特征在于,所述对所述标靶图像中 的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值数,包括:
    对所述标靶图像中的所述多面体标靶进行椭圆拟合,获得椭圆拟合方程为
    Figure PCTCN2019113441-appb-100001
    Figure PCTCN2019113441-appb-100002
    其中,a为椭圆长轴的长,b为椭圆短轴的长;
    获取所述标靶图像中的所述多面体标靶的边界点,获得a,b的值;
    根据a,b的值、r max的值以及公式m u=a/r max和m v=b/r max,获得m u,m v的初始值。
  15. 根据权利要求11至13任一项所述的方法,其特征在于,所述方法还包括:
    根据鱼眼相机的镜头参数获取m u,m v的初始值。
  16. 根据权利要求11至15任一项所述的方法,其特征在于,所述根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j,包括:
    根据所述多面体标靶的顶点位于第一虚拟球面上,所述第一虚拟球面的球心到每个顶点的距离相等,且所述第一虚拟球面的球心与每个五边形平面或六边形平面的中心的连线与该五边形平面或六边形平面垂直的关系获得每个五边形平面或六边形平面相对于世界坐标系的平移矩阵T j和旋转矩阵R j
    根据所述每个五边形平面或六边形平面相对于世界坐标系的平移矩阵T j和旋转矩阵R j,以及标记层的中心与所述五边形平面或六边形平面的中心重合,且所述标记层的一边与所述五边形平面或六边形平面的一条边平行获得每个标记点相对于世界坐标系的坐标。
  17. 根据权利要求11至16任一项所述的方法,其特征在于,所述使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机成像模型参数,包括:
    利用Levenberg-Marquardt算法使所述多面体标靶上的点的投影点与所述多面体标靶上的点的测量点之间差的平方和最小。
  18. 一种基于多面体标靶的鱼眼相机标定装置,其特征在于,所述多面体标靶包括内表面以及设置于所述内表面上的多个标记点,所述装置包括:
    图像获取模块,用于获取标靶图像,所述标靶图像包括多面体标靶以及设置于所述多面体标靶的内表面上的多个标记点的图像;
    相机标定模块,用于根据所述标靶图像,将选取的径向畸变模型与等距投影模型进行拟合,获得畸变参数k 1、k 2、k 3、k 4、k 5的初始值;
    所述相机标定模块还用于根据所述畸变参数k 1、k 2、k 3、k 4、k 5的初始值以及公式r max=k 1θ max+k 2θ max 3+k 3θ max 5+k 4θ max 7+k 5θ max 9计算得到半径r max,其中,θ max表示所述鱼眼相机的最大视场角;
    所述相机标定模块还用于对所述标靶图像中的多面体标靶进行椭圆拟合,根据拟合的椭圆以及r max的值获得u 0,v 0,m u,m v的初始值,其中,(u 0,v 0)为主点,m u和m v分别为图像坐标水平方向上单位距离的像素点个数和垂直方向上单位距离的像素点个数;
    所述相机标定模块还用于根据所述多面体标靶获取每个标记点所在的五边形平面和六边形平面的平移矩阵T j和旋转矩阵R j
    数值优化模块,用于使用Levenberg-Marquardt算法对k 1,k 2,k 3,k 4,k 5,u 0,v 0,m u,m v,T j,R j的初始值进行优化以确定所述鱼眼相机的成像模型参数。
  19. 一种电子设备,其特征在于,包括:
    一个或多个处理器;
    存储器,与所述一个或多个处理器电连接;
    一个或多个应用程序,其中,所述一个或多个应用程序被存储在所述存储器中并被配置为由一个或多个处理器执行,所述一个或多个应用程序配置用于执行如权利要求11至17任一项所述的方法。
  20. 一种计算机可读取存储介质,其特征在于,所述计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行如权利要求11至17任一项所述的方法。
PCT/CN2019/113441 2019-05-22 2019-10-25 鱼眼相机标定***、方法、装置、电子设备及存储介质 WO2020232971A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19929895.1A EP3944194B1 (en) 2019-05-22 2019-10-25 Fisheye camera calibration system, method and apparatus, and electronic device and storage medium
US17/505,658 US11380016B2 (en) 2019-05-22 2021-10-20 Fisheye camera calibration system, method and electronic device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201910431510.7 2019-05-22
CN201920745752.9U CN209821888U (zh) 2019-05-22 2019-05-22 鱼眼相机标定的标靶及***
CN201910431510.7A CN110163922B (zh) 2019-05-22 2019-05-22 鱼眼相机标定***、方法、装置、电子设备及存储介质
CN201920745752.9 2019-05-22

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/505,658 Continuation-In-Part US11380016B2 (en) 2019-05-22 2021-10-20 Fisheye camera calibration system, method and electronic device

Publications (1)

Publication Number Publication Date
WO2020232971A1 true WO2020232971A1 (zh) 2020-11-26

Family

ID=73458954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/113441 WO2020232971A1 (zh) 2019-05-22 2019-10-25 鱼眼相机标定***、方法、装置、电子设备及存储介质

Country Status (3)

Country Link
US (1) US11380016B2 (zh)
EP (1) EP3944194B1 (zh)
WO (1) WO2020232971A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112858328A (zh) * 2020-12-31 2021-05-28 青岛大学 电气连接器露铜面积检测方法、装置、设备和存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111507924B (zh) * 2020-04-27 2023-09-29 北京百度网讯科技有限公司 视频帧的处理方法和装置
CN115170665B (zh) * 2022-07-08 2023-08-01 北京航空航天大学 一种基于图像的球形物***姿确定方法及***
CN114897953B (zh) * 2022-07-14 2022-09-23 青岛环海海洋工程勘察研究院有限责任公司 一种基于多靶标共线连接的水上水下点云一致性评估方法
CN117911294B (zh) * 2024-03-18 2024-05-31 浙江托普云农科技股份有限公司 基于视觉的玉米果穗表面图像矫正方法、***及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096923A (zh) * 2011-01-20 2011-06-15 上海杰图软件技术有限公司 鱼眼标定方法和装置
US20150346471A1 (en) * 2014-05-27 2015-12-03 Carl Zeiss Meditec Ag Method for the image-based calibration of multi-camera systems with adjustable focus and/or zoom
CN106548477A (zh) * 2017-01-24 2017-03-29 长沙全度影像科技有限公司 一种基于立体标定靶的多路鱼眼相机标定装置及方法
CN108592953A (zh) * 2018-06-29 2018-09-28 易思维(杭州)科技有限公司 立体标定靶及将其应用于视觉测量中定位被测物的方法
CN110163922A (zh) * 2019-05-22 2019-08-23 四川深瑞视科技有限公司 鱼眼相机标定***、方法、装置、电子设备及存储介质

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202736126U (zh) 2012-08-22 2013-02-13 云南大学 求解摄像机内参数的正十二面体标定块
CN103035007B (zh) 2012-12-14 2015-07-01 云南大学 利用正六棱台求解摄像机内参数
CN102982550B (zh) 2012-12-14 2016-01-06 云南大学 利用正五棱台求解摄像机内参数
EP2860699A1 (en) 2013-10-11 2015-04-15 Telefonaktiebolaget L M Ericsson (Publ) Technique for view synthesis
US9894350B2 (en) * 2015-02-24 2018-02-13 Nextvr Inc. Methods and apparatus related to capturing and/or rendering images
CN106815866B (zh) 2015-11-30 2020-04-03 宁波舜宇光电信息有限公司 鱼眼相机的标定方法及其标定***和标板
KR102506480B1 (ko) 2016-06-14 2023-03-07 삼성전자주식회사 영상 처리 장치 및 그 영상 처리 방법
CN106846415B (zh) 2017-01-24 2019-09-20 长沙全度影像科技有限公司 一种多路鱼眼相机双目标定装置及方法
CN207663479U (zh) 2017-12-28 2018-07-27 深圳市东一视讯科技有限公司 双摄全景相机标定光箱
CN208313322U (zh) 2018-06-29 2019-01-01 易思维(杭州)科技有限公司 立体标定靶
CN109242915A (zh) 2018-09-29 2019-01-18 合肥工业大学 基于多面立体靶标的多相机***标定方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096923A (zh) * 2011-01-20 2011-06-15 上海杰图软件技术有限公司 鱼眼标定方法和装置
US20150346471A1 (en) * 2014-05-27 2015-12-03 Carl Zeiss Meditec Ag Method for the image-based calibration of multi-camera systems with adjustable focus and/or zoom
CN106548477A (zh) * 2017-01-24 2017-03-29 长沙全度影像科技有限公司 一种基于立体标定靶的多路鱼眼相机标定装置及方法
CN108592953A (zh) * 2018-06-29 2018-09-28 易思维(杭州)科技有限公司 立体标定靶及将其应用于视觉测量中定位被测物的方法
CN110163922A (zh) * 2019-05-22 2019-08-23 四川深瑞视科技有限公司 鱼眼相机标定***、方法、装置、电子设备及存储介质

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GAO, XIULI ET AL.: "Fast fisheye camera calibration method using stereoscopic calibration board", JOURNAL OF HARBIN ENGINEERING UNIVERSITY, VOL. 37, NO. 11, 30 November 2016 (2016-11-30), DOI: 20200106142843A *
See also references of EP3944194A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112858328A (zh) * 2020-12-31 2021-05-28 青岛大学 电气连接器露铜面积检测方法、装置、设备和存储介质
CN112858328B (zh) * 2020-12-31 2022-11-08 青岛大学 电气连接器露铜面积检测方法、装置、设备和存储介质

Also Published As

Publication number Publication date
EP3944194B1 (en) 2023-02-22
US11380016B2 (en) 2022-07-05
EP3944194A1 (en) 2022-01-26
EP3944194A4 (en) 2022-06-22
US20220044443A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
WO2020232971A1 (zh) 鱼眼相机标定***、方法、装置、电子设备及存储介质
CN111857329B (zh) 注视点计算方法、装置及设备
CN110809786B (zh) 校准装置、校准图表、图表图案生成装置和校准方法
CN108122191B (zh) 鱼眼图像拼接成全景图像和全景视频的方法及装置
CN110136207B (zh) 鱼眼相机标定***、方法、装置、电子设备及存储介质
WO2018153374A1 (zh) 相机标定
US11514608B2 (en) Fisheye camera calibration system, method and electronic device
CN108257183A (zh) 一种相机镜头光轴校准方法和装置
CN106952219B (zh) 一种基于外参数修正鱼眼像机的图像生成方法
CN110490943B (zh) 4d全息捕捉***的快速精确标定方法、***及存储介质
WO2023273108A1 (zh) 单目测距方法、装置及智能装置
CN106886976B (zh) 一种基于内参数修正鱼眼像机的图像生成方法
CN108846796A (zh) 图像拼接方法及电子设备
CN111383264A (zh) 一种定位方法、装置、终端及计算机存储介质
CN110163922B (zh) 鱼眼相机标定***、方法、装置、电子设备及存储介质
CN114004890B (zh) 姿态确定方法、装置、电子设备和存储介质
CN113259642B (zh) 一种影片视角调节方法及***
CN114511447A (zh) 图像处理方法、装置、设备及计算机存储介质
TWI705292B (zh) 判斷相機模組之組裝品質的方法
CN210986289U (zh) 四目鱼眼相机及双目鱼眼相机
CN111353945B (zh) 鱼眼图像校正方法、装置及存储介质
TWM594322U (zh) 全向立體視覺的相機配置系統
CN111131689B (zh) 全景图像修复方法和***
JP2006338167A (ja) 画像データ作成方法
CN111080713B (zh) 相机标定***以及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19929895

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019929895

Country of ref document: EP

Effective date: 20211021

NENP Non-entry into the national phase

Ref country code: DE