WO2014106303A1 - Panoramic lens calibration for panoramic image and/or video capture apparatus - Google Patents

Panoramic lens calibration for panoramic image and/or video capture apparatus Download PDF

Info

Publication number
WO2014106303A1
WO2014106303A1 PCT/CA2014/050006 CA2014050006W WO2014106303A1 WO 2014106303 A1 WO2014106303 A1 WO 2014106303A1 CA 2014050006 W CA2014050006 W CA 2014050006W WO 2014106303 A1 WO2014106303 A1 WO 2014106303A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
panoramic
panoramic lens
calibration
image sensor
Prior art date
Application number
PCT/CA2014/050006
Other languages
French (fr)
Inventor
Jerome CARRETERO
Dongxu Li
Youcef RAHAL
Original Assignee
Tamaggo Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tamaggo Inc. filed Critical Tamaggo Inc.
Publication of WO2014106303A1 publication Critical patent/WO2014106303A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B43/00Testing correct operation of photographic apparatus or parts thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe

Definitions

  • This invention relates to panoramic image capture apparatus, and in particular to systems and methods for calibrating a panoramic fish-eye lens during assembly into a panoramic imaging digital camera.
  • Panoramic image capture as used herein is intended to include both still and moving image capture of an image field having a field of view of more than about 2 ⁇ steradians (hemisphere), typically wider than about 200° or 100° from zenith, the optical axis.
  • An imaging apparatus with such a fish-eye lens is able to take so-called “360° pictures” about the optical axis.
  • the output image is round.
  • a 360° camera can provide typical panoramic pictures, but also an immersive experience such as but not limited to: virtual tourism, real estate sales virtual touring, surveillance, immersive conferencing, etc.
  • panoramic image capture, camera and/or lens making an abstraction of the eventual application of the technology.
  • a panomorph fish-eye (non-catadioptric) lens is designed to maximize object space resolution by control of distortion and optimization of the imaging sensor surface. Such a lens projects the incident field of view by design to cover the largest projected area on a rectangular image sensor, wherein the output image field is obround.
  • Figure 1 illustrates such a prior art a panoramic lens provided by ImmerVision mapping the circular virtual image, while focusing, onto an ellipse on the non-unity aspect ratio image sensor. In this way the field of vision is projected as an ellipse on the image sensor array.
  • Panoramic lenses can be tuned to have a very large depth of field, meaning that imaged objects between infinity and a distance very close to the camera appear in focus. In this case the back-focus is very shallow. Because back-focus is shallow (of a few micrometers) and because the projected field of view is entirely contained within the imaging sensor area, performing alignment of the lens focal plane with the imaging sensor surface is a non-trivial task, especially when it comes to mass production of panoramic lenses at competitive prices. Additional manufacturing imperfections such as but not limited to lack of symmetry about the optical axis need to be addressed in order to obtain good image resolution.
  • Summary Image quality in a panoramic camera employing a panoramic lens can be boosted by performing a lens-by-lens calibration on production lenses.
  • a method of calibrating a panoramic lens comprising: providing a panoramic lens; mounting the panoramic lens in a positioning stage; providing a printed circuit board having a non-unity aspect image sensor; generally aligning a longitudinal axis of the panoramic lens with a normal direction of an acquisition surface of the image sensor; and repeatedly moving the positioning stage in accordance with a search algorithm.
  • a calibration method wherein said positioning stage adjusts the distance between the panoramic lens and the imaging sensor, said search algorithm including a z-axis sweep.
  • a calibration method wherein said positioning stage adjusts six degree of freedom between the panoramic lens and the imaging sensor, said search algorithm including an all degrees of freedom search.
  • a method for calibrating wide angle surround lens comprising: providing a wide angle surround lens; mounting the surround lens in a positioning stage; providing a printed circuit board having an image sensor; providing at least one target; generally aligning a longitudinal axis of the surround lens with a normal direction of an acquisition surface of the image sensor; and repeatedly moving the positioning stage in at least one position or orientation direction according to target assessment values.
  • said positioning stage adjusts the distance between the surround lens and the imaging sensor, said repeated moving including a z-axis sweep.
  • said positioning stage adjusts six degrees of freedom between the surround lens and the imaging sensor, said repeated moving including multiple degrees of freedom sweep.
  • said targets comprising at least three targets away from an optical axis of said surround lens.
  • said targets comprise a frequency sweep pattern, preferably a sinusoidal frequency pattern, preferably circular in shape.
  • said targets comprise warped target images that project onto a square image object on said image sensor.
  • a method wherein said surround lens is panomorphic.
  • a method of manufacturing a panoramic camera comprising: mounting an image sensor onto a circuit board; calibrating a wide angle surround lens; affixing the surround lens to the image sensor.
  • Figure 1 is a schematic diagram illustrating a prior art panoramic lens
  • Figure 2 is a schematic diagram illustrating an exploded side view of a panoramic imaging assembly of a panoramic camera, in accordance with an implementation of the proposed solution
  • Figure 3 is another schematic diagram illustrating an exploded perspective view of a panoramic imaging assembly, in accordance with the implementation of the proposed solution
  • Figure 4 is a schematic diagram showing a definition of six degrees of freedom employed in accordance with the proposed solution
  • Figure 5 is a schematic diagram illustrating a panoramic lens mounted for calibration during assembly into a panoramic camera, in accordance with an implementation of the proposed solution
  • Figure 6A is a schematic diagram illustrating a calibration enclosure and example locations of calibration targets, in accordance with an implementation of the proposed solution
  • Figure 6B is a schematic diagram illustrating a distortion of the field of view as imaged by the imaging sensor
  • Figure 7 is a schematic diagram illustrating a generic focus assessment process assessing lens resolution
  • Figure 8 is a schematic diagram illustrating a data processing flow of an iterative focus assessment process for a panoramic lens in accordance with the proposed solution
  • Figure 9 illustrates the USAF-1951 pattern typically employed in visual quality assessment
  • Figures 10A and 10B are schematic diagrams illustrating white noise and the Fourier transform of white noise
  • Figures 1 1 A and 1 1 B are schematic diagrams illustrating blurred white noise and the Fourier transform of blurred white noise
  • Figures 12A and 12B are schematic diagrams illustrating a Fourier transform of a pattern and 1 D sampling among 16 averaged directions with regression results superposed;
  • Figure 13A is a schematic diagram illustrating a digitized calibration pattern showing Moire 2D pattern
  • Figure 13B is a schematic diagram illustrating digitized calibration patterns in the field of view of a calibrated panoramic lens showing Moire 2D patterns
  • Figures 14A, 14B and 14C are schematic diagrams illustrating examples of calibration targets respectively employed on the X, Y and Z axes, in accordance with an implementation of the proposed solution;
  • Figures 14D and 14E are schematic diagrams illustrating examples of auxiliary calibration targets employed on the to the left and right of the X calibration target in accordance with another implementation of the proposed solution;
  • Figure 15A is a schematic diagram illustrating an imaged calibration ovoid in accordance with an implementation of the proposed solution
  • Figure 15B is a schematic diagram illustrating an imaged calibration ovoid with MTF values at good focus, accordance with the implementation of the proposed solution
  • Figure 16 is a schematic diagram illustrating an imaged calibration ovoid for a calibration enclosure employing the circular sinusoidal frequency sweep patterns of the proposed solution, edge detection and the USAF-1951 pattern;
  • Figure 17 is a schematic diagram illustrating an imaged calibration ovoid for a calibration enclosure employing the circular sinusoidal frequency sweep patterns of the proposed solution and the corresponding measured metric values;
  • Figures 18A and 18B are schematic diagrams graphically illustrating calibration pattern detection in accordance with an embodiment of the proposed solution
  • Figure 19 is a schematic diagram illustrating a calibration pattern detected inside a blob in accordance with the embodiment of the proposed solution
  • Figure 20 is a schematic diagram illustrating a Fourier transform of the detected calibration pattern of Figure 19 in accordance with the embodiment of the proposed solution
  • Figure 21 A is a schematic diagram illustrating directions for frequency decay examination in accordance with an example of the implementation of the proposed solution
  • Figure 21 B is a schematic diagram illustrating frequency decay examination directions in accordance with an example of the implementation of the proposed solution
  • Figure 22 is a schematic diagram illustrating an overview of a process of panoramic lens calibration in accordance with an embodiment of the proposed solution
  • Figure 23A is a schematic diagram illustrating an overview of an ellipse detection process for panoramic lens calibration in accordance with an implementation of the proposed solution
  • Figure 23B is a schematic plot illustrating binning of pixels in accordance with the proposed solution.
  • Figure 23C is a schematic diagram illustrating another overview of an ellipse detection process for panoramic lens calibration in accordance with an implementation of the proposed solution
  • Figure 24 is a schematic diagram illustrating a process of panoramic lens calibration adjustment in accordance with an implementation of the proposed solution
  • Figure 25 is a schematic diagram illustrating a process of panoramic lens calibration adjustment in accordance with another implementation of the proposed solution
  • Figure 26 is a schematic diagram illustrating a process of panoramic lens calibration in accordance with another embodiment of the proposed solution.
  • Figure 27 is a schematic diagram illustrating an iterative process of panoramic lens calibration in accordance with yet another aspect of the proposed solution.
  • Figures 2 and 3 respectively illustrate exploded side and perspective views of a panoramic imaging assembly.
  • components of the panoramic imaging assembly include a panoramic lens block, a sleeve, a printed circuit board having an imaging sensor, a heat dissipation/vibration dampening pad, a support bracket and tensioning/vibration dissipating elements mechanically biasing the Printed Circuit Board (PCB) against the support bracket.
  • the support bracket is metallic and can be further configured to further dissipate heat away from the imaging sensor PCB.
  • an Infra-Red (IR) filter can be used to attenuate incident IR radiation.
  • the panoramic lens has a wide depth of field from infinity to a few inches away from the panoramic lens. Alignment with the imaging sensor plane has tight error tolerances as the imaging sensor light sensitive elements are measured in nanometers. Because the camera back-focus is very shallow, and because the field of view is projected inside the rectangular area of the sensor, there is a need to fine-tune (calibrate) the imaging sensor position relative to the panoramic during manufacturing/assembly.
  • Imaged objects at a distance far from the panoramic lens will always be in focus if the optical focus plane of the panoramic lens and the physical plane of imaging sensor coincide.
  • the focal surface of the panoramic lens is non-planar (curved). Therefore the position and orientation of the imaging sensor with respect to the panoramic lens needs to be optimized, including but not limited to centering the field of view with respect to the imaging sensor.
  • the non-unity aspect ratio of the imaging sensor breaks axial symmetry which brings additional challenges. Satisfying lens resolution can be achieved within a few microns of adjustability between the image sensor and the panoramic lens.
  • an assembly method has been elected to use a cement/binder/adhesive to fix the relative position and orientation of the imaging sensor with respect to the panoramic lens bloc.
  • the imaging sensor attachment to the PCB can incur errors in all six degrees of freedom, x-y-z translational (position) errors and skew rotations about the x-y-z axes also known respectively as u-v-w displacement angles defining orientation.
  • the definitions of the six degrees of freedom are provided in Figure 4.
  • the optical axis is the z-axis, while w is the rotation about the z-axis.
  • x and y extend laterally, forward (and backwards) from the panoramic camera.
  • the panoramic lens block unit is manufactured within manufacturing tolerances not necessarily complementary to a particular imaging sensor installed on a particular PCB.
  • the sleeve is a priori fixedly attached to the PCB with weak xy alignment ( ⁇ 0.05mm) using positioning holes. After assembly the PCB and sleeve do not move with respect to one another and can be considered a single mechanical entity.
  • An example of a panoramic lens employed is designed to be affixed via adhesive to the sleeve. Before assembly, there is no physical link between sleeve and panoramic lens, therefore there are 6 Degrees of Freedom (DoF), see Figure 4, to be fixed.
  • DoF Degrees of Freedom
  • the assembly of the panoramic lens into a panoramic camera includes: aligning the imaging sensor with respect to the plane of the panoramic lens using a positioning stage, applying a small quantity of adhesive while the imaging sensor is held in a determined position and orientation, waiting for small quantity of adhesive to cure and removing the assembled panoramic lens and image sensor combination from the positioning stage. After removal, sufficient adhesive can be applied for the necessary fixed engagement. Adhesive can be applied by a robotic arm for example. Alignment includes a calibration process employing a quality control step which can accept or reject the panoramic lens block.
  • Figure 5 illustrates a COTS six degrees of freedom positioning stage having an accuracy/repeatability in the 0.1 m range.
  • the panoramic lens is illustrated at the center of a calibration cube enclosure of 200cm side length, atop the positioning stage pointing up in the vertical direction. Data cables for conveying an image acquired by the imaging sensor are also visible.
  • Figure 5 also illustrates one example of three calibration targets projected on the calibration enclosure.
  • the calibration enclosure and calibration targets provide, a simulated, well known surrounding allowing assessment of panoramic lens focus via imaging sensor output data.
  • the calibration procedure performs mechanical adjustment of the imaging sensor PCB with respect to the panoramic lens.
  • Figure 6A illustrates a schematic example of a calibration enclosure employed to display the calibration targets thereon.
  • the preferred calibration enclosure is substantially cubic as it can be set up easily by ensuring same diagonal measurements.
  • the invention is not limited to the illustrated cubic enclosure.
  • the calibration enclosure can be spherical, spheroidal, pill shape, cylindrical, etc.
  • the panoramic lens maps the incident solid angle onto an ellipse on the non-unity aspect imaging sensor
  • different x-y-z directions have different resolutions: the x and y directions map onto the longitudinal and transverse axes of the image sensor while the z direction maps in the middle of the image sensor.
  • the mapping incurs distortion for example as illustrated in Figure 6B.
  • Alignment consists of setting the relative placement of the imaging sensor PCB along the 6 DoF's with respect to the panoramic lens, the imaging sensor being placed in a jig on a calibration bench under the panoramic lens.
  • Figure 7 illustrates a generic focus assessment process performed by assessing a lens resolution via an acquired image in accordance with the proposed solution.
  • the lens resolution can be maximized by adjusting the relative placement (position and orientation) by maximizing minimum performance values obtained for each calibration target.
  • the lens resolution of a group of zones within the FOV projection of the acquired image is assessed.
  • Figure 8 illustrates a data processing flow of an iterative focus assessment process for a panoramic lens in accordance with the proposed solution. If the calibrated position and orientation does not allow for a certain quality above a threshold, the panoramic lens is rejected.
  • MTF Modulation Transfer Function
  • Figure 7 is presented to help visualize a one- dimensional MTF.
  • a pattern is provided having a progression of frequencies in a range hereafter called a 1 D frequency sweep.
  • the frequency sweep is a printed sinusoidal frequency sweep. It is noted that the printed pattern displays high frequency content.
  • the output of the lens has high contrast at low frequencies and low contrast at high frequencies.
  • the output of the lens can be captured and digitized by an imaging sensor of a certain resolution.
  • Assessment includes plotting pixel values of the input and output images.
  • the first one having constant amplitude showing sinusoidal sampling at low frequencies and a moire pattern at high frequencies due to limited sampling (lower sampling resolution than the pattern print resolution).
  • the amplitude of the output is modulated by the MTF.
  • the envelope of the output amplitude plot divided by the envelope of the first plot represents the MTF.
  • the invention is not limited to a constant amplitude input pattern.
  • the lens MTF decreases with increasing frequency.
  • the MTF is an indicator of the lens resolution and varies with the radial distance away from the optical axis of the lens.
  • the frequency at which the MTF is 50% of its maximum value is expressed in line pairs on the imaging sensor area per millimeter (MTF50).
  • the lines/mm value is representative of the smallest line visible at a certain distance away, for example 10in.
  • An approximation of the MTF50 was chosen as an indicator of lens resolution. It is useless to use a sensor with a resolution a lot higher than the lens resolution, and the minimum imaging sensor resolution which corresponds to the lens resolution is called the effective resolution of the camera.
  • the panoramic lens employed with a 14Mpx sensor has a magnification factor such that the smallest region 1 m away corresponding to an image sensor pixel is 2mm by 2mm. As such, a larger calibration enclosure would not improve the image sensor positioning.
  • the imaged field on the imaging sensor is two dimensional (2D). It was found that panoramic lenses often have local defects which are directional and it is important to measure the MTF (obtain resolution scores) in multiple directions. Therefore lens resolution is examined in many directions.
  • a template of resolution thresholds (for multiple directions) is used to accept or reject a lens block. (The calibration procedure and QA test provide a lens resolution template "MTF50 figure" so as to accept or reject a production lens block.)
  • the prior art USAF-1951 calibration target pattern illustrated in Figure 9 can be employed for horizontal and vertical resolution score measurements only and its use relies on individual subjective assessment. It was found that interpretation of the USAF-1951 calibration target pattern was very variable among individuals.
  • a measurement method and pattern using Fourier transform measurement is employed instead of a direct spatial measurement.
  • Figures 10A and 10B illustrate white noise and the Fourier transform of white noise
  • Figures 1 1A and 1 1 B illustrate blurred white noise and the Fourier transform of blurred white noise.
  • the blur alters high frequencies of the Fourier transform, and the effect on the Fourier transform in Figure 1 1 B is more apparent than on the spatial image Figure 1 1A.
  • MTF50 alpha * nu_c with alpha close to 1 , and depending on the pattern.
  • resolution score an approximation to MTF50. It was found that a calibration target having a circular sine frequency sweep pattern has the following advantages:
  • ⁇ ( ⁇ ) (). ⁇ ) + 0. ⁇ . ⁇ »)( ⁇ 2 ⁇ - ⁇ ' / 2 ⁇ ( / + ⁇ ⁇ ( / - / . ) ) with c being the square side and r the normalized radius from image center to pixel position,
  • Figures 14A, 14B and 14C illustrate examples of calibration targets respectively employed on the x, y and z axes, in accordance with the implementation of the proposed solution employing a cubic calibration enclosure.
  • the calibration targets are drawn so as to look like squares on the image sensor plane, wherein reverse optical mapping is employed. Employing a square captured shape greatly simplifies computation when performing a 2D Fast FT (FFT) on the acquired pattern of pixels.
  • Figures 14D and 14E illustrate examples of auxiliary calibration targets employed on the left and right of the x calibration targets in accordance with another implementation of the proposed solution.
  • Figure 15A illustrates an acquired calibration image obtained through the panoramic lens. The calibration targets are found in big search zones by blob detection, and a wide square area is selected in each search zone.
  • Figure 15B illustrates a well focused panoramic lens. Focus measures/resolution scores for each calibration pattern are also illustrated in each blob.
  • the invention is not limited to employing the circular sinusoidal frequency sweep pattern exclusively. Square high contrast patterns can be employed with edge detection. As well employing a radial pattern is also illustrated.
  • Figure 16 illustrates an imaged calibration ellipse for a calibration enclosure employing the circular sinusoidal frequency sweep patterns of the proposed solution, edge detection and the USAF-1951 calibration target pattern.
  • Figure 17 illustrates another imaged calibration ovoid for a calibration enclosure employing a large number of circular sinusoidal frequency sweep patterns in concentric distributions and the corresponding measured resolution score metric values.
  • the invention does not require the use of an entire calibration box with a full complement of calibration targets, a quarter of the calibration box would suffice based on symmetry considerations.
  • Generation of the calibration patterns to be printed out is performed from the desired shape on the elliptic image.
  • the (x, y) position of the imaged field is converted into a x, y position on the imaging sensor frame, itself converted to incidence angles into the panoramic lens using the lens mapping function provided by the manufacturer.
  • the incidence angle corresponds to a position on the calibration enclosure where the corresponding calibration target pixel is located.
  • Determination of a well acquired calibration pattern can be done by applying the same level of blur to the calibration pattern and to a linear frequency sweep pattern, followed by measuring the resolution score using the direct spatial measurement method and the alternate method. Ideally the measurements should coincide. Real- life applicability is important, that is the metric algorithm needs to be applicable to experimental data as well to simulated data.
  • the positioning stage for example the hexapod illustrated in Figure 5, is moved iteratively from its last position based on a metric improvement algorithm.
  • the motion can be prioritized in the following order: u, v, w, x, y. z-
  • the imaging sensor is configured to send raw data at native resolution.
  • the received data follows the sensor geometry and the data can be converted to luminance values as a preprocessing step, for example as the calibration patterns are gray scale.
  • a sheet of white paper at a known location in the calibration enclosure (for example on the ceiling) provides values necessary for white balance calibration employed for a linear correction to obtain luminance values from the raw data without loss of fidelity.
  • the imaging sensor employs a Bayer filter geometry with R/Gr/Gb/B 2x2 pixel matrix elements. The luminance value is obtained by converting each of the 4 components into luminance values.
  • the circular sinusoidal frequency sweep pattern is detected by blob detection employing the calibration computer executing blob detection executable logic instructions.
  • Figures 18A and 18B graphically illustrate calibration pattern detection. Then a square calibration pattern image is found as illustrated in Figure 19 inside the blob and the Fourier transform of the square pattern found is computed. The corresponding Fourier transform of the square calibration pattern illustrated in Figure 19 is illustrated in Figure 20.
  • the square image is normalized in intensity (before the Fourier transform computation). The frequency decay of the Fourier transform is then examined, in various directions.
  • Figure 21 A illustrates examples of specific directions examined.
  • a least-squares regression is performed with a function consisting of a linear part plus a noise threshold, as illustrated in Figure 21 B.
  • the resolution scores are extracted from the slope of the best fit linear function.
  • the worst resolution score is selected as the metric.
  • a very small Gaussian blur is applied to the square image.
  • the introduction of the Gaussian noise can be controlled so as not to affect the resolution score computation by employing a high cutoff frequency therefor.
  • an imaging sensor positioning search algorithm can be employed for finding the optimum position of the imaging sensor with respect to the panoramic lens in all 6 axes.
  • the imaging sensor is aligned with the FOV projection (output ellipse), more specifically the axes of the imaging sensor are aligned with the axes of the FOV projection (output ellipse) projected on the imaging sensor.
  • the contours of the output projected ellipse are estimated and an elliptic regression can be performed.
  • Ellipse parameters including: center, orientation angle, long and short axis length are provided from the elliptic regression analysis.
  • the w-x-y parameters can be extracted from the elliptic regression parameters. Given a known starting position of the positioning stage, it is possible to determine whether to increase or decrease z based on the ellipse size.
  • an ellipse detection algorithm includes:
  • the histogram is fit with 2 bell curves employing an EM algorithm
  • a Fitzgibbon, Pilu & Fisher's method for direct least squares fitting of ellipses can be employed, without limiting the invention, to fitting of the contour pixels identified.
  • the image is filtered to eliminate high-frequency noise. 2. Binarization of the image is performed in order to obtain contours from the
  • the binarization can be performed by histogram separation in the valley between the black and gray peaks. Histogram separation can be performed using the EM algorithm, clustering algorithms such as k-means, or simply using a fixed threshold.
  • Fisher's method for Direct least squares fitting of ellipses can be employed, without limiting the invention, to fitting of the contour pixels identified.
  • a panoramic lens calibration adjustment algorithm then includes: 1 . detecting ellipse (without limiting the invention, as detailed herein immediately above) 2. estimating w-x-y parameters from obtained ellipse parameters, x and y are extracted from the ellipse center, w is the ellipse angle.
  • the positioning stage is commanded to move along w-x-y-z;
  • another panoramic lens calibration adjustment algorithm then includes:
  • another panoramic lens calibration and adjustment algorithm includes locating the focus plane of the imaging sensor employing a constant-step sweep along the z axis.
  • the algorithm Starting with a good enough z position, for example providing 70% of the expected resolution score value on the center target, the algorithm performs a sweep along the z axis, gathering resolution score values for the calibration targets. A regression is performed on the gathered values until sufficient data is collected to determine an outcome of the z traverse phase. Location, along the z axis, of the imaging sensor focal plane is estimated from the z axis positioning stage adjustment which maximizes the resolution score measurement value(s) of all targets. The imaging sensor is then moved to the location along the z axis corresponding to the focal plane.
  • a zuv control loop algorithm includes:
  • Target detection and resolution score extraction 3. Storage of the current Z value and resolution scores. 4. If enough data is available, regression of each score(z) curve with a curve. Currently the curve is a triangle, with a background noise level. The parameters are maximum, slope, and noise value.
  • step 5 Decide whether to stop data collection or continue the searching loop. The fact that we are sufficiently far from the maximum, that the maximum is within certain bounds, that the slope is above a certain threshold, are the stop criteria. If the criteria are met, then the optimal position in z, u, v has been found, and step 7 is the next.
  • the algorithm includes performing a regression to provide a curve with estimated panoramic lens alignment parameters. For example, such a curve resembles a triangle, with a background noise level.
  • the parameters are a maximum, a slope, and a noise value.
  • the fact that a positioning stage adjustment is sufficiently far from the maximum, the fact that the maximum is within certain bounds, and the fact that the slope is above a certain threshold are considered part of an iterative process stop conditions. It is noted that imaging sensor tilt does not impact the imaging sensor position when the positioning stage is moved, only x-y-z need to be found. More specifically and with reference to Figure 27:
  • the z position is determined (roughly); a. by performing u or v movements; b. a new ellipse position is determined; c. z distance from ideal is adjusted so as to minimize the ellipse position difference (radius); 3. the x/y distances from ideal are determined (ideally 0) a. check ellipse position;
  • another algorithm includes: 1. perform ellipse centering using auto-xy calibration
  • x (resp. y) eccentricity of the sensor frame vs. hexapod frame is in linear relation with the shift on y (resp. x) of the center of the FOV ellipse
  • the invention is not limited to the described calibration targets implemented as plates with printed calibration targets posted on the inner walls of the calibration enclosure.
  • data projectors can be employed with the same enclosure having projected resolutions better than one pixel per 4mm 2 .
  • the use of projectors, such as but not limited to pico projectors addressing both size and power dissipation, can enable the use of multiple patterns for an improved determination of the MTF.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

There is provided a method for calibrating panoramic lens. The method includes: providing a panoramic lens; mounting the panoramic lens in a positioning stage; providing a printed circuit board having a non-unity aspect image sensor; generally aligning a longitudinal axis of the panoramic lens with a normal direction of an acquisition surface of the image sensor; and repeatedly moving the positioning stage in accordance with a search algorithm while acquiring calibration images. The search algorithm employs modulation transfer function derived values to reject or align the image sensor with respect to the panoramic lens.

Description

PANORAMIC LENS CALIBRATION FOR
PANORAMIC IMAGE AND/OR VIDEO CAPTURE APPARATUS
This application claims priority from US Provisional Patent Application 61/749,899, filed 07 January 2013. Technical Field
This invention relates to panoramic image capture apparatus, and in particular to systems and methods for calibrating a panoramic fish-eye lens during assembly into a panoramic imaging digital camera.
Background
In the digital imaging field, there is a growing need for panoramic image capture.
"Panoramic image capture" as used herein is intended to include both still and moving image capture of an image field having a field of view of more than about 2π steradians (hemisphere), typically wider than about 200° or 100° from zenith, the optical axis. An imaging apparatus with such a fish-eye lens is able to take so-called "360° pictures" about the optical axis. When projected on an imaging sensor, the output image is round. For example, a 360° camera can provide typical panoramic pictures, but also an immersive experience such as but not limited to: virtual tourism, real estate sales virtual touring, surveillance, immersive conferencing, etc. For the remainder of the description, reference will be made to "panoramic image capture, camera and/or lens" making an abstraction of the eventual application of the technology.
Current image sensors do not have a non-unity aspect ratio, being also used for other purposes such as: panoramic image capture, video capture, three dimensional image/video capture, etc. Employing a conventional fish-eye lens wastes a large number of pixels of an image sensor which has a non-unity aspect ratio. A panomorph fish-eye (non-catadioptric) lens is designed to maximize object space resolution by control of distortion and optimization of the imaging sensor surface. Such a lens projects the incident field of view by design to cover the largest projected area on a rectangular image sensor, wherein the output image field is obround. Figure 1 illustrates such a prior art a panoramic lens provided by ImmerVision mapping the circular virtual image, while focusing, onto an ellipse on the non-unity aspect ratio image sensor. In this way the field of vision is projected as an ellipse on the image sensor array.
Panoramic lenses can be tuned to have a very large depth of field, meaning that imaged objects between infinity and a distance very close to the camera appear in focus. In this case the back-focus is very shallow. Because back-focus is shallow (of a few micrometers) and because the projected field of view is entirely contained within the imaging sensor area, performing alignment of the lens focal plane with the imaging sensor surface is a non-trivial task, especially when it comes to mass production of panoramic lenses at competitive prices. Additional manufacturing imperfections such as but not limited to lack of symmetry about the optical axis need to be addressed in order to obtain good image resolution.
Summary Image quality in a panoramic camera employing a panoramic lens can be boosted by performing a lens-by-lens calibration on production lenses.
In some embodiments of the proposed solution there is provided a method of calibrating a panoramic lens, the method comprising: providing a panoramic lens; mounting the panoramic lens in a positioning stage; providing a printed circuit board having a non-unity aspect image sensor; generally aligning a longitudinal axis of the panoramic lens with a normal direction of an acquisition surface of the image sensor; and repeatedly moving the positioning stage in accordance with a search algorithm. In other embodiments of the proposed solution there is provided a calibration method wherein said positioning stage adjusts the distance between the panoramic lens and the imaging sensor, said search algorithm including a z-axis sweep.
In other embodiments of the proposed solution there is provided a calibration method wherein said positioning stage adjusts six degree of freedom between the panoramic lens and the imaging sensor, said search algorithm including an all degrees of freedom search.
In accordance with an aspect of the proposed solution there is provided a method for calibrating wide angle surround lens, the method comprising: providing a wide angle surround lens; mounting the surround lens in a positioning stage; providing a printed circuit board having an image sensor; providing at least one target; generally aligning a longitudinal axis of the surround lens with a normal direction of an acquisition surface of the image sensor; and repeatedly moving the positioning stage in at least one position or orientation direction according to target assessment values. In accordance with another aspect of the proposed solution there is provided a method wherein said positioning stage adjusts the distance between the surround lens and the imaging sensor, said repeated moving including a z-axis sweep.
In accordance with a further aspect of the proposed solution there is provided a method wherein said positioning stage adjusts six degrees of freedom between the surround lens and the imaging sensor, said repeated moving including multiple degrees of freedom sweep.
In accordance with a further aspect of the proposed solution there is provided a method wherein said targets comprising at least three targets away from an optical axis of said surround lens. In accordance with a further aspect of the proposed solution there is provided a method wherein said targets comprise a frequency sweep pattern, preferably a sinusoidal frequency pattern, preferably circular in shape. In accordance with a further aspect of the proposed solution there is provided a method wherein said targets comprise warped target images that project onto a square image object on said image sensor.
In accordance with a further aspect of the proposed solution there is provided a method wherein said image sensor has a non-unity aspect ratio.
In accordance with a further aspect of the proposed solution there is provided a method wherein said surround lens projects an oval image onto said image sensor.
In accordance with a further aspect of the proposed solution there is provided a method wherein said surround lens is panomorphic. In accordance with a further aspect of the proposed solution there is provided a method of manufacturing a panoramic camera comprising: mounting an image sensor onto a circuit board; calibrating a wide angle surround lens; affixing the surround lens to the image sensor.
In accordance with yet another aspect of the proposed solution there is provided a method wherein said image sensor is provided with a sleeve, said affixing comprising curing an adhesive to connect said surround lens to said sleeve.
Brief Description of the Drawings
The invention will be better understood by way of the following detailed description of embodiments of the invention with reference to the appended drawings, in which:
Figure 1 is a schematic diagram illustrating a prior art panoramic lens;
Figure 2 is a schematic diagram illustrating an exploded side view of a panoramic imaging assembly of a panoramic camera, in accordance with an implementation of the proposed solution; Figure 3 is another schematic diagram illustrating an exploded perspective view of a panoramic imaging assembly, in accordance with the implementation of the proposed solution;
Figure 4 is a schematic diagram showing a definition of six degrees of freedom employed in accordance with the proposed solution;
Figure 5 is a schematic diagram illustrating a panoramic lens mounted for calibration during assembly into a panoramic camera, in accordance with an implementation of the proposed solution;
Figure 6A is a schematic diagram illustrating a calibration enclosure and example locations of calibration targets, in accordance with an implementation of the proposed solution;
Figure 6B is a schematic diagram illustrating a distortion of the field of view as imaged by the imaging sensor;
Figure 7 is a schematic diagram illustrating a generic focus assessment process assessing lens resolution;
Figure 8 is a schematic diagram illustrating a data processing flow of an iterative focus assessment process for a panoramic lens in accordance with the proposed solution;
Figure 9 illustrates the USAF-1951 pattern typically employed in visual quality assessment;
Figures 10A and 10B are schematic diagrams illustrating white noise and the Fourier transform of white noise;
Figures 1 1 A and 1 1 B are schematic diagrams illustrating blurred white noise and the Fourier transform of blurred white noise; Figures 12A and 12B are schematic diagrams illustrating a Fourier transform of a pattern and 1 D sampling among 16 averaged directions with regression results superposed;
Figure 13A is a schematic diagram illustrating a digitized calibration pattern showing Moire 2D pattern;
Figure 13B is a schematic diagram illustrating digitized calibration patterns in the field of view of a calibrated panoramic lens showing Moire 2D patterns;
Figures 14A, 14B and 14C are schematic diagrams illustrating examples of calibration targets respectively employed on the X, Y and Z axes, in accordance with an implementation of the proposed solution;
Figures 14D and 14E are schematic diagrams illustrating examples of auxiliary calibration targets employed on the to the left and right of the X calibration target in accordance with another implementation of the proposed solution;
Figure 15A is a schematic diagram illustrating an imaged calibration ovoid in accordance with an implementation of the proposed solution;
Figure 15B is a schematic diagram illustrating an imaged calibration ovoid with MTF values at good focus, accordance with the implementation of the proposed solution;
Figure 16 is a schematic diagram illustrating an imaged calibration ovoid for a calibration enclosure employing the circular sinusoidal frequency sweep patterns of the proposed solution, edge detection and the USAF-1951 pattern;
Figure 17 is a schematic diagram illustrating an imaged calibration ovoid for a calibration enclosure employing the circular sinusoidal frequency sweep patterns of the proposed solution and the corresponding measured metric values;
Figures 18A and 18B are schematic diagrams graphically illustrating calibration pattern detection in accordance with an embodiment of the proposed solution; Figure 19 is a schematic diagram illustrating a calibration pattern detected inside a blob in accordance with the embodiment of the proposed solution;
Figure 20 is a schematic diagram illustrating a Fourier transform of the detected calibration pattern of Figure 19 in accordance with the embodiment of the proposed solution;
Figure 21 A is a schematic diagram illustrating directions for frequency decay examination in accordance with an example of the implementation of the proposed solution;
Figure 21 B is a schematic diagram illustrating frequency decay examination directions in accordance with an example of the implementation of the proposed solution;
Figure 22 is a schematic diagram illustrating an overview of a process of panoramic lens calibration in accordance with an embodiment of the proposed solution;
Figure 23A is a schematic diagram illustrating an overview of an ellipse detection process for panoramic lens calibration in accordance with an implementation of the proposed solution;
Figure 23B is a schematic plot illustrating binning of pixels in accordance with the proposed solution;
Figure 23C is a schematic diagram illustrating another overview of an ellipse detection process for panoramic lens calibration in accordance with an implementation of the proposed solution;
Figure 24 is a schematic diagram illustrating a process of panoramic lens calibration adjustment in accordance with an implementation of the proposed solution;
Figure 25 is a schematic diagram illustrating a process of panoramic lens calibration adjustment in accordance with another implementation of the proposed solution; Figure 26 is a schematic diagram illustrating a process of panoramic lens calibration in accordance with another embodiment of the proposed solution; and
Figure 27 is a schematic diagram illustrating an iterative process of panoramic lens calibration in accordance with yet another aspect of the proposed solution.
Detailed Description
In accordance with an implementation of the proposed solution, Figures 2 and 3 respectively illustrate exploded side and perspective views of a panoramic imaging assembly. From left to right in Figure 2, and from front to back in Figure 3, components of the panoramic imaging assembly include a panoramic lens block, a sleeve, a printed circuit board having an imaging sensor, a heat dissipation/vibration dampening pad, a support bracket and tensioning/vibration dissipating elements mechanically biasing the Printed Circuit Board (PCB) against the support bracket. In some implementations the support bracket is metallic and can be further configured to further dissipate heat away from the imaging sensor PCB. Depending on the intended use of the panoramic camera, an Infra-Red (IR) filter can be used to attenuate incident IR radiation.
A person of ordinary skill in the art would understand that component manufacturing and component assembly incur manufacturing and assembly errors respectively. Some manufacturing/assembly errors have loose restrictions on error tolerances reducing manufacturing costs, while other manufacturing/assembly errors have tight tolerances in order to provide a usable panoramic lens.
As described hereinabove, the panoramic lens has a wide depth of field from infinity to a few inches away from the panoramic lens. Alignment with the imaging sensor plane has tight error tolerances as the imaging sensor light sensitive elements are measured in nanometers. Because the camera back-focus is very shallow, and because the field of view is projected inside the rectangular area of the sensor, there is a need to fine-tune (calibrate) the imaging sensor position relative to the panoramic during manufacturing/assembly.
Imaged objects at a distance far from the panoramic lens will always be in focus if the optical focus plane of the panoramic lens and the physical plane of imaging sensor coincide. However while the imaging sensor surface is planar, the focal surface of the panoramic lens is non-planar (curved). Therefore the position and orientation of the imaging sensor with respect to the panoramic lens needs to be optimized, including but not limited to centering the field of view with respect to the imaging sensor. Also the non-unity aspect ratio of the imaging sensor breaks axial symmetry which brings additional challenges. Satisfying lens resolution can be achieved within a few microns of adjustability between the image sensor and the panoramic lens. Once assembled relative motion between the imaging sensor and the panoramic lens should be negligible (shock-resistant device) otherwise autofocus may be required with a corresponding increase in manufacturing cost. Because of a difficulty of having tight mechanical tolerances in the manufacturing of the panoramic lens barrel, an assembly method has been elected to use a cement/binder/adhesive to fix the relative position and orientation of the imaging sensor with respect to the panoramic lens bloc. The imaging sensor attachment to the PCB can incur errors in all six degrees of freedom, x-y-z translational (position) errors and skew rotations about the x-y-z axes also known respectively as u-v-w displacement angles defining orientation. The definitions of the six degrees of freedom are provided in Figure 4. The optical axis is the z-axis, while w is the rotation about the z-axis. x and y extend laterally, forward (and backwards) from the panoramic camera. The panoramic lens block unit is manufactured within manufacturing tolerances not necessarily complementary to a particular imaging sensor installed on a particular PCB. The sleeve is a priori fixedly attached to the PCB with weak xy alignment (~0.05mm) using positioning holes. After assembly the PCB and sleeve do not move with respect to one another and can be considered a single mechanical entity. An example of a panoramic lens employed is designed to be affixed via adhesive to the sleeve. Before assembly, there is no physical link between sleeve and panoramic lens, therefore there are 6 Degrees of Freedom (DoF), see Figure 4, to be fixed.
In accordance with the proposed solution, the assembly of the panoramic lens into a panoramic camera includes: aligning the imaging sensor with respect to the plane of the panoramic lens using a positioning stage, applying a small quantity of adhesive while the imaging sensor is held in a determined position and orientation, waiting for small quantity of adhesive to cure and removing the assembled panoramic lens and image sensor combination from the positioning stage. After removal, sufficient adhesive can be applied for the necessary fixed engagement. Adhesive can be applied by a robotic arm for example. Alignment includes a calibration process employing a quality control step which can accept or reject the panoramic lens block.
Figure 5 illustrates a COTS six degrees of freedom positioning stage having an accuracy/repeatability in the 0.1 m range. The panoramic lens is illustrated at the center of a calibration cube enclosure of 200cm side length, atop the positioning stage pointing up in the vertical direction. Data cables for conveying an image acquired by the imaging sensor are also visible. Figure 5 also illustrates one example of three calibration targets projected on the calibration enclosure. The calibration enclosure and calibration targets provide, a simulated, well known surrounding allowing assessment of panoramic lens focus via imaging sensor output data. The calibration procedure performs mechanical adjustment of the imaging sensor PCB with respect to the panoramic lens.
In accordance with the proposed solution Figure 6A illustrates a schematic example of a calibration enclosure employed to display the calibration targets thereon. The preferred calibration enclosure is substantially cubic as it can be set up easily by ensuring same diagonal measurements. However the invention is not limited to the illustrated cubic enclosure. For reasons presented hereinbelow, the calibration enclosure can be spherical, spheroidal, pill shape, cylindrical, etc.
Returning to the illustrated example cubic calibration enclosure, as the panoramic lens maps the incident solid angle onto an ellipse on the non-unity aspect imaging sensor, different x-y-z directions have different resolutions: the x and y directions map onto the longitudinal and transverse axes of the image sensor while the z direction maps in the middle of the image sensor. The mapping incurs distortion for example as illustrated in Figure 6B. Alignment consists of setting the relative placement of the imaging sensor PCB along the 6 DoF's with respect to the panoramic lens, the imaging sensor being placed in a jig on a calibration bench under the panoramic lens.
Figure 7 illustrates a generic focus assessment process performed by assessing a lens resolution via an acquired image in accordance with the proposed solution. The lens resolution can be maximized by adjusting the relative placement (position and orientation) by maximizing minimum performance values obtained for each calibration target. As described herein below, for a panoramic lens, the lens resolution of a group of zones within the FOV projection of the acquired image is assessed. Figure 8 illustrates a data processing flow of an iterative focus assessment process for a panoramic lens in accordance with the proposed solution. If the calibrated position and orientation does not allow for a certain quality above a threshold, the panoramic lens is rejected.
Lens resolution is expressed using a Modulation Transfer Function (MTF) criterion for various incidence angles. In general, MTF criteria provide an assessment regarding the finest features which can be imaged with a lens, not necessarily the smallest features which can be discerned. MTF is usually measured using the 5° slanted edge measurement technique documented in ISO standard 12233, which is incorporated herein by reference.
By way of simple illustration, Figure 7 is presented to help visualize a one- dimensional MTF. A pattern is provided having a progression of frequencies in a range hereafter called a 1 D frequency sweep. Without limiting the invention, in the illustrated example, the frequency sweep is a printed sinusoidal frequency sweep. It is noted that the printed pattern displays high frequency content. When the pattern is observed though a lens, the output of the lens has high contrast at low frequencies and low contrast at high frequencies. The output of the lens can be captured and digitized by an imaging sensor of a certain resolution. Assessment includes plotting pixel values of the input and output images. The first one having constant amplitude showing sinusoidal sampling at low frequencies and a moire pattern at high frequencies due to limited sampling (lower sampling resolution than the pattern print resolution). The amplitude of the output is modulated by the MTF. The envelope of the output amplitude plot divided by the envelope of the first plot (=1 in this case) represents the MTF. The invention is not limited to a constant amplitude input pattern.
The lens MTF decreases with increasing frequency. The MTF is an indicator of the lens resolution and varies with the radial distance away from the optical axis of the lens. The frequency at which the MTF is 50% of its maximum value is expressed in line pairs on the imaging sensor area per millimeter (MTF50). The lines/mm value is representative of the smallest line visible at a certain distance away, for example 10in. An approximation of the MTF50 was chosen as an indicator of lens resolution. It is useless to use a sensor with a resolution a lot higher than the lens resolution, and the minimum imaging sensor resolution which corresponds to the lens resolution is called the effective resolution of the camera. For example, the panoramic lens employed with a 14Mpx sensor, has a magnification factor such that the smallest region 1 m away corresponding to an image sensor pixel is 2mm by 2mm. As such, a larger calibration enclosure would not improve the image sensor positioning. Returning to the panoramic lens of the proposed solution, the imaged field on the imaging sensor is two dimensional (2D). It was found that panoramic lenses often have local defects which are directional and it is important to measure the MTF (obtain resolution scores) in multiple directions. Therefore lens resolution is examined in many directions. A template of resolution thresholds (for multiple directions) is used to accept or reject a lens block. (The calibration procedure and QA test provide a lens resolution template "MTF50 figure" so as to accept or reject a production lens block.)
For example, the prior art USAF-1951 calibration target pattern illustrated in Figure 9 can be employed for horizontal and vertical resolution score measurements only and its use relies on individual subjective assessment. It was found that interpretation of the USAF-1951 calibration target pattern was very variable among individuals. In accordance with the proposed solution, a measurement method and pattern using Fourier transform measurement is employed instead of a direct spatial measurement.
Figures 10A and 10B illustrate white noise and the Fourier transform of white noise, while Figures 1 1A and 1 1 B illustrate blurred white noise and the Fourier transform of blurred white noise. The blur alters high frequencies of the Fourier transform, and the effect on the Fourier transform in Figure 1 1 B is more apparent than on the spatial image Figure 1 1A.
While performing simulations with more gradual blur, a way was found to automatically assess the shape of the Fourier transform, by regression against frequency along various angles, with the following function: f(r) = max(a0 - nu * a0/(2*nu_c), noise_floor) with:
• aO being the amplitude at nu=0, and · nu_c being the frequency at which the amplitude is aO/2.
This simple regression was motivated by the shape of the curves obtained by observing directional I D-sampling of the 2D Fourier transform as illustrated in Figures 12A and 12B.
The Fourier transform of the image is obtained and the shape of its magnitude vs. frequency is examined to obtain a frequency cut-off value nu_c. Direct observation of the MTF using a 1 D linear frequency sweep vs. observation of FT cut-off frequency from 2D patterns revealed that a simple linear relation was enough to estimate MTF50 from nu_c:
MTF50 = alpha * nu_c with alpha close to 1 , and depending on the pattern. Hereafter reference will be made to resolution score as an approximation to MTF50. It was found that a calibration target having a circular sine frequency sweep pattern has the following advantages:
• provides a performance metric value whether focused or not,
• provides visual cues of good focus, the moire patterns illustrated in Figure 13A for a calibration target and illustrated in Figure 13B for a well focused panoramic lens, and
• provides resolution measurements in all directions which is important for high-FOV fish-eye lenses such as the panoramic lens employed in the proposed solution. Good focus visual cues are provided in the form of moire patterns much like the digitized 1 D pattern of Figure 7. The corresponding pixel equation is:
Γ( Γ ) = ().Γ) + 0.ό .ν»)( ·2 τ - ί' / 2 τ ( / + Γ · ( / - / . ) ) with c being the square side and r the normalized radius from image center to pixel position,
Figure imgf000015_0001
Simulations have shown that the maximum frequency (on the calibration target) only needs to be higher than the maximum measured frequency, with a little security factor enabling the size of the square to be adapted (varied).
Figures 14A, 14B and 14C illustrate examples of calibration targets respectively employed on the x, y and z axes, in accordance with the implementation of the proposed solution employing a cubic calibration enclosure. The calibration targets are drawn so as to look like squares on the image sensor plane, wherein reverse optical mapping is employed. Employing a square captured shape greatly simplifies computation when performing a 2D Fast FT (FFT) on the acquired pattern of pixels. Figures 14D and 14E illustrate examples of auxiliary calibration targets employed on the left and right of the x calibration targets in accordance with another implementation of the proposed solution. Figure 15A illustrates an acquired calibration image obtained through the panoramic lens. The calibration targets are found in big search zones by blob detection, and a wide square area is selected in each search zone. Figure 15B illustrates a well focused panoramic lens. Focus measures/resolution scores for each calibration pattern are also illustrated in each blob. The invention is not limited to employing the circular sinusoidal frequency sweep pattern exclusively. Square high contrast patterns can be employed with edge detection. As well employing a radial pattern is also illustrated. Figure 16 illustrates an imaged calibration ellipse for a calibration enclosure employing the circular sinusoidal frequency sweep patterns of the proposed solution, edge detection and the USAF-1951 calibration target pattern. Figure 17 illustrates another imaged calibration ovoid for a calibration enclosure employing a large number of circular sinusoidal frequency sweep patterns in concentric distributions and the corresponding measured resolution score metric values. For certainty, the invention does not require the use of an entire calibration box with a full complement of calibration targets, a quarter of the calibration box would suffice based on symmetry considerations.
For each calibration target, the bigger the square size (number of observed pixels) is, the more accurate resolution score measurement is, but the longer the computation time is. A square size of ~256 pixels was judged good enough. Generation of the calibration patterns to be printed out is performed from the desired shape on the elliptic image. The (x, y) position of the imaged field is converted into a x, y position on the imaging sensor frame, itself converted to incidence angles into the panoramic lens using the lens mapping function provided by the manufacturer. The incidence angle corresponds to a position on the calibration enclosure where the corresponding calibration target pixel is located.
Determination of a well acquired calibration pattern can be done by applying the same level of blur to the calibration pattern and to a linear frequency sweep pattern, followed by measuring the resolution score using the direct spatial measurement method and the alternate method. Ideally the measurements should coincide. Real- life applicability is important, that is the metric algorithm needs to be applicable to experimental data as well to simulated data.
With reference to Figure 8, the positioning stage, for example the hexapod illustrated in Figure 5, is moved iteratively from its last position based on a metric improvement algorithm. For example the motion can be prioritized in the following order: u, v, w, x, y. z-
As used in the lens calibration jig, the imaging sensor is configured to send raw data at native resolution. The received data follows the sensor geometry and the data can be converted to luminance values as a preprocessing step, for example as the calibration patterns are gray scale. A sheet of white paper at a known location in the calibration enclosure (for example on the ceiling) provides values necessary for white balance calibration employed for a linear correction to obtain luminance values from the raw data without loss of fidelity. In practice, the imaging sensor employs a Bayer filter geometry with R/Gr/Gb/B 2x2 pixel matrix elements. The luminance value is obtained by converting each of the 4 components into luminance values.
Having acquired and preprocessed the raw data into luminance values, the circular sinusoidal frequency sweep pattern is detected by blob detection employing the calibration computer executing blob detection executable logic instructions. Figures 18A and 18B graphically illustrate calibration pattern detection. Then a square calibration pattern image is found as illustrated in Figure 19 inside the blob and the Fourier transform of the square pattern found is computed. The corresponding Fourier transform of the square calibration pattern illustrated in Figure 19 is illustrated in Figure 20. In accordance with one implementation of the proposed solution, the square image is normalized in intensity (before the Fourier transform computation). The frequency decay of the Fourier transform is then examined, in various directions. Figure 21 A illustrates examples of specific directions examined. A least-squares regression is performed with a function consisting of a linear part plus a noise threshold, as illustrated in Figure 21 B. The resolution scores are extracted from the slope of the best fit linear function. The worst resolution score is selected as the metric. In accordance with the proposed solution, it was found that in order to improve the regression by removing noise caused by the Bayer pattern values, and also to reduce the influence of bad pixels, a very small Gaussian blur is applied to the square image. The introduction of the Gaussian noise can be controlled so as not to affect the resolution score computation by employing a high cutoff frequency therefor.
In accordance with the proposed solution, an imaging sensor positioning search algorithm can be employed for finding the optimum position of the imaging sensor with respect to the panoramic lens in all 6 axes.
In accordance with an embodiment of the proposed solution and with reference to Figure 22, the imaging sensor is aligned with the FOV projection (output ellipse), more specifically the axes of the imaging sensor are aligned with the axes of the FOV projection (output ellipse) projected on the imaging sensor. The contours of the output projected ellipse are estimated and an elliptic regression can be performed. Ellipse parameters including: center, orientation angle, long and short axis length are provided from the elliptic regression analysis. The w-x-y parameters can be extracted from the elliptic regression parameters. Given a known starting position of the positioning stage, it is possible to determine whether to increase or decrease z based on the ellipse size.
In accordance with an implementation of the embodiment and with reference to Figure 23A of the proposed solution, an ellipse detection algorithm includes:
1 . If the image is blurred as judged against a blur threshold, an image histogram is obtained;
2. The histogram is fit with 2 bell curves employing an EM algorithm;
3. Binning of pixels is performed based on histogram separation, wherein the histogram is separated between the bell curves (see Figure 23B);
4. Closed contour detection is performed on binned pixels; and 5. Bad contours are eliminated. Practice has found that contours located outside of the ellipse are due to flares. Such contours have a centroid outside of an ellipse of predetermined size corresponding to the imaging sensor.
6. For example, a Fitzgibbon, Pilu & Fisher's method for direct least squares fitting of ellipses can be employed, without limiting the invention, to fitting of the contour pixels identified.
In accordance with another implementation of the embodiment of the proposed solution another ellipse detection algorithm includes:
1 . The image is filtered to eliminate high-frequency noise. 2. Binarization of the image is performed in order to obtain contours from the
FOV resulting in 3 dominant shades in the picture: the black level, white level, and target gray level. The binarization can be performed by histogram separation in the valley between the black and gray peaks. Histogram separation can be performed using the EM algorithm, clustering algorithms such as k-means, or simply using a fixed threshold.
3. Closed contour detection is performed on the binary image. Bad contours are eliminated; practice has found that contours located outside of the ellipse are due to flares. Such contours have a centroid outside of an ellipse of predetermined size.
4. Ellipse fitting is performed on the contours, and the ellipse orientation (w), center (x, y) and dimensions are extracted. For example, a Fitzgibbon, Pilu &
Fisher's method for Direct least squares fitting of ellipses can be employed, without limiting the invention, to fitting of the contour pixels identified.
In accordance with an implementation of the proposed solution illustrated in Figure 24, a panoramic lens calibration adjustment algorithm then includes: 1 . detecting ellipse (without limiting the invention, as detailed herein immediately above) 2. estimating w-x-y parameters from obtained ellipse parameters, x and y are extracted from the ellipse center, w is the ellipse angle.
3. Estimating z displacement from ellipse size. The manufactured panoramic lens specification lists a certain size projected ellipsoid of a calibrated lens. From the uncalibrated ellipse size and obtained w-x-y parameters, the direction in which the panoramic lens is to be moved on the optical axis can in order to achieve a certain size can be determined.
4. The positioning stage is commanded to move along w-x-y-z; and
5 The alignment process resumes from step 1 unless further improvement is not achievable.
In accordance with another implementation of the proposed solution, another panoramic lens calibration adjustment algorithm then includes:
1. Acquisition of an image from the imaging sensor
2. Ellipse detection from the acquired sensor image 3. Computation of relative movement in the sensor geometric frame:
• Movement along z, yielding the rough z position, is performed by comparison of the ellipse area to an expected value.
• Movement along x (resp. y) is performed by comparison of the ellipse center to the sensor center. · Movement along w is performed from the ellipse rotation.
4. If the movement vector is smaller than a specified threshold, the algorithm terminates.
5. Computation of the relative movement in the hexapod frame by propagation of the wxyz coordinates into WXYZ coordinates.
Movement of hexapod. In accordance with another implementation of the proposed solution and with reference to Figure 25, a search along x-y-z axes can be employed to align the imaging sensor with respect to the panoramic lens:
1 . Use a dichotomy-like algorithm to find the best resolution score for the center target;
2. Perform a stochastic dichotomy-like blind search algorithm to adjust u (resp. v) in order to maximize the minimum resolution score on 2 targets on the long axis (resp. short axis).
In accordance with yet another embodiment of the proposed solution and with reference to Figure 26, another panoramic lens calibration and adjustment algorithm includes locating the focus plane of the imaging sensor employing a constant-step sweep along the z axis.
Starting with a good enough z position, for example providing 70% of the expected resolution score value on the center target, the algorithm performs a sweep along the z axis, gathering resolution score values for the calibration targets. A regression is performed on the gathered values until sufficient data is collected to determine an outcome of the z traverse phase. Location, along the z axis, of the imaging sensor focal plane is estimated from the z axis positioning stage adjustment which maximizes the resolution score measurement value(s) of all targets. The imaging sensor is then moved to the location along the z axis corresponding to the focal plane.
In accordance with an implementation, a zuv control loop algorithm includes:
1 . Acquisition of an image from the sensor.
2. Target detection and resolution score extraction. 3. Storage of the current Z value and resolution scores. 4. If enough data is available, regression of each score(z) curve with a curve. Currently the curve is a triangle, with a background noise level. The parameters are maximum, slope, and noise value.
5. Decide whether to stop data collection or continue the searching loop. The fact that we are sufficiently far from the maximum, that the maximum is within certain bounds, that the slope is above a certain threshold, are the stop criteria. If the criteria are met, then the optimal position in z, u, v has been found, and step 7 is the next.
6. Move to another z position, expanding the search zone, and go to step 1.
7. Move to the optimal position. In accordance with an implementation, during the collection of resolution score measurements of a z sweep the algorithm includes performing a regression to provide a curve with estimated panoramic lens alignment parameters. For example, such a curve resembles a triangle, with a background noise level. The parameters are a maximum, a slope, and a noise value. The fact that a positioning stage adjustment is sufficiently far from the maximum, the fact that the maximum is within certain bounds, and the fact that the slope is above a certain threshold are considered part of an iterative process stop conditions. It is noted that imaging sensor tilt does not impact the imaging sensor position when the positioning stage is moved, only x-y-z need to be found. More specifically and with reference to Figure 27:
1. ellipse centering is performed employing auto-xy calibration;
2. the z position is determined (roughly); a. by performing u or v movements; b. a new ellipse position is determined; c. z distance from ideal is adjusted so as to minimize the ellipse position difference (radius); 3. the x/y distances from ideal are determined (ideally 0) a. check ellipse position;
b. perform a w movement (w+=1 °);
c. check the ellipse position;
d. return to w=0; and
e. adjust x/y according to the ellipse movement
4. resume from 1 unless the xyz distance radius is small enough. In accordance with another implementation, another algorithm includes: 1. perform ellipse centering using auto-xy calibration
2. determine the z distance, from an initial guess
1. determine initial ellipse position
2. perform a significant u or v movement
3. determine new ellipse position
4. adjust z so as to minimize the ellipse position difference (radius) 3. determine the x/y distances (ideally 0, but typically not 0)
1. determine initial ellipse position at w=0
2. perform a w movement (w+=1 °)
3. determine new the ellipse position
4. return to w=0
5. adjust x/y according to the ellipse movement:
1. x adjusted from the y movement 2. y adjusted from the x movement 4. goto 1 unless the xyz distance radius is small enough. The algorithm takes advantage of the following:
1 . for small angles of w rotation, x (resp. y) eccentricity of the sensor frame vs. hexapod frame is in linear relation with the shift on y (resp. x) of the center of the FOV ellipse
2. w rotation does not impact z component eccentricity of the sensor frame vs. hexapod frame when the lens is calibrated (hexapod Z axis is confounded with optical axis, normal to focal plane confounded with sensor plane) 3. x/y eccentricity should be small, so y eccentricity impact is minor during u movement compared to the one of z.
The invention is not limited to the described calibration targets implemented as plates with printed calibration targets posted on the inner walls of the calibration enclosure. For example data projectors can be employed with the same enclosure having projected resolutions better than one pixel per 4mm2. The use of projectors, such as but not limited to pico projectors addressing both size and power dissipation, can enable the use of multiple patterns for an improved determination of the MTF.
While the invention has been shown and described with referenced to preferred embodiments thereof, it will be recognized by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims

What is claimed is:
1 . A method for calibrating wide angle panoramic lens, the method comprising: providing a wide angle panoramic lens; providing an electronic board having an image sensor; mounting one of the panoramic lens and the electronic board in a positioning stage; providing at least one camera testing chart including at least one resolution measurement chart; generally aligning an optical axis of the panoramic lens with a normal direction of an acquisition surface of the imaging sensor; and repeatedly moving the positioning stage in at least one position or orientation direction according to target assessment values to align a projection of the field of view to the center of the image sensor area and the focal plane correspond to the imaging plane of the imaging sensor.
2. A method as claimed in claim 1 , wherein said positioning stage adjusts the distance between the panoramic lens and the imaging sensor, in order to determine a plane of maximal resolution values of at least three resolution measurement test charts.
3. A method as claimed in claim 1 or 2, wherein said positioning stage adjusts six degrees of freedom between the panoramic lens and the imaging sensor, said repeated moving including multiple degrees of freedom sweep.
4. A method as claimed in claim 1 , 2 or 3, wherein said resolution measurement charts comprise warped target images that project onto a square image object on said image sensor.
5. A method as claimed in any one of claims 1 to 4, wherein said resolution measurement charts as projected on the focal plane comprise a broad frequency content pattern including one of circular sinusoidal frequency sweep pattern and a white noise pattern, preferably circular sinusoidal frequency pattern.
6. A method as claimed in any one of claims 1 to 5, further comprising obtaining resolution thresholds.
7. A method as claimed in any one of claims 1 to 6, wherein said image sensor has a non-unity aspect ratio.
8. A method as claimed in any one of claims 1 to 7, wherein said panoramic lens projects an obround image onto said image sensor.
9. A method as claimed in any one of claims 1 to 8, wherein said lens is panomorphic.
10. A method as claimed in any one of claims 1 to 9, wherein said lens is fixed in position with respect to said testing chart, and said electronic board is mounted to said positioning stage.
1 1 . A method of manufacturing a panoramic camera comprising: mounting an image sensor onto a circuit board; calibrating a wide angle panoramic lens as claimed in any one of claims 1 to 10; affixing the panoramic lens to the image sensor.
12. A method as claimed in claim 1 1 , wherein said image sensor is provided with sleeve, said affixing comprising curing an adhesive to connect said panoram lens to said sleeve.
PCT/CA2014/050006 2013-01-07 2014-01-07 Panoramic lens calibration for panoramic image and/or video capture apparatus WO2014106303A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361749899P 2013-01-07 2013-01-07
US61/749,899 2013-01-07

Publications (1)

Publication Number Publication Date
WO2014106303A1 true WO2014106303A1 (en) 2014-07-10

Family

ID=51062116

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2014/050006 WO2014106303A1 (en) 2013-01-07 2014-01-07 Panoramic lens calibration for panoramic image and/or video capture apparatus

Country Status (2)

Country Link
TW (1) TW201439665A (en)
WO (1) WO2014106303A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014202541A1 (en) * 2014-02-12 2015-08-13 Oliver Jenner Image acquisition device with parallel kinematic movement device
DE102014220519A1 (en) * 2014-10-09 2016-04-14 Robert Bosch Gmbh Method for positioning an image-receiving element to optical device and use of the method
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
US9674433B1 (en) * 2014-07-24 2017-06-06 Hoyos Vsn Corp. Image center calibration for a quadric panoramic optical device
WO2018094940A1 (en) * 2016-11-24 2018-05-31 深圳市圆周率软件科技有限责任公司 Mass production method and system for panorama camera
CN108198222A (en) * 2018-01-29 2018-06-22 大连东软信息学院 A kind of wide-angle lens calibration and image correction method
CN109375470A (en) * 2018-12-07 2019-02-22 歌尔股份有限公司 A kind of test device, test macro and the test method of wide-angle mould group
CN111006772A (en) * 2019-12-31 2020-04-14 上海市计量测试技术研究院 Standard plate group and method for detecting minimum detectable size of thermal infrared imager
CN111669547A (en) * 2020-05-29 2020-09-15 成都易瞳科技有限公司 Panoramic video structuring method
CN112616009A (en) * 2020-12-31 2021-04-06 维沃移动通信有限公司 Electronic equipment and camera module thereof
CN113012032A (en) * 2021-03-03 2021-06-22 中国人民解放军战略支援部队信息工程大学 Aerial panoramic image display method capable of automatically labeling place names

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107941346A (en) * 2017-11-16 2018-04-20 中国电子科技集团公司第十三研究所 Spatial resolution calibrating installation and preparation method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252195A1 (en) * 2003-06-13 2004-12-16 Jih-Yung Lu Method of aligning lens and sensor of camera
WO2011018678A1 (en) * 2009-08-11 2011-02-17 Ether Precision, Inc. Method and device for aligning a lens with an optical system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252195A1 (en) * 2003-06-13 2004-12-16 Jih-Yung Lu Method of aligning lens and sensor of camera
WO2011018678A1 (en) * 2009-08-11 2011-02-17 Ether Precision, Inc. Method and device for aligning a lens with an optical system

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014202541A1 (en) * 2014-02-12 2015-08-13 Oliver Jenner Image acquisition device with parallel kinematic movement device
US9674433B1 (en) * 2014-07-24 2017-06-06 Hoyos Vsn Corp. Image center calibration for a quadric panoramic optical device
DE102014220519A1 (en) * 2014-10-09 2016-04-14 Robert Bosch Gmbh Method for positioning an image-receiving element to optical device and use of the method
DE102014220519B4 (en) 2014-10-09 2021-11-11 Robert Bosch Gmbh Method for positioning an image-receiving element in relation to an optical device and use of the method
WO2018094940A1 (en) * 2016-11-24 2018-05-31 深圳市圆周率软件科技有限责任公司 Mass production method and system for panorama camera
CN106803273B (en) * 2017-01-17 2019-11-22 湖南优象科技有限公司 A kind of panoramic camera scaling method
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
CN108198222B (en) * 2018-01-29 2021-09-03 大连东软信息学院 Wide-angle lens calibration and image correction method
CN108198222A (en) * 2018-01-29 2018-06-22 大连东软信息学院 A kind of wide-angle lens calibration and image correction method
CN109375470A (en) * 2018-12-07 2019-02-22 歌尔股份有限公司 A kind of test device, test macro and the test method of wide-angle mould group
CN109375470B (en) * 2018-12-07 2021-12-10 歌尔光学科技有限公司 Testing device, testing system and testing method of wide-angle module
CN111006772A (en) * 2019-12-31 2020-04-14 上海市计量测试技术研究院 Standard plate group and method for detecting minimum detectable size of thermal infrared imager
CN111669547A (en) * 2020-05-29 2020-09-15 成都易瞳科技有限公司 Panoramic video structuring method
CN112616009A (en) * 2020-12-31 2021-04-06 维沃移动通信有限公司 Electronic equipment and camera module thereof
CN112616009B (en) * 2020-12-31 2022-08-02 维沃移动通信有限公司 Electronic equipment and camera module thereof
CN113012032A (en) * 2021-03-03 2021-06-22 中国人民解放军战略支援部队信息工程大学 Aerial panoramic image display method capable of automatically labeling place names

Also Published As

Publication number Publication date
TW201439665A (en) 2014-10-16

Similar Documents

Publication Publication Date Title
WO2014106303A1 (en) Panoramic lens calibration for panoramic image and/or video capture apparatus
US8711275B2 (en) Estimating optical characteristics of a camera component using sharpness sweep data
CN109859272B (en) Automatic focusing binocular camera calibration method and device
WO2020010945A1 (en) Image processing method and apparatus, electronic device and computer-readable storage medium
US6741279B1 (en) System and method for capturing document orientation information with a digital camera
JP6561327B2 (en) Optical inspection apparatus, lens barrel manufacturing method, and optical inspection method
US20180120687A1 (en) Ranging method, automatic focusing method and device
WO2017162201A1 (en) Inclination measurement system for photographing module and measurement method
CN108489423B (en) Method and system for measuring horizontal inclination angle of product surface
CN107783270B (en) Optical registration method and system for remote sensing camera with combination of large view field and small F number line surface
CN108734738B (en) Camera calibration method and device
CN105717511B (en) Multiple point distance measurement method based on line beam laser device and common camera chip
CN101673043B (en) Wide-angle distortion testing system and method
CN111213364A (en) Shooting equipment control method, shooting equipment control device and shooting equipment
CN105865664A (en) Non-contact type assembly stress detection system and use method
CN105913414A (en) Calibration device for infrared camera visual system and calibration method
US7999851B2 (en) Optical alignment of cameras with extended depth of field
JP7504688B2 (en) Image processing device, image processing method and program
CN109751917A (en) A kind of calibration method of thermal imaging gun sight reference for installation off-axis degree
CN103297799A (en) Testing an optical characteristic of a camera component
CN113345024B (en) Method for judging assembly quality of camera module
CN115830131A (en) Method, device and equipment for determining fixed phase deviation
CN208820953U (en) The device at rapid survey video camera inclination angle and field angle
JP2020067511A (en) Camera system, control method and program of the same
Poulin-Girard et al. Dedicated testing setup for panoramic lenses

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14735092

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14735092

Country of ref document: EP

Kind code of ref document: A1