CN113160393B - High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof - Google Patents

High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof Download PDF

Info

Publication number
CN113160393B
CN113160393B CN202110529545.1A CN202110529545A CN113160393B CN 113160393 B CN113160393 B CN 113160393B CN 202110529545 A CN202110529545 A CN 202110529545A CN 113160393 B CN113160393 B CN 113160393B
Authority
CN
China
Prior art keywords
dimensional
imaging device
absolute phase
target
mapping coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110529545.1A
Other languages
Chinese (zh)
Other versions
CN113160393A (en
Inventor
刘晓利
郑振桐
杨洋
王猛
张小杰
李显业
汤其剑
彭翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN202110529545.1A priority Critical patent/CN113160393B/en
Publication of CN113160393A publication Critical patent/CN113160393A/en
Application granted granted Critical
Publication of CN113160393B publication Critical patent/CN113160393B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a high-precision three-dimensional reconstruction method and device based on a large depth of field and related components thereof. The method comprises the following steps: dividing the area of the large depth of field measurement scene; performing binocular vision three-dimensional calibration on each divided region by using a three-dimensional measurement system to obtain calibration data; calculating absolute phase distribution diagrams and three-dimensional information of the planar plate at different positions; establishing a three-dimensional mapping coefficient table of the corresponding region; acquiring a target image of a measured object and calculating an absolute phase distribution map; and acquiring the absolute phase of an absolute phase distribution diagram of each pixel point in the target image, searching a three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of the corresponding region, and calculating the space three-dimensional point coordinates by using the three-dimensional mapping coefficient. According to the method, the large depth of field scene is divided into a plurality of areas for calculation, the whole calculation process is more rigorous and accurate, and the space three-dimensional point coordinates of the measured object are obtained after the three-dimensional mapping coefficient table is built.

Description

High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof
Technical Field
The invention relates to the technical field of three-dimensional imaging, in particular to a high-precision three-dimensional reconstruction method and device based on a large depth of field and related components thereof.
Background
The stripe projection three-dimensional measurement in the prior art belongs to one of structural lighting methods, has the advantages of high speed, high precision, low cost, easy operation and the like, and is widely applied to the fields of industrial measurement, intelligent manufacturing, cultural relic protection and the like. The principle of the stripe projection three-dimensional measurement technology is that a standard sine stripe pattern is projected on an object to be measured, the height of the object modulates the projected stripe, a camera acquires a modulated deformation stripe pattern, and then three-dimensional information of the object is reconstructed by combining a phase-height mapping principle through stripe phase demodulation and a system calibration technology.
Typically, fringe projection uses a digital micromirror array (DMD) based on the principles of optical imaging to produce a fringe pattern, with a fixed focal length optical projection lens having a limited depth of field. In particular, to improve the brightness of the image, commercial projectors all use a large aperture design, resulting in a smaller depth of field. In some large measurement scenes, the projection system cannot meet the requirements because the volume of the measured object is large or the large span space range formed by a plurality of objects. And a projector for scanning MEMS (micro electro mechanical system) galvanometer laser has a large imaging range with a large depth of field due to the use of a laser light source and a projection mode for scanning the galvanometer. Camera lenses also face the problem of limited depth of field, while electronic zoom lenses have the function of continuously changing the focal length of the lens.
In recent years, the development of electronically tunable lenses (Electrically Tunable Lens, ETL) has provided more options for designing more compact optical systems, particularly for their precision, rapidity, convenience, repeatability, etc., and have found widespread use in the fields of display, microscopy, auto-focus imaging, laser machining, etc. For three-dimensional measurement, there is also research into using electronically tunable lenses to achieve certain functions. One technique is to use electronically tunable lenses to quickly acquire multiple discrete focused images and to obtain three-dimensional information of the entire scene by combining data from different focus situations. However, the existing three-dimensional measurement system has low precision in measuring a scene with a large depth of field, and the problem of poor reconstruction recovery effect is still not solved.
Disclosure of Invention
The embodiment of the invention provides a high-precision three-dimensional reconstruction method and device based on a large depth of field and related components thereof, and aims to solve the problems of low three-dimensional measurement precision and poor three-dimensional reconstruction effect in a scene with the large depth of field in the prior art.
In a first aspect, an embodiment of the present invention provides a high-precision three-dimensional reconstruction method based on a large depth of field, including:
dividing the area of the large depth of field measurement scene;
Performing binocular vision three-dimensional calibration on each divided region by using a three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
projecting appointed patterns to plane plates at different positions by using a projection device in different areas, collecting plate images of the plane plates at different positions by using an imaging device, calculating to obtain absolute phase distribution diagrams of the plane plates at different positions, and calculating to obtain three-dimensional information of the plane plates at different positions according to the calibration data;
in each region, acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution diagram of the plane plate, and establishing a three-dimensional mapping coefficient table of the corresponding region according to the mapping relation between the three-dimensional information of each plane plate and the absolute phase of the corresponding pixel point;
projecting a target pattern to a measured object by using a projection device, collecting a target burnt-scan image of the measured object by using an imaging device, performing deblurring treatment on the target burnt-scan image to obtain a target image, and calculating an absolute phase distribution diagram of the target image;
Acquiring an absolute phase of each pixel point of the target image in an absolute phase distribution diagram of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding area according to the area to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain a corresponding space three-dimensional point coordinate.
In a second aspect, an embodiment of the present invention provides a high-precision three-dimensional reconstruction apparatus based on a large depth of field, including:
the area dividing unit is used for dividing the area of the large depth of field measurement scene;
the calibration data acquisition unit is used for carrying out binocular vision three-dimensional calibration on each divided area by utilizing the three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
the three-dimensional information acquisition unit is used for projecting specified patterns to the plane flat plates at different positions by using the projection device in different areas, collecting the flat plate images of the plane flat plates at different positions by using the imaging device, calculating to obtain absolute phase distribution diagrams of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data;
A three-dimensional mapping coefficient table obtaining unit, configured to obtain, in each region, an absolute phase of each pixel point of the imaging device in an absolute phase distribution diagram of the planar panel, and establish a three-dimensional mapping coefficient table of a corresponding region according to a mapping relationship between three-dimensional information of each planar panel and an absolute phase of a corresponding pixel point;
the target burnt-scan image acquisition unit is used for projecting a target pattern to a measured object by using the projection device, collecting a target burnt-scan image of the measured object by using the imaging device, performing deblurring treatment on the target burnt-scan image to obtain a target image and calculating an absolute phase distribution diagram of the target image;
the space three-dimensional point coordinate acquisition unit is used for acquiring the absolute phase of the absolute phase distribution diagram of each pixel point of the target image in the target image, searching the corresponding three-dimensional mapping coefficient in the three-dimensional mapping coefficient table of the corresponding area according to the area to which the absolute phase belongs, and calculating to obtain the corresponding space three-dimensional point coordinate by utilizing the three-dimensional mapping coefficient.
In a third aspect, an embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor executes the computer program to implement the high-precision three-dimensional reconstruction method based on a large depth of field according to the first aspect.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium, where the computer readable storage medium stores a computer program, where the computer program when executed by a processor causes the processor to perform the high-precision three-dimensional reconstruction method based on a large depth of field according to the first aspect.
The embodiment of the invention provides a high-precision three-dimensional reconstruction method and device based on a large depth of field and related components. The method comprises the following steps: dividing the area of the large depth of field measurement scene; performing binocular vision three-dimensional calibration on each divided region by using a three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device; projecting appointed patterns to plane plates at different positions by using a projection device in different areas, collecting plate images of the plane plates at different positions by using an imaging device, calculating to obtain absolute phase distribution diagrams of the plane plates at different positions, and calculating to obtain three-dimensional information of the plane plates at different positions according to the calibration data; in each region, acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution diagram of the plane plate, and establishing a three-dimensional mapping coefficient table of the corresponding region according to the mapping relation between the three-dimensional information of each plane plate and the absolute phase of the corresponding pixel point; projecting a target pattern to a measured object by using a projection device, collecting a target burnt-scan image of the measured object by using an imaging device, performing deblurring treatment on the target burnt-scan image to obtain a target image, and calculating an absolute phase distribution diagram of the target image; acquiring an absolute phase of each pixel point of the target image in an absolute phase distribution diagram of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding area according to the area to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain a corresponding space three-dimensional point coordinate. According to the embodiment of the invention, the three-dimensional mapping coefficient table is established for each region by dividing the large depth of field scene, and the three-dimensional mapping coefficient of the region corresponding to the measured object is directly obtained when the three-dimensional reconstruction is carried out, so that the space three-dimensional point coordinate is calculated.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a high-precision three-dimensional reconstruction method based on a large depth of field according to an embodiment of the present invention;
FIG. 2 is a simulation diagram of a high-precision three-dimensional reconstruction method based on a large depth of field according to an embodiment of the present invention;
fig. 3 is a schematic block diagram of a high-precision three-dimensional reconstruction device based on a large depth of field according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Referring to fig. 1 and 2, fig. 1 is a flow chart of a high-precision three-dimensional reconstruction method based on a large depth of field according to an embodiment of the present invention, and the method includes steps S101 to S106.
S101, carrying out region division on a large depth of field measurement scene;
in this step, since the large depth of field scene is large, when three-dimensional reconstruction is performed, accuracy is reduced if the large depth of field scene is measured directly, and therefore, the measurement space of the large depth of field scene needs to be divided in the depth direction to obtain several smaller measurement areas.
In one embodiment, after the step S101, the method comprises
Calibrating the positions of the imaging device and the projection device by using a calibration algorithm;
and acquiring the maximum value and the minimum value of the total control current of the zoom lens of the imaging device for measuring the large depth of field measurement scene and the corresponding region control current value when the zoom lens of the imaging device focuses on the central position of each region, and recording the maximum value and the minimum value of the total control current and the corresponding region control current value of each region.
In this step, the positions of the imaging device and the projection device are calibrated first, then, according to the measurement depth range of the imaging device during focusing and the diopter variation range of the zoom lens of the imaging device, a proper depth range of a measurement scene is determined, the diopter of the zoom lens is changed, so that the zoom lens can be focused at the center positions of different areas, then, the corresponding control current values of the zoom lens when the zoom lens is focused at the center positions of different areas are recorded, the diopter of the zoom lens is adjusted, and the imaging device is respectively focused at the maximum depth and the minimum depth of the large-depth measurement scene, and the control current values of the corresponding depths are recorded. In this embodiment, the minimum depth of the large depth-of-field measurement scene is 400mm, and the maximum depth is 1000mm.
S102, performing binocular vision three-dimensional calibration on each divided area by utilizing a three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
in the step, an imaging device with a zoom lens and a projection device with an MEMS galvanometer are utilized to form a three-dimensional measurement system, and binocular vision three-dimensional calibration is carried out on each area by utilizing the three-dimensional measurement system to obtain calibration data. The imaging device may be a zoom camera with a zoom lens and the projection device may be a projector with a MEMS galvanometer.
In an embodiment, the performing binocular vision stereo calibration on each divided area by using the three-dimensional measurement system to obtain calibration data includes:
establishing an imaging device coordinate system by taking an optical center of the imaging device as an origin and an optical axis of the imaging device as a Z axis; establishing a coordinate system of the projection device by taking an optical center of the projection device as an origin and taking an optical axis of the projection device as a Z axis;
and obtaining the conversion relation between the intrinsic parameters of the imaging device and the coordinate system of the imaging device and the conversion relation between the intrinsic parameters of the projection device and the projection coordinate system by using a binocular vision three-dimensional calibration algorithm, and calculating the conversion relation between the coordinate system of the imaging device and the projection coordinate system to obtain calibration data.
In this embodiment, the intrinsic parameters of the imaging device and the projection device, that is, the intrinsic parameter matrix including the focal length, the optical center position, the number of pixels per unit distance, and the like, are acquired. The intrinsic parameters of the imaging device and the coordinate system of the imaging device have a corresponding conversion relation, the intrinsic parameters of the projection device and the coordinate system of the projection device also have a corresponding conversion relation, and the conversion relation between the coordinate system of the imaging device and the coordinate system of the projection is calculated according to the conversion relation.
The calibration process of the imaging device is as follows:
assuming that a point P exists in a certain area, its coordinates in the world coordinate system and the imaging device coordinate system are (X W ,Y W ,Z W) and (Xc ,Y c ,Z c ) The projection coordinates on the imaging plane of the imaging device are (u, v), and then the perspective projection imaging process is as follows: wherein ,sx ,s y Pixel count (pixels/mm) of the image plane unit distance along the corresponding image coordinate axis, respectively; (u) 0 ,v 0 ) Is the intersection point of the optical axis of the imaging device and the image plane, namely the projection of the optical center on the image plane, and is called a principal point; f (f) x ,f y Equivalent focal lengths along corresponding image coordinate axes respectively; r is a 3 x 3 orthogonal matrix, T is a 3 x 1 vector, and R and T represent the rotational and translational transformations of the world coordinate system into the imaging device coordinate system, respectively. The above can be abbreviated as: / >Wherein s is a scale factor; [ R T ]]Is an external parameter matrix; /> and />Homogeneous coordinates of the three-dimensional point P and the image point; m is a projection matrix; k is an internal parameter matrix: />The conversion relation from the world coordinate system to the imaging device coordinate system is as follows: />Since the imaging device has deviation in the imaging process, the diameter needs to be calculatedThe radial distortion and the tangential distortion are expressed as: delta Rx =x(k 1 r 2 +k 2 r 4 +k 3 r 6 ),δ Ry =y(k 1 r 2 +k 2 r 4 +k 3 r 6 ),δ Tx =2p 1 xy+p 2 (r 2 +2x 2 ),δ Ty =p 1 (r 2 +2y 2 )+2p 2 And xy. Wherein delta Rx and δRy Radial distortion in the x-direction and y-direction, respectively; delta Tx and δTy Tangential distortions in the x-direction and y-direction, respectively; (x, y) is ideal image coordinates;representing the distance of an ideal image point to a principal point; k (k) 1 、k 2 and k3 Is a radial distortion parameter; p is p 1 and p2 Is a tangential distortion parameter. Taking into account these two distortion errors, the process of converting an ideal image point (x, y) into a distorted image point (x ', y') can be expressed as: x' =x+δ RxTx ,y'=y+δ RyTy
For each region, firstly acquiring target images of the planar target at different positions by using the three-dimensional measurement system, and calculating the position of the imaging device by using a Zhang Zhengyou calibration algorithm according to the target images. When the target image is acquired, the plane target is firstly placed to the middle position of a certain area, the imaging device is utilized to acquire the target image, then the plane target is transformed to another position, the target image is continuously acquired, and multiple groups of target images are obtained through multiple position transformations of the plane target. The overall flow of the Zhang Zhengyou calibration algorithm is as follows: firstly, shooting a plurality of target images of a plane target at different positions by using an imaging device, then detecting the target images to obtain characteristic points of the target images, solving internal parameters and external parameters of the imaging device under ideal distortion-free conditions, and improving the accuracy by using maximum likelihood estimation, solving the actual radial distortion coefficient of the imaging device by using least square, and optimizing and improving the estimation accuracy by using a maximum likelihood method according to the internal parameters, the external parameters and the radial distortion coefficient of the imaging device.
The calibration process of the projection device is as follows:
in the process of calibrating the projection device, a phase shift technology is used for acquiring the corresponding relation between the image of the imaging device and the image of the projection device. The projection device projects a group of phase shift diagrams and Gray code coding diagrams of horizontal stripes on the plane target, and the imaging device acquires the target image. Then, phase shift and Gray code are used to calculate the center (u) of the plane target C ,v C ) Absolute phase value phi in horizontal direction of (2) h Finding a horizontal corresponding line of the image of the projection device through the absolute phase value, and obtaining the coordinate value v of the horizontal corresponding line P The method comprises the following steps: wherein ,Nh Total number of stripes for the horizontal phase shift pattern; h is the vertical resolution of the projection device image. Similarly, the projection device projects a group of phase shift patterns and Gray code patterns of vertical stripes, and the circle center (u) of the same standard point can be obtained C ,v C ) Absolute phase value phi in the vertical direction of (2) v Corresponding coordinate value u on the image of the projector P The method comprises the following steps: wherein ,Nv Total number of stripes for the vertical phase shift pattern; w is the horizontal resolution of the projection device image. The method for calibrating the parameters of the projection device is the same as that of the imaging device, and the calibration of the projection device is used for solving an internal parameter matrix of the projection device.
For each region, calibrating the projection device after calibrating the imaging device. And projecting patterns to the plane targets by using a projection device, acquiring target images of the plane targets with patterns by using an imaging device, then changing the positions of the plane targets to continuously project the patterns and acquire the target images, obtaining a plurality of groups of target images by changing the positions of the plane targets for a plurality of times, and then calculating a quadrature absolute phase distribution diagram of each target image by using the obtained target images. Specifically, gray code and phase shift method can be used to perform phase decoding, so as to calculate quadrature absolute phase distribution diagram of each target image. The Gray code plus phase shift method can reduce the number of coded bits of Gray code, accelerate the decoding speed, and make up the defect that the discontinuous position is difficult to reconstruct by the pure phase shift method and Gray code method. The specific coding method adopting Gray code and phase shift method comprises the following steps: firstly, a series of gray code black-and-white stripe patterns are projected to a measured object, wherein the areas with the same codes are used as a coding period, and then, a phase shift method is adopted to sequentially project phase shift patterns, so that each coding area is further continuously subdivided. And finding out a pair of homonymy points on the imaging device image surface and the projection device image surface through the quadrature absolute phase distribution diagram of the target image, namely taking the image point of the imaging device image surface as a point to be matched, and finding out a first sub-pixel point with the same absolute phase as the point to be matched on the projection device image surface. And finally, calculating the position of the projection device by using a Zhang Zhengyou calibration algorithm according to the target image.
S103, projecting specified patterns to plane plates at different positions by using a projection device in different areas, collecting plate images of the plane plates at different positions by using an imaging device, calculating to obtain absolute phase distribution diagrams of the plane plates at different positions, and calculating to obtain three-dimensional information of the plane plates at different positions according to the calibration data;
in this step, in each region, a specified pattern is projected to a planar plate at different positions by using the projection device having a MEMS galvanometer, and a plate image is acquired by the imaging device, then an absolute phase distribution map of the planar plate at different positions is calculated from the plate image, and three-dimensional information of the planar plate is calculated by using calibration data acquired in advance. The calibration data includes: intrinsic parameters of the imaging device and the projection device, distortion coefficients of a zoom lens of the imaging device, and a conversion relationship between two coordinate systems.
The specific collection process of the plate image of the plane plate is as follows: and placing the plane plate in a measuring area, projecting orthogonal sinusoidal phase shift stripe patterns and Gray code coding patterns on the plane plate by a projection device, shooting and collecting plate images of the plane plate at different positions by an imaging device, changing the position of the plane plate, and repeating the projection and collection processes to obtain a plurality of groups of plate image data.
In an embodiment, the calculating, according to the calibration data, three-dimensional information of the planar plate at different positions includes:
calculating three-dimensional information according to the following formula:
s C [u C ,v C ,1] T =K C M C [X W ,Y W ,Z W ,1] T
s P [u P ,v P ,1] T =K P I[X W ,Y W ,Z W ,1] T
wherein ,sC and sP Scale factors, K, of imaging device and projection device, respectively C and KP Internal parameter matrix of imaging device and projection device, M C Is an external parameter matrix of the imaging device, I is an identity matrix, (u) C ,v C) and (uP ,v P ) The three-dimensional measuring system is the image coordinates of the imaging device and the projection device after distortion parameters are corrected, and T is the transposition of the matrix.
In this embodiment, the three-dimensional reconstruction process of the P point by the three-dimensional measurement system can be expressed as: wherein ,sC and sP Scale factors, K, of imaging device and projection device, respectively C and KP Internal parameter matrix of imaging device and projection device, M C and MP External parameter matrices of the imaging device and the projection device, respectively. The structural parameters of the three-dimensional measurement system can be expressed as: />Wherein r is a rotation vector converted from a coordinate system of the projector to a coordinate system of the imaging device, and t is a translation vector converted from the coordinate system of the projector to the coordinate system of the imaging device; establishing the world coordinate system under the projection device coordinate system, wherein R P Is a unitary matrix, T P Zero matrix, R C T is the rotation vector from the world coordinate system to the imaging device coordinate system C For translation vectors of world coordinate system to imaging device coordinate system, M P =i, three-dimensional reconstruction process transforms into: />Wherein I is an identity matrix; (u) C ,v C) and (uP ,v P ) Is the image coordinates of the imaging device and the projection device after the system distortion parameter is corrected. Solving for the three-dimensional coordinates of the P point (X) W ,Y W ,Z W ) Then, combine the absolute phase value phi of P point 1 And obtaining the sampling data of the phase three-dimensional mapping coefficient table of the pixel point of the imaging device in one region.
S104, in each region, acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution diagram of the plane plate, and establishing a three-dimensional mapping coefficient table of the corresponding region according to the mapping relation between the three-dimensional information of each plane plate and the absolute phase of the corresponding pixel point;
in this step, according to the pre-calculated absolute phase distribution diagram of the planar plate, the absolute phase corresponding to each pixel point of the imaging device in different areas is obtained, and then the mapping relation between each pixel point and the corresponding three-dimensional information is obtained, so as to establish a three-dimensional mapping coefficient table of the corresponding area. In each region, a position information of the planar plate is used as a set of sampling data, that is, a pixel corresponds to a three-dimensional space point and a phase value. The plane plate with multiple positions provides multiple groups of sampling data, and a mapping coefficient table of each pixel of a region is obtained by fitting out the mapping coefficients.
In one embodiment, the step S104 includes:
according to the three-dimensional information corresponding to the pixel points, calculating a three-dimensional mapping coefficient according to the following formula, and establishing a three-dimensional mapping coefficient table:
wherein ,αn ,c X ,b n ,c Y ,c n and cZ The three-dimensional mapping coefficient is characterized in that N is polynomial order, and phi is absolute phase of a corresponding pixel point.
In the present embodiment, a certain pixel point m is given c Is of absolute phase phi C The three-dimensional space points are (X, Y, Z), and according to the three-dimensional information of the pixel points, the three-dimensional space points can be deduced: wherein ,αn ,c X ,b n ,c Y ,c n and cZ Mapping coefficients corresponding to three spatial dimensions. The mapping coefficient { a } corresponding to each pixel point can be calculated by the formula n ,b n ,c n And establishing a three-dimensional mapping coefficient table corresponding to the absolute phase of each pixel point in each region.
S105, projecting a target pattern to a measured object by using a projection device, collecting a target burnt image of the measured object by using an imaging device, performing deblurring treatment on the target burnt image to obtain a target image, and calculating an absolute phase distribution diagram of the target image;
in the step, a projection device with an MEMS galvanometer is used for projecting a pattern to a measured object, the imaging device acquires a target focus scanning image of the measured object under a single-frame exposure condition, then deblurring processing is carried out on the target focus scanning image to obtain a target image, and then an absolute phase distribution diagram of the target image is calculated. The imaging device continuously performs focal plane scanning under a single-frame exposure condition to obtain a target focal scanning image, and the single-frame exposure time is the current period of the zoom lens of the imaging device. The current period changes with time in a triangular wave, and the maximum value and the minimum value of the triangular wave are the maximum value and the minimum value of a current value range for controlling the imaging device to continuously focus on the whole large depth of field measurement scene. And controlling the current value of the imaging device so as to control the single-frame exposure time of the imaging device for collecting the target sweeping image.
When the target sweeping pattern of each measured object is collected, the period of the control current of the zoom lens is T, and the maximum value is I H Minimum value is I L Is a function of the current as a function of time t:wherein n is a natural number.
In one embodiment, the step S105 includes:
inputting the target focus scanning image into an integral point spread function to perform deconvolution operation to obtain a target image subjected to deblurring treatment;
the integral point spread function is calculated as follows:
wherein r is the distance between the center of a circle of confusion imaged by an object point on a sensor plane of an imaging device; b 0 In order to control the current value of the electronic zoom lens to be at the moment 0, the object point is imaged on the diameter of a circle of diffusion on the sensor plane of the imaging device; b 1 Spot diameter focused on the imaging device sensor plane for the object point; b 2 In order to control the current value of the electronic zoom lens to be at the moment of a half period T, the object point is imaged on the diameter of a circle of confusion on the sensor plane of the imaging device; c (C) 1 and C2 Are two constants.
In this embodiment, an integral point spread function is constructed based on a focus scan model of the imaging device, and deconvolution operation is performed on the target focus scan image by using the integral point spread function, so as to obtain a deblurred target focus scan image. The calculation formula of the integral point spread function is as described above, and the calculation result is deblurred by wiener filtering.
S106, acquiring the absolute phase of each pixel point of the target image in the absolute phase distribution diagram of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding area according to the area to which the absolute phase belongs, and calculating to obtain a corresponding space three-dimensional point coordinate by using the three-dimensional mapping coefficient.
In the step, firstly, judging the area of each pixel point corresponding to the target scanned image according to the absolute phase distribution diagram of the target scanned image, then searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of the corresponding area, and calculating the corresponding space three-dimensional point coordinate by utilizing the three-dimensional mapping coefficient.
In one embodiment, the step S106 includes:
the spatial three-dimensional point coordinates are calculated by the following formula:
wherein ,{an ,b n ,c n The three-dimensional mapping coefficient of the corresponding region is shown in the specification, phi is the absolute phase of the pixel point, and N is the polynomial order.
In this embodiment, the absolute phase of each pixel point of the imaging device in the target focal scan image is obtained as a target phase, the region to which the target phase belongs is determined, and the corresponding three-dimensional mapping coefficient { a } is found according to the three-dimensional mapping coefficient table of the corresponding region n ,b n ,c n And calculating to obtain a space three-dimensional point coordinate according to the absolute phase and the three-dimensional mapping coefficient.
As shown in fig. 2, if the absolute phase of a pixel is Φ, the phase value range of the pixel in the region 1 is Φ 1 1 ~Ф 1 n The phase value range in region 2 is Φ 2 1 ~Ф 2 n The phase value range in region 3 is Φ 3 1 ~Ф 3 n The phase value range in region n is phi n 1 ~Ф n n . Judging the area of the absolute phase if phi 2 1 ≤Φ≤Φ 2 n The phase value of the pixel belongs to the area 2, the three-dimensional mapping coefficient corresponding to the pixel is found out from the three-dimensional mapping coefficient table of the area 2, and the corresponding space three-dimensional point coordinate is obtained by calculating according to the following formula: wherein ,{an ,b n ,c n And the three-dimensional mapping coefficient corresponding to the region 2.
If phi is 2 1 ≤Φ≤Φ 2 n And phi is 1 1 ≤Φ≤Φ 1 n The condition for determining which region Φ belongs to is:when L is more than or equal to 0, the phase value of the image point belongs to the area 1; when L < 0, the phase value of the image point belongs to region 2.
Referring to fig. 3, fig. 3 is a schematic block diagram of a high-precision three-dimensional reconstruction device based on a large depth of field according to an embodiment of the present invention, where the high-precision three-dimensional reconstruction device 200 based on a large depth of field includes:
a region dividing unit 201, configured to divide a region of a large depth of field measurement scene;
a calibration data obtaining unit 202, configured to perform binocular vision stereo calibration on each divided area by using a three-dimensional measurement system, so as to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
A three-dimensional information obtaining unit 203, configured to project a specified pattern to a planar panel at different positions in different areas by using a projection device, collect panel images of the planar panel at different positions by using an imaging device, calculate an absolute phase distribution diagram of the planar panel at different positions, and calculate three-dimensional information of the planar panel at different positions according to the calibration data;
a three-dimensional mapping coefficient table obtaining unit 204, configured to obtain, in each region, an absolute phase of each pixel point of the imaging device in an absolute phase distribution diagram of the planar panel, and establish a three-dimensional mapping coefficient table of a corresponding region according to a mapping relationship between three-dimensional information of each planar panel and an absolute phase of a corresponding pixel point;
a target-scanning-image obtaining unit 205, configured to project a target pattern to a measured object by using a projection device, collect a target scanning image of the measured object by using an imaging device, and perform deblurring processing on the target scanning image to obtain a target image and calculate an absolute phase distribution diagram of the target image;
the spatial three-dimensional point coordinate obtaining unit 206 is configured to obtain an absolute phase of an absolute phase distribution diagram of each pixel point of the target image in the target image, search a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding region according to a region to which the absolute phase belongs, and calculate to obtain a corresponding spatial three-dimensional point coordinate by using the three-dimensional mapping coefficient.
In an embodiment, the area dividing unit 201 then includes:
the position calibration unit is used for calibrating the positions of the imaging device and the projection device by using a calibration algorithm;
and the control current value recording unit is used for acquiring the maximum value and the minimum value of the total control current of the zoom lens of the imaging device for measuring the large depth of field measurement scene and the corresponding area control current value when the zoom lens of the imaging device focuses on the central position of each area, and recording the maximum value and the minimum value of the total control current and the corresponding area control current value of each area.
In one embodiment, the calibration data acquisition unit 202 includes:
the coordinate system establishing unit is used for establishing an imaging device coordinate system by taking an optical center of the imaging device as an original point and an optical axis of the imaging device as a Z axis; establishing a coordinate system of the projection device by taking an optical center of the projection device as an origin and taking an optical axis of the projection device as a Z axis;
and the calibration data calculation unit is used for acquiring the conversion relation between the intrinsic parameters of the imaging device and the coordinate system of the imaging device and the conversion relation between the intrinsic parameters of the projection device and the projection coordinate system by using a binocular vision three-dimensional calibration algorithm, and calculating the conversion relation between the coordinate system of the imaging device and the projection coordinate system to obtain calibration data.
In an embodiment, the three-dimensional information acquisition unit 203 includes:
a three-dimensional information formula calculation unit for calculating three-dimensional information according to the following formula:
s C [u C ,v C ,1] T =K C M C [X W ,Y W ,Z W ,1] T
s P [u P ,v P ,1] T =K P I[X W ,Y W ,Z W ,1] T
wherein ,sC and sP Scale factors, K, of imaging device and projection device, respectively C and KP Internal parameter matrix of imaging device and projection device, M C Is an external parameter matrix of the imaging device, I is an identity matrix, (u) C ,v C) and (uP ,v P ) The three-dimensional measuring system is the image coordinates of the imaging device and the projection device after distortion parameters are corrected, and T is the transposition of the matrix.
In one embodiment, the three-dimensional mapping coefficient table obtaining unit 204 includes:
the three-dimensional mapping coefficient calculation unit is used for calculating the three-dimensional mapping coefficient according to the three-dimensional information corresponding to the pixel point according to the following formula, and establishing a three-dimensional mapping coefficient table:
wherein ,αn ,c X ,b n ,c Y ,c n and cZ The three-dimensional mapping coefficient is characterized in that N is polynomial order, and phi is absolute phase of a corresponding pixel point.
In an embodiment, the target swept image acquisition unit 205 includes:
the deblurring processing unit is used for inputting the target swept image into an integral point spread function to perform deconvolution operation to obtain a deblurred target image;
an integral point spread function calculation unit, which is used for calculating the integral point spread function as follows:
Wherein r is the distance between the center of a circle of confusion imaged by an object point on a sensor plane of an imaging device; b 0 In order to control the current value of the electronic zoom lens to be at the moment 0, the object point is imaged on the diameter of a circle of diffusion on the sensor plane of the imaging device; b 1 Spot diameter focused on the imaging device sensor plane for the object point; b 2 In order to control the current value of the electronic zoom lens to be at the moment of a half period T, the object point is imaged on the diameter of a circle of confusion on the sensor plane of the imaging device; c (C) 1 and C2 Are two constants.
In one embodiment, the spatial three-dimensional point coordinate acquiring unit 206 includes:
a space three-dimensional point coordinate calculation unit for calculating the space three-dimensional point coordinate by the following formula:
wherein ,{an ,b n ,c n The three-dimensional mapping coefficient of the corresponding region is shown in the specification, phi is the absolute phase of the pixel point, and N is the polynomial order.
The embodiment of the invention also provides computer equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the high-precision three-dimensional reconstruction method based on the large depth of field when executing the computer program.
The embodiment of the invention also provides a computer readable storage medium, wherein the computer readable storage medium is stored with a computer program, and the computer program realizes the high-precision three-dimensional reconstruction method based on the large depth of field when being executed by a processor.
In the description, each embodiment is described in a progressive manner, and each embodiment is mainly described by the differences from other embodiments, so that the same similar parts among the embodiments are mutually referred. For the system disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section. It should be noted that it will be apparent to those skilled in the art that various modifications and adaptations of the invention can be made without departing from the principles of the invention and these modifications and adaptations are intended to be within the scope of the invention as defined in the following claims.
It should also be noted that in this specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (9)

1. The high-precision three-dimensional reconstruction method based on the large depth of field is characterized by comprising the following steps of:
dividing the area of the large depth of field measurement scene;
performing binocular vision three-dimensional calibration on each divided region by using a three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
projecting appointed patterns to plane plates at different positions by using a projection device in different areas, collecting plate images of the plane plates at different positions by using an imaging device, calculating to obtain absolute phase distribution diagrams of the plane plates at different positions, and calculating to obtain three-dimensional information of the plane plates at different positions according to the calibration data;
in each region, acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution diagram of the plane plate, and establishing a three-dimensional mapping coefficient table of the corresponding region according to the mapping relation between the three-dimensional information of each plane plate and the absolute phase of the corresponding pixel point;
projecting a target pattern to a measured object by using a projection device, collecting a target burnt-scan image of the measured object by using an imaging device, performing deblurring treatment on the target burnt-scan image to obtain a target image, and calculating an absolute phase distribution diagram of the target image;
Acquiring an absolute phase of each pixel point of the target image in an absolute phase distribution diagram of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding area according to the area to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain a corresponding space three-dimensional point coordinate;
the method for projecting a target pattern to a measured object by using a projection device, collecting a target burnt image of the measured object by using an imaging device, performing deblurring processing on the target burnt image to obtain a target image, and calculating an absolute phase distribution diagram of the target image comprises the following steps:
inputting the target focus scanning image into an integral point spread function to perform deconvolution operation to obtain a target image subjected to deblurring treatment;
the integral point spread function is calculated as follows:
wherein r is the distance between the center of a circle of confusion imaged by an object point on a sensor plane of an imaging device; b 0 In order to control the current value of the electronic zoom lens to be at the moment 0, the object point is imaged on the diameter of a circle of diffusion on the sensor plane of the imaging device; b 1 Spot diameter focused on the imaging device sensor plane for the object point; b 2 In order to control the current value of the electronic zoom lens to be at the moment of a half period T, the object point is imaged on the diameter of a circle of confusion on the sensor plane of the imaging device; c (C) 1 and C2 Are two constants.
2. The high-precision three-dimensional reconstruction method based on large depth of field according to claim 1, wherein after the region division of the large depth of field measurement scene, comprising:
calibrating the positions of the imaging device and the projection device by using a calibration algorithm;
and acquiring the maximum value and the minimum value of the total control current of the zoom lens of the imaging device for measuring the large depth of field measurement scene and the corresponding region control current value when the zoom lens of the imaging device focuses on the central position of each region, and recording the maximum value and the minimum value of the total control current and the corresponding region control current value of each region.
3. The high-precision three-dimensional reconstruction method based on the large depth of field according to claim 1, wherein the performing binocular vision stereoscopic calibration on each divided region by using the three-dimensional measurement system to obtain calibration data comprises:
establishing an imaging device coordinate system by taking an optical center of the imaging device as an origin and an optical axis of the imaging device as a Z axis; establishing a coordinate system of the projection device by taking an optical center of the projection device as an origin and taking an optical axis of the projection device as a Z axis;
And obtaining the conversion relation between the intrinsic parameters of the imaging device and the coordinate system of the imaging device and the conversion relation between the intrinsic parameters of the projection device and the coordinate system of the projection device by using a binocular vision three-dimensional calibration algorithm, and calculating the conversion relation between the coordinate system of the imaging device and the coordinate system of the projection device to obtain calibration data.
4. The high-precision three-dimensional reconstruction method based on the large depth of field according to claim 1, wherein the calculating the three-dimensional information of the planar plate at different positions according to the calibration data comprises:
calculating three-dimensional information according to the following formula:
wherein ,s C ands P scale factors of the imaging device and the projection device respectively,K C andK P internal parameter matrices of the imaging device and the projection device respectively,M C for the matrix of external parameters of the imaging device,Iis a unit matrix [ (]u C , v C ) and (u P , v P ) Is the image coordinates of the imaging device and the projection device after the distortion parameters of the three-dimensional measurement system are corrected,Tis a transpose of the matrix.
5. The high-precision three-dimensional reconstruction method according to claim 1, wherein the obtaining, in each region, an absolute phase of each pixel point of the imaging device in an absolute phase distribution diagram of the planar plate, and establishing a three-dimensional mapping coefficient table of a corresponding region according to a mapping relationship between three-dimensional information of each planar plate and an absolute phase of a corresponding pixel point, comprises:
According to the three-dimensional information corresponding to the pixel points, calculating a three-dimensional mapping coefficient according to the following formula, and establishing a three-dimensional mapping coefficient table:
wherein ,α n c X b n c Y c n andc Z the three-dimensional mapping coefficient is characterized in that N is polynomial order, and phi is absolute phase of a corresponding pixel point.
6. The method of claim 1, wherein the obtaining the absolute phase of the absolute phase distribution map of each pixel point of the target image in the target image, searching the corresponding three-dimensional mapping coefficient in the three-dimensional mapping coefficient table of the corresponding region according to the region to which the absolute phase belongs, and calculating the corresponding spatial three-dimensional point coordinate by using the three-dimensional mapping coefficient comprises:
the spatial three-dimensional point coordinates are calculated by the following formula:
wherein ,{a n ,b n ,c n The three-dimensional mapping coefficient of the corresponding region is shown in the specification, phi is the absolute phase of the pixel point, and N is the polynomial order.
7. A high-precision three-dimensional reconstruction device based on a large depth of field, comprising:
the area dividing unit is used for dividing the area of the large depth of field measurement scene;
the calibration data acquisition unit is used for carrying out binocular vision three-dimensional calibration on each divided area by utilizing the three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
The three-dimensional information acquisition unit is used for projecting specified patterns to the plane flat plates at different positions by using the projection device in different areas, collecting the flat plate images of the plane flat plates at different positions by using the imaging device, calculating to obtain absolute phase distribution diagrams of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data;
a three-dimensional mapping coefficient table obtaining unit, configured to obtain, in each region, an absolute phase of each pixel point of the imaging device in an absolute phase distribution diagram of the planar panel, and establish a three-dimensional mapping coefficient table of a corresponding region according to a mapping relationship between three-dimensional information of each planar panel and an absolute phase of a corresponding pixel point;
the target burnt-scan image acquisition unit is used for projecting a target pattern to a measured object by using the projection device, collecting a target burnt-scan image of the measured object by using the imaging device, performing deblurring treatment on the target burnt-scan image to obtain a target image and calculating an absolute phase distribution diagram of the target image;
the space three-dimensional point coordinate acquisition unit is used for acquiring the absolute phase of the absolute phase distribution diagram of each pixel point of the target image in the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding area according to the area to which the absolute phase belongs, and calculating to obtain a corresponding space three-dimensional point coordinate by utilizing the three-dimensional mapping coefficient;
The target-sweeping-image acquisition unit includes:
the deblurring processing unit is used for inputting the target swept image into an integral point spread function to perform deconvolution operation to obtain a deblurred target image;
an integral point spread function calculation unit, which is used for calculating the integral point spread function as follows:
wherein r is the distance between the center of a circle of confusion imaged by an object point on a sensor plane of an imaging device; b 0 In order to control the current value of the electronic zoom lens to be at the moment 0, the object point is imaged on the diameter of a circle of diffusion on the sensor plane of the imaging device; b 1 Spot diameter focused on the imaging device sensor plane for the object point; b 2 In order to control the current value of the electronic zoom lens to be at the moment of a half period T, the object point is imaged on the diameter of a circle of confusion on the sensor plane of the imaging device; c (C) 1 and C2 Are two constants.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the high-precision depth-of-field-based three-dimensional reconstruction method according to any one of claims 1 to 6 when executing the computer program.
9. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, causes the processor to perform the high-precision depth-of-field-based three-dimensional reconstruction method according to any one of claims 1 to 6.
CN202110529545.1A 2021-05-14 2021-05-14 High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof Active CN113160393B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110529545.1A CN113160393B (en) 2021-05-14 2021-05-14 High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110529545.1A CN113160393B (en) 2021-05-14 2021-05-14 High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof

Publications (2)

Publication Number Publication Date
CN113160393A CN113160393A (en) 2021-07-23
CN113160393B true CN113160393B (en) 2023-08-04

Family

ID=76875231

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110529545.1A Active CN113160393B (en) 2021-05-14 2021-05-14 High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof

Country Status (1)

Country Link
CN (1) CN113160393B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546285B (en) * 2022-11-25 2023-06-02 南京理工大学 Large-depth-of-field stripe projection three-dimensional measurement method based on point spread function calculation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767405A (en) * 2016-12-15 2017-05-31 深圳大学 The method and device of the quick corresponding point matching of phase mapping assist three-dimensional imaging system
CN107358631A (en) * 2017-06-27 2017-11-17 大连理工大学 A kind of binocular vision method for reconstructing for taking into account three-dimensional distortion
CN110477936A (en) * 2019-08-20 2019-11-22 新里程医用加速器(无锡)有限公司 Beam-defining clipper scaling method, device, equipment and the medium of radiation imaging system
CN111649691A (en) * 2020-03-06 2020-09-11 福州大学 Digital fringe projection three-dimensional imaging system and method based on single-pixel detector

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767405A (en) * 2016-12-15 2017-05-31 深圳大学 The method and device of the quick corresponding point matching of phase mapping assist three-dimensional imaging system
CN107358631A (en) * 2017-06-27 2017-11-17 大连理工大学 A kind of binocular vision method for reconstructing for taking into account three-dimensional distortion
CN110477936A (en) * 2019-08-20 2019-11-22 新里程医用加速器(无锡)有限公司 Beam-defining clipper scaling method, device, equipment and the medium of radiation imaging system
CN111649691A (en) * 2020-03-06 2020-09-11 福州大学 Digital fringe projection three-dimensional imaging system and method based on single-pixel detector

Also Published As

Publication number Publication date
CN113160393A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN104061879B (en) A kind of structural light three-dimensional face shape vertical survey method continuously scanned
CN103003665B (en) Stereo distance measurement apparatus
CN113205593B (en) High-light-reflection surface structure light field three-dimensional reconstruction method based on point cloud self-adaptive restoration
JP4873485B2 (en) Shape measuring method and shape measuring apparatus using a number of reference surfaces
CN112465912B (en) Stereo camera calibration method and device
CN108020175B (en) multi-grating projection binocular vision tongue surface three-dimensional integral imaging method
CN113205592B (en) Light field three-dimensional reconstruction method and system based on phase similarity
CN112967342B (en) High-precision three-dimensional reconstruction method and system, computer equipment and storage medium
CN111080705B (en) Calibration method and device for automatic focusing binocular camera
CN113160339A (en) Projector calibration method based on Samm&#39;s law
JP2013185832A (en) Information processing apparatus and information processing method
CN115775303B (en) Three-dimensional reconstruction method for high-reflection object based on deep learning and illumination model
CN115546285B (en) Large-depth-of-field stripe projection three-dimensional measurement method based on point spread function calculation
JP2010032260A (en) Apparatus and method for correcting distortion of optical system
CN115578296B (en) Stereo video processing method
CN114792345B (en) Calibration method based on monocular structured light system
CN113160393B (en) High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof
CN110108230B (en) Binary grating projection defocus degree evaluation method based on image difference and LM iteration
CN117450955B (en) Three-dimensional measurement method for thin object based on space annular feature
CN113251953B (en) Mirror included angle measuring device and method based on stereo deflection technology
CN114219866A (en) Binocular structured light three-dimensional reconstruction method, reconstruction system and reconstruction equipment
Chen et al. Finding optimal focusing distance and edge blur distribution for weakly calibrated 3-D vision
CN109859313B (en) 3D point cloud data acquisition method and device, and 3D data generation method and system
CN114993207B (en) Three-dimensional reconstruction method based on binocular measurement system
Mannan et al. Optimal camera parameters for depth from defocus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant