CN111524173A - Rapid large-range phase unwrapping method based on double reference planes - Google Patents

Rapid large-range phase unwrapping method based on double reference planes Download PDF

Info

Publication number
CN111524173A
CN111524173A CN202010248074.2A CN202010248074A CN111524173A CN 111524173 A CN111524173 A CN 111524173A CN 202010248074 A CN202010248074 A CN 202010248074A CN 111524173 A CN111524173 A CN 111524173A
Authority
CN
China
Prior art keywords
phase
reference plane
unwrapping
phi
farthest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010248074.2A
Other languages
Chinese (zh)
Other versions
CN111524173B (en
Inventor
金�一
段明辉
郑亚兵
孙正
陈恩红
吕盼稂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202010248074.2A priority Critical patent/CN111524173B/en
Publication of CN111524173A publication Critical patent/CN111524173A/en
Application granted granted Critical
Publication of CN111524173B publication Critical patent/CN111524173B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a quick large-range phase unwrapping method based on double reference planes, which comprises the following steps: (1) determining a space range to be measured, respectively obtaining wrapping phases of a nearest reference plane and a farthest reference plane by utilizing a phase profilometry according to a sequence, and solving absolute phase diagrams of the two reference planes by utilizing a time unwrapping method; (2) calculating a wrapped phase map and a modulation projection map of the object; (3) unwrapping the wrapping phase of the object by using the absolute phase diagram of the farthest reference plane to obtain a candidate absolute phase diagram of the object; (4) carrying out edge detection on the candidate phase diagram and the modulation projection diagram of the object, and classifying unwrapping; (5) for the unwrapped lossiness class, converting the unwrapped phase map to an unwrapped ambiguity class by generating a virtual adaptive reference plane phase to provide an unwrapped phase reference thereto; (6) and (5) correcting the absolute phase diagram to obtain a real absolute phase diagram. The invention is suitable for high-speed three-dimensional structured light reconstruction of large-range moving objects.

Description

Rapid large-range phase unwrapping method based on double reference planes
Technical Field
The invention relates to the field of high-speed three-dimensional reconstruction, in particular to a quick large-range phase unwrapping method based on double reference planes.
Background
The high-speed three-dimensional reconstruction technology overcomes the limitation of two-dimensional image expression, completely shows the three-dimensional appearance and texture characteristics of the space dynamic object, restores the real state of the space object and enhances the space third dimension of the dynamic object. The high-speed three-dimensional reconstruction of the dynamic object has important application value. Time phase unwrapping and space phase unwrapping are two of the most important unwrapping methods in the field of three-dimensional reconstruction at present; the time unwrapping method needs to increase a large number of projection patterns for determining an absolute phase diagram, so that the effective frame rate of three-dimensional reconstruction is greatly reduced; the spatial unwrapping method relies on wrapped phase map gradient changes, consumes a large amount of computational resources and is prone to unwrapping errors.
The phase unwrapping method based on the single reference plane is a method requiring the difference between the absolute phase of the reference plane and the absolute phase of an object to be measured to be kept in the range of [0,2 pi ], does not need to project extra patterns during measurement, is simple in unwrapping calculation, can be applied to measurement of high-speed dynamic moving objects, but can only measure the surface profile of the object with the position and the height within the limit range, and when the part of the object exceeds the limit range, unwrapping errors can occur. Therefore, this method is not suitable for three-dimensional reconstruction of large-scale moving objects. From the above analysis, for the existing phase unwrapping method technology based on a single reference plane, it has a limitation on three-dimensional measurement of a large-range moving object.
Disclosure of Invention
In order to solve the problems in the prior art, the invention aims to provide a fast large-range phase unwrapping method based on double reference planes, and the method is applied to high-speed three-dimensional reconstruction. The virtual self-adaptive reference plane is fitted through the double reference planes, so that the absolute phase of the solution object is not wrong; and removing ambiguity of the absolute phase diagram in a large range by using the difference between the edge of the phase diagram and the edge of the object adjusting diagram, and finally realizing fast phase unwrapping in the large range to obtain a real object absolute phase diagram.
The technical scheme adopted by the invention is as follows: a fast large-range phase unwrapping method based on double reference planes comprises the following steps:
determining a space range to be measured, respectively obtaining wrapping phases of a nearest reference plane and a farthest reference plane by utilizing a phase profilometry according to a sequence, and solving absolute phase diagrams of the nearest reference plane and the farthest reference plane by utilizing a time unwrapping method, wherein the absolute phase diagrams are phi respectivelyFAnd phiB(ii) a Simultaneously processing the fringe projection of the farthest reference plane to obtain a modulated projection I about the farthest reference planeB
Step (2) keeping the farthest reference plane in the camera visual field, enabling the object to be measured to be located at any position in the measuring range, enabling the projector to project N stripe patterns with different phase shifts, enabling N to be larger than or equal to 3, and enabling the camera to record a final image I obtained by modulating the stripe patterns on the surface of the object1,I2,...,IN(ii) a Using object image sequences InCalculating a wrapped phase map phi of an objectOAnd modulating the projection IOWherein 0 is<n≤N;
Step (3) utilizing the absolute phase diagram phi of the farthest reference planeBWrapping phase phi of unwrapped objectsOObtaining a candidate absolute phase map phi of the object0 O
Step (4) utilizing Sobel operator to respectively align phase diagram phi0 OAnd modulating the projection IOPerforming edge detection, subtracting the binarization edge image of the former from the binarization edge image of the latter, and counting to obtain the number of pixels greater than 0 in the result; if the counted number of pixels is greater than a set threshold value T, classifying the unwrapping scheme as unwrapping failure, otherwise, classifying as unwrapping ambiguity;
step (5) for removing the lost or lost parcel class, generating virtual selfThe phase of the adaptive reference plane provides a dephasing reference thereto, and the virtual adaptive reference plane is generated by using phiFAnd phiBSelf-adaptive generation, counting the number of the binary edge pixel values larger than 0 until the number is lower than a threshold value T, and converting the unwrapped phase map into an unwrapped ambiguity class;
step (6) for the unwrapping ambiguity class, determining the distance between the object and the farthest reference plane according to the shadow length formed by the object on the farthest reference plane, and determining the candidate absolute phase diagram phi0 OOn the basis of the absolute phase diagram, a constant m × 2 pi is added to realize absolute phase diagram correction, and a real absolute phase diagram phi is obtainedO(ii) a Wherein m is an integer.
Further, the step (1) calculates the fringe projection diagram of the farthest reference plane to obtain the modulation projection diagram I related to the farthest reference planeBThe concrete formula is as follows:
Figure BDA0002434483850000021
in the formula, N (N ≧ 3) represents the total number of captured pictures.
Further, the step (2) utilizes the object image sequence InCalculating a wrapped phase map phi of an objectOAnd modulating the projection IOThe method comprises the following specific steps:
Figure BDA0002434483850000031
Figure BDA0002434483850000032
further, step (3), absolute phase map Φ using the farthest reference planeBUnwrapping the wrapped phase phi of the object as a phase referenceOObtaining fringe order k (x, y) of wrapped phase map, and calculating candidate absolute phase map of object by combining fringe order distribution and wrapped phase map
Figure BDA0002434483850000033
Figure BDA0002434483850000034
Where floor (·) is a rounded down function.
Figure BDA0002434483850000035
Further, in the step (5), for the unwrapping failure class, a virtual adaptive reference plane phase is generated to provide a unwrapping phase reference for the unwrapping failure class, and the adaptive reference plane is generated in a manner that:
ΦA=s×ΦF+(1-s)×ΦB,s∈[0,1]
counting the number of the binary edge pixel values larger than 0, if the number is lower than a threshold value T, converting the unwrapped phase image into an unwrapped ambiguity class, otherwise, continuously adjusting the reference plane until the counted number of effective pixels is lower than the threshold value T.
The invention has the advantages and positive effects that:
(1) the phase unwrapping method provided by the invention adopts more advanced double-reference phase planes to generate the virtual self-adaptive reference plane, and removes the ambiguity of the absolute phase diagram by using the edge difference of the phase diagram and the modulation diagram, thereby realizing the phase unwrapping of a large-range moving object.
(2) The invention adopts a phase unwrapping method based on double reference planes, and realizes the pixel-level unwrapping accuracy of the object phase.
(3) The method adopts the absolute phase diagram of the double reference planes obtained before the experiment, avoids using extra projection patterns in the experiment, improves the measurement speed, avoids a complex phase unwrapping iterative algorithm, and improves the reconstruction speed.
Drawings
FIG. 1 is a flow chart of a method for unwrapping absolute phase over a large spatial range in accordance with the present invention;
FIG. 2(a) is a diagram of the reconstruction result when the statue is close to the farthest reference plane position;
FIG. 2(b) is a graph of the reconstruction results of a statue at a position intermediate the most distal plane and the most proximal plane;
FIG. 2(c) is a graph of the reconstruction result when the statue is near the nearest reference plane location;
FIG. 3 is a graph of the dynamic three-dimensional shape reconstruction effect when the palm portion is away from the measurement system;
FIG. 4 is a graph of the dynamic three-dimensional shape reconstruction effect with complex brim characteristics when the cup lid is away from the measurement system.
Detailed Description
The technical method adopted by the invention comprises the following steps: the difference between the edge of the phase diagram and the edge of the object modulation diagram is obtained by adopting advanced double reference planes, so that errors caused by decoding the phase wrapping in a large range are avoided; for the unwrapping ambiguity class, according to the shadow length formed by the object on the farthest reference plane, the approximate range of the object from the farthest reference plane is determined, a real object absolute phase diagram is obtained, and finally the rapid and large-range phase unwrapping is realized. The core content of the invention comprises two aspects: the first is to solve the wrapped phase map of the object to obtain a complete relative phase map, and the second is to estimate the absolute phase map based on the length of the shadow.
Firstly, the specific steps of solving the wrapped phase diagram of the object to obtain a complete relative phase diagram are as follows:
step (1), determining a space range to be measured, respectively obtaining wrapping phases of a nearest reference plane and a farthest reference plane by utilizing a phase profilometry according to a sequence, and solving absolute phase diagrams of the two reference planes by utilizing a time unwrapping method, wherein the absolute phase diagrams are phi respectivelyFAnd phiB(ii) a Simultaneously processing the fringe projection of the farthest reference plane to obtain a modulated projection I about the farthest reference planeB
Figure BDA0002434483850000041
In the formula, N (N ≧ 3) represents the total number of captured pictures.
Step (2), removing the nearest reference plane, and keeping the farthest reference plane in the phaseIn the machine vision field, an object to be measured is positioned at any position in a measuring range, a projector projects N (N is more than or equal to 3) stripe patterns with different phase shifts, and a camera records a final image I of the stripe patterns after the stripe patterns are modulated by the surface of the object1,I2,...,IN(ii) a Using object image sequences In(0<N is less than or equal to N) calculating the wrapping phase diagram phi of the objectOAnd modulating the projection IO
Figure BDA0002434483850000042
Figure BDA0002434483850000043
Step (3) utilizing the absolute phase diagram phi of the farthest reference planeBUnwrapping the wrapped phase phi of the object as a phase referenceOObtaining candidate absolute phase diagram of the object
Figure BDA0002434483850000044
Figure BDA0002434483850000051
Figure BDA0002434483850000052
Step (4) utilizing Sobel operators to respectively carry out candidate phase diagram matching
Figure BDA0002434483850000053
And modulating the projection IOPerforming edge detection, subtracting the binarization edge image of the former from the binarization edge image of the latter, and utilizing the number of pixels which are more than 0 in the histogram statistical result; if the counted number is larger than a threshold value T (the value of T is set by a user), classifying the unwrapping scheme as unwrapping failure, otherwise, classifying as unwrapping ambiguity;
and (5) for the unwrapping failure class, providing unwrapped phase reference for the unwrapped failure class by generating a virtual adaptive reference plane phase, and repeating the step (3), wherein the generation mode of the adaptive reference plane is as follows:
ΦA=s×ΦF+(1-s)×ΦB,s∈[0,1]
counting the number of the binary edge pixel values larger than 0, if the number is lower than the threshold value T, converting the unwrapped phase map into an unwrapped ambiguity class, namely a complete relative phase map, otherwise, continuously adjusting the reference plane until the counted number of the effective pixels is lower than the threshold value T.
Secondly, the specific steps of the absolute phase diagram estimation based on the shadow length are as follows:
step (1), for the unwrapping ambiguity class, according to the shadow length formed by the object on the farthest reference plane, determining the approximate range of the object from the farthest reference plane, and further carrying out candidate absolute phase diagram
Figure BDA0002434483850000054
The absolute phase diagram correction is realized by increasing m × 2 pi to obtain a real absolute phase diagram phiO
The effectiveness of the method of the invention is verified by three specific example analyses. Example 1 static three-dimensional shape recovery for two independent figures; example 2 dynamic three-dimensional shape recovery when the free palm is away from the measurement system; example 3 is dynamic three-dimensional shape recovery with complex edge feature lids away from the measurement system, as described separately below:
example 1:
the two independent figures of merit move in the range of the farthest reference plane and the nearest reference plane, and in the experimental measurement, the figures of merit move from the farthest reference plane to the nearest reference plane. And reconstructing two independent statue three-dimensional shapes of three spatial positions in the motion process by utilizing the proposed self-adaptive pixel-by-pixel unwrapping algorithm. The algorithm flow is the same for each position, and the specific steps are as follows:
(1) determining the space range to be measured, obtaining the wrapping phases of the nearest reference plane and the farthest reference plane respectively by phase profilometry according to the sequence, and obtaining the wrapping phases of the two reference planes by a time unwrapping methodAbsolute phase diagram, respectively phiFAnd phiB(ii) a Simultaneously processing the fringe projection of the farthest reference plane to obtain a modulated projection I about the farthest reference planeB
(2) Extracting a statue wrapped phase diagram phi by using a traditional three-step phase shift methodOAnd modulating the image IO
(3) Absolute phase map phi with farthest reference planeBUnwrapping the wrapped phase phi of the object as a phase referenceOObtaining candidate absolute phase diagram of the object
Figure BDA0002434483850000061
(4) Respectively aligning the candidate phase images by using Sobel operator
Figure BDA0002434483850000062
And modulating the projection IOPerforming edge detection, subtracting the binarization edge image of the former from the binarization edge image of the latter, and utilizing the number of pixels which are more than 0 in the histogram statistical result; if the number of statistics is larger than a threshold value T (the value of T is set by a user), classifying the unpacking scheme as unpacking failure, and otherwise, classifying as unpacking ambiguity.
(5) For the unwrapped loser class, a unwrapped phase reference is provided by generating a virtual adaptive reference plane phase.
(6) For the unwrapping ambiguity class, determining the approximate range of the object from the farthest reference plane according to the shadow length formed by the object on the farthest reference plane, and further carrying out the mapping on the candidate absolute phase map
Figure BDA0002434483850000063
The absolute phase diagram correction is realized by increasing m × 2 pi, m is an integer, and the real absolute phase diagram phi is obtainedO
The independent statue three-dimensional shape can be obtained through the steps. As shown, fig. 2(a), (b) and (c) are the three-dimensional reconstruction results of independent statues at three positions during the movement respectively. It can be seen that the adaptive pixel-by-pixel unwrapping algorithm provided by the invention can effectively realize the measurement of three-dimensional shapes without being limited by the depth range.
Example 2:
and recovering the dynamic three-dimensional shape when the free palm is far away from the measuring system. The processing is performed by using the algorithm proposed by the present invention, and the reconstructed result of the palm portion is displayed as shown in fig. 3. Although the hand is moved over a large depth range, the three-dimensional surface contour can be accurately and clearly acquired in the whole process.
Example 3:
the dynamic three-dimensional shape of the lid with the complex edge feature when away from the measurement system is restored. The algorithm provided by the invention is utilized for processing, a related three-dimensional reconstruction model is shown in figure 4, and a dynamic measurement result shows that the method provided by the invention is suitable for high-speed three-dimensional shape measurement of any free moving object in a large depth measurement range.
The above implementations are provided for the purpose of describing the present invention only and are not intended to limit the scope of the present invention. The scope of the invention is defined by the appended claims. Various equivalent substitutions and modifications can be made without departing from the spirit and principles of the invention, and are intended to be within the scope of the invention.

Claims (5)

1. A fast large-range phase unwrapping method based on double reference planes is characterized in that: the method comprises the following steps:
determining a space range to be measured, respectively obtaining wrapping phases of a nearest reference plane and a farthest reference plane by utilizing a phase profilometry according to a sequence, and solving absolute phase diagrams of the nearest reference plane and the farthest reference plane by utilizing a time unwrapping method, wherein the absolute phase diagrams are phi respectivelyFAnd phiB(ii) a Simultaneously processing the fringe projection of the farthest reference plane to obtain a modulated projection I about the farthest reference planeB
Step (2) keeping the farthest reference plane in the camera visual field, enabling the object to be measured to be located at any position in the measuring range, enabling the projector to project N stripe patterns with different phase shifts, enabling N to be larger than or equal to 3, and enabling the camera to record a final image I obtained by modulating the stripe patterns on the surface of the object1,I2,...,IN(ii) a Using object image sequences InCalculating a wrapped phase map phi of an objectOAnd modulating the projection IOWherein 0 is<n≤N;
Step (3) utilizing the absolute phase diagram phi of the farthest reference planeBWrapping phase phi of unwrapped objectsOObtaining candidate absolute phase diagram of the object
Figure FDA0002434483840000011
Step (4) utilizing Sobel operator to respectively align phase diagrams
Figure FDA0002434483840000012
And modulating the projection IOPerforming edge detection, subtracting the binarization edge image of the former from the binarization edge image of the latter, and counting to obtain the number of pixels greater than 0 in the result; if the counted number of pixels is greater than a set threshold value T, classifying the unwrapping scheme as unwrapping failure, otherwise, classifying as unwrapping ambiguity;
step (5) for the unwrapping lossiness, providing unwrapped phase references by generating virtual adaptive reference plane phases using phiFAnd phiBSelf-adaptive generation, counting the number of the binary edge pixel values larger than 0 until the number is lower than a threshold value T, and converting the unwrapped phase map into an unwrapped ambiguity class;
step (6) for the unwrapping ambiguity class, determining the distance between the object and the farthest reference plane according to the shadow length formed by the object on the farthest reference plane, and determining the candidate absolute phase map
Figure FDA0002434483840000013
On the basis of the absolute phase diagram, a constant m × 2 pi is added to realize absolute phase diagram correction, and a real absolute phase diagram phi is obtainedO(ii) a Wherein m is an integer.
2. The fast wide-range phase unwrapping method according to claim 1, wherein:
calculating the fringe projection diagram of the farthest reference plane to obtain a modulation projection diagram I related to the farthest reference planeBThe concrete formula is as follows:
Figure FDA0002434483840000021
in the formula, N (N ≧ 3) represents the total number of captured pictures.
3. The fast wide-range phase unwrapping method according to claim 1, wherein:
the step (2) is to utilize the object image sequence InCalculating a wrapped phase map phi of an objectOAnd modulating the projection IOThe method comprises the following specific steps:
Figure FDA0002434483840000022
Figure FDA0002434483840000023
4. the fast wide-range phase unwrapping method according to claim 1, wherein:
the step (3) is to utilize the absolute phase diagram phi of the farthest reference planeBUnwrapping the wrapped phase phi of the object as a phase referenceOObtaining fringe order k (x, y) of wrapped phase map, and calculating candidate absolute phase map of object by combining fringe order distribution and wrapped phase map
Figure FDA0002434483840000024
Figure FDA0002434483840000025
Where floor (·) is a floor rounding function;
Figure FDA0002434483840000026
5. the fast wide-range phase unwrapping method according to claim 1, wherein:
in the step (5), for the unwrapping failure class, a virtual adaptive reference plane phase is generated to provide a unwrapping phase reference for the unwrapping failure class, and the adaptive reference plane is generated in a manner that:
ΦA=s×ΦF+(1-s)×ΦB,s∈[0,1]
counting the number of the binary edge pixel values larger than 0, if the number is lower than a threshold value T, converting the unwrapped phase image into an unwrapped ambiguity class, otherwise, continuously adjusting the reference plane until the counted number of effective pixels is lower than the threshold value T.
CN202010248074.2A 2020-04-01 2020-04-01 Rapid large-range phase unwrapping method based on double reference planes Active CN111524173B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010248074.2A CN111524173B (en) 2020-04-01 2020-04-01 Rapid large-range phase unwrapping method based on double reference planes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010248074.2A CN111524173B (en) 2020-04-01 2020-04-01 Rapid large-range phase unwrapping method based on double reference planes

Publications (2)

Publication Number Publication Date
CN111524173A true CN111524173A (en) 2020-08-11
CN111524173B CN111524173B (en) 2022-09-06

Family

ID=71901395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010248074.2A Active CN111524173B (en) 2020-04-01 2020-04-01 Rapid large-range phase unwrapping method based on double reference planes

Country Status (1)

Country Link
CN (1) CN111524173B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113048914A (en) * 2021-04-19 2021-06-29 中国科学技术大学 Phase unwrapping method and device
CN113393481A (en) * 2021-06-10 2021-09-14 湖南大学 Rapid phase unwrapping method, apparatus, device and medium based on edge detection
CN114964071A (en) * 2022-06-14 2022-08-30 广东工业大学 Concrete surface roughness test system, method, medium, equipment and terminal
CN117475172A (en) * 2023-12-28 2024-01-30 湖北工业大学 Deep learning-based high-noise environment phase diagram wrapping method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160261851A1 (en) * 2015-03-05 2016-09-08 Shenzhen University Calbration method for telecentric imaging 3d shape measurement system
CN109141291A (en) * 2018-09-25 2019-01-04 南昌航空大学 A kind of fast phase unwrapping algorithm
US20190271540A1 (en) * 2016-12-15 2019-09-05 Southeast University Error correction method for fringe projection profilometry system
CN110345882A (en) * 2019-06-28 2019-10-18 浙江大学 A kind of adaptive structure light three-dimension measuring system and method based on geometrical constraint
CN110428459A (en) * 2019-06-04 2019-11-08 重庆大学 A method of the Phase- un- wrapping based on numerical order coding
CN110440714A (en) * 2019-09-05 2019-11-12 南昌航空大学 A kind of phase unwrapping package method based on multifrequency and binary system striped
CN110779467A (en) * 2019-10-24 2020-02-11 中国科学技术大学 Shadow phase error compensation method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160261851A1 (en) * 2015-03-05 2016-09-08 Shenzhen University Calbration method for telecentric imaging 3d shape measurement system
US20190271540A1 (en) * 2016-12-15 2019-09-05 Southeast University Error correction method for fringe projection profilometry system
CN109141291A (en) * 2018-09-25 2019-01-04 南昌航空大学 A kind of fast phase unwrapping algorithm
CN110428459A (en) * 2019-06-04 2019-11-08 重庆大学 A method of the Phase- un- wrapping based on numerical order coding
CN110345882A (en) * 2019-06-28 2019-10-18 浙江大学 A kind of adaptive structure light three-dimension measuring system and method based on geometrical constraint
CN110440714A (en) * 2019-09-05 2019-11-12 南昌航空大学 A kind of phase unwrapping package method based on multifrequency and binary system striped
CN110779467A (en) * 2019-10-24 2020-02-11 中国科学技术大学 Shadow phase error compensation method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A. SOTOMAYOR-OLMEDO 等: "《2010 20th International Conference on Electronics Communications and Computers 》", 24 February 2010 *
戴美玲: ""一种基于双参考平面的等相位坐标标定方法"", 《光学学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113048914A (en) * 2021-04-19 2021-06-29 中国科学技术大学 Phase unwrapping method and device
CN113048914B (en) * 2021-04-19 2022-04-19 中国科学技术大学 Phase unwrapping method and device
CN113393481A (en) * 2021-06-10 2021-09-14 湖南大学 Rapid phase unwrapping method, apparatus, device and medium based on edge detection
CN114964071A (en) * 2022-06-14 2022-08-30 广东工业大学 Concrete surface roughness test system, method, medium, equipment and terminal
CN117475172A (en) * 2023-12-28 2024-01-30 湖北工业大学 Deep learning-based high-noise environment phase diagram wrapping method and system
CN117475172B (en) * 2023-12-28 2024-03-26 湖北工业大学 Deep learning-based high-noise environment phase diagram wrapping method and system

Also Published As

Publication number Publication date
CN111524173B (en) 2022-09-06

Similar Documents

Publication Publication Date Title
CN111524173B (en) Rapid large-range phase unwrapping method based on double reference planes
CN109658515B (en) Point cloud meshing method, device, equipment and computer storage medium
CN107833270B (en) Real-time object three-dimensional reconstruction method based on depth camera
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN108734776B (en) Speckle-based three-dimensional face reconstruction method and equipment
KR101554241B1 (en) A method for depth map quality enhancement of defective pixel depth data values in a three-dimensional image
Kim et al. Scene reconstruction from high spatio-angular resolution light fields.
US8947677B2 (en) Dual-frequency phase multiplexing (DFPM) and period coded phase measuring (PCPM) pattern strategies in 3-D structured light systems, and lookup table (LUT) based data processing
CN110555908B (en) Three-dimensional reconstruction method based on indoor moving target background restoration
CN107622480B (en) Kinect depth image enhancement method
CN111028295A (en) 3D imaging method based on coded structured light and dual purposes
Lun Robust fringe projection profilometry via sparse representation
CN104318552B (en) The Model registration method matched based on convex closure perspective view
Lun et al. Robust single-shot fringe projection profilometry based on morphological component analysis
CN104933754A (en) Linear shadow mapping method of de-pixeldined contour line reconstruction
CN111899326A (en) Three-dimensional reconstruction method based on GPU parallel acceleration
CN116753863A (en) Three-dimensional measurement method, three-dimensional measurement device, electronic equipment and storage medium
CN113551617B (en) Binocular double-frequency complementary three-dimensional surface type measuring method based on fringe projection
CN112637582B (en) Three-dimensional fuzzy surface synthesis method for monocular video virtual view driven by fuzzy edge
CN110988876B (en) Closed robust double-baseline InSAR phase unwrapping method and system and readable storage medium
CN110763156B (en) Three-dimensional imaging method and system based on light field
CN113450460A (en) Phase-expansion-free three-dimensional face reconstruction method and system based on face shape space distribution
Tokieda et al. High-frequency shape recovery from shading by cnn and domain adaptation
Guðmundsson et al. TOF-CCD image fusion using complex wavelets
Noguchi et al. High-resolution surface reconstruction based on multi-level implicit surface from multiple range images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant