CN112815867A - Phase unwrapping method for adaptively acquiring virtual plane - Google Patents

Phase unwrapping method for adaptively acquiring virtual plane Download PDF

Info

Publication number
CN112815867A
CN112815867A CN202011635485.3A CN202011635485A CN112815867A CN 112815867 A CN112815867 A CN 112815867A CN 202011635485 A CN202011635485 A CN 202011635485A CN 112815867 A CN112815867 A CN 112815867A
Authority
CN
China
Prior art keywords
phase
virtual plane
unwrapping
linear array
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011635485.3A
Other languages
Chinese (zh)
Inventor
林斌
邢生平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Jiang'ao Optoelectronics Technology Co ltd
Original Assignee
Suzhou Jiang'ao Optoelectronics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Jiang'ao Optoelectronics Technology Co ltd filed Critical Suzhou Jiang'ao Optoelectronics Technology Co ltd
Priority to CN202011635485.3A priority Critical patent/CN112815867A/en
Publication of CN112815867A publication Critical patent/CN112815867A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/2433Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures for measuring outlines by shadow casting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a phase unwrapping method for adaptively acquiring a virtual plane, which is simple in structure and only acquires a linear array structured light image to determine a virtual reference plane without determining the position of the virtual plane in a manual measurement mode; the projector is used for projecting a group of linear array structured light, the camera collects a picture to judge the depth information of the measured object, and the linear array structured light consists of a plurality of high-brightness light rays, so that the problem of line matching error needs to be eliminated through image preprocessing and a nearest neighbor search algorithm. The minimum absolute phase diagram defined on the virtual plane is utilized to carry out unwrapping operation to carry out phase unwrapping, the judgment precision of the fringe series is high, the anti-noise capability is strong, and the measurement speed and the unwrapping quality are improved.

Description

Phase unwrapping method for adaptively acquiring virtual plane
Technical Field
The invention belongs to the field of three-dimensional measurement, and particularly relates to a phase unwrapping method for adaptively acquiring a virtual plane.
Background
How to efficiently perform phase unwrapping has been a research focus in the field of digital fringe projection technology. The traditional time phase unwrapping needs more extra stripe pattern information to determine the number of the stripe levels, and is not high in unwrapping efficiency and not beneficial to high-speed three-dimensional measurement. A researcher obtains points with the same phase from different angles by adding additional cameras for matching, the method reduces the stereo matching difficulty of the traditional binocular camera, but due to the fact that the periodicity of stripes is considered, the points need to be checked backwards and forwards globally, correct matching points are obtained, the calculation speed is low, the system calibration requirement on two cameras and one projector is extremely high, the measurement difficulty on objects with rapidly changing surfaces is high, the applicable surface is narrow, and the cost and the complexity of the system are improved. Later researchers put forward that a method for expanding phases by utilizing geometrical constraint of structured light can realize rapid three-dimensional measurement, and phase unwrapping is carried out by referring to an absolute phase diagram established at a virtual plane position. However, the method needs to determine the position of the virtual plane in advance through a manual measurement mode, has low adaptability to different measurement objects, needs to determine the position of the virtual plane through the manual measurement mode every time of measurement, and is not favorable for rapid measurement of three-dimensional morphology.
Detailed Description
Aiming at the defect that the position of a virtual plane can not be determined in a self-adaptive manner, the invention provides a phase unwrapping method for obtaining the virtual plane in a self-adaptive manner, which is realized by the following technical scheme:
the invention discloses a phase unwrapping method for adaptively acquiring a virtual plane, which comprises the following steps:
1) projecting linear array structured light to the surface of an object by using a projector, obtaining the approximate outline of the object through image processing and skeletonization, obtaining the position information of the highest point of the surface of the object, and rapidly and adaptively determining the position of a virtual plane;
2) establishing a minimum absolute phase at the position of the virtual plane according to the height-phase mapping relation by means of the calibrated parameters;
3) projecting digital stripes to the surface of the object by using a projector to obtain a wrapping phase;
4) assisting the wrapped phase to perform phase unwrapping by means of the minimum phase established on the virtual plane;
5) and realizing three-dimensional reconstruction by unwrapping the phase and utilizing the phase-height mapping relation.
As a further improvement, the virtual plane of the invention is infinitely close to and does not coincide with the surface of the object.
As a further improvement, in step 1), the method specifically includes projecting a set of linear array structured light by a projector, collecting a picture by a charge coupled device in a camera to determine depth information of a measured object, where the depth information is obtained by the following formula:
Figure BDA0002881017640000021
the method comprises the following steps that S is the depth information of the surface of an object to be measured, S' is the pixel offset in a charge coupling device in a camera, f is the focal length of the charge coupling device in the camera, alpha is the included angle between the central axis OA of an imaging lens and the normal of the surface of the object, beta is the included angle between the charge coupling device in the camera and the central axis OA of the imaging lens, gamma is the included angle between the central axis OA of a structured light generator and the normal of the surface of the object, b is the distance from the main plane of the imaging lens to the light spot of the surface of the object, and a is the distance from the;
the information Z of the highest point of the surface can be obtained through the depth information S obtained in the above waymin
As a further improvement, in step 1) of the present invention, the position Z of the virtual plane iswComprises the following steps:
Zw=Zmin+ΔZ (2)
and deltaZ is a margin and is determined according to the error of the measured height and the actual height of the object.
As a further improvement, the value of the delta Z is determined according to the error of the height of the object optically measured by the linear array structure and the actual height of the object.
As a further improvement, the minimum absolute phase phi in step 2) of the inventionminIs the absolute phase of a one-to-one mapping by phiminThe wrapped phase unwrapping results in an absolute phase as well.
The invention has the following beneficial effects:
the technical scheme of the invention is not influenced by the appearance of the measured object, only one linear array structured light image is acquired to determine the virtual reference plane, the position of the virtual plane is not required to be determined in a manual measurement mode, and the structure is simple; the minimum absolute phase diagram defined on the virtual plane is utilized to carry out unwrapping operation to carry out phase unwrapping, the judgment precision of the fringe series is high, the anti-noise capability is strong, and the measurement speed and the unwrapping quality are improved.
The determination of the virtual plane has direct influence on phase unwrapping, when the object is farther away from the structured light system than the virtual plane, the projection of the object on the projector electronic micro-mirror plane can generate offset, and the phase value is larger than that on the virtual plane, so that the virtual plane is set at a position which is infinitely close to the surface of the object and is not coincident with the surface of the object.
The projector is used for projecting a group of linear array structured light, the camera collects a picture to judge the depth information of the measured object, and the linear array structured light consists of a plurality of high-brightness light rays, so that the problem of line matching error needs to be eliminated through image preprocessing and a nearest neighbor search algorithm.
When the light of the linear array structure irradiates on the surface of an object, the distance between the linear array structure and the object is smaller, and only the approximate depth information of the highest point of the surface of the object is taken, so that the linear array structure has no limit on the measurement height range of the object.
Minimum absolute phase ΦminIs the absolute phase of a one-to-one mapping by phiminThe absolute phase is obtained by wrapping phase expansion, the minimum absolute phase is established by the depth information of the virtual plane obtained by one-time measurement of the linear array structured light, the phase expansion can be assisted, the number of image acquisition is small, and the strips are arrangedThe pattern grade determination precision is high, the measurement speed is not influenced by the appearance of the measured object, and the noise resistance is strong.
Drawings
FIG. 1 is a schematic contour view of a surface of an object to be measured;
FIG. 2 is a minimum absolute phase diagram;
FIG. 3 is a projected fringe pattern and wrapped phase diagram;
FIG. 4 is a phase unwrapping flow diagram;
FIG. 5 is a reconstruction flow diagram;
FIG. 6 is a comparison of the mapping of the object plane and the virtual plane in the projector;
in the figure, 1 is the object plane, 2 is the virtual plane, 3 is the collimating lens, 4 is the charge coupled device in the camera, 5 is the digital micromirror device in the projector, and 6 is the mapping region of the object plane.
FIG. 7 is a schematic diagram of oblique incidence triangulation;
in the figure, 7 is a structured light generator, 4 is a charge coupled device in the camera, 3 is a collimating lens, 8 is an imaging system, 9 is a reference plane, and 10 is a real plane.
Detailed Description
The technical scheme of the invention is further explained by specific embodiments in the following description and the attached drawings of the specification:
the invention discloses a phase unwrapping method for adaptively acquiring a virtual plane, which comprises the following specific steps of:
1) FIG. 1 is a schematic outline view of the surface of an object to be measured; the projector is used for projecting linear array structured light to the surface of an object, a camera collects linear array structured light patterns irradiated on the surface of the object, after image processing and skeletonization, the approximate outline of the object is restored through model viewpoint conversion, the position information of the highest point of the surface of the object is obtained, the position of a virtual plane is rapidly and adaptively determined, the depth information of the highest point of the surface of the object is obtained according to the image, and the spatial position of the virtual plane is determined. Since the determination of the virtual plane does not need to be particularly accurate, it is sufficient to obtain rough contour information.
2) And FIG. 2 is a minimum absolute phase diagram, which is obtained on a virtual plane according to a height-phase mapping relation by means of calibrated parameters.
3) And FIG. 3 is a projected fringe pattern and wrapped phase diagram, a projector is used for projecting digital fringes onto the surface of an object, three fringe patterns are obtained by adopting a three-step phase shift method, and then the wrapped phase diagram is obtained.
4) FIG. 4 is a phase unwrapping flow chart; the minimum absolute phase established on the virtual plane is used for assisting the wrapped phase to perform phase expansion, and the accuracy of judging the fringe order is high.
5) FIG. 5 is a reconstruction flow diagram; and recovering the three-dimensional reconstruction by using the unwrapped actual phase diagram and the phase-height relation.
The determination of the virtual plane has a direct impact on the phase unwrapping. FIG. 6 is a comparison of the mapping of the object plane and the virtual plane in the projector; fig. 6 shows the mapping of the object plane 1 and the virtual plane 2 on the projector digital micromirror plane 5. When the object pattern on the CCD 4 in the camera is projected through the collimating lens 3, and the object plane 1 is farther away from the structured light system than the virtual plane 2, the projection of the object on the projector digital micromirror plane 5 will be shifted, i.e. the object plane mapping region 6, and the phase value is larger than the phase value on the virtual plane 2, so the virtual plane 2 is set at an infinite proximity and non-coincident position with respect to the object plane 1.
FIG. 7 is a schematic diagram of oblique incidence triangulation; a group of linear array structured light is projected by a structured light generator 7, is irradiated on the surface of an actual plane 10 of an object through a collimating lens 3 to form reflection, is imaged on a charge coupled device 4 in a camera after passing through an imaging system 8, and the charge coupled device 4 collects a deformation graph and compares the deformation graph with a light spot on a reference plane 9 to judge the depth information of the object to be measured.
The relationship of depth information S to the in-camera pixel offset S' is derived from the model of fig. 7:
Figure BDA0002881017640000051
wherein S is the object to be measuredSurface depth information, S' is the offset of pixels in the camera, f is the focal length of the camera, alpha is the included angle between the central axis OA of the imaging lens and the surface normal of the object, beta is the included angle between the camera and the central axis OA of the imaging lens, gamma is the included angle between the central axis of the structured light generator and the surface normal of the object, b is the distance from the main plane of the imaging lens to the light spot on the surface of the object, a is the distance from the main plane of the imaging lens to the charge coupling device, and the denominator
Figure BDA0002881017640000052
The symbol '-' represents the actual plane below the reference plane, and '+' represents the actual plane above the reference plane. The approximate contour of the surface of the object is recovered through the transformation of the model viewpoint, and the height information of the highest point of the surface can be further obtained.
In order to ensure that the virtual plane is before the highest point of the object surface, a margin Δ Z in the Z direction should be removed, which is determined according to the error between the measured height and the actual height of the object, when the position of the virtual plane is:
Zw=Zmin+ΔZ (9)
in the formula, the value of the delta Z is determined according to the error between the height of the object measured by the linear array structure light and the actual height of the object.
Because of the minimum absolute phase phiminIs the absolute phase of a one-to-one mapping by phiminThe absolute phase is also obtained by phase unwrapping, the linear array structure light is measured once to obtain virtual plane depth information, the minimum absolute phase is established, the phase unwrapping can be assisted, the number of image acquisition is small, the fringe progression determination precision is high, the measurement speed is not influenced by the shape of a measured object, and the anti-noise capability is strong.
The foregoing description is not intended to be limiting, and it will be appreciated by those skilled in the art that various changes, modifications, additions or substitutions may be made without departing from the true scope of the invention, and these improvements and modifications are intended to be within the scope of the invention.

Claims (6)

1. A phase unwrapping method for adaptively acquiring a virtual plane is characterized by comprising the following steps:
1) projecting linear array structured light to the surface of an object by using a projector, obtaining the approximate outline of the object through image processing and skeletonization, obtaining the position information of the highest point of the surface of the object, and rapidly and adaptively determining the position of a virtual plane (2);
2) establishing a minimum absolute phase at the position of the virtual plane (2) according to the height-phase mapping relation by means of the calibrated parameters;
3) projecting digital stripes to the surface of the object by using a projector to obtain a wrapping phase;
4) assisting the wrapped phase for phase unwrapping by means of a minimum phase established on the virtual plane (2);
5) and realizing three-dimensional reconstruction by unwrapping the phase and utilizing the phase-height mapping relation.
2. The phase unwrapping method for adaptively acquiring a virtual plane as defined in claim 1, wherein the virtual plane (2) is infinitely close to and does not coincide with the surface of the object.
3. The phase unwrapping method for adaptively acquiring the virtual plane according to claim 1, wherein in the step 1), the specific steps are that a projector is used for projecting a group of linear array structured light, a charge-coupled device (4) in the camera acquires a picture to determine the depth information of the object to be measured, and the depth information is obtained by the following formula:
Figure FDA0002881017630000011
the method comprises the following steps that S is the depth information of the surface of an object to be measured, S' is the pixel offset in a charge coupling device (4) in a camera, f is the focal length of the charge coupling device (4) in the camera, alpha is the included angle between the central axis OA of an imaging lens and the normal of the surface of the object, beta is the included angle between the charge coupling device (4) in the camera and the central axis OA of the imaging lens, gamma is the included angle between the central axis of a structured light generator (7) and the normal of the surface of the object, b is the distance from the main plane of the imaging lens to the light spot of the surface of the object, and a is the distance from the main;
the information Z of the highest point of the surface can be obtained through the depth information S obtained in the above waymin
4. Phase unwrapping method according to claim 3, characterized in that in said step 1), said virtual plane (2) has a position ZwComprises the following steps:
Zw=Zmin+ΔZ (2)
and deltaZ is a margin and is determined according to the error of the measured height and the actual height of the object.
5. The phase unwrapping method for adaptively acquiring the virtual plane as recited in claim 4, wherein the value of Δ Z is determined according to an error between an object height optically measured by the linear array structure and an actual object height.
6. The phase unwrapping method for adaptively acquiring a virtual plane as defined in claim 1, wherein the minimum absolute phase Φ in the step 2) isminIs the absolute phase of a one-to-one mapping by phiminThe wrapped phase unwrapping results in an absolute phase as well.
CN202011635485.3A 2020-12-31 2020-12-31 Phase unwrapping method for adaptively acquiring virtual plane Pending CN112815867A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011635485.3A CN112815867A (en) 2020-12-31 2020-12-31 Phase unwrapping method for adaptively acquiring virtual plane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011635485.3A CN112815867A (en) 2020-12-31 2020-12-31 Phase unwrapping method for adaptively acquiring virtual plane

Publications (1)

Publication Number Publication Date
CN112815867A true CN112815867A (en) 2021-05-18

Family

ID=75857052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011635485.3A Pending CN112815867A (en) 2020-12-31 2020-12-31 Phase unwrapping method for adaptively acquiring virtual plane

Country Status (1)

Country Link
CN (1) CN112815867A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103900489A (en) * 2014-03-11 2014-07-02 苏州江奥光电科技有限公司 Linear laser scanning three-dimensional contour measuring method and device
CN109307483A (en) * 2018-11-20 2019-02-05 西南石油大学 A kind of phase developing method based on structured-light system geometrical constraint
CN110345882A (en) * 2019-06-28 2019-10-18 浙江大学 A kind of adaptive structure light three-dimension measuring system and method based on geometrical constraint
CN110849268A (en) * 2019-12-10 2020-02-28 南昌航空大学 Quick phase-height mapping calibration method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103900489A (en) * 2014-03-11 2014-07-02 苏州江奥光电科技有限公司 Linear laser scanning three-dimensional contour measuring method and device
CN109307483A (en) * 2018-11-20 2019-02-05 西南石油大学 A kind of phase developing method based on structured-light system geometrical constraint
CN110345882A (en) * 2019-06-28 2019-10-18 浙江大学 A kind of adaptive structure light three-dimension measuring system and method based on geometrical constraint
CN110849268A (en) * 2019-12-10 2020-02-28 南昌航空大学 Quick phase-height mapping calibration method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
苏涵: "激光三角扫描物体形貌测量传感器关键技术研究", 《中国优秀硕士学位论文全文数据库(电子期刊)信息科技辑》 *

Similar Documents

Publication Publication Date Title
CN110296667B (en) High-reflection surface three-dimensional measurement method based on line structured light multi-angle projection
CN111207693A (en) Three-dimensional measurement method of turbine blade ceramic core based on binocular structured light
CN106500627A (en) 3-D scanning method and scanner containing multiple different wave length laser instrument
CN111879235A (en) Three-dimensional scanning detection method and system for bent pipe and computer equipment
WO2013076605A1 (en) Method and system for alignment of a pattern on a spatial coded slide image
US20130272600A1 (en) Range image pixel matching method
CN113205592B (en) Light field three-dimensional reconstruction method and system based on phase similarity
CN109307483A (en) A kind of phase developing method based on structured-light system geometrical constraint
CN110345882A (en) A kind of adaptive structure light three-dimension measuring system and method based on geometrical constraint
WO2018107427A1 (en) Rapid corresponding point matching method and device for phase-mapping assisted three-dimensional imaging system
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN114972538A (en) Thickness measuring device and method of flattened ultrathin heat pipe based on binocular structured light
CN114219866A (en) Binocular structured light three-dimensional reconstruction method, reconstruction system and reconstruction equipment
CN116817796B (en) Method and device for measuring precision parameters of curved surface workpiece based on double telecentric lenses
CN109708612A (en) A kind of blind scaling method of light-field camera
CN109636859A (en) A kind of scaling method of the 3D vision detection based on one camera
JP7300895B2 (en) Image processing device, image processing method, program, and storage medium
CN109741384B (en) Multi-distance detection device and method for depth camera
CN112815867A (en) Phase unwrapping method for adaptively acquiring virtual plane
CN116518869A (en) Metal surface measurement method and system based on photometric stereo and binocular structured light
KR101465996B1 (en) Method for measurement of high speed 3d shape using selective long period
CN114143426B (en) Three-dimensional reconstruction system and method based on panoramic structured light
FR2869983A1 (en) METHOD FOR MEASURING THE POSITION AND / OR THE ORIENTATION OF OBJECTS USING IMAGE PROCESSING METHODS
Arold et al. Hand-guided 3D surface acquisition by combining simple light sectioning with real-time algorithms
CN113551617B (en) Binocular double-frequency complementary three-dimensional surface type measuring method based on fringe projection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210518