CN114943755B - Processing method for three-dimensional reconstruction of phase image based on binocular structured light - Google Patents

Processing method for three-dimensional reconstruction of phase image based on binocular structured light Download PDF

Info

Publication number
CN114943755B
CN114943755B CN202210875682.5A CN202210875682A CN114943755B CN 114943755 B CN114943755 B CN 114943755B CN 202210875682 A CN202210875682 A CN 202210875682A CN 114943755 B CN114943755 B CN 114943755B
Authority
CN
China
Prior art keywords
phase
pixel
image
slope
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210875682.5A
Other languages
Chinese (zh)
Other versions
CN114943755A (en
Inventor
郑晓军
唐笑虎
胡子阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SICHUAN INSTITUTE PRODUCT QUALITY SUPERVISION INSPECTION AND RESEARCH
Original Assignee
SICHUAN INSTITUTE PRODUCT QUALITY SUPERVISION INSPECTION AND RESEARCH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SICHUAN INSTITUTE PRODUCT QUALITY SUPERVISION INSPECTION AND RESEARCH filed Critical SICHUAN INSTITUTE PRODUCT QUALITY SUPERVISION INSPECTION AND RESEARCH
Priority to CN202210875682.5A priority Critical patent/CN114943755B/en
Publication of CN114943755A publication Critical patent/CN114943755A/en
Application granted granted Critical
Publication of CN114943755B publication Critical patent/CN114943755B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of three-dimensional measurement and discloses a processing method for three-dimensionally reconstructing a phase image based on binocular structured light. The method comprises the following steps: acquiring a pose transformation matrix between the left camera and the right camera and normalizing the pose transformation matrix; setting minimum pixel change, calculating the maximum pixel change generated in the horizontal direction, and rounding up to be used as the number of horizontal segments to obtain a segmented image; calculating the epipolar slope of the central pixels of each segmented image as the non-difference epipolar slope of all pixels in each segmented image; projecting a plurality of grating images with specific phase shifts to the surface of an object to be measured by adopting a phase measurement profilometry to form a sine stripe image sequence, solving to obtain a wrapping phase, and generating a phase image which monotonously changes along the horizontal direction; and acquiring a segmentation region of the phase image, and performing segmentation region continuous matching by taking the front point matching end point as a rear point matching start point. The technical scheme of the invention improves the three-dimensional measurement precision and the measurement efficiency.

Description

Processing method for three-dimensional reconstruction of phase image based on binocular structured light
Technical Field
The invention relates to the technical field of object three-dimensional measurement, in particular to a processing method for reconstructing a phase image based on binocular structured light.
Background
Three-dimensional morphological features of objects are one of the most important features of objects. Three-dimensional surface shape measurement (namely three-dimensional object surface profile measurement) is an important means for acquiring object morphological characteristics, and is also a basis for recording, comparing and copying the object morphological characteristics.
In the existing method for measuring the three-dimensional surface shape of an object, optical three-dimensional sensing is increasingly paid attention and researched by people due to non-contact and high measurement precision. The phase measurement profilometry based on the structured light is an important three-dimensional sensing method, is an existing technology, adopts the sinusoidal grating projection and the phase shift technology, reconstructs three-dimensional information of the surface of an object by acquiring the spatial information of full-field stripes and the time sequence information of the phase shift stripes in a stripe period, has the characteristics of high precision, no influence of the surface reflectivity of the object and the like, and is easy to realize computer-aided automatic measurement, so that the phase measurement profilometry based on the structured light has wide application prospect in the fields of industrial detection, physical profiling, biomedicine, machine vision and the like.
The phase measurement profilometry based on the structured light is matched with binocular stereo vision, so that three-dimensional reconstruction can be realized quickly, accurately and simply. Compared with monocular camera reconstruction, the binocular camera has better robust performance on complex scenes due to the reduction of the blind area of the visual field. At present, in order to realize real-time binocular three-dimensional reconstruction, projection is performed by using various structured lights in the aspect of structured light imaging, wherein the projection is performed by using surface structured light, bidirectional structured light, colored structured light, binary coded structured light and the like, and the unwrapping phase process is optimized by utilizing various structured light characteristics, so that the unwrapping phase speed and precision are improved.
However, no matter what structured light is adopted, the process of acquiring the phase is optimized, and the customized optimization is not performed on the stereo matching after the phase expansion. According to the method, through polar line geometric characteristics, matching optimization is carried out on the stereo phase by utilizing the traditional stereo vision optimization schemes such as polar line correction, pixel one-to-one corresponding matching is achieved after the image to be matched is subjected to reprojection correction, however, after the image is corrected by polar line correction, image phase information changes due to interpolation and is not matched with original pixels during matching, and therefore when the pixel points are accurately positioned, the matching precision can be maintained by recovering a fitting surface to select sub-pixel matching points by using principles such as least square. Although the methods such as epipolar rectification reduce the matching time of the pixel points, the preprocessing is often greatly prolonged and brings errors caused by fitting, and the rapid, accurate and simple measurement is difficult.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the invention provides a processing method for three-dimensional reconstruction phase images based on binocular structured light, aiming at solving the problems of measurement precision and measurement efficiency in the existing binocular structured light three-dimensional reconstruction.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a processing method for three-dimensional reconstruction of a phase image based on binocular structured light comprises the following steps:
step 1, acquiring a pose transformation matrix between a left camera and a right camera and normalizing the pose transformation matrix;
step 2, setting minimum pixel change, calculating the maximum pixel change generated in the horizontal direction according to the polar line slope, rounding up to be used as the number of horizontal segments, and averagely dividing based on the number of the horizontal segments to obtain a segmented image;
step 3, calculating the polar slope of the central pixel of each segmented image, and taking the polar slope of the central pixel as the non-difference polar slope of all pixels in each segmented image;
step 4, projecting a plurality of grating images with specific phase shifts to the surface of an object to be measured by adopting a phase measurement profilometry to form a sine stripe image sequence, solving to obtain a wrapping phase, and generating a phase image which changes monotonously along the horizontal direction;
and 5, acquiring a segmented region of the phase image, and performing segmented region continuous matching by taking the front point matching end point as a rear point matching start point.
Further, in step 1, the pose transformation matrix includes a rotation matrix and a translation matrix.
Further, in step 2, the minimum pixel change is 1 pixel.
Further, in step 2, the maximum pixel variation S is calculated by:
Figure 848366DEST_PATH_IMAGE001
wherein, the delta k' is the slope difference of the head and tail polar lines in the segmented image area,
Figure 236622DEST_PATH_IMAGE002
the imaging width for the camera is wide.
Further, in step 3, the polar slope k of the central pixel is related to the pixel point of the left camera
Figure 727646DEST_PATH_IMAGE003
The calculation method comprises the following steps:
Figure 910366DEST_PATH_IMAGE004
whereinp i q i t i For parameterized matrices, the calculation process is as follows:
Figure 690103DEST_PATH_IMAGE005
wherein C is an algebraic remainder matrix of the left camera reprojection matrix;
whereina i b i c i For the reprojection matrix parameters:
Figure 367203DEST_PATH_IMAGE006
wherein, the first and the second end of the pipe are connected with each other,
Figure 712734DEST_PATH_IMAGE007
is the pth row and the qth column of the reprojection matrix of the right camera,
Figure 331934DEST_PATH_IMAGE008
the imaging pole of the right camera.
Further, calculating all pixel point positions corresponding to each non-differential polar line slope, and establishing a mapping table of the corresponding relation between the pixel point positions and the non-differential polar line slopes.
Compared with the prior art, the invention has the following beneficial effects:
in the technical scheme of the invention, the epipolar line slope of a central pixel is taken as the non-differential epipolar line slope of all pixels in each segmented image, and isolated epipolar lines in the segmented images are combined into the non-differential epipolar line of the pixel, so that the continuous matching of the phase along the non-differential epipolar line of the pixel is realized, and the precision height is measured. In addition, the epipolar line slope of all pixels in each segmented image is replaced by the epipolar line slope of the central pixel in the adjacent region, and a mapping table is established for searching all the pixel point positions and the epipolar line slopes, so that the purpose of simplifying epipolar line calculation is achieved, calculation time consumption is greatly reduced while the accuracy is kept, and the measurement efficiency is improved.
Drawings
Fig. 1 is a schematic flow diagram of a processing method for three-dimensionally reconstructing a phase image based on binocular structured light.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a processing method for three-dimensionally reconstructing a phase image based on binocular structured light includes the following steps:
step 1, acquiring a pose transformation matrix between a left camera and a right camera and normalizing the pose transformation matrix; step 2, setting minimum pixel change, calculating the maximum pixel change generated in the horizontal direction according to the polar line slope, rounding up to be used as the number of horizontal segments, and averagely dividing based on the number of the horizontal segments to obtain a segmented image; step 3, calculating the polar slope of the central pixel of each segmented image, taking the polar slope of the central pixel as the non-differential polar slope of all pixels in each segmented image, and establishing a mapping table based on the corresponding relation between all pixel positions and the non-differential polar slope; step 4, projecting a plurality of grating images with specific phase shifts to the surface of an object to be measured by adopting a phase measurement profilometry to form a sine stripe image sequence, solving to obtain a wrapping phase, and generating a phase image which changes monotonously along the horizontal direction; and 5, acquiring a segmented region of the phase image, and performing segmented region continuous matching by taking the front point matching end point as a rear point matching start point.
In this embodiment, in the phase measurement profilometry, a plurality of grating images with specific phase shifts are projected onto the surface of an object to form a sequence of sinusoidal fringe images with depth differences, and a phase depth image is obtained by unwrapping. Phase profilometry is a prior art technique.
On the first hand, in this embodiment, by combining monotonicity of polar line change in global step length in polar line geometry and pixel level change of a phase image, a similar region along a polar line direction (horizontal direction) and also a phase increasing direction is integrated, that is, data of each row in the horizontal direction is integrated to obtain a segmented image. The pixels with the position difference of the corresponding pixels on the global step length epipolar line smaller than the minimum pixel change are integrated into a similar area, so that the complicated calculation process of the whole pixel epipolar line can be simplified into the calculation of the central pixel epipolar line in the segmented image area, and the purpose of simplifying the epipolar line calculation is further achieved.
In a second aspect, this embodiment provides a phase continuous search algorithm according to the strategy for calculating a segmented image proposed in the first aspect, because the phase change in a close region (usually, a horizontal line segment) is monotonous, when searching along the same epipolar line, the search on the epipolar line does not need to be performed from zero step to image wide step according to the original search of each time, and the purpose of continuous search is achieved by continuing the search position of the previous pixel point as the search start position of the current pixel point, thereby reducing redundant calculation among different pixel points, and in order to counter certain sudden interference, matching can be performed after a certain buffer interval is set during each matching.
In some embodiments, in step 1, the re-projection matrix of the camera is a pose transformation matrix between the left camera and the right camera, the pose transformation matrix including a rotation matrix and a translation matrix. Normalizing the pose transformation matrix between the left camera and the right camera, setting the imaging plane of the left camera and the imaging plane of the right camera as a two-dimensional coordinate, setting the imaging plane on a plane with a z-axis value of 1 in the three-dimensional coordinate when adding the two-dimensional coordinate into the three-dimensional coordinate for calculation, and connecting a pixel coordinate system and a camera coordinate system in series in order to realize the constraint that the z-axis value in the three-dimensional coordinate is 1.
In some embodiments, in step 2, the minimum pixel variation is 1 pixel. The pixel points with the position difference of the corresponding points on the global step length epipolar line being less than 1 pixel are unified into a similar area, and one similar area is a segmented image, so that the complicated process of computing the epipolar line of the whole pixel can be simplified into computing the epipolar line of the central pixel in the segmented image area, and the purpose of simplifying the epipolar line computing is further achieved. Furthermore, based on the simplification strategy, a phase continuous search algorithm is provided, redundant calculation among different pixel points is reduced, and three-dimensional reconstruction is realized quickly, accurately and simply.
In some embodiments, for epipolar geometry, all epipolar lines pass through the stationary point of the pole of the imaging plane, so the epipolar lines can be represented by the point-slope equation:
Figure 333388DEST_PATH_IMAGE009
whereinkThe slope of the epipolar line is found from the left point,
Figure 63447DEST_PATH_IMAGE010
for the imaging point of the right camera,
Figure 14216DEST_PATH_IMAGE011
the imaging pole of the right camera. Setting the depth of the reconstructed object to infinity, the position of the reconstructed object in the left camera imaging does not change, so the imaging in the right camera is still on the same polar line, and the position projected on the right camera imaging plane can be located according to the homography principle of the infinity plane, and the point is taken as a phase pole, so the polar slope can be expressed as:
Figure 804318DEST_PATH_IMAGE012
whereinkThe slope of the epipolar line is found from the left point,
Figure 293068DEST_PATH_IMAGE013
is a phase imaging point of the right camera,
Figure 561238DEST_PATH_IMAGE011
imaging poles for the right camera. Then calculating the right camera plane by using Hartley algorithmNormal vector of plane
Figure 881361DEST_PATH_IMAGE014
And expressing the slope of the polar line by using parameters of a right camera plane normal vector and a reprojection matrix (a left camera and right camera position posture transformation matrix):
Figure 593096DEST_PATH_IMAGE015
whereina i b i c i The parameters obtained by the re-projection matrix and the pixel points of the left camera can be obtained by the following formula:
Figure 303563DEST_PATH_IMAGE006
wherein
Figure 375424DEST_PATH_IMAGE007
Is the pth row and qth column of the reprojection matrix of the right camera.
Imaging the pixel in the epipolar geometry at each position in the world coordinate system satisfies the homography principle, namely obeys the epipolar constraint
Figure 815633DEST_PATH_IMAGE016
In whichmm' refers to the homogeneous coordinates of the corresponding pixel points,Frefers to the basis matrix. Then, parameterization expression is carried out on the intermediate process, and the calculation equation of the epipolar line slope directly related to the pixel point of the left camera is obtained as follows:
Figure 885220DEST_PATH_IMAGE017
whereinp i q i t i The matrix calculation process for parameterization can be calculated by:
Figure 145300DEST_PATH_IMAGE005
whereinCReproject matrix M for left camera L An algebraic remainder matrix of (1), thereby, due top i q i t i The values of the isoparametric are fixed for a pair of binocular cameras, so that when the whole image is matched, only 1 time of calculation is needed to establish a non-differential polar slope about all pixel points of the left camera
Figure 503076DEST_PATH_IMAGE003
The mapping table of the position has corresponding relations between all pixel point positions and polar line slopes, and the subsequent table lookup is only needed to obtain the non-differential polar line slope based on the pixel point positions, so that the time consumed by matching can be greatly reduced.
In some embodiments, in binocular stereopsis, epipolar lines typically appear as straight lines that are nearly parallel to the imaging plane, with small slope values. Therefore, for a higher resolution image, when the change generated by the epipolar slope of the adjacent pixels in the left camera is not different from the pixel position searched along the epipolar line in the right camera imaging in the same step length, it can be regarded as being on the same epipolar line, which is called as pixel non-differential epipolar line in the present application, that is, the following conditions are satisfied:
Figure 797791DEST_PATH_IMAGE018
whereinW s The search length is synchronized, i.e. the same step size as experienced at different epipolar lines. Therefore, under a specific step length, polar lines of different pixels in the area can be regarded as the same straight line, the result is no difference in pixel level search, and meanwhile, according to monotonicity of a phase image, continuous search can be realized for the same-row phase in a certain area, namely, a search starting point of a subsequent point can start continuous search from a search end point of a front point, so that the search time is greatly reduced, and the matching efficiency is improved. To realize continuous searching in image region, the searching step length is setW s As a cameraThe imaging width is wide. By the polar slope formula
Figure 835017DEST_PATH_IMAGE017
It can be known that, for each row of pixel points, the number of rows in the vertical direction is fixed, so that the change of the polar slope is a monotonous inverse proportion function, and the maximum pixel change can be obtained by directly calculating the head-to-tail slope interpolation as the maximum difference value in the horizontal direction:
Figure 785656DEST_PATH_IMAGE001
wherein Δk′For the head and tail polar slope difference in the segmented image region,Sfor maximum pixel change, since the pixel changes to the pixel level, thereforeSThe whole is required to be rounded up,Wimaging width for cameraWidth. And the pose of the binocular camera can be adjusted to enable the baseline of the binocular camera to be parallel to the horizontal plane, and the pose of the camera can be obtained by combining the internal reference position of the camera and the pose of the camera:
Figure 464899DEST_PATH_IMAGE019
thus, according to the polar slope formula
Figure 99273DEST_PATH_IMAGE017
And calculating the epipolar slope in the same row, wherein the calculation of the epipolar slope is degenerated into a linear function, so that the slope change in the horizontal direction (epipolar direction) can be regarded as a linear change. Can be based onSEqually dividing the pixel points of the left camera in the horizontal direction to obtain the original imageWidthStrip for packaging articlesWidthStep size polar line simplification toSStripWidthThe step-size pixels have no continuous phase matching on the epipolar line of the difference, wherein each segment has no polar slope of the difference as the polar slope of the center of the segment.
Finally, it should be noted that: the above embodiments are only preferred embodiments of the present invention to illustrate the technical solutions of the present invention, but not to limit the technical solutions, and certainly not to limit the scope of the present invention; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention; that is, the technical problems to be solved by the present invention, which are not substantially changed or supplemented by the spirit and the concept of the main body of the present invention, are still consistent with the present invention and shall be included in the scope of the present invention; in addition, the technical scheme of the invention is directly or indirectly applied to other related technical fields, and the technical scheme of the invention is included in the patent protection scope of the invention.

Claims (3)

1. A processing method for three-dimensional reconstruction of a phase image based on binocular structured light is characterized by comprising the following steps:
step 1, acquiring a pose transformation matrix between a left camera and a right camera and normalizing the pose transformation matrix;
step 2, setting minimum pixel change, calculating the maximum pixel change generated in the horizontal direction according to the polar line slope, rounding up to be used as the number of horizontal segments, and averagely dividing based on the number of the horizontal segments to obtain a segmented image;
step 3, calculating the polar line slope of the central pixel of each segmented image, and taking the polar line slope of the central pixel as the non-differential polar line slope of all pixels in each segmented image;
step 4, projecting a plurality of grating images with specific phase shifts to the surface of an object to be measured by adopting a phase measurement profilometry to form a sine stripe image sequence, solving to obtain a wrapping phase, and generating a phase image which changes monotonously along the horizontal direction;
step 5, obtaining a segmentation region of the phase image, and performing segmentation region continuous matching by taking the front point matching end point as a rear point matching start point;
in step 2, the maximum pixel variation S is calculated by:
Figure DEST_PATH_IMAGE001
wherein, the delta k' is the slope difference of the head and tail polar lines in the segmented image area,
Figure DEST_PATH_IMAGE002
imaging width for the camera;
in the step 3, the center pixel polar line slope k is related to the pixel point of the left camera
Figure DEST_PATH_IMAGE003
The calculation method comprises the following steps:
Figure DEST_PATH_IMAGE004
whereinp i q i t i For parameterized matrices, the calculation process is as follows:
Figure DEST_PATH_IMAGE005
wherein C is an algebraic residue matrix of the left camera reprojection matrix;
whereina i b i c i For the reprojection matrix parameters, the expression is:
Figure DEST_PATH_IMAGE006
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE007
is the pth row and the qth column of the reprojection matrix of the right camera,
Figure DEST_PATH_IMAGE008
is a right cameraThe imaging pole of (a);
and calculating all pixel point positions corresponding to each non-differential polar line slope, and establishing a mapping table of the corresponding relation between the pixel point positions and the non-differential polar line slopes.
2. The binocular-structured-light-based phase image three-dimensional reconstruction processing method according to claim 1, wherein in step 1, the pose transformation matrix comprises a rotation matrix and a translation matrix.
3. The binocular-structured-light-based processing method for three-dimensional reconstruction of the phase image according to claim 1, wherein in step 2, the minimum pixel change is 1 pixel.
CN202210875682.5A 2022-07-25 2022-07-25 Processing method for three-dimensional reconstruction of phase image based on binocular structured light Active CN114943755B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210875682.5A CN114943755B (en) 2022-07-25 2022-07-25 Processing method for three-dimensional reconstruction of phase image based on binocular structured light

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210875682.5A CN114943755B (en) 2022-07-25 2022-07-25 Processing method for three-dimensional reconstruction of phase image based on binocular structured light

Publications (2)

Publication Number Publication Date
CN114943755A CN114943755A (en) 2022-08-26
CN114943755B true CN114943755B (en) 2022-10-04

Family

ID=82910194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210875682.5A Active CN114943755B (en) 2022-07-25 2022-07-25 Processing method for three-dimensional reconstruction of phase image based on binocular structured light

Country Status (1)

Country Link
CN (1) CN114943755B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115880448B (en) * 2022-12-06 2024-05-14 西安工大天成科技有限公司 Three-dimensional measurement method and device based on binocular imaging

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109166149A (en) * 2018-08-13 2019-01-08 武汉大学 A kind of positioning and three-dimensional wire-frame method for reconstructing and system of fusion binocular camera and IMU

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101936761B (en) * 2009-06-30 2012-02-15 宝山钢铁股份有限公司 Visual measuring method of stockpile in large-scale stock ground
CN113574406A (en) * 2019-03-15 2021-10-29 特里纳米克斯股份有限公司 Detector for identifying at least one material property
CN112102491B (en) * 2020-08-12 2022-12-06 西安交通大学 Skin damage surface three-dimensional reconstruction method based on surface structured light
CN114332349B (en) * 2021-11-17 2023-11-03 浙江视觉智能创新中心有限公司 Binocular structured light edge reconstruction method, system and storage medium
CN114152217B (en) * 2022-02-10 2022-04-12 南京南暄励和信息技术研发有限公司 Binocular phase expansion method based on supervised learning
CN114723828B (en) * 2022-06-07 2022-11-01 杭州灵西机器人智能科技有限公司 Multi-line laser scanning method and system based on binocular vision

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109166149A (en) * 2018-08-13 2019-01-08 武汉大学 A kind of positioning and three-dimensional wire-frame method for reconstructing and system of fusion binocular camera and IMU

Also Published As

Publication number Publication date
CN114943755A (en) 2022-08-26

Similar Documents

Publication Publication Date Title
Furukawa et al. Dense 3d motion capture from synchronized video streams
Young et al. Coded structured light
CN108335352B (en) Texture mapping method for multi-view large-scale three-dimensional reconstruction scene
Liu et al. Real-time 3D surface-shape measurement using background-modulated modified Fourier transform profilometry with geometry-constraint
Nguyen et al. Three-dimensional shape reconstruction from single-shot speckle image using deep convolutional neural networks
Jiang et al. Pixel-by-pixel absolute phase retrieval using three phase-shifted fringe patterns without markers
WO2012096747A1 (en) Forming range maps using periodic illumination patterns
CN101658347B (en) Method for obtaining dynamic shape of foot model
US9147279B1 (en) Systems and methods for merging textures
CN111563952B (en) Method and system for realizing stereo matching based on phase information and spatial texture characteristics
CN114943755B (en) Processing method for three-dimensional reconstruction of phase image based on binocular structured light
CN104236479A (en) Line structured optical three-dimensional measurement system and three-dimensional texture image construction algorithm
CN103826032A (en) Depth map post-processing method
CN113379818A (en) Phase analysis method based on multi-scale attention mechanism network
CN112833818B (en) Single-frame fringe projection three-dimensional surface type measuring method
He et al. 3D Surface reconstruction of transparent objects using laser scanning with LTFtF method
WO2013012054A1 (en) Image processing method and apparatus
Hu et al. High-speed and accurate 3D shape measurement using DIC-assisted phase matching and triple-scanning
Song et al. Super-resolution phase retrieval network for single-pattern structured light 3D imaging
Liu et al. The applications and summary of three dimensional reconstruction based on stereo vision
Xi et al. Research on the algorithm of noisy laser stripe center extraction
CN113551617B (en) Binocular double-frequency complementary three-dimensional surface type measuring method based on fringe projection
CN114943761A (en) Method and device for extracting center of light stripe of central line structure of FPGA (field programmable Gate array)
FR2916529A1 (en) OPTICAL METROLOGY METHOD FOR MEASURING THE CONTOUR OF A MANUFACTURED PART
CN113450460A (en) Phase-expansion-free three-dimensional face reconstruction method and system based on face shape space distribution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant