CN108036740B - High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles - Google Patents

High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles Download PDF

Info

Publication number
CN108036740B
CN108036740B CN201711270604.8A CN201711270604A CN108036740B CN 108036740 B CN108036740 B CN 108036740B CN 201711270604 A CN201711270604 A CN 201711270604A CN 108036740 B CN108036740 B CN 108036740B
Authority
CN
China
Prior art keywords
black
dimensional
phase
camera
white camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711270604.8A
Other languages
Chinese (zh)
Other versions
CN108036740A (en
Inventor
陈钱
陶天阳
左超
冯世杰
胡岩
尹维
顾国华
张玉珍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201711270604.8A priority Critical patent/CN108036740B/en
Publication of CN108036740A publication Critical patent/CN108036740A/en
Application granted granted Critical
Publication of CN108036740B publication Critical patent/CN108036740B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a high-precision real-time three-dimensional color measurement system and method based on multiple viewing angles, and the system comprises four black-and-white cameras, color cameras, a projector and a computer, wherein the four black-and-white cameras are symmetrical to the central axis of the projector, two black-and-white cameras are respectively arranged on one side of the projector, the color cameras are arranged on any one side and positioned on the outer sides of the black-and-white cameras, and all the cameras are connected with the computer through data lines and connected with the projector through trigger lines; the projector projects three-step phase shift stripes and simultaneously generates a trigger signal with the same projection rate, the trigger signal is transmitted to four black-and-white cameras and four color cameras through trigger lines, all the cameras synchronously shoot phase shift images projected by the projector and modulated by a measured object under the drive of the trigger signal, the shot phase shift images are transmitted to a computer through data lines, and the computer processes the phase shift images to obtain color texture mapping. The invention can realize high-precision real-time measurement on the premise of ensuring the highest measurement efficiency of the phase-shift profilometry.

Description

High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles
Technical Field
The invention belongs to the technical field of three-dimensional imaging, and particularly relates to a high-precision real-time three-dimensional color measurement system and method based on multiple visual angles.
Background
In the field of three-dimensional imaging, fast acquisition of high-precision three-dimensional data of a target object has been an important technical difficulty. In the early days, people acquire three-dimensional images by detecting a target object point by point through a mechanical three-coordinate measuring machine, but the point-by-point contact type image acquisition technology has extremely low efficiency on one hand and damages the measured object on the other hand, and the defects make the technology difficult to be applied in the fields such as human body detection, cultural relic protection and the like. Compared with the traditional mechanical three-dimensional image acquisition technology, the optical three-dimensional image acquisition technology has the advantages of high efficiency due to non-contact and wide application in the fields of scientific research, industrial detection and the like. In recent years, due to the development of digital projection devices, the Fringe projection technology capable of realizing full-field imaging in the optical three-dimensional imaging method has become a research hotspot (s.s. gorthi and p.rastogi, "Fringe projection technologies: while we are. Two techniques currently in use in the field of fringe projection are Fourier contouring (M.Takeda and K.Mutoh, "Fourier transform profiling for the automatic measurement of 3-d object profiles," Applied optics 22,3977- > 3982 (1983)), and phase-shift contouring (V.Srinivasan, H. -C.Liu, and M.Halioua, "automatic phase-measuring profiling of 3-d difference objects," Applied optics 23,3105- "3108 (1984)), respectively.
Compared with Fourier profilometry, phase-shift profilometry is more suitable for automatic high-precision three-dimensional image acquisition due to insensitivity to ambient light and noise and simple calculation. One difficulty with phase shift profilometry is how to spread the wrapped phase obtained by the phase shift method. Currently, the mainstream phase unwrapping method includes time domain phase unwrapping and space domain phase unwrapping, wherein the time domain phase unwrapping can stably unwrappe the wrapping phase of any surface, but too many grating stripes can greatly affect the phase acquisition efficiency and thus the three-dimensional image acquisition speed (efficiency) (old money; von gejejy; guhua; sinister; grandson; well-luck; well-known; cheng; lie et. the time phase unwrapping method based on the dual-frequency tri-gray scale sinusoidal grating stripe projection: china, 201410027275.4[ I ].2013-04-30.), which can reduce the measurement efficiency of the phase-shift profilometry and thus reduce the capability of the phase-shift profilometry to measure moving objects. On the other hand, although the spatial domain can ensure the efficiency of phase unwrapping, it is very unstable when measuring a discontinuous surface or a complex surface, and phase unwrapping errors are likely to occur.
Compared with other methods, the three-dimensional imaging technology based on the phase-shift profilometry has great advantages in the aspects of imaging efficiency and imaging precision and is widely applied to the fields of cultural relic protection, human body detection and the like, but most of the existing phase-shift profilometry focuses on static scene measurement, and although the measurement precision is high, the measurement efficiency is extremely low. Although there are also dynamic phase shift profilometry efforts, they are all generally gaining improvements in measurement efficiency or speed at the expense of measurement accuracy, as opposed to static measurements. With the development of industrial measurement, the measurement requirement is continuously improved, so that the development of a three-dimensional scanning system with measurement efficiency, speed and precision is imperative.
Disclosure of Invention
The invention aims to provide a multi-view-angle-based high-precision real-time three-dimensional color measurement system and a method thereof.
The technical solution for realizing the purpose of the invention is as follows: a high-precision real-time three-dimensional color measurement system based on multiple viewing angles and a method thereof comprise four black-and-white cameras, a color camera, a projector and a computer, wherein the four black-and-white cameras are symmetrical to the central axis of the projector, two black-and-white cameras are respectively arranged on one side of the projector, the color camera is arranged on any one side and positioned on the outer side of the black-and-white camera, and all the cameras are connected with the computer through data lines and connected with the projector through trigger lines; the projector projects three-step phase shift stripes and simultaneously generates a trigger signal with the same projection speed, the trigger signal is transmitted to four black-and-white cameras and color cameras through trigger lines, all the cameras synchronously shoot phase shift images which are projected by the projector and modulated by a measured object under the drive of the trigger signal, the shot phase shift images are transmitted to a computer through data lines, the computer carries out absolute phase expansion on the shot phase shift images to obtain absolute phases so as to obtain preliminary three-dimensional data, then corresponding points are searched again through the preliminary three-dimensional data and sub-pixelation is carried out, high-precision three-dimensional measurement is achieved, and finally, high-precision three-dimensional coordinate points are mapped to the color cameras to achieve color texture mapping.
Compared with the prior art, the invention has the following remarkable advantages: (1) optimizing the relative spatial position of the visual angles among multiple visual angles: the shorter the view angle base line is, the smaller the angle is, the more beneficial the ambiguity of package phase elimination is (the fewer the number of candidate points is), but the larger the error of the three-dimensional space coordinates of the remaining candidate points is. In order to inhibit the coordinate error of the projection of the space coordinates to the imaging plane, the baselines and angles of the auxiliary view camera and the main view camera are reduced, and most wrong candidate points are excluded by combining homonymy point phase consistency check and a strict phase difference threshold value. The remaining phase ambiguity is then removed by phase coincidence checking by two cameras symmetrically placed on the other side of the projector. And finally, searching the same-name point on the auxiliary camera with the longest base line of the camera through the absolute phase of the main visual angle camera and performing three-dimensional reconstruction, thereby realizing high-precision real-time measurement on the premise of ensuring the highest measurement efficiency of the phase-shift profilometry. (2) The three-dimensional reconstruction precision can be ensured to reach 50 mu m while the unfolding phase is stabilized. (3) The algorithm is simple, reconstruction calculation amount is small, and real-time online measurement can be realized.
The present invention is described in further detail below with reference to the attached drawing figures.
Drawings
FIG. 1 is a schematic flow chart of a multi-view-angle-based high-precision real-time three-dimensional color measurement method according to the present invention.
FIG. 2 is a schematic view of the entire multi-view system, where C1,C2,C3,C4Representing black and white cameras, C5Representing a color camera.
FIG. 3 shows the system accuracy evaluation results: (a) the precision is +/-1 um for a measured standard ceramic ball; (b) and (c) is a system measurement result; (d) and (e) the difference between the measurement result and the fitted standard sphere result, namely the error; (f) and (g) are error statistical histograms corresponding to (d) and (e).
FIG. 4 is a measurement of an object with a system measuring surfaces having different topographies: (a) respectively, the (d) and (g) are measured objects with steep surface, complex surface and separation surface; (b) the (e) and (h) are corresponding measurement results; (c) and (f) and (i) are corresponding local area amplification results.
FIG. 5 is a measurement of a dynamic object measured by the system: (a) and (b) is the rotating great toilet gypsum image measuring result; (c) and (d) palm measurements of movements.
Detailed Description
With reference to fig. 2, the high-precision real-time three-dimensional color measurement system based on multi-view includes four black and white cameras, a color camera, a projector and a computer, wherein the four black and white cameras are symmetrical to the central axis of the projector, two black and white cameras are respectively disposed on one side of the projector, and the color camera is disposed on any one side and outside the black and white cameras. All cameras are connected with a computer through data lines and connected with a projector through trigger lines. The projector projects three-step phase shift stripes and simultaneously generates trigger signals with the same projection speed, the trigger signals are transmitted to four black-and-white cameras and four color cameras through trigger lines, all the cameras synchronously shoot phase shift images which are projected by the projector and modulated by a measured object under the drive of the trigger signals, the shot phase shift images are transmitted to a computer through data lines, the computer carries out absolute phase expansion (wrapping phase resolving, depth constraining, polar line constraining and phase consistency checking) on the shot phase shift images to obtain initial three-dimensional data, then corresponding points are searched again through the initial three-dimensional data and sub-pixelation is carried out, high-precision three-dimensional measurement is achieved, and finally, high-precision three-dimensional coordinate points are mapped to the color cameras to achieve color texture mapping.
With reference to fig. 1, the high-precision real-time three-dimensional color measurement method based on multiple viewing angles of the present invention comprises the following steps:
step one, optimizing projector camera position and calibrating measurement system
Adjusting the spatial positions of four cameras and one projector in the measuring system according to the projector camera position optimization method, and completing the calibration of the measuring system, wherein the method comprises the following specific processes: projector with a light sourceProjecting stripe patterns and generating trigger signals, transmitting the trigger signals to four black-and-white cameras and a color camera through signal lines, starting to synchronously acquire images after all the cameras receive the trigger signals, and transmitting the acquired images to a computer for processing to obtain three-dimensional data. The spatial position between multiple views affects their spatial geometric constraint relationship (multiple views refer to each camera and projector in all cameras and projectors are called views), which is expressed as: the closer the distance between the visual angles is, the shorter the projection line segment of any point on one visual angle in a certain depth range on the other visual angle is, and on the other hand, the larger the distance between the two visual angles is, the more accurate the three-dimensional reconstruction between the two visual angles is. This is the projector camera position optimization principle, according to which the following optimization is made for the camera and projector positions in the measurement system: first black and white camera C1Keeping a far distance (more than 10cm) with the projector, determining whether to adjust the distance (5-10cm) according to the feedback result of the step two, and using a second black-and-white camera C2Is arranged at the first camera C1To the projector and close to the first camera C1Put, the third black and white camera C3And a second camera C2A fourth black-and-white camera C symmetrically arranged about the central axis of the projector4With the first camera C1Symmetrically arranged about the central axis of the projector, a color camera C5At the first camera C1Outer or fourth camera C4Outside, e.g. color camera C5At the first camera C1Outer side of (2), while the color camera C5And a second camera C2With respect to the first camera C1Are symmetrically arranged. After the optimization of the spatial position is completed, the whole measurement system is calibrated to a unified world coordinate system by using a calibration algorithm (the calibration algorithm can refer to z.zhang, "flexible new technology for camera calibration." IEEE Transactions on pattern analysis and machine interaction.22 (11),1330 and 1334 (2000)), so as to obtain calibration parameters of five cameras (including four black and white cameras and one color camera) and a projector in a world coordinate system, and convert the parameters into two-dimensional to three-dimensional parameters and three-dimensional to three-dimensional parametersTwo-dimensional mapping parameters (see, for example, K.Liu, Y.Wang, D.L.Lau, et al, "Dual-frequency pattern for high-speed 3-D mapping assessment." Optics express.18(5): 5229) 5244(2010), which will be described later.
And step two, shooting images by using the measuring system marked in the step one, and performing wrapping phase calculation, depth constraint, first epipolar constraint and first phase consistency check.
Measuring the flat plate by using the measuring system marked in the step one, and calculating to pass through a second camera C2The number of candidate points later, if for the first camera C1And if the number of the candidate points of the upper arbitrary point is less than or equal to 3, the next step is carried out, otherwise, the step I is returned, and the concrete steps are as follows: for the first camera C1At any point on
Figure BDA0001495498260000051
First camera C1The three-step phase-shifted image acquired can be represented as:
Figure BDA0001495498260000052
Figure BDA0001495498260000053
Figure BDA0001495498260000054
wherein
Figure BDA0001495498260000055
The gray scale of the stripe image obtained by shooting by the camera,
Figure BDA0001495498260000056
respectively, the average value and the modulation degree of the stripe,
Figure BDA0001495498260000057
is a watchThe absolute phase of the spatial location is characterized. The phase can be obtained by a three-step phase shift algorithm
Figure BDA0001495498260000058
Specifically, the following formula:
Figure BDA0001495498260000059
it is noted that the phase obtained here is due to the truncation effect of the arctan function
Figure BDA00014954982600000510
For wrapping the phase, its value range is [0,2 π]It is prepared by reacting
Figure BDA00014954982600000511
The relationship of (a) to (b) is as follows:
Figure BDA00014954982600000512
wherein k is a period order having a value range of [0, N-1]An integer within the range, and N is the total number of cycles of the stripe. The key to phase shift profilometry is how to solve for k. The system will solve k through depth constraint, epipolar constraint between multiple views and phase consistency check. First for the first camera C1At any point in the above, it is sequentially assumed that the cycle order k is [0, N-1]]Assuming the projected fringes are vertical fringes, according to the following equation:
Figure BDA00014954982600000513
the abscissa x of the corresponding point of the point on the projector can be calculatedpAnd then by the following formula:
Figure BDA0001495498260000061
Figure BDA0001495498260000062
Figure BDA0001495498260000063
n three-dimensional coordinates of the point can be obtained by calculation
Figure BDA0001495498260000064
These three-dimensional points are referred to as three-dimensional candidate points,
Figure BDA0001495498260000065
and
Figure BDA0001495498260000066
for the first camera C obtained in step one1And a two-dimensional to three-dimensional mapping parameter with the projector. Here with the first camera C1For example, the image processing of other black and white cameras is similar to this process, which is not described herein, and will be described later when using the corresponding calculation results (e.g., the second camera C)2And a third camera C3And a fourth camera C4The calculation formula in (1). Due to the limitation of the projection equipment of the fringe projection system, the effective projection range (the range which ensures that the contrast of the projected fringes is large enough) is not infinite, and the effective projection range of the depth direction of the system is [ -150mm,250mm]Within the range of
Figure BDA0001495498260000067
The range of directions can be three-dimensional coordinates that do not meet the conditions
Figure BDA0001495498260000068
Excluding so as to reduce the number of three-dimensional candidate points to N1(N1Any positive integer less than N), which (the above process of removing invalid three-dimensional candidate points by depth range) is a depth constraint. Remaining N1The three-dimensional candidate points are projected to a second camera C by a first epipolar line mapping2To obtain N1Two-dimensional projection points (two-dimensional candidate points). For in the second phaseMachine C2N of (A) to1Two-dimensional projection points, each point having a wrapped phase
Figure BDA0001495498260000069
The phases of the same point on the object on the images captured by the respective cameras appear similar, and thus can be represented by
Figure BDA00014954982600000610
Excluding points that are dissimilar in phase behavior, wherein
Figure BDA00014954982600000611
Is composed of
Figure BDA00014954982600000612
With it in the second camera C2The phase difference of the two-dimensional candidate points is used for representing the phase performance similarity of the two candidate points,
Figure BDA00014954982600000613
rad is in units of radians for a phase threshold preset a priori. This is the first phase consistency check, two-dimensional (from C) through this step1At any point on
Figure BDA00014954982600000614
Candidate points mapped onto other cameras) or three-dimensional candidate points (from C)1At any point on
Figure BDA00014954982600000615
Mapping to a candidate point in three-dimensional space) will be from N1Reduction to N2(N2Is any one less than N1Positive integer of (d); at this time, examine N2If the phase expansion is less than or equal to 3, if so, optimizing the positions of the camera and the projector to reach the condition of stable phase expansion; and if not, returning to the step one.
And step three, sequentially carrying out second epipolar line constraint, second phase consistency test, third epipolar line constraint and third phase consistency test on the remaining two-dimensional or three-dimensional candidate points in the step two, and determining the absolute phase of any point on the first camera by the sum of the weight phases of the four cameras.
Will remain N2The three-dimensional candidate points are projected to a third camera C using a second epipolar line mapping3And performing a second phase consistency check again to obtain
Figure BDA0001495498260000071
Wherein
Figure BDA0001495498260000072
Is composed of
Figure BDA0001495498260000073
With the third camera C3The phase difference of the two-dimensional candidate points above,
Figure BDA0001495498260000074
to pass a phase threshold preset a priori, the phase threshold setting needs to take into account the relative spatial position between the times, since the first camera C1And a third camera C3Compared with the first camera C1And a second camera C2Further away from each other, the accuracy of the projected points is worse, so the threshold here needs to be satisfied
Figure BDA0001495498260000075
This point will not be described in detail later. At this time, the number of two-dimensional or three-dimensional candidate points will be from N2Reduction to N3(N3Is any one less than N2Positive integer of (d). Then N is added3Projection of the three-dimensional candidate points onto a fourth camera C4The upper and lower layers are again formed by
Figure BDA0001495498260000076
Carrying out the third phaseChecking the bit consistency to obtain
Figure BDA0001495498260000077
Wherein
Figure BDA0001495498260000078
Is a phase threshold set a priori. At this time, the number of two-dimensional or three-dimensional candidate points will be from N3Reduction to N4(N4Is any one less than N3Positive integer of (d). To any point so far
Figure BDA0001495498260000079
N corresponding thereto4It is not necessarily uniquely determined, i.e., the value of k cannot be uniquely determined. To in the remaining N4Unique determination among candidate points
Figure BDA00014954982600000710
The invention adopts a weighted phase difference sum algorithm. For N4The weighted sum of the phase difference is given by the following formula
Figure BDA00014954982600000711
Fully utilizes the phase difference obtained by the three phase consistency tests, wherein delta phi is the sum of the phase difference weights,
Figure BDA00014954982600000712
corresponding to the phase difference
Figure BDA00014954982600000713
The weighting factor of (2). The higher the accuracy of the projected point and the higher the reliability of the phase difference, the larger weighting factor should be assigned, so the values of the weighting factors in the measurement system are as follows:
Figure BDA00014954982600000714
of course, it should be noted that the threshold value here may beThe fine adjustment is made according to the actual situation of the measuring system, but the requirement is met in general
Figure BDA0001495498260000081
Finally selecting a candidate point with smaller delta phi as
Figure BDA0001495498260000082
Corresponding to a point k of
Figure BDA0001495498260000083
The order of cycles of. To this end, for the first camera C1Any point on the phase can be unfolded by the process to obtain the absolute phase
Figure BDA0001495498260000084
Step four, according to the absolute phase
Figure BDA0001495498260000085
Calculate the first camera C1Is at the fourth camera C4And sub-pixelating the corresponding points
For the first camera C1At any point on
Figure BDA0001495498260000086
According to the absolute phase obtained in step three
Figure BDA0001495498260000087
Can obtain the space three-dimensional coordinates
Figure BDA0001495498260000088
But due to the first camera C1Close to the projector, by
Figure BDA0001495498260000089
Obtained by direct solution
Figure BDA00014954982600000810
Poor precision, in order to increase the three-dimensional weightThe structure precision is high, three-dimensional reconstruction is carried out by using two visual angles with larger distance, and a first camera C is selected in a measurement system1And a fourth camera C4And performing three-dimensional reconstruction. Combining three-dimensional points
Figure BDA00014954982600000811
Projected to a fourth camera C4Get the corresponding point
Figure BDA00014954982600000812
But at this time
Figure BDA00014954982600000813
The accuracy is also poor. The measurement system is improved by two steps of integer pixel optimization and sub-pixel optimization
Figure BDA00014954982600000814
The precision of the method is well laid for high-precision three-dimensional reconstruction. The integer pixel optimization is mainly realized by optimizing the phase difference, and an integer i E [ -3,3 is set]In a
Figure BDA00014954982600000815
Search within range so that
Figure BDA00014954982600000816
Minimum i, and using that time
Figure BDA00014954982600000817
To replace
Figure BDA00014954982600000818
As new corresponding point coordinates. The phase can be considered to vary linearly, based on which sub-pixel optimization can be given by
Figure BDA00014954982600000819
Is carried out, wherein
Figure BDA00014954982600000820
Are respectively as
Figure BDA00014954982600000821
The wrapped phase of (d). To this end, obtain
Figure BDA00014954982600000822
At the fourth camera C4High precision sub-pixel precision coordinates
Figure BDA00014954982600000823
Step five, according to the first camera C1Calculating a space three-dimensional point by the above points and the sub-pixel corresponding points thereof to complete three-dimensional data acquisition, which comprises the following steps: for the first camera C1At any point on
Figure BDA00014954982600000824
Obtaining the coordinates of the high-precision corresponding points
Figure BDA00014954982600000825
Then, by the following formula
Figure BDA00014954982600000826
Figure BDA00014954982600000827
Figure BDA00014954982600000828
Can calculate to obtain high-precision three-dimensional coordinates
Figure BDA0001495498260000091
Figure BDA0001495498260000092
And
Figure BDA0001495498260000093
for the first camera C obtained in step one1And a fourth camera C4Two-dimensional to three-dimensional mapping parameters between.
Step six, solving a first camera C according to the obtained three-dimensional data1In a color camera C5And acquiring the color texture of the corresponding point by the corresponding point to finish the true color display of the three-dimensional data.
For the first camera C1At any point on
Figure BDA0001495498260000094
By the formula
Figure BDA0001495498260000095
Can find out it in the color camera C5Corresponding point on
Figure BDA0001495498260000096
And
Figure BDA0001495498260000097
for calibrating the parameter matrix in step one, the method comprises
Figure BDA0001495498260000098
Color information mapping of (A) to (B)
Figure BDA0001495498260000099
The true color display of the three-dimensional data can be completed.
Through the steps, the measuring system disclosed by the invention can enhance the stability of phase unwrapping by adjusting the relative spatial position of the camera and the projector of the multi-view system, realize stable phase unwrapping on the premise of ensuring the highest measuring efficiency of the phase-shift profilometry, and realize high-precision three-dimensional reconstruction. In addition, the method is simple, is suitable for GPU parallel processing, and can realize real-time online three-dimensional reconstruction. In order to test the three-dimensional reconstruction effect of the multi-view-angle-based high-precision real-time three-dimensional color measurement system, the measurement system shown in FIG. 2 is set up. Firstly, in order to verify the measurement accuracy of the measurement system, a group of standard ceramic balls are measured by using the system shown in fig. 2, as shown in fig. 3(a), fig. 3(b) and fig. 3(c) are two measurement results of the ceramic balls, the standard balls are obtained by fitting the measurement results and are used as reference values, fig. 3(d) and fig. 3(e) are difference values between the reference values and the measured values, and the corresponding statistical error distribution rules are as shown in fig. 3(f) and fig. 3(g), so that the measurement accuracy of the measurement system of the invention can reach 50 μm. In order to verify that the measuring system can stably measure a surface with any topography, a prism with an abrupt edge, a grand satellite plaster image with a complex topography and two separated objects are respectively measured, as shown in fig. 4(a), 4(d) and 4(g), fig. 4(b), 4(e) and 4(h) are measurement results of the measuring system, and fig. 4(c), 4(f) and 4(i) are local area amplification results of the measurement results, so that the measuring system is quite stable when measuring the object with the high-complexity topography, and the stability of the measuring system is verified. Finally, in order to verify the real-time measurement capability of the measurement system, the rotating large satellite gypsum image and the moving palm are measured, and the measurement results are shown in fig. 5, wherein fig. 5(a), fig. 5(b) are the measurement results of the rotating large satellite gypsum image, fig. 5(c) and fig. 5(d) are the measurement results of the moving palm, and as can be seen from the measurement results, the measurement system of the invention can well measure the moving object.

Claims (4)

1. A high-precision real-time three-dimensional color measurement method based on multiple viewing angles is characterized by comprising the following steps:
the method comprises the following steps that firstly, a projector and a camera position optimization and measurement system calibration are carried out, namely, the measurement system is composed of four black-and-white cameras, a color camera, a projector and a computer, the projector projects stripe patterns and generates trigger signals, the trigger signals are transmitted to the four black-and-white cameras and the color camera through signal lines, all the cameras synchronously shoot phase shift images projected by the projector and modulated by a measured object under the driving of the trigger signals, and the shot phase shift images are transmitted to the computer through data lines; after the positions of the projector and the cameras are optimized, the whole measuring system is calibrated to a unified world coordinate system by using a calibration algorithm to obtain a calibration parameter matrix of five cameras and the projector in the world coordinate system, and the parameters are converted into two-dimensional to three-dimensional mapping parameters;
step two, shooting images by using the measuring system marked in the step one, and performing wrapping phase resolving, depth constraint, first polar line constraint and first phase consistency check;
step three, sequentially carrying out second epipolar line constraint, second phase consistency check, third epipolar line constraint and third phase consistency check on the remaining two-dimensional or three-dimensional candidate points in the step two, and determining the absolute phase of any point on the first black and white camera by the sum of the difference of the weighted phase differences of the four cameras;
step four, according to the absolute phase
Figure FDA0002298517550000011
Calculate the first black and white camera C1Point on the fourth black-and-white camera C4And sub-pixelating the corresponding points;
step five, according to the first black and white camera C1Calculating a space three-dimensional point by the above points and the sub-pixel corresponding points thereof to complete three-dimensional data acquisition, which comprises the following steps: for the first black and white camera C1At any point on
Figure FDA0002298517550000012
Obtaining the coordinates of the high-precision corresponding points
Figure FDA0002298517550000013
Then, by the following formula
Figure FDA0002298517550000014
Figure FDA0002298517550000015
Figure FDA0002298517550000016
Calculating to obtain high-precision three-dimensional coordinates
Figure FDA0002298517550000017
Figure FDA0002298517550000018
Figure FDA0002298517550000019
And
Figure FDA00022985175500000110
for the first black and white camera C obtained in the step one1And a fourth black-and-white camera C4Two-dimensional to three-dimensional mapping parameters therebetween;
step six, solving a first black-and-white camera C according to the obtained three-dimensional data1In a color camera C5The corresponding point on the three-dimensional image acquires the color texture of the corresponding point to complete the true color display of the three-dimensional data, namely for the first black and white camera C1At any point on
Figure FDA00022985175500000111
By the formula
Figure FDA0002298517550000021
Thereby obtaining it in the color camera C5Corresponding point on
Figure FDA0002298517550000022
Figure FDA0002298517550000023
And
Figure FDA0002298517550000024
for calibrating the parameter matrix in step one, the method comprises
Figure FDA0002298517550000025
Color information mapping of (A) to (B)
Figure FDA0002298517550000026
The true color display of the three-dimensional data can be completed;
the specific process in the second step is as follows:
measuring the flat plate by using the measuring system marked in the step one, and calculating the measured value by using a second black-and-white camera C2The number of the next candidate points, if for the first black and white camera C1And if the number of the candidate points of the upper arbitrary point is less than or equal to 3, the next step is carried out, otherwise, the step I is returned, and the concrete steps are as follows: for the first black and white camera C1At any point on
Figure FDA0002298517550000027
First black and white camera C1The three-step phase-shifted image acquired can be represented as:
Figure FDA0002298517550000028
Figure FDA0002298517550000029
Figure FDA00022985175500000210
wherein
Figure FDA00022985175500000211
The gray scale of the stripe image obtained by shooting by the camera,
Figure FDA00022985175500000212
respectively, the average value and the modulation degree of the stripe,
Figure FDA00022985175500000213
absolute phase to characterize spatial position;
obtaining phase by three-step phase shift algorithm
Figure FDA00022985175500000214
Specifically, the following formula:
Figure FDA00022985175500000215
Figure FDA00022985175500000216
for wrapping the phase, its value range is [0,2 π]It is prepared by reacting
Figure FDA00022985175500000217
The relationship of (a) to (b) is as follows:
Figure FDA00022985175500000218
wherein k is the period order, the value range of k is an integer in the range of [0, N-1], and N is the total period number of the stripes;
solving for k by depth constraint, epipolar constraint between multiple views, and phase consistency test, first for the first black and white camera C1At any point in the above, it is sequentially assumed that the cycle order k is [0, N-1]]Assuming the projected fringes are vertical fringes, according to the following equation:
Figure FDA00022985175500000219
calculating to obtain the abscissa x of the corresponding point of the point on the projectorpAnd then by the following formula:
Figure FDA0002298517550000031
Figure FDA0002298517550000032
Figure FDA0002298517550000033
the total N three-dimensional coordinates of the point are obtained by calculation
Figure FDA0002298517550000034
These three-dimensional points are referred to as three-dimensional candidate points,
Figure FDA0002298517550000035
and
Figure FDA0002298517550000036
for the first black and white camera C obtained in the step one1Two-dimensional to three-dimensional mapping parameters with the projector;
the first black-and-white camera C1For example, the image processing process of other black and white cameras is the same: the effective projection range in the depth direction is [ -150mm,250mm]Within the range of
Figure FDA0002298517550000037
Range of directions, three-dimensional coordinates that will not meet the conditions
Figure FDA0002298517550000038
Excluding so as to reduce the number of three-dimensional candidate points to N1This is the depth constraint, the remaining N1The three-dimensional candidate points are projected to a second black-and-white camera C through the first epipolar constraint mapping2To obtain N1Two-dimensional projection points, namely two-dimensional candidate points; for a second black and white camera C2N of (A) to1Two-dimensional projection points, each point having a wrapped phase
Figure FDA0002298517550000039
The phases of the same point on the object on the images captured by the respective cameras appear similar, and therefore are represented by the following formula
Figure FDA00022985175500000310
Excluding points that are dissimilar in phase behavior, wherein
Figure FDA00022985175500000311
Is composed of
Figure FDA00022985175500000312
With it in a second black-and-white camera C2The phase difference of the two-dimensional candidate points is used for representing the phase performance similarity of the two candidate points,
Figure FDA00022985175500000313
to pass a priori pre-set phase threshold, rad is in units of radians, which is the first phase consistency test, the number of two-dimensional or three-dimensional candidate points will be from N by this step1Reduction to N2(ii) a At this time, examine N2If the phase expansion is less than or equal to 3, if so, optimizing the positions of the camera and the projector to reach the condition of stable phase expansion; and if not, returning to the step one.
2. The multi-view based high-precision real-time three-dimensional color measurement method according to claim 1, wherein in the first step, the positions of the camera and the projector in the measurement system are optimized as follows: first black and white camera C1Keeping a longer distance from the projector, determining whether to adjust the distance according to the number of the remaining candidate points in the step two, and a second black-and-white camera C2Is arranged in the first black-and-white camera C1Between the projector and close to the first black-and-white camera C1Put, the third black and white camera C3And a second black-and-white camera C2About the central axis of the projectorScale on, fourth black-and-white camera C4With the first black-and-white camera C1Symmetrically arranged about the central axis of the projector, a color camera C5At the first black-and-white camera C1Outer or fourth black-and-white camera C4Outside if the color camera C5At the first black-and-white camera C1Outer side of, color camera C5And a second black-and-white camera C2With respect to the first black-and-white camera C1Are symmetrically arranged.
3. The method for measuring the color of the object with high precision and real time based on the multi-view angle as claimed in claim 1, wherein the specific process in the third step is as follows:
will remain N2Projecting the three-dimensional candidate points to a third black-and-white camera C by utilizing second epipolar constraint mapping3And performing a second phase consistency check again to obtain
Figure FDA0002298517550000041
Wherein
Figure FDA0002298517550000042
Is composed of
Figure FDA0002298517550000043
With it in a third black-and-white camera C3The phase difference of the two-dimensional candidate points above,
Figure FDA0002298517550000044
is a phase threshold preset by prior; at this time, the number of two-dimensional or three-dimensional candidate points will be from N2Reduction to N3Then N is added3Projecting the three-dimensional candidate points to a fourth black-and-white camera C4The upper and lower layers are again formed by
Figure FDA0002298517550000045
Performing a third phase consistency test to obtain
Figure FDA0002298517550000046
Wherein
Figure FDA0002298517550000047
Is a phase threshold set according to prior;
at this time, the number of two-dimensional or three-dimensional candidate points will be from N3Reduction to N4So far as to any point
Figure FDA0002298517550000048
N corresponding thereto4Not necessarily uniquely determined, i.e. the value of k cannot be uniquely determined; to in the remaining N4Unique determination among candidate points
Figure FDA0002298517550000049
By weighting the sum of the phase differences, i.e. for N4The weighted sum of the phase difference algorithm is as follows for each candidate point in the candidate points
Figure FDA00022985175500000410
Fully utilizes the phase difference obtained by the three phase consistency tests, wherein delta phi is the sum of the phase difference weights,
Figure FDA00022985175500000411
corresponding to the phase difference
Figure FDA00022985175500000412
By a weight factor of
Figure FDA00022985175500000413
Finally selecting a candidate point with smaller delta phi as
Figure FDA00022985175500000414
Corresponding to a point k of
Figure FDA00022985175500000415
So far, for the first black and white camera C1Any point on the absolute phase is obtained by expanding the phase through the process
Figure FDA00022985175500000416
4. The multi-view-based high-precision real-time three-dimensional color measurement method according to claim 3, wherein the specific process in the fourth step is as follows:
for the first black and white camera C1At any point on
Figure FDA0002298517550000051
According to the absolute phase obtained in step three
Figure FDA0002298517550000052
Obtain the space three-dimensional coordinates thereof
Figure FDA0002298517550000053
But due to the first black and white camera C1Close to the projector, by
Figure FDA0002298517550000054
Obtained by direct solution
Figure FDA0002298517550000055
Poor precision, select the first black and white camera C1And a fourth monochrome camera C4Performing three-dimensional reconstruction, i.e. three-dimensional points
Figure FDA0002298517550000056
Projection onto a fourth black-and-white camera C4Get the corresponding point
Figure FDA0002298517550000057
And the improvement is realized by two steps of integer pixel optimization and sub-pixel optimization
Figure FDA0002298517550000058
The precision of the method is that paving is made for high-precision three-dimensional reconstruction;
integer pixel optimization is achieved by optimizing the phase difference, setting an integer i e ∈ -3,3]In a
Figure FDA0002298517550000059
Search within range so that
Figure FDA00022985175500000510
Minimum i, and using that time
Figure FDA00022985175500000511
To replace
Figure FDA00022985175500000512
As new corresponding point coordinates, sub-pixel optimization is given by
Figure FDA00022985175500000513
Is carried out, wherein
Figure FDA00022985175500000514
Are respectively as
Figure FDA00022985175500000515
Wrap phase of (A), thereby obtaining
Figure FDA00022985175500000516
In the fourth black-and-white camera C4High precision sub-pixel precision coordinates
Figure FDA00022985175500000517
CN201711270604.8A 2017-12-05 2017-12-05 High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles Active CN108036740B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711270604.8A CN108036740B (en) 2017-12-05 2017-12-05 High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711270604.8A CN108036740B (en) 2017-12-05 2017-12-05 High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles

Publications (2)

Publication Number Publication Date
CN108036740A CN108036740A (en) 2018-05-15
CN108036740B true CN108036740B (en) 2020-04-10

Family

ID=62095747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711270604.8A Active CN108036740B (en) 2017-12-05 2017-12-05 High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles

Country Status (1)

Country Link
CN (1) CN108036740B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242895B (en) * 2018-07-20 2022-03-25 南京理工大学 Self-adaptive depth constraint method based on real-time three-dimensional measurement of multi-camera system
CN109579741B (en) * 2018-11-01 2020-06-26 南京理工大学 Full-automatic multi-mode three-dimensional color measurement method based on multiple visual angles
CN109490312B (en) * 2018-11-01 2021-07-09 昆山市泽荀自动化科技有限公司 Detection debugging method applied to inductance detection equipment
CN110207614B (en) * 2019-05-28 2020-12-04 南京理工大学 High-resolution high-precision measurement system and method based on double telecentric camera matching
JP2021148608A (en) * 2020-03-19 2021-09-27 株式会社リコー Imaging device, range-finding device, and imaging program
CN113916152B (en) * 2021-09-09 2023-04-07 湖南长步道光学科技有限公司 Sample detection device and method based on phase deflection technology
CN114279360B (en) * 2021-12-27 2023-08-11 天津大学 Method and device for measuring multi-order phase deflection based on telecentric imaging system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0631148A1 (en) * 1993-06-25 1994-12-28 Siemens Aktiengesellschaft Method for contactless testing of soldering
CN101466998A (en) * 2005-11-09 2009-06-24 几何信息学股份有限公司 Method and apparatus for absolute-coordinate three-dimensional surface imaging
CN102750698A (en) * 2012-06-11 2012-10-24 上海大学 Texture camera calibration device, texture camera calibration method and geometry correction method of texture image of texture camera
CN102945565A (en) * 2012-10-18 2013-02-27 深圳大学 Three-dimensional photorealistic reconstruction method and system for objects and electronic device
CN205942802U (en) * 2016-06-07 2017-02-08 西安新拓三维光测科技有限公司 Scattered spot of human digit is thrown and is adopted device
CN107063129A (en) * 2017-05-25 2017-08-18 西安知象光电科技有限公司 A kind of array parallel laser projection three-dimensional scan method
CN107131848A (en) * 2016-02-26 2017-09-05 福禄理昂·伟洛米泽 The optical triangle method device of quick and fine and close SHAPE DETECTION can be realized

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0631148A1 (en) * 1993-06-25 1994-12-28 Siemens Aktiengesellschaft Method for contactless testing of soldering
CN101466998A (en) * 2005-11-09 2009-06-24 几何信息学股份有限公司 Method and apparatus for absolute-coordinate three-dimensional surface imaging
CN102750698A (en) * 2012-06-11 2012-10-24 上海大学 Texture camera calibration device, texture camera calibration method and geometry correction method of texture image of texture camera
CN102945565A (en) * 2012-10-18 2013-02-27 深圳大学 Three-dimensional photorealistic reconstruction method and system for objects and electronic device
CN107131848A (en) * 2016-02-26 2017-09-05 福禄理昂·伟洛米泽 The optical triangle method device of quick and fine and close SHAPE DETECTION can be realized
CN205942802U (en) * 2016-06-07 2017-02-08 西安新拓三维光测科技有限公司 Scattered spot of human digit is thrown and is adopted device
CN107063129A (en) * 2017-05-25 2017-08-18 西安知象光电科技有限公司 A kind of array parallel laser projection three-dimensional scan method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
High-resolution, real-time three-dimensional shape measurement;Song Zhang等;《Optical Engineering》;20061231;第45卷(第12期);第123601-1~123601-8页 *

Also Published As

Publication number Publication date
CN108036740A (en) 2018-05-15

Similar Documents

Publication Publication Date Title
CN108036740B (en) High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles
Feng et al. Calibration of fringe projection profilometry: A comparative review
CN110672039B (en) Object omnibearing three-dimensional measurement method based on plane reflector
US11808564B2 (en) Calibration method for fringe projection systems based on plane mirrors
He et al. Accurate calibration method for blade 3D shape metrology system integrated by fringe projection profilometry and conoscopic holography
CN104279981B (en) The measuring method and device of a kind of absolute face shape of the minute surface based on streak reflex/class mirror article
CN111536905B (en) Monocular grating structure optical vision measurement method and system based on reference image
CN108955571A (en) The method for three-dimensional measurement that double frequency heterodyne is combined with phase-shift coding
Liang et al. Development and application of a non-destructive pavement testing system based on linear structured light three-dimensional measurement
CN102243103A (en) Method for quickly measuring colors and three-dimensional profile of object
Bräuer-Burchardt et al. Phase unwrapping using geometric constraints for high-speed fringe projection based 3D measurements
CN116295113A (en) Polarization three-dimensional imaging method integrating fringe projection
CN113280755B (en) Large-curvature mirror surface three-dimensional shape measuring method based on curved surface screen phase deflection
Yang et al. Flexible and fast calibration method for uni-directional multi-line structured light system
Yu et al. An improved projector calibration method for structured-light 3D measurement systems
US10801834B2 (en) Fringe projection for determining topography of a body
CN109242895B (en) Self-adaptive depth constraint method based on real-time three-dimensional measurement of multi-camera system
Brutto et al. 3D mosaic documentation using close range photogrammetry
Yang et al. Point Light Measurement and Calibration for Photometric Stereo
Lu et al. Parallax correction of texture image in fringe projection profilometry
CN112212805B (en) Efficient three-dimensional phase unwrapping method based on composite coding
TWI837061B (en) System and method for 3d profile measurements using color fringe projection techniques
Yamazaki et al. 3D reconstruction of glossy surfaces using stereo cameras and projector-display
CN118189857A (en) Digital image correlation three-dimensional measurement method and system based on single camera-projector system
Jia et al. Comparison of linear and non-linear calibration methods for phase-shifting surface-geometry measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant