SUMMERY OF THE UTILITY MODEL
Therefore, it is necessary to provide a three-dimensional profile measurement system based on point spread function engineering to solve the problems of high hardware cost and complex data processing flow of the conventional three-dimensional measurement system.
A three-dimensional profile measurement system based on point spread function engineering, comprising:
the projection unit is used for modulating the emitted light into projection light which is periodically distributed and projecting the projection light on the surface of an object to be measured;
the phase modulation unit is used for receiving the reflected light of the object to be detected and modulating the reflected light into single helical rotation;
a rotation unit for rotating the single-spiral light by 180 degrees;
the detection unit is used for converting the single spiral light before rotation into first image data and converting the single spiral light after 180-degree rotation into second image data; and
and the image reconstruction unit is electrically connected with the detection unit and is used for reconstructing the three-dimensional profile of the object to be detected according to the first image data and the second image data.
In the three-dimensional profile measuring system based on the point spread function engineering, the projection unit modulates the emitted light into projection light with periodic distribution and projects the projection light onto the surface of the object to be measured, the phase modulation unit receives the reflected light of the object to be measured and modulates the reflected light into single helical rotation, the rotation unit rotates the single helical light by 180 degrees, the detection unit converts the single helical light before rotation into first image data and converts the single helical light after rotation by 180 degrees into second image data, and the image reconstruction unit processes the first image data and the second image data to reconstruct the three-dimensional profile of the object to be measured. By projecting the periodically distributed dot matrix on the object to be measured, high-density three-dimensional point cloud data can be obtained at one time on a receiving light path, the measuring speed is improved, the reflected light of the object to be measured is modulated into single helical rotation, the processing flow of image data is simplified, and the structure is simple.
In one embodiment, the image reconstruction unit includes:
the superposition unit is used for superposing the pixel gray values of the first image data and the second image data to obtain third image data;
the intercepting unit is used for carrying out noise reduction processing on the third image data and intercepting each pair of single spiral points in the third image to form a plurality of sub-region image stacks, wherein each sub-region comprises a pair of single spiral points and pixels around the single spiral points;
the parallel processing unit is used for carrying out parallel processing on the plurality of sub-regions;
the calculation unit is used for calculating the center coordinate and the rotation angle between the two single spiral points of each pair of single spiral points; the calculation unit is further used for calculating the object plane depth of the projection point corresponding to each pair of single spiral points according to the rotation angle; and
and the reconstruction unit is used for reconstructing the three-dimensional profile of the object to be measured according to the central coordinates between the two single spiral points of each pair of single spiral points and the object plane depth of the projection point corresponding to each pair of single spiral points.
In one embodiment, the method further comprises the following steps:
a light source;
the projection unit comprises a first lens, a target, a second lens, a reflector, a half-transmitting and half-reflecting mirror and a third lens;
the light source, the first lens, the target, the second lens, the reflector, the semi-transparent and semi-reflective mirror and the third lens are sequentially arranged along a transmitting light path, the target is attached to one side of the first lens, which is far away from the light source, and is positioned on a focal plane of one side of the second lens, which is close to the first lens, and the focal plane of one side of the third lens, which is far away from the semi-transparent and semi-reflective mirror, is superposed with a surface to be measured of an object to be measured;
the light emitted by the light source is irradiated on the target through the first lens, the target is used for modulating the emitted light into projection light which is periodically distributed, the projection light is incident to the reflector through the second lens and is reflected to the semi-transparent semi-reflecting mirror through the reflector, and the semi-transparent semi-reflecting mirror is used for projecting the projection light onto an object to be measured through the third lens.
In one embodiment, the target is provided with a plurality of through holes with uniform intervals.
In one embodiment, the through hole is a circular hole.
In one embodiment, the phase modulation unit includes a phase component and a fourth lens, the third lens, the half mirror, the phase component, the fourth lens and the detection unit are sequentially disposed along a receiving optical path, and a focal plane of one side of the fourth lens close to the phase component coincides with a focal plane of one side of the third lens close to the half mirror;
the reflected light of the object to be detected enters the semi-transparent semi-reflecting mirror through the third lens, and enters the phase modulation unit through the reflected light of the semi-transparent semi-reflecting mirror, the phase assembly is used for modulating the reflected light into a single helical rotation, and the single helical rotation is emitted from the phase assembly and enters the detection unit through the fourth lens.
In one embodiment, the phase assembly includes a plurality of annular phase plates sleeved in sequence.
In one embodiment, the light source comprises an L ED light source.
In one embodiment, the projected light is blue light.
In one embodiment, the detection unit comprises a photodetector.
Detailed Description
To facilitate an understanding of the present application, the present application will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present application are given in the accompanying drawings. This application may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
Referring to fig. 1, an embodiment of the present application provides a three-dimensional profile measurement method based on point spread function engineering, including the following steps.
Step S01, modulating the emitted light into projection light with periodic distribution and projecting the projection light onto the surface of the object.
The light intensity distribution of the emitted light emitted from the light source is modulated, the emitted light is modulated into a periodically distributed dot matrix, namely periodically distributed projection light, and then the projection light is projected onto the surface of the object to be measured, so that the projection light on the surface of the object to be measured forms a periodically distributed dot matrix pattern.
Step S02, the reflected light of the object to be measured is received and modulated into a single helical rotation.
The signal light reflected by the object to be measured, i.e. the reflected light, is received by the phase modulation unit, the phase modulation unit modulates the reflected light into a form of a single-spiral point spread function from a form of a common point spread function, i.e. modulates the reflected light of gaussian distribution into a single spiral, as shown in fig. 2, fig. 2(a) is a schematic diagram of the single-spiral point spread function, and fig. 2(b) is a schematic diagram of the common point spread function, i.e. a schematic diagram of a non-single-spiral point spread function. After the point spread function is subjected to phase modulation, the intensity distribution of the cross section of the point spread function is a Gaussian point deviated from the center, when the projection position is far away from the surface to be measured of the object to be measured, the Gaussian point of the projection point on the surface to be measured of the object to be measured rotates around the center, and the deviated distance z is in direct proportion to the rotation angle theta, so that the axial information of the projection point position of the surface to be measured of the object to be measured can be determined according to the rotation angle of the single spiral point.
Step S03, the single-helix light is rotated 180 degrees.
The rotation of the single helical rotation may be achieved by rotating the phase modulation unit by 180 degrees after the phase modulation unit is rotated by 180 degrees, and the single helical light emitted from the phase modulation unit is rotated by 180 degrees with respect to the single helical light incident to the phase modulation unit. The single-spiral light rotates 180 degrees, namely the single-spiral point rotates 180 degrees around the position of the projection point on the object to be measured.
In step S04, the single-spiral light before rotation is converted into first image data and the single-spiral light after rotation by 180 degrees is converted into second image data.
The rotation angle of the single helical optical rotation before rotation, that is, the rotation angle of the single helical optical rotation before rotation is 0 degree, that is, the rotation angle of the single helical optical rotation emitted from the phase modulation unit before the phase modulation unit does not rotate with respect to the single helical optical rotation incident to the phase modulation unit is 0 degree. The same single spiral point in the first image data and the second image data is in a centrosymmetric relationship around the projection position.
Step S05, the first image data and the second image data are processed to reconstruct the three-dimensional contour of the object.
In the three-dimensional profile measurement method based on point spread function engineering, the processing of the first image data and the second image data specifically includes:
referring to fig. 3, in step S051, the pixel gray scale values of the first image data and the second image data are overlapped to obtain the third image data.
Referring to fig. 4 to 8, in this step, the gray-level values of the corresponding pixel positions in the first image data and the second image data are added to obtain the third image data. The size of the image corresponding to the third image data coincides with the size of the image corresponding to the first image data or coincides with the size of the image corresponding to the second image data. The size of the image corresponding to the first image data is equal to the size of the image corresponding to the second image data. Each projection location includes a pair of single helical points in the image to which the third image data corresponds. Fig. 4 is a projection intensity distribution of the projection light on the surface of the object to be measured. Fig. 5 is a schematic diagram of the first image data when the defocus distance z is 0 cm. Fig. 6 is a schematic diagram of the second image data when the defocus distance z is 0 cm. Fig. 7 is a schematic diagram of third image data when the defocus distance z is 0 cm. Fig. 8 is a schematic diagram of the third image data when the defocus distance z is 0.5 cm. The defocus distance is an axial distance of a projection point in a three-dimensional xyz coordinate system in the image corresponding to the third image data, that is, the above-described offset distance z.
And step S052, performing noise reduction processing on the third image data, and intercepting each pair of single spiral points in the third image to form a plurality of sub-region image stacks. Wherein each subregion comprises a pair of single-helix points and its surrounding pixels.
In this step, the third image data is subjected to noise reduction processing, specifically, the third image data is subjected to high-pass filtering to remove a large amount of background noise, and then discrete noise in the third image data is removed in a corrosion and expansion mode, each pair of single spiral points in the third image is cut out, namely, each pair of single spiral points and pixels around the pair of single spiral points are cut out from the image corresponding to the third image data, the size of the sub-region can be set according to actual needs, and the reasonable size of the sub-region can avoid redundant single spiral points in the sub-region, in one embodiment, the size of the sub-region is 25pixels × 25 pixels.
And step S053, performing parallel processing on the plurality of sub-regions.
A plurality of sub-regions are respectively provided with S1,S2…SnWherein n is an integer greater than 1.
And step S054, calculating the center coordinates and the rotation angle between the two single spiral points of each pair of single spiral points.
Specifically, an xyz coordinate system is established in the image corresponding to the third image data, and then double gaussian fitting is performed on each pair of single-spiral points in each subregion, so as to obtain that the transverse coordinates of each pair of single-spiral points are (x) respectively
n,1,y
n,1) And (x)
n,2,y
n,2) Then, the central coordinates of each pair of the spiral points are calculated according to the transverse coordinates of each pair of the single spiral points
The central coordinate is the transverse coordinate of the projection point on the object to be measured. Finally, calculating the included angle between the connecting line of the two single spiral points in each pair of single spiral points and the y axis, wherein the included angle is the rotation angle theta of the single spiral points
n。
And step S055, calculating the object plane depth of the projection point corresponding to each pair of single spiral points according to the rotation angle.
The object plane depth of the projection point corresponding to each pair of single-spiral points, i.e. the axial coordinate zn of each projection point, so that the three-dimensional coordinate of each projection point is
The object plane depth of the projection point corresponding to each pair of single-spiral points is also the defocus distance.
In this step, the rotation angle θ of the single spiral point can be determined
nCalculating the axial coordinate of each projection point by the relation k of the axial coordinate zn of each projection point
Referring to FIG. 9, the rotation angle θ of a single spiral pointnThe method for calibrating the linear relation k between the axial coordinate zn of each projection point specifically comprises the following steps:
and S101, vertically fixing the white acrylic plate serving as a measured object on a displacement table.
The displacement table can move by a preset distance, namely, the displacement table moves by the preset distance every time.
Step S102, modulating the emitted light into projection light with periodic distribution and projecting the projection light on the surface of the object to be measured.
And step S103, receiving the reflected light of the object to be detected and modulating the reflected light into a single helical rotation.
Step S104, rotating the single-spiral light by 180 degrees.
Step S105, converts the single-spiral light before rotation into first image data and converts the single-spiral light after rotation by 180 degrees into second image data.
And step S106, judging whether the moving times of the displacement table reach the preset times.
Step S107, if the moving frequency of the displacement table reaches a preset frequency, calculating to obtain the rotation angle of the single spiral point and the axial coordinate of the corresponding projection point according to the first image data and the second image data corresponding to each position of the displacement table, and performing linear fitting on the rotation angle of each single spiral point and the axial coordinate of the corresponding projection point to obtain the rotation angle theta of the single spiral pointnAnd the axial coordinate zn of each projection point. The axial coordinate is the object plane depth.
And S056, reconstructing the three-dimensional profile of the object to be measured according to the center coordinates between the two single spiral points of each pair of single spiral points and the object plane depth of the projection point corresponding to each pair of single spiral points.
In this step, the center coordinates between two single-spiral points of each pair of single-spiral points and the object plane depth of the projection point corresponding to each pair of single-spiral points, i.e., the three-dimensional coordinates of each projection point
And importing the three-dimensional coordinates of each projection point into three-dimensional point cloud processing software for processing, so that the three-dimensional contour of the object in the projection area can be reconstructed.
The three-dimensional profile measuring method based on the point spread function engineering comprises the steps of modulating emitted light into projection light in periodic distribution, projecting the projection light onto the surface of an object to be measured, receiving reflected light of the object to be measured, modulating the reflected light into single helical rotation, rotating the single helical light by 180 degrees, converting the single helical light before rotation into first image data and converting the single helical light after rotation by 180 degrees into second image data, and processing the first image data and the second image data to reconstruct the three-dimensional profile of the object to be measured. By projecting the periodically distributed dot matrix on the object to be measured, high-density three-dimensional point cloud data can be obtained at one time on a receiving light path, the measuring speed is improved, the reflected light of the object to be measured is modulated into single helical rotation, the processing flow of image data is simplified, and the structure is simple.
Referring to fig. 10, an embodiment of the present application further provides a three-dimensional profile measurement system based on a point spread function engineering, including a projection unit 10, a phase modulation unit 20, a rotation unit 30, a detection unit 40, and an image reconstruction unit 50, where the detection unit 40 is electrically connected to the image reconstruction unit 50.
The projection unit 10 is configured to modulate the emitted light into projection light with a periodic distribution and project the projection light onto the surface of the object 100. The phase modulation unit 20 is configured to receive the reflected light of the object 100 and modulate the reflected light into a single helical rotation. The rotating unit 30 is used to rotate the single-spiral light by 180 degrees. The detection unit 40 is configured to convert the single-spiral light before rotation into first image data and convert the single-spiral light after rotation by 180 degrees into second image data. The image reconstructing unit 50 is configured to process the first image data and the second image data to reconstruct a three-dimensional contour of the object 100.
The rotation unit 30 is connected to the phase modulation unit 20, the rotation unit 30 rotates the single helical rotation by rotating the phase modulation unit 20, and when the phase modulation unit 20 rotates 180 degrees, the single helical light emitted from the phase modulation unit 20 rotates 180 degrees with respect to the single helical light incident on the phase modulation unit 20. The single-spiral light rotates 180 degrees, that is, the single-spiral point rotates 180 degrees around the projection point position on the object 100. The rotation angle of the single helical rotation before the rotation, that is, the rotation angle of the single helical rotation before the rotation is 0 degree, that is, the rotation angle of the single helical rotation before the phase modulation unit 20 does not rotate with respect to the single helical rotation incident to the phase modulation unit 20 is 0 degree. The same single spiral point in the first image data and the second image data is in a centrosymmetric relationship around the projection position.
The detection unit 40 may be a photodetector and the image reconstruction unit 50 may be a computer.
Referring to fig. 11, the three-dimensional profile measuring system based on the point spread function engineering further includes a light source 60. The projection unit 10 includes a first lens 11, a target 12, a second lens 13, a mirror 14, a half mirror 15, and a third lens 16. The light source 60, the first lens 11, the target 12, the second lens 13, the reflector 14, the half-mirror 15 and the third lens 16 are sequentially arranged along the emission light path, the target 12 is attached to one side of the first lens 11 far away from the light source 60 and is located on a focal plane of one side of the second lens 13 close to the first lens 11, and a focal plane of one side of the third lens 16 far away from the half-mirror 15 coincides with a surface to be measured of the object 100 to be measured. The light source 60, the first lens 11, the target 12, the second lens 13, the reflector 14, the half-mirror 15 and the third lens 16 form an emission light path.
The light emitted by the light source 60 is irradiated on the target 12 through the first lens 11, the target 12 is configured to modulate the emitted light into projection light with periodic distribution, the projection light is incident on the reflector 14 through the second lens 13 and is reflected to the half-mirror 15 through the reflector 14, the half-mirror 15 is configured to project the projection light on the object 100 to be measured through the third lens 16, and the projection light forms a periodic dot matrix pattern on the surface of the object 100 to be measured.
In one embodiment, the light source 60 is an L ED light source, and the L ED light source emits blue light, i.e., the projected light is blue light.
Referring to fig. 12, the target 12 is formed with a plurality of through holes 121 uniformly spaced apart, that is, the through holes 121 are periodically distributed. The through hole 121 in the target 12 allows the emitted light to pass through and does not transmit light except for the through hole. After passing through the target 12, the uniformly distributed emitted light is modulated into periodically distributed projected light. The size and the spacing of the through holes 121 determine the spacing and the brightness of the projection points of the projected light on the object 100. In one embodiment, the through hole 121 is a circular hole.
The phase modulation unit 20 includes a phase assembly 21 and a fourth lens 22, the third lens 16, the half mirror 15, the phase assembly 21, the fourth lens 22 and the detection unit 40 are sequentially disposed on the object 100 along the receiving light path, and a focal plane of the fourth lens 22 near the phase assembly 21 coincides with a focal plane of the third lens 16 near the half mirror 15.
The reflected light of the object 100 to be measured enters the half mirror 15 through the third lens 16, and enters the phase modulation unit 20 through the reflected light of the half mirror 15, the phase assembly 21 is used for modulating the reflected light into a single helical rotation, and the single helical rotation exits from the phase assembly 21 and enters the detection unit 40 through the fourth lens 22.
Referring to fig. 13, the phase assembly 21 includes a plurality of annular phase plates 211 sequentially sleeved on each other. If the number of the annular phase plates 211 of the phase assembly 21 is N, the phase plate 211 close to the center of the phase assembly 21 is the 1 st phase plate 211, the 2 nd phase plate 211 is sleeved outside the 1 st phase plate 211, the 3 rd phase plate 211 is sleeved outside the 2 nd phase plate 211, and so on, the N-th phase plate 211 is sleeved outside the N-1 th phase plate 211. The inner diameter of the 2 nd phase plate 211 is equal to the outer diameter of the 1 st phase plate 211, the inner diameter of the 3 rd phase plate 211 is equal to the outer diameter of the 2 nd phase plate 211, and so on, the inner diameter of the N-1 th phase plate 211 is equal to the outer diameter of the N-2 th phase plate 211, and the outer diameter of the N-1 th phase plate 211 is equal to the inner diameter of the N-1 th phase plate 211. When the number of the annular phase plates 211 of the phase assembly 21 is N, the transmittance function of the phase assembly 21 is as follows:
wherein the sum of (p,
) Is polar coordinate, N is 1,2,3 … N; r is the radius of the phase plate. Therefore, the number N of the
annular phase plates 211 of the
phase assembly 21 determines the axial range corresponding to a single spiral point rotating by a certain angle, and as N increases, the distance from the single spiral point to the rotation center increases, and the axial range is correspondingly enlarged.
Referring to fig. 14, the image reconstructing unit 50 includes a superimposing unit 51, a clipping unit 52, a parallel processing unit 53, a calculating unit 54, and a reconstructing unit 55. The superimposing unit 51 is configured to superimpose the pixel grayscale values of the first image data and the second image data to obtain third image data. The clipping unit 52 is configured to perform noise reduction processing on the third image data, and clip each pair of single-spiral points in the third image to form a plurality of sub-region image stacks, where each sub-region includes a pair of single-spiral points and its surrounding pixels. The parallel processing unit 53 is configured to perform parallel processing on the plurality of sub-regions. The calculation unit 54 is used to calculate the center coordinates and the rotation angle between the two single-spiral points of each pair of single-spiral points. The calculating unit 54 is further configured to calculate the object plane depth of the projection point corresponding to each pair of single-spiral points according to the rotation angle. The reconstruction unit 55 is configured to reconstruct a three-dimensional profile of the object 100 according to the center coordinates between the two single-spiral points of each pair of single-spiral points and the object plane depth of the projection point corresponding to each pair of single-spiral points.
In the three-dimensional profile measuring system based on the point spread function engineering, the projection unit modulates the emitted light into projection light with periodic distribution and projects the projection light onto the surface of the object to be measured, the phase modulation unit receives the reflected light of the object to be measured and modulates the reflected light into single helical rotation, the rotation unit rotates the single helical light by 180 degrees, the detection unit converts the single helical light before rotation into first image data and converts the single helical light after rotation by 180 degrees into second image data, and the image reconstruction unit processes the first image data and the second image data to reconstruct the three-dimensional profile of the object to be measured. By projecting the periodically distributed dot matrix on the object to be measured, high-density three-dimensional point cloud data can be obtained at one time on a receiving light path, the measuring speed is improved, the reflected light of the object to be measured is modulated into single helical rotation, the processing flow of image data is simplified, and the structure is simple.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only represent some embodiments of the present invention, and the description thereof is specific and detailed, but not to be construed as limiting the scope of the present invention. It should be noted that, for those skilled in the art, without departing from the spirit of the present invention, several variations and modifications can be made, which are within the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the appended claims.