CN111862086A - Method, apparatus, medium, and system for detecting surface topography - Google Patents

Method, apparatus, medium, and system for detecting surface topography Download PDF

Info

Publication number
CN111862086A
CN111862086A CN202010766085.XA CN202010766085A CN111862086A CN 111862086 A CN111862086 A CN 111862086A CN 202010766085 A CN202010766085 A CN 202010766085A CN 111862086 A CN111862086 A CN 111862086A
Authority
CN
China
Prior art keywords
gray
image
color multi
scale
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010766085.XA
Other languages
Chinese (zh)
Inventor
孙源
巴钟灵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yinguan Semiconductor Technology Co Ltd
Original Assignee
Shanghai Yinguan Semiconductor Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yinguan Semiconductor Technology Co Ltd filed Critical Shanghai Yinguan Semiconductor Technology Co Ltd
Priority to CN202010766085.XA priority Critical patent/CN111862086A/en
Publication of CN111862086A publication Critical patent/CN111862086A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method, a device, a computer-readable storage medium and a system for detecting surface topography. The method comprises the following steps: generating at least one color multi-gray-level coded image based on a gray response curve of the system, and projecting the at least one color multi-gray-level coded image to a surface to be detected through projection equipment; respectively acquiring a reflection image of each color multi-gray-scale coded image through at least one image acquisition device, and calculating three-dimensional coordinates of each reflection image under a reference coordinate system of the corresponding image acquisition device to generate a group of point cloud data for indicating the at least one color multi-gray-scale coded image; and determining the topography of the surface to be detected based on a set of point cloud data of the at least one color multi-gray level encoded image.

Description

Method, apparatus, medium, and system for detecting surface topography
Technical Field
The present invention relates to the field of industrial measurements, and more particularly, to a method, apparatus, computer-readable storage medium and system comprising the same for detecting surface topography.
Background
With the continuous development and improvement of manufacturing technology, the surface structure and processing requirements of industrial products present a trend of more and more complexity and diversification. In the actual production process, the quality of the manufacturing means and the processing technology is measured by detecting characteristic parameters (including roughness, waviness, shape errors, dimension errors and the like) of the surface quality of the workpiece.
Surface Morphology (Surface Morphology) is an important factor in describing the overall microscopic geometry of a workpiece Surface and affecting its performance properties, such as paintability, solvency, wear resistance, corrosion resistance, and other product functional properties. In the design and production process of products, the accurate analysis and evaluation of the surface morphology by using a digital means is the premise of ensuring the product functions and controlling the product quality and is an important link of quality control, so that the accuracy of the morphology detection on the surface of a workpiece greatly influences the product quality, and the using effect and the performance of the product are influenced.
Conventional surface topography sensing typically measures and analyzes two-dimensional topography information. With the continuous improvement of high-end manufacturing application level, two-dimensional surface morphology information can not meet the requirements, and the requirements of research, development, design and manufacturing of current high-end industrial products can be met only through precise detection and rapid analysis of surface three-dimensional morphology. Because the data volume obtained by the three-dimensional surface topography measurement is huge, a large amount of computing time is needed for measuring or calculating the surface topography characteristics and later surface quality assessment, and the method is one of the key problems to be solved urgently in the field surface detection application under the industrial environment with higher real-time requirement.
Disclosure of Invention
In order to solve the problems, the invention provides a scheme for detecting the surface morphology, which realizes non-contact, rapid and accurate surface morphology detection by combining an optical technology and a digital processing technology, and meets the requirements of real-time property, large field of view and high precision of industrial field measurement.
According to one aspect of the invention, a method for detecting surface topography is provided. The method comprises the following steps: generating at least one color multi-gray-level coded image based on a gray response curve of the system, and projecting the at least one color multi-gray-level coded image to a surface to be detected through projection equipment; respectively acquiring a reflection image of each color multi-gray-scale coded image through at least one image acquisition device, and calculating three-dimensional coordinates of each reflection image under a reference coordinate system of the corresponding image acquisition device to generate a group of point cloud data for indicating the at least one color multi-gray-scale coded image; and determining the topography of the surface to be detected based on a set of point cloud data of the at least one color multi-gray level encoded image.
According to another aspect of the present invention, there is provided an apparatus for detecting surface topography, comprising: a memory having computer program code stored thereon; and a processor configured to execute the computer program code to perform the method as described above.
According to another aspect of the invention, an apparatus for detecting surface topography is provided. The device comprises a circuit unit configured to perform the method as described above at power-up.
According to another aspect of the invention, a computer-readable storage medium is provided, having stored thereon computer program code, which when executed performs the method as described above.
According to another aspect of the invention, a system for detecting surface topography is provided. The system comprises: the apparatus as described above; a projection device configured to project the reference gray-scale gradation image or the color multi-gray-scale encoded image to the reference plane or the surface to be inspected under control of the device; and the at least one image acquisition device is configured to receive a reflection image corresponding to the reference gray-scale gradation image or the color multi-gray-scale code image and send the reflection image to the device in response to the projection device projecting the reference gray-scale gradation image or the color multi-gray-scale code image to the reference plane or the surface to be detected.
Drawings
FIG. 1 shows a schematic structural diagram of a system for detecting surface topography in accordance with an embodiment of the present invention;
FIG. 2 shows a flow diagram of a method for detecting surface topography in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart illustrating one embodiment of the step of determining a gray scale response curve for the system in the method of FIG. 2;
FIG. 4 is a flow chart illustrating one embodiment of the step of determining a set of point cloud data for at least one color multi-gray scale encoded image in the method of FIG. 2;
FIG. 5 is a flow chart illustrating one embodiment of the step of determining the topographical features of the surface to be inspected in the method illustrated in FIG. 2; and
fig. 6 shows a schematic structural diagram of an apparatus for detecting surface topography according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described in detail below with reference to the accompanying drawings in order to more clearly understand the objects, features and advantages of the present invention. It should be understood that the embodiments shown in the drawings are not intended to limit the scope of the present invention, but are merely intended to illustrate the spirit of the technical solution of the present invention.
In the following description, for the purposes of illustrating various inventive embodiments, certain specific details are set forth in order to provide a thorough understanding of the various inventive embodiments. One skilled in the relevant art will recognize, however, that the embodiments may be practiced without one or more of the specific details. In other instances, well-known devices, structures and techniques associated with this application may not be shown or described in detail to avoid unnecessarily obscuring the description of the embodiments.
Throughout the specification and claims, the word "comprise" and variations thereof, such as "comprises" and "comprising," are to be understood as an open, inclusive meaning, i.e., as being interpreted to mean "including, but not limited to," unless the context requires otherwise.
Reference throughout this specification to "one embodiment" or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment. Thus, the appearances of the phrases "in one embodiment" or "in some embodiments" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Fig. 1 shows a schematic structural diagram of a system 1 for detecting surface topography according to an embodiment of the present invention. As shown in fig. 1, system 1 may include an optical subsystem 10, which may include, for example, a projection device 12 and at least one image acquisition device 14a and 14b (hereinafter collectively referred to as image acquisition device 14). Projection device 12 may project a predetermined image 42 onto surface 40 (as shown by the solid lines in fig. 1), which may be any of a variety of currently known or future developed projection devices, such as a digital projector, for example. Image-capturing device 14 may receive a reflected image (shown in phantom in fig. 1) from surface 40 corresponding to predetermined image 42 after projection device 12 projects predetermined image 42 onto surface 40, which may be any of various image-capturing devices currently known or developed in the future, such as a camera or the like. Note that while two image capture devices 14 are shown in fig. 1, those skilled in the art will appreciate that in actual practice, optical subsystem 10 may contain more or fewer image capture devices 14.
The system 1 further comprises a digital processing device 20 for generating a predetermined image 42 and for controlling the projection device 12 to project the predetermined image 42 onto the surface 40, and for processing the reflected image after receiving the reflected image of the image acquisition device 40 to determine topographical features of the surface 40. In some embodiments, the digital processing device 20 may be a device 600 as described below in connection with fig. 6, including a processor 22 and a memory 24, wherein the memory 24 has stored therein computer program code that is executed by the processor 22 to perform various functions described herein. In other embodiments, digital processing device 20 may also be implemented to include chip circuitry (e.g., a Field Programmable Gate Array (FPGA)) that includes circuit elements that perform the various functions described herein at power-up.
The optical subsystem 10 and the digital processing device 20 may communicate over a communication link 30. The communication link 30 may be any wired or wireless communication link that enables data transfer between the two.
The surface 40 is a surface for receiving an image projected by the projection device 12, which may be a reference surface or a surface to be detected as described below, depending on the context.
FIG. 2 shows a flow diagram of a method 100 for detecting surface topography in accordance with an embodiment of the present invention. The method 100 shown in fig. 2 may be implemented, for example, by the digital processing device 20 or a portion thereof.
As shown in fig. 2, the method 100 comprises a step 110 in which a gray response curve of the system 1 is determined based on at least four reference gray scale gradient images. Wherein the at least four reference gray-scale gradation images have the same number of gray-scale levels and different gray-scale gradation directions. Here, the gray scale response curve of system 1 refers to the response curve between the input gray scale of system 1 (i.e., the gray scale of the gray scale gradation image generated by digital processing device 20 input to the image projected by projection device 12 onto surface 40) and the output gray scale (i.e., the gray scale of the image captured by image capture device 14 from surface 40).
In some embodiments, the gray scale response curve of system 1 is a fixed response curve, for example, where the physical structure of system 1 (e.g., the relative positions of projection device 12 and image acquisition device 14, physical parameters, etc.) is unchanged. In this case, the gray response curve may be preset in the digital processing device 20 without performing step 110 to determine in advance.
In other embodiments, step 110 may be performed to determine the gray scale response curve of the system 1 in advance, for example, at initialization of the system 1 or in the event of a change in the physical structure of the system 1.
Fig. 3 shows a flow chart of one embodiment of the step 110 of determining the gray scale response curve of the system 1 in the method 100 shown in fig. 2.
As shown in fig. 3, step 110 may include a sub-step 111 in which the at least four reference grayscale gradation images are acquired based on the pixel resolution of projection device 12.
In some embodiments, at least four initial reference gray scale gradient images are preset in the digital processing device 20, and are adaptively adjusted according to the resolution of different projection devices 12 when the method 100 is performed to generate the at least four reference gray scale gradient images.
For example, it is assumed that four initial reference gray-scale gradation images with 256 gray-scales of 8 bits are built in the digital processing device 20 in advance, and the first image may be gradation from left to right from gray-scale level 0 to gray-scale level 255, the second image may be gradation from right to left from gray-scale level 0 to gray-scale level 255, the third image may be gradation from gray-scale level 0 to gray-scale level 255 from top to bottom, and the fourth image may be gradation from gray-scale level 0 to gray-scale level 255 from bottom to top. Then, in the case where the pixel resolution of projection device 12 is 256 pixels by 256 pixels, digital processing device 20 generates four reference grayscale gradation images identical to the four initial reference grayscale gradation images, each containing 256 grayscale levels, each occupying the width of one pixel in the width and height directions, respectively. Alternatively, in the case where the pixel resolution of the projection device 12 is 1024 pixels × 768 pixels (assuming that the length direction is 1024 pixels and the height direction is 768 pixels), the digital processing device 20 calculates 768/256 to 3 with respect to the dimension in which the pixels are small, that is, with respect to the dimension of 768 pixels, and stretches the initial reference gradation image by 3 times in the length and height directions, respectively, that is, sets consecutive 3 pixels to the same gradation level, so that 256 gradation levels in height fill up 768 pixels, and in the dimension in which the pixels are large, that is, the dimension of 1024 pixels, the remaining pixels on both sides are filled with 0 or 255 except for the central 768 pixels forming a square corresponding to 768.
In other embodiments, digital processing device 20 may generate the at least four reference grayscale gradation images directly according to the resolution of projection device 12 each time method 100 is performed.
For example, in the case of a pixel resolution of 256 pixels by 256 pixels of projection device 12, digital processing device 20 directly generates four reference gray-scale gradation images containing 256 gray-scales each occupying one pixel in the width and height directions, respectively, where the first image may be a left-to-right gradation from gray-scale 0 to gray-scale 255, the second image may be a right-to-left gradation from gray-scale 0 to gray-scale 255, the third image may be a top-to-bottom gradation from gray-scale 0 to gray-scale 255, and the fourth image may be a bottom-to-top gradation from gray-scale 0 to gray-scale 255. For another example, in the case where the pixel resolution of the projection device 12 is 1024 pixels × 768 pixels (assuming that the length direction is 1024 pixels and the height direction is 768 pixels), the digital processing device 20 calculates 768/256 to 3 with respect to the dimension in which the pixels are small, that is, with respect to the dimension of 768 pixels, and generates four reference gradation images which contain 256 gradations in the dimension in which the pixels are small and each of which occupies 3 pixels, respectively, the gradation direction may be as described above. In the dimension where the pixels are large, i.e., the dimension of 1024 pixels, the digital processing device 20 fills the remaining pixels on both sides with 0 or 255, except for the central 768 pixels corresponding to 768 pixels forming a square.
In sub-step 112, the at least four reference grayscale gradient images are projected onto the reference plane 40 in sequence. Here, the reference plane 40 may be any plane.
In sub-step 113, a reference reflection image of each reference gray-scale gradation image is acquired by one image acquisition device 14.
In sub-step 114, a reference reflection image of the at least four reference gray-scale gradation images is subjected to a consistency process such that the reference reflection images have the same size and gray-scale gradation direction.
In one embodiment, the consistency processing may include selecting one of the at least four reference gray scale gradation images as a reference image, and then adjusting the other reference gray scale gradation images with reference to a gray scale gradation direction of the reference image such that the size and the gray scale gradation direction of the other reference gray scale gradation images are the same as the reference image. For example, an image in which the gray level is gradually changed from 0 to 255 from left to right may be selected as a reference image, the other reference gray-level gradually-changed images are rotated and scaled so as to be the same size as the reference image, and the gray level gradually-changing direction is also the gray level gradually-changed from 0 to 255 from left to right.
Next, in sub-step 115, for each reference reflection image of the reference gray-scale gradation image, the gray-scale value of each pixel column is sequentially averaged according to the gray-scale gradation direction to generate a gray-scale average vector of each reference reflection image. The grey value of each pixel in each pixel column of each received reference reflection image is typically not the same due to surface roughness of the reference surface 40, etc., in which case the average of the grey values of the pixels of each pixel column may be used to reflect the overall grey condition of the pixel column. Specifically, in one embodiment, the gray mean vector for each reference reflectance image may be determined by the following equation (1):
Figure BDA0002614650460000071
wherein P is a pixel matrix of the received reference reflected image; m and n are respectively the gray level row and column of the matrix P, and under the condition of 256 gray levels, the value range of m and n is 0 to 255; t represents the serial number of the received reference reflection image, and the value of t is 1 to 4 under the condition of using 4 reference gray level gradient images; rtThe gray mean vector of the t-th reference reflection image; .
In sub-step 116, the gray mean vectors of all reference reflectance images are averaged to obtain a total output gray mean vector. Specifically, in one embodiment, the total output gray level mean vector R may be determined by the following equation (2):
Figure BDA0002614650460000072
in sub-step 117, a gray response curve of the system 1 is generated based on the input gray mean vectors of the at least four reference gray gradient images and the resulting total output gray mean vector of sub-step 116.
In one embodiment, the input gray average vector may be obtained with reference to the method of sub-steps 114 to 116.
In another embodiment, the gray value vector of the selected reference image may be used as the total input gray mean vector of the at least four reference gray gradient images. That is, since the reference gradation images other than the reference image are subjected to the consistency processing based on the reference image, the consistency processing process can be omitted, and the gradation value vector of the reference image is directly used as the total input gradation average value vector. Here, the reference image is an ideal input image generated by the projection device 12 under the control of the digital processing device 20, and thus it is simple and accurate to use its gray value vector as the total input gray average vector.
In some embodiments, the gray response curve of system 1 is a response curve obtained with the total input gray mean vector obtained as above as the abscissa and the total output gray mean vector as the ordinate.
In other embodiments, the gray scale response curve of system 1 is a fit curve of a plurality of response curves obtained by repeating step 110 above under different reference surfaces 40.
Returning to fig. 2, method 100 continues to step 120, wherein at least one color multi-gray scale encoded image is generated based on the gray response curve of system 1 and projected by projection device 12 onto surface 40 to be inspected, respectively. Here, the gray scale response curve of the system 1 may be built in the digital processing device 20, or may be obtained by performing the above step 110. More specifically, the complete gray scale response curve of the system 1 may have inaccuracy due to edge effect and the like near both ends (the highest value of gray scale and the lowest value of gray scale), so the non-saturated linear region in the complete gray scale response curve of the system 1 may be selected to determine the light intensity of the color multi-gray scale encoded image to be generated in step 120 to compensate and correct the projection error between the projected image and the reflected image.
In one embodiment, in step 120, the at least one color multi-gray scale encoded image may be generated based on a gray response curve of system 1 and a pixel resolution of projection device 12.
In some embodiments, the color multi-gray level encoded image may be a color multi-gray level gray code image. Gray code is a reliable code, and only one bit of the adjacent two codes has different values, so that instantaneous error coding generated during analog-to-digital conversion of a circuit can be avoided. It will be appreciated by those skilled in the art that the present invention is not so limited and that various other image encoding methods are suitable for use with the present invention.
Taking the example that the color multi-gray-scale coded image includes three colors of RGB (red, green, and blue), the number of gray scales of each color multi-gray-scale coded image and the number of color multi-gray-scale coded images satisfy the following relationship (3):
n3m≥S, (3)
where n represents the number of gray levels of each color multi-gray level encoded image, m represents the number of color multi-gray level encoded images (which is typically taken to be the minimum in practice), and S represents the pixel resolution of projection device 12. In this case, m color multi-gray level coded images are projected, representing that n are projected3mAnd (4) encoding.
For example, assuming that projection device 12 has a pixel resolution of 1024 and 4 gray levels per color multi-gray level encoded image, then the pixel resolution is based on n3mS or more can deduce that m is at least 2, so 2 color multi-gray level coding images are needed to meet the requirement.
Next, at step 130, one reflection image of each color multi-gray level encoded image is acquired by at least one image capturing device 14, respectively, and three-dimensional coordinates of each reflection image in the reference coordinate system of the corresponding image capturing device 14 are calculated to generate a set of point cloud data indicating the at least one color multi-gray level encoded image.
FIG. 4 illustrates a flowchart of one embodiment of the step 130 of determining a set of point cloud data for at least one color multi-gray scale encoded image in the method 100 illustrated in FIG. 2.
As shown in fig. 4, step 130 may include a sub-step 132 in which a first reflected image and a second reflected image of each color multi-gray level encoded image are acquired by at least a first image acquisition device 14a and a second image acquisition device 14b, respectively.
In sub-step 134, for each color multi-gray level coded image, the two-dimensional coordinates of each pixel point of the first reflected image acquired in sub-step 132 under the first image capturing device 14a and the abscissa in the projection device 12 are acquired, and the two-dimensional coordinates of the first image capturing device 14a and the abscissa of the projection device 12 are matrix-converted by using the principle of triangulation to acquire the three-dimensional coordinates of each pixel point of the first reflected image under the reference coordinate system of the first image capturing device 14 a.
In sub-step 136, for each color multi-gray level coded image, the two-dimensional coordinates of each pixel point of the second reflected image acquired in sub-step 132 under the second image capturing device 14b and the abscissa in the projection device 12 are acquired, and the two-dimensional coordinates of the second image capturing device 14b and the abscissa of the projection device 12 are matrix-converted by using the principle of triangulation to acquire the three-dimensional coordinates of each pixel point of the second reflected image under the reference coordinate system of the second image capturing device 14 b.
Note that while sub-steps 134 and 136 are depicted in fig. 4 as being in the illustrated sequential order, in actual practice, sub-steps 134 and 136 may be performed in a different order, e.g., sub-steps 134 and 136 may be performed in parallel, sub-step 136 may be performed before sub-step 134, etc.
Next, in sub-step 138, a set of point cloud data of the at least one color multi-gray scale coded image is obtained based on the three-dimensional coordinates of each pixel point of the first reflected image obtained in sub-step 134 in the reference coordinate system of the first image capturing device 14a and the three-dimensional coordinates of each pixel point of the second reflected image obtained in sub-step 136 in the reference coordinate system of the second image capturing device 14 b.
Returning to fig. 2, the method 100 proceeds to step 140, wherein the topographical features of the surface 40 to be inspected are determined based on a set of point cloud data of the at least one color multi-grayscale encoded image.
In some cases, such as where the area of the surface 40 to be inspected is small, a single execution of step 120 results in a reflection image of the entire surface 40 to be inspected. In this case, in step 140, the topographical features of the surface 40 to be inspected can be determined from the point cloud data obtained in step 130.
In other cases, such as where the surface 40 to be inspected is large in area or is an arc surface, the step 120 may be performed once to obtain a reflection image of only a portion of the surface 40 to be inspected, such that the topographical features of the entire surface 40 to be inspected may not be obtained through the steps 130 and 140 described above. In this case, in step 140, the above steps 120 and 130 may also be repeated to acquire reflection images of different portions of the entire surface 40 to be inspected and process the reflection images to obtain topographical features of the entire surface 40 to be inspected.
FIG. 5 illustrates a flow chart of one embodiment of the step 140 of determining the topographical features of the surface 40 to be inspected in the method 100 illustrated in FIG. 2.
As shown in fig. 5, step 140 may comprise a sub-step 141 in which a plurality of viewing angles of the projection device 12 and the at least one image acquisition device 14 with respect to the surface 40 to be inspected is determined. For example, a workpiece having a surface 40 to be inspected may be fixed in a fixed position, and multiple viewing angles for projecting and capturing different surface portions of the workpiece may be obtained by moving or rotating the position of the optical subsystem 10; or vice versa, the optical subsystem 10 may be fixed in a fixed position, and multiple viewing angles for projecting and photographing different surface portions of the workpiece may be obtained by moving or rotating the workpiece to be inspected.
In sub-step 142, for each of a plurality of views, a set of point cloud data for at least one color multi-grayscale encoded image is acquired. Here, the method of acquiring the point cloud data in step 142 may use the methods described in steps 120 and 130 above.
In order to obtain complete information about the surface 40 to be detected, the projection images of multiple viewing angles usually partially overlap, and the point cloud data obtained at each viewing angle has an independent coordinate system, and therefore, direct splicing cannot be performed, and therefore, the point cloud data obtained at each viewing angle also needs to be subjected to coordinate conversion so as to be unified to the same coordinate system, which is also called registration. To this end, in sub-step 143, the point cloud data at each view angle may be coordinate converted to point cloud data in a globally uniform coordinate system. Specifically, for the reflection images obtained from two adjacent viewing angles, the corresponding coordinates and coordinate systems thereof can be obtained respectively, and then the point cloud coordinates under each viewing angle are converted into the same coordinate system in the form of a translation rotation matrix. For example, the coordinate relationship of corresponding points in the point cloud data of the reflected images of two adjacent viewing angles can be expressed as the following formula (4):
q=R*p+T, (4)
and q and p are coordinates of the reflected images under two visual angles respectively, R is a rotation matrix, and T is a translation matrix.
In one implementation, the rotation matrix R and the translation matrix T between the coordinates of the two reflection images can be found by an iterative optimization method with minimized errors, i.e.
Figure BDA0002614650460000101
And iteratively calculating the optimization target to obtain R and T parameters when the minimized error is obtained. In one example, R is a 3x3 matrix, T is a 3x1 matrix, piRefers to the point cloud data coordinate in the p coordinate system, qiThe method refers to point cloud data coordinates under a q coordinate system.
Next, at sub-step 146, the converted point cloud data at the multiple perspectives are stitched to form complete point cloud data for the surface 40 to be inspected.
That is, in such an embodiment, point cloud data of different portions of the surface to be inspected 40 may be acquired by adjusting the viewing angles of the projection apparatus and the image acquisition apparatus 14 with respect to the different portions of the surface to be inspected 40, and the point cloud data may be stitched to obtain global point cloud data.
However, in general, in order to obtain a reflection image that completely covers the entire surface 40 to be inspected, the reflection images obtained at the various viewing angles may be partially overlapped, so that the point cloud data after the registration of the sub-step 143 may have point cloud overlapping, and the density of the point cloud data at the overlapped portion may be much higher than that of other regions. In order to improve the precision and reduce the error caused by point cloud stacking, data point simplification needs to be carried out on the point cloud data with unified coordinates, namely the point cloud data of the overlapped part is simplified. In this case, step 140 may further include a substep 144 in which the point cloud data after conversion to the uniform coordinate system at the multiple viewing angles is reduced to remove overlapping data between the point cloud data at the multiple viewing angles.
In addition, because hardware equipment and operation may bring noise to the point cloud data, so that data precision and smoothness of the three-dimensional curved surface are affected, filtering smoothing operation can be performed on the point cloud data. In this case, step 140 may further include a substep 145 in which the reduced point cloud data is filtered to obtain smooth point cloud data for the surface 40 to be inspected.
Here, filtering the point cloud data may include using median filtering, mean filtering, gaussian filtering, and the like, which is not described herein.
In some cases, the described aspects of the invention may be implemented as a system 1 comprising an optical subsystem 10 and a digital processing device 20 as shown in FIG. 1. In this case, the projection device 12 is configured to project a reference gray-scale gradient image or a color multi-gray-scale coded image towards the reference plane 40 or the surface 40 to be inspected under the control of the digital processing device 20, and the at least one image acquisition device 14 is configured to receive a corresponding reflected image and to send the reflected image to the digital processing device 20 in response to the projection device 12 projecting the reference gray-scale gradient image or the color multi-gray-scale coded image towards the reference plane 40 or the surface 40 to be inspected.
In other cases, aspects described herein may be implemented as a digital processing device 20 as shown in FIG. 1, which includes a processor 22 and a memory 24, with computer program code stored in the memory 24, which is executed by the processor 22 to perform various aspects described herein. The complete structure of digital processing device 20 is described below, for example, in connection with device 600 of fig. 6. In this case, the user may reuse some existing hardware devices, such as the projection device 12, the image acquisition device 14, and the like.
In still other cases, the described aspects of the invention may be implemented as a digital processing device 20 comprising chip circuitry that may include circuit elements for performing various aspects of the invention. For example, digital processing device 20 may include a general purpose Graphics Processor (GPU) and dedicated chip circuitry (e.g., an FPGA or ASIC, etc.). In this case, the user may also reuse some existing hardware devices, such as the projection device 12, the image acquisition device 14, the GPU, and so on. By using the combination of the special chip circuit and the general GPU, on one hand, the strong parallel processing capability and high-efficiency data transmission capability of the GPU for image post-processing can be utilized, and on the other hand, the special chip circuit such as FPGA is utilized for carrying out pipeline operation to carry out hardware acceleration on an image preprocessing algorithm, thereby realizing the flexibility of the whole algorithm design and equipment control management.
In other cases, aspects described herein may be implemented as computer program code for performing the functions described herein and/or a computer-readable storage medium or the like containing such computer program code. In this case, the user may fully utilize the existing hardware, including the optical subsystem 10 and the digital processing device 20, etc.
By using the scheme of some aspects of the invention, the color coding is carried out on the structured light, the traditional gray stripe projection is replaced by the color projection, the projection frame rate and the depth resolution are improved, and the dynamic projection coding can be carried out to adapt to different measurement occasions.
Fig. 6 shows a schematic structural diagram of an apparatus 600 for detecting surface topography according to an embodiment of the present invention. The device 600 may be, for example, a desktop or laptop computer, etc. As shown, device 600 may include one or more Central Processing Units (CPUs) 610 (only one shown schematically) that may perform various appropriate actions and processes according to computer program instructions stored in a Read Only Memory (ROM)620 or loaded from a storage unit 680 into a Random Access Memory (RAM) 630. In the RAM 630, various programs and data required for the operation of the device 600 can also be stored. The CPU 610, ROM 620, and RAM 630 are connected to each other via a bus 640. An input/output (I/O) interface 650 is also connected to bus 640.
Various components in device 600 are connected to I/O interface 650, including: an input unit 660 such as a keyboard, a mouse, etc.; an output unit 670 such as various types of displays, speakers, and the like; a storage unit 680, such as a magnetic disk, optical disk, or the like; and a communication unit 690 such as a network card, modem, wireless communication transceiver, etc. The communication unit 690 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The method 100 described above may be performed, for example, by the processing unit 610 of the apparatus 600. For example, in some embodiments, the method 100 may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as the storage unit 680. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 620 and/or the communication unit 690. When the computer program is loaded into RAM 630 and executed by CPU 610, one or more operations of method 100 described above may be performed. Further, the communication unit 690 may support wired or wireless communication functions.
The method 100 and apparatus 600 for detecting surface topography in accordance with the present invention are described above with reference to the accompanying drawings. Those skilled in the art will appreciate, however, that the device 600 need not contain all of the components shown in fig. 6, it may contain only some of the components necessary to perform the functions described in the present invention, and the manner in which these components are connected is not limited to the form shown in the drawings. For example, in the case where the device 600 is a portable device such as a cellular phone, the device 600 may have a different structure compared to that in fig. 6.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (16)

1. A method for detecting surface topography, comprising:
generating at least one color multi-gray-level coded image based on a gray response curve of a system, and projecting the at least one color multi-gray-level coded image to a surface to be detected through projection equipment;
respectively acquiring one reflection image of each color multi-gray-scale coded image through at least one image acquisition device, and calculating three-dimensional coordinates of each reflection image under a reference coordinate system of the corresponding image acquisition device to generate a group of point cloud data for indicating the at least one color multi-gray-scale coded image; and
determining the topography of the surface to be detected based on a set of point cloud data of the at least one color multi-gray level encoded image.
2. The method of claim 1, wherein generating at least one color multi-gray scale encoded image based on a gray response curve of a system further comprises:
generating the at least one color multi-gray scale encoded image based on the gray response curve and a pixel resolution of the projection device.
3. The method of claim 2, wherein each color multi-gray level encoded image comprises three colors of RGB, and the number of gray levels of each color multi-gray level encoded image and the number of the at least one color multi-gray level encoded image satisfy the following relationship:
n3m≥S,
wherein n represents the number of gray levels of each color multi-gray level encoded image, m represents the number of the at least one color multi-gray level encoded image, and S represents the pixel resolution of the projection device.
4. The method of claim 1, wherein generating a set of point cloud data indicative of the at least one color multi-grayscale encoded image further comprises:
respectively acquiring a first reflection image and a second reflection image of each color multi-gray-scale coding image through a first image acquisition device and a second image acquisition device;
for each color multi-gray-level coded image, acquiring a two-dimensional coordinate of each pixel point of the first reflection image in the first image acquisition device and an abscissa in the projection device, and performing matrix conversion on the two-dimensional coordinate of the first image acquisition device and the abscissa by using a triangulation distance measurement principle to acquire a three-dimensional coordinate of each pixel point of the first reflection image in a reference coordinate system of the first image acquisition device;
for each color multi-gray-level coded image, acquiring a two-dimensional coordinate of each pixel point of the second reflection image in the second image acquisition device and an abscissa in the projection device, and performing matrix conversion on the two-dimensional coordinate of the second image acquisition device and the abscissa by using a triangulation distance measurement principle to acquire a three-dimensional coordinate of each pixel point of the second reflection image in a reference coordinate system of the second image acquisition device; and
and acquiring the set of point cloud data of the at least one color multi-gray-level coded image based on the three-dimensional coordinates of each pixel point of the first reflected image of each color multi-gray-level coded image in the reference coordinate system of the first image acquisition device and the three-dimensional coordinates of each pixel point of the second reflected image in the reference coordinate system of the second image acquisition device.
5. The method of claim 1, wherein determining a topographical feature of the surface to be detected based on a set of point cloud data of the at least one color multi-grayscale encoded image further comprises:
determining a plurality of viewing angles of the projection device and the at least one image acquisition device relative to the surface to be inspected;
for each view of the plurality of views, obtaining a set of point cloud data for the at least one color multi-grayscale encoded image;
performing coordinate conversion on the point cloud data under each visual angle to convert the point cloud data into point cloud data under a globally uniform coordinate system; and
and splicing the converted point cloud data under the multiple viewing angles to form complete point cloud data for the surface to be detected.
6. The method of claim 5, wherein determining the topographical features of the surface to be detected based on a set of point cloud data of the at least one color multi-grayscale encoded image further comprises:
simplifying the converted point cloud data under the multiple viewing angles to remove overlapped data among the point cloud data under the multiple viewing angles; and
and filtering the simplified point cloud data to obtain smooth point cloud data aiming at the surface to be detected.
7. The method of claim 1, further comprising:
determining the gray response curve based on at least four reference gray scale gradient images, wherein the at least four reference gray scale gradient images have the same number of gray scale levels and different gray scale gradient directions.
8. The method of claim 7, wherein determining the gray response curve based on at least four reference gray scale gradient images further comprises:
acquiring the at least four reference gray-scale gradient images based on the pixel resolution of the projection equipment;
projecting the at least four reference gray level gradient images to a reference plane in sequence;
acquiring a reference reflection image of each reference gray-scale gradient image through one image acquisition device of the at least one image acquisition device;
performing consistency processing on the reference reflection images of the at least four reference gray level gradient images to enable the reference reflection images of the at least four reference gray level gradient images to have the same size and gray level gradient direction;
for the reference reflection image of each reference gray level gradient image, sequentially solving the gray level mean value of each pixel column according to the gray level gradient direction to generate a gray level mean value vector of each reference reflection image;
averaging the gray level mean value vectors of all the reference reflection images to obtain a total output gray level mean value vector; and
generating the gray response curve based on a total input gray mean vector and the total output gray mean vector of the at least four reference gray gradient images.
9. The method of claim 8, wherein coherently processing the reference reflection images of the at least four reference grayscale gradation images such that the reference reflection images of the at least four reference grayscale gradation images have the same size and grayscale gradation direction comprises:
selecting one of the at least four reference gray gradient images as a reference image; and
and taking the gray level gradient direction of the reference image as a reference, and adjusting other reference gray level gradient images in the at least four reference gray level gradient images to enable the size and the gray level gradient direction of the other reference gray level gradient images to be the same as those of the reference image.
10. The method of claim 9, wherein the vector of gray values of the reference image is taken as a total input gray mean vector of the at least four reference gray gradient images.
11. The method of claim 1, wherein the color multi-gray level encoded image comprises a color multi-gray level gray code image.
12. An apparatus for detecting surface topography, comprising:
a memory having computer program code stored thereon; and
a processor configured to execute the computer program code to perform the method of any of claims 1 to 11.
13. An apparatus for detecting surface topography comprising a circuit unit configured to perform the method of any of claims 1 to 11 when powered up.
14. A computer readable storage medium having stored thereon computer program code which, when executed, performs the method of any of claims 1 to 11.
15. A system for detecting surface topography, comprising:
the apparatus of claim 12 or 13;
the projection device configured to project the reference gray-scale gradation image or the color multi-gray-scale encoded image onto the reference plane or the surface to be inspected under control of the device; and
the at least one image acquisition device is configured to receive a reflection image corresponding to the reference gray-scale gradation image or the color multi-gray-scale code image and to send the reflection image to the device in response to the projection device projecting the reference gray-scale gradation image or the color multi-gray-scale code image to the reference plane or the surface to be detected.
16. The system of claim 15, wherein the projection device comprises a digital projector and/or the image acquisition device comprises a camera.
CN202010766085.XA 2020-08-03 2020-08-03 Method, apparatus, medium, and system for detecting surface topography Withdrawn CN111862086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010766085.XA CN111862086A (en) 2020-08-03 2020-08-03 Method, apparatus, medium, and system for detecting surface topography

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010766085.XA CN111862086A (en) 2020-08-03 2020-08-03 Method, apparatus, medium, and system for detecting surface topography

Publications (1)

Publication Number Publication Date
CN111862086A true CN111862086A (en) 2020-10-30

Family

ID=72952691

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010766085.XA Withdrawn CN111862086A (en) 2020-08-03 2020-08-03 Method, apparatus, medium, and system for detecting surface topography

Country Status (1)

Country Link
CN (1) CN111862086A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114166150A (en) * 2021-12-07 2022-03-11 海伯森技术(深圳)有限公司 Stripe reflection three-dimensional measurement method, system and storage medium
CN114543707A (en) * 2022-04-25 2022-05-27 南京南暄禾雅科技有限公司 Phase expansion method in scene with large depth of field

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114166150A (en) * 2021-12-07 2022-03-11 海伯森技术(深圳)有限公司 Stripe reflection three-dimensional measurement method, system and storage medium
CN114166150B (en) * 2021-12-07 2022-06-21 海伯森技术(深圳)有限公司 Stripe reflection three-dimensional measurement method, system and storage medium
CN114543707A (en) * 2022-04-25 2022-05-27 南京南暄禾雅科技有限公司 Phase expansion method in scene with large depth of field

Similar Documents

Publication Publication Date Title
US10880541B2 (en) Stereo correspondence and depth sensors
CN103069250B (en) 3-D measuring apparatus, method for three-dimensional measurement
US11941831B2 (en) Depth estimation
US9135710B2 (en) Depth map stereo correspondence techniques
CN111951376B (en) Three-dimensional object reconstruction method fusing structural light and photometry and terminal equipment
JP2023549821A (en) Deformable neural radiance field
CN111741286B (en) Geometric fusion of multiple image-based depth images using ray casting
CN111750804B (en) Object measuring method and device
CN111862086A (en) Method, apparatus, medium, and system for detecting surface topography
US11512946B2 (en) Method and system for automatic focusing for high-resolution structured light 3D imaging
CN106062824A (en) Edge detection device, edge detection method, and program
CN104504387A (en) Correcting method and device for text image
CN112053383A (en) Method and device for real-time positioning of robot
CN110135011B (en) Visual-based flexible board vibration form visualization method
CN113160416B (en) Speckle imaging device and method for coal flow detection
CN113989384A (en) Camera calibration method, cropping data generation method, device, equipment and medium
US8472756B2 (en) Method for producing high resolution image
CN112802084A (en) Three-dimensional topography measuring method, system and storage medium based on deep learning
CN107810384B (en) Stripe projection method, stripe projection apparatus, and computer program product
CN115546016B (en) Method for acquiring and processing 2D (two-dimensional) and 3D (three-dimensional) images of PCB (printed Circuit Board) and related device
JP2009302731A (en) Image processing apparatus, image processing program, image processing method, and electronic device
CN115239559A (en) Depth map super-resolution method and system for fusion view synthesis
CN114549613A (en) Structural displacement measuring method and device based on deep super-resolution network
CN114998409A (en) Adaptive structured light measuring method and device, electronic equipment and medium
CN114757985A (en) Binocular depth sensing device based on ZYNQ improved algorithm and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20201030