CN114373007A - Depth data measuring apparatus, depth data measuring method, and image matching method - Google Patents

Depth data measuring apparatus, depth data measuring method, and image matching method Download PDF

Info

Publication number
CN114373007A
CN114373007A CN202011101854.0A CN202011101854A CN114373007A CN 114373007 A CN114373007 A CN 114373007A CN 202011101854 A CN202011101854 A CN 202011101854A CN 114373007 A CN114373007 A CN 114373007A
Authority
CN
China
Prior art keywords
structured light
image
images
coded
imaging device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011101854.0A
Other languages
Chinese (zh)
Inventor
王敏捷
梁雨时
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Tuyang Information Technology Co ltd
Original Assignee
Shanghai Tuyang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Tuyang Information Technology Co ltd filed Critical Shanghai Tuyang Information Technology Co ltd
Priority to CN202011101854.0A priority Critical patent/CN114373007A/en
Publication of CN114373007A publication Critical patent/CN114373007A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a depth data measuring device and method and an image matching method. The apparatus comprises: first and second structured light projection means for projecting first and second coded structured light toward the photographing space; an imaging unit for performing photographing; and a processor for controlling the first and second structured light projecting devices to project the coded structured light to the photographing space, respectively and simultaneously, controlling the imaging unit to photograph the first and second pattern two-dimensional images under the irradiation of the first and second structured lights, respectively, and the composite pattern two-dimensional image under the common irradiation of the first and second structured lights, and determining depth data of the photographing object based on the photographed images. According to the invention, the plurality of structured light projection devices are simultaneously utilized to project the respective structured light and the common structured light in turn, so that more patterns can be projected by the least projection devices, thereby improving the image acquisition precision, for example, realizing smaller window matching.

Description

Depth data measuring apparatus, depth data measuring method, and image matching method
Technical Field
The present invention relates to depth imaging, and in particular, to a depth data measuring apparatus and method, and an image matching method.
Background
In recent years, three-dimensional imaging techniques have been developed vigorously. Currently, a structured light based depth measurement scheme is capable of three-dimensional measurement of the surface of an object in real time. Briefly, the scheme first projects a two-dimensional laser texture pattern, such as a discretized speckle pattern, with encoded information onto a surface of a natural body. In the binocular implementation, two image acquisition devices with relatively fixed positions are used for continuously acquiring laser textures, the processing unit is used for sampling two images acquired by the two image acquisition devices simultaneously by using the sampling window, the laser texture patterns matched in the sampling window are determined, the depth distance of each laser texture sequence segment projected on the surface of a natural body is calculated according to the difference between the matched texture patterns, and the three-dimensional data of the surface of the object to be measured is further measured.
In the matching process, the larger the sampling window is, the larger the amount of pattern information included in a single sampling is, and thus matching is easier to perform, but the larger the granularity of the obtained depth image is. Accordingly, the smaller the sampling window, the finer the granularity of the image, but the greater the mismatch rate.
With the advent of the consumer-grade three-dimensional camera age, how to realize three-dimensional information, especially accurate acquisition of depth information, at lower cost and with higher accuracy and smaller volume becomes a problem in the industry.
Disclosure of Invention
In view of the above, the present invention provides a depth data measuring apparatus, a depth data measuring method, and an image matching method, which can acquire more structured light patterns with the minimum structured light projection devices by simultaneously projecting respective structured light and common structured light by using a plurality of structured light projection devices in turn, thereby improving the accuracy of image acquisition.
According to a first aspect of the present invention, there is provided a depth data measuring apparatus comprising: a first structured light projecting device for projecting a first coded structured light to the shooting space; a second structured light projection device for projecting a second coded structured light to the shooting space; an imaging unit fixed relative to the first and second structured light projecting devices, for photographing the photographing space to obtain a two-dimensional image of a photographing object under the structured light irradiation; and a processor for controlling the first and second structured light projecting devices to project the coded structured light to the photographing space, respectively, and the first and second structured light projecting devices to project the coded structured light to the photographing space at the same time, controlling the imaging unit to photograph a first pattern two-dimensional image and a second pattern two-dimensional image of the photographing object under the irradiation of the first and second coded structured light, and a composite pattern two-dimensional image of the photographing object under the irradiation of the coded structured light composited by the first and second coded structured light, and determining depth data of the photographing object based on the photographed first pattern two-dimensional image, second pattern two-dimensional image, and composite pattern two-dimensional image.
According to a second aspect of the present invention, there is provided an image matching method comprising: controlling the first structured light projecting device and the second structured light projecting device to project the first coded structured light, the second coded structured light and the first and second coded structured light to the shooting space at different moments; acquiring three groups of images which are obtained by shooting three types of coded structured light projected in the same shooting space by a first imaging device and a second imaging device respectively, wherein the first imaging device and the second imaging device have a preset relative position relationship, the image shot by the first imaging device in each group of images is a first image, and the image shot by the second imaging device is a second image; for each group of images, respectively solving the confidence coefficient of each window matching between the first image and the second image in the same group according to the same size of the matching window; and determining the windows matched with each other between the first image and the second image in each group of images based on the confidence degree of the matching of the windows between the first image and the second image in each group of images.
According to a third aspect of the present invention, there is provided a depth data measuring method comprising: projecting at least three kinds of coded structured light to the same shooting space, wherein at least one kind of coded structured light is the coded structured light projected and synthesized by two or more structured light projection devices simultaneously; shooting at least three kinds of coded structured light projected in the same shooting space by using a first imaging device and a second imaging device to respectively obtain at least three groups of images respectively comprising a first image and a second image, wherein the first imaging device and the second imaging device have a preset relative position relationship; determining, by the method according to the second aspect of the invention, windows in each set of images that match each other between the first image and the second image; determining depth data of the first image pixel according to a position difference between the first image pixel and the second image pixel in the windows matched with each other and the predetermined relative position relation.
According to a fourth aspect of the present invention, there is provided a depth data measuring method comprising: repeatedly projecting at least three kinds of coded structured light to the same shooting space according to a preset work period, wherein at least one kind of coded structured light is the coded structured light projected and synthesized by two or more structured light projection devices; shooting each of at least three kinds of coded structured light projected in the same shooting space in each work period by using a first imaging device and a second imaging device, and respectively obtaining at least three groups of images which respectively comprise a first image and a second image and correspond to the corresponding work periods, wherein the first imaging device and the second imaging device have a preset relative position relationship; determining, by the method according to the second aspect of the invention, windows in which the first image and the second image match each other in each set of images within the respective duty cycle; and determining the depth data of the first image pixel in the corresponding working period according to the position difference between the first image pixel and the second image pixel in the windows matched with each other in the corresponding working period and the preset relative position relation.
The method comprises the steps of projecting different coded structured lights aiming at the same shooting object and respectively carrying out binocular imaging, carrying out window matching confidence coefficient calculation according to the same window size aiming at each coded structured light, and determining the final matched window by comprehensively considering the confidence coefficient of the window at the same position under each coded structured light. Therefore, the confidence coefficient of the small window is improved by introducing a plurality of groups of images, and more accurate and clear depth information is obtained.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
Fig. 1 is a schematic diagram of a depth data measuring device of the present disclosure.
Fig. 2 shows an example of a measuring head of a depth data measuring apparatus according to an embodiment of the present invention.
Fig. 3 shows an example of two structured light projection devices projecting three different kinds of coded structured light to an object to be measured in the same photographing space.
Fig. 4 shows another example in which two structured light projection devices project three different kinds of coded structured light toward an object to be measured in the same photographing space.
Fig. 5 shows an exemplary flow chart of an image matching method according to the present invention.
Fig. 6 shows an exemplary flow chart of a depth data measurement method according to the present invention.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Image matching is a very important step in depth of field (hereinafter also referred to as "depth data") extraction for binocular stereo vision. The target of this step is to find out the corresponding pixel point of each pixel point of the image on the image of another visual angle, calculate the parallax image, estimate the depth image, i.e. mark the corresponding depth data at each pixel point.
The local stereo matching algorithm can be used for image matching, the basic principle is that a certain point on one image is given, a sub-window in the neighborhood of the pixel point is selected, all possible positions in a region in the other image are searched for a sub-image most similar to the sub-window image according to a certain similarity judgment basis, and the corresponding pixel point in the matched sub-image is the matching point of the pixel. The algorithm has small calculation amount and is suitable for real-time operation.
In the actual window selection, if the window is large, the information contained in the window is also large, so that matching is easy to perform, but the granularity is large; if the window is small, although the granularity is small, it is easy to cause a mismatch.
In view of the above, the present invention provides a depth data measuring apparatus, a depth data measuring method, and an image matching method, which can acquire more structured light patterns with the minimum structured light projection devices by simultaneously projecting respective structured light and common structured light by using a plurality of structured light projection devices in turn, thereby improving the accuracy of image acquisition. Further, different coded structured lights are projected to the same shooting object and are respectively subjected to binocular imaging, window matching confidence coefficient calculation is carried out on each coded structured light in the same window size, and the final matched window is determined by comprehensively considering the confidence coefficient of the windows at the same position under each coded structured light. Therefore, the confidence coefficient of the small window is improved by introducing a plurality of groups of images, and more accurate and clear depth information is obtained.
Fig. 1 is a schematic diagram of a depth data measuring device of the present disclosure. As shown in fig. 1, the depth data measurement system includes structured light projection devices 101 and 102, first and second imaging devices 11 and 12, and a processor 13.
The structured light projection device comprises a first structured light projection device 101 and a second structured light projection device 102. The first structured light projecting device 101 is configured to project the first coded structured light to the shooting space. The second structured light projecting device 102 is configured to project the second coded structured light to the shooting space. Here, the first and second coded structured lights may be structured lights that can be distinguished in the photographing space, for example, the two structured lights may have completely different patterns, or have the same pattern but have an angular deflection or displacement.
The structured light projection device is used for projecting coded structured light to the shooting space, and the projected light can be infrared light so as to be distinguished from natural light. The structured light projection device may comprise a laser light source and a coded structured light generation device. The laser light source may be implemented by a laser diode for generating a single beam of laser light. In other embodiments, a Vertical Cavity Surface Emitting Laser (VCSEL) or the like may also be used as the laser light source. The structured light generating device may be used to diffract the generated laser light into specifically encoded structured light and may be implemented by a Diffractive Optical Element (DOE). It will be appreciated by those skilled in the art that the structured light projection device may also employ, for example, a holographic microlens array, an optical mask, and/or other types of gratings as the structured light generating device to provide the desired structured light pattern.
In one embodiment, the encoded structured light pattern projected by the structured light projection device can be provided as a two-dimensional discrete speckle pattern. In a preferred embodiment, the coded structured light is unique, and the position code values calculated on the image acquired by the imaging device have unique code values at different positions within a predetermined range and a predetermined window size. This position-coding value can be generated, for example, using the De Bruijn sequence or M-array. In particular, a specific projection pattern may be realized by a DOE having a specific structure. In particular, the first and second structured light projection devices 101, 102 may be arranged with different DOEs to produce different projection patterns, or with the same DOE, but the DOE of one projection device may be angularly deflected or displaced to distinguish the pattern projected by the DOE of the other projection device.
In the present invention, the first structured light projection device 101 and the second structured light projection device 102 can project three kinds of coded structured light to the same imaging space. Specifically, the first and second structured light projection devices 101 and 102 may each project different coded structured light, i.e., first and second coded structured light, and the first and second structured light projection devices 101 and 102 may simultaneously project structured light to obtain a combined coded structured light.
As shown in the drawing, the depth data measuring apparatus of the present invention may further include an imaging unit fixed relative to the first and second structured light projecting devices, in addition to the structured light projecting devices, for photographing the photographing space to obtain a two-dimensional image of the photographing object under the structured light irradiation.
In one embodiment, a monocular scheme may be used, that is, the imaging unit may be a single imaging unit, which may continuously collect the projected laser texture, and the processor may compare the collected structured light pattern with a reference surface pattern of known depth distance pre-stored in a register, calculate the depth distance of each discrete segment projected on the surface of the natural body, and further measure to obtain three-dimensional data of the surface of the object to be measured. The three-dimensional measurement technology based on the structured light detection adopts a parallel image processing method, and can detect the depth information of an object to be detected to a certain extent.
But since the monocular scheme requires multiple calibrations and the applicable scenarios are limited, the binocular scheme as shown in the figure is preferably used. The imaging unit then comprises: the first imaging device and the second imaging device are used for shooting the coded structured light projected in the same shooting space to respectively obtain a first image and a second image of a shooting object under the irradiation of the projected structured light, and the first imaging device and the second imaging device have a preset relative position relationship.
The processor 13 is connected to the structured light projection means 101 and 102, the first imaging means 11 and the second imaging means 12 and can control the projection and imaging means as above. Specifically, the processor 13 may control the first and second structured light projecting devices 101 and 102 to project the coded structured light to the photographing space, respectively, and simultaneously project the coded structured light to the photographing space, control the imaging unit to photograph a first pattern two-dimensional image and a second pattern two-dimensional image of the photographing object under the irradiation of the first and second coded structured lights, and a composite pattern two-dimensional image of the photographing object under the irradiation of the coded structured light composited by the first and second coded structured lights, and determine the depth data of the photographing object based on the photographed first pattern two-dimensional image, second pattern two-dimensional image, and composite pattern two-dimensional image and the predetermined relative positional relationship.
Fig. 2 shows an example of a measuring head of a depth data measuring apparatus according to an embodiment of the present invention.
As shown in fig. 2, the measuring head may include a first imaging device 11 and a second imaging device 12 respectively disposed at left and right sides, and the structured light projecting device may include first and second structured light projecting devices 101 and 102. The first and second structured light projection means 101 and 102 each comprise a laser and a structured light generating means to provide different coded structured light in a predetermined order. Due to the displacement between the projection subunits 101 and 102, the coded structured light projected on the object to be measured is not the same even though both comprise the same diffraction pattern. The projection subunit and the imaging device may be secured, for example, by a connection mechanism 20. The measuring head may be connected to a processor and a memory, for example, by lines. It will be appreciated that the illustration of fig. 2 is merely one example of a measuring head. In practical applications, the measuring head may have other configurations for providing at least three coded structured lights.
The first imaging device 11 and the second imaging device 12 take a picture of the shooting space to obtain a first image and a second image, respectively, the first imaging device 11 and the second imaging device 12 have a predetermined relative position relationship, are generally horizontally disposed, and are spaced by a fixed distance, and the relative position is generally unchanged during the shooting process (for example, as shown in fig. 2). In addition, since the imaging unit is fixed relative to the first and second structured light projection devices, the first imaging device 11 and the second imaging device 12, and the first structured light projection device 101 and the second structured light projection device 102, have relatively fixed positions (e.g., fixed via the housing 20), and depth data of the photographed discrete light spot can be obtained based on the relative positions (mainly, the distance d between the first and second imaging devices).
In the case where the first and second structured light projection devices 101 and 102 project infrared light, the first and second imaging devices 11 and 12 may be infrared imaging apparatuses.
In the case where the first structured light projecting device 101 and the second structured light projecting device 102 project three kinds of coded structured light to the same shooting space, the first imaging device 11 and the second imaging device 12 also shoot the three kinds of coded structured light projected in the same shooting space accordingly, and three sets of images each including the first image and the second image are obtained.
Fig. 3 shows an example of two structured light projection devices projecting three different kinds of coded structured light to an object to be measured in the same photographing space. For example, the first structured light projecting device 101 may first project the first coded structured light as shown in the left side of fig. 3, and the first imaging device 11 and the second imaging device 12 may then respectively photograph a first image and a second image for a photographic subject (e.g., a human face) on which the first coded structured light is projected. The first image and the second image form a first set of images. Subsequently, the second structured light projecting device 102 may project the second coded structured light as shown in the middle of fig. 3, and the first imaging device 11 and the second imaging device 12 then respectively capture the first image and the second image for the subject on which the second coded structured light is projected. The first image and the second image constitute a second set of images. Subsequently, the first structured light projection device 101 and the second structured light projection device 102 may project the first and second coded structured light simultaneously as shown on the right side of fig. 3. The first and second structured light together result in a composite pattern. The first imaging device 11 and the second imaging device 12 then take a first image and a second image, respectively, for the subject on which the composite pattern is projected. The first image and the second image constitute a third set of images.
As shown in fig. 3, the first structured light projection device 101 may project a pattern a, and the second structured light projection device 102 may project the same pattern a ', for example, rotated by 5 °, with the pattern projected by both the first and second structured light projection devices being a + a'. In other words, in this example, the first and second structured light projection devices 101 and 102 may use the same DOE to produce the same pattern, except that the DOE in the second structured light projection device 102 may be rotated by 5 ° when generated or installed.
Here, it should be understood that although A, A 'and a + a' are shown in the figures as being projected in order, in other embodiments, any other order than the above may be used, such as a + a ', A, A'. In the continuous depth data acquisition, a certain shooting order may be repeated, for example, projection of a → a '→ a + a' may be performed in a loop, or the projection order may be changed among three shots per round as long as it is ensured that the three-shot pattern is included per round of projection.
Fig. 4 shows another example in which two structured light projection devices project three different kinds of coded structured light toward an object to be measured in the same photographing space. It is also possible that the first structured light projection device 101 projects alone, the second structured light projection device 102 projects alone, and the first structured light projection device 101 and the second structured light projection device 102 project in common, and at each projection, photographing of the corresponding image group is performed. Unlike the example of fig. 3, in fig. 4 the first structured light projection device 101 may project a pattern a and the second structured light projection device 102 may project a different pattern B, the patterns projected by both the first and second structured light projection devices being a + B. In other words, in this example, the first and second structured light projection devices 101 and 102 may use different DOEs to produce different patterns.
In addition, although the patterns are labeled A, A 'and a + a', and A, B and a + B in the examples of fig. 3 and 4, it is true that the present invention achieves the advantageous technical effects of actually projecting three different patterns using two projected patterns of two structured light projection devices through the merging of the pattern projections, and the two structured light projection devices can rest one round in three projections per round, thereby being able to extend the service life of the laser.
Preferably, the depth data measuring apparatus of the present invention may further comprise a memory 14 for storing each set of images, each including the first image and the second image thereof, for subsequent processing by the processor 13.
The processor 13 obtains the first image and the second image in each group of images from the memory 14, for each group of images, respectively obtains the confidence of each window matching between the first image and the second image in the same group with the same matching window size, determines a window in each group of images in which the first image and the second image are matched with each other based on the confidence of each window matching between the first image and the second image in each group of images, and determines the depth data of the first image pixel according to the position difference between the first image pixel and the second image pixel in the window in which the first image and the second image are matched with each other and the predetermined relative position relationship.
As mentioned above, the local stereo matching algorithm is used to perform image matching, where a point on one image (first image) is given, a sub-window in the neighborhood of the pixel point is selected, and at all possible positions in a region in the other image (second image), a sub-image most similar to the sub-window image is searched according to a predetermined similarity judgment basis, and the pixel point corresponding to the sub-image matched with the sub-window image is the matching point of the pixel.
The processor 13 selects, for each first image pixel in the first image, its neighboring region, for example, a 13 × 13 pixel-sized sub-window, according to a predetermined neighborhood rule; and searching the sub-window image which is most similar to the sub-window image in the first image, for example, the sub-windows with the same size of 13x13 pixels, according to a predetermined similarity judgment basis at all possible positions in an area in the second image, wherein the corresponding pixel point in the matched sub-window image is the matching point of the pixel.
In the actual window size selection, if the window is large, the information contained in the window is also large, so that matching is easy to perform, but the granularity is large; if the window is small, although the granularity is small, it is easy to cause a mismatch.
In the prior art, the projected coded structured light may be unique within a predetermined range and with a given window size. To ensure a correct match, a sub-window size that is compatible with the range of uniqueness of the coded structured light is usually required. In the case of using the first or second coded structured light alone, the prior art requires, for example, a 13x13 pixel size sub-window to ensure an accurate match with high confidence for each pixel in the first and second images.
Unlike the prior art where a set of first and second images are taken and sub-window local matching is performed for the same coded structured light only (i.e., sub-window local matching is performed using one of the coded structured lights alone). The image matching scheme of the invention obtains at least three groups of corresponding images aiming at least three groups of coded structured light, respectively obtains the confidence coefficient of matching of windows between a first image and a second image in each group of images for each group of images, and determines the windows matched with each other between the first image and the second image in each group of images based on the confidence coefficient of matching of windows between the first image and the second image in each group of images.
As described above, fig. 3 and 4 show examples in which two structured light projection devices project three different kinds of coded structured light to a photographic subject in the same photographic space. Fig. 3 and 4 can also be regarded as different first images obtained by the first imaging device 11 shooting the same object under different coded structured light. In one embodiment, the left images of fig. 3 and 4 may be considered as the first image of the first set of images captured by the first imaging device 11, the middle images of fig. 3 and 4 may be considered as the first image of the second set of images captured by the first imaging device 11, and the right images of fig. 3 and 4 may be considered as the first image of the third set of images captured by the first imaging device 11. Although not shown, it is understood that the second imaging device 12 also captures a second image of the first set of images, a second image of the second set of images, and a second image of the third set of images for the two coded structured light projections that correspond, but have slightly different viewing angles.
After acquiring three sets of images of different patterns as shown in fig. 3 or fig. 4, a schematic diagram of sub-window selection can be performed on the images of the photographic subject projected with different coded structured light. In matching the first and second images of the first set of images, a smaller size, e.g. a small window of size 3x3 pixels, may be selected for matching. With the range of code uniqueness of the projected coded structured light unchanged, for a certain determined 3x3 image in the first image, due to the window size becoming smaller (e.g., from 13x13 pixels to 3x3 pixels)Pixel Window X, there may be a 3X3 pixel window at multiple locations in the second image (e.g., 3X3 pixel Window Y1、Y2、Y3And Y4) There is a higher confidence of the match with window a in the first image. Subsequently, matching may be performed for the first and second images of the second set of images. A small window of 3x3 pixels is also selected for matching. In the second set of matches, there may also be a 3X3 pixel window (e.g., pixel window Y ') at multiple locations in the second image for pixel window X ' at the same location as pixel window X of the first set of first images '1、Y’2And Y'3) There is a higher confidence of the match with the pixel window X'. Finally, a matching may be performed for the first and second images of the third set of images. A small window of 3x3 pixels is also selected for matching. In the second set of matches, there may also be a 3X3 pixel window (e.g., pixel window Y') at multiple locations in the second image for pixel window X "at the same location as pixel window X of the first image of the first set"1、Y”2And Y "3) There is a higher confidence of the match with pixel window X ". However, if the confidence of matching among the three sets of images is considered together, only the three sets of images are all co-located (i.e., Y)1、Y’1、Y”1) The pixel window has higher confidence in the three matching, so that the pixel window X of the first image and the pixel window Y of the second image in the first group of images can be judged1Matching, in the second set of images, the pixel window X ' of the first image with the pixel window Y ' of the second image '1Matching, in the third set of images, the pixel window X 'of the first image with the pixel window Y' of the second image "1And (4) matching. Therefore, high accuracy matching can be achieved through a smaller window, granularity is reduced, and the definition of the acquired depth image is improved.
In one embodiment, the confidence sums of the windows in the same position in each set of images may be found, and the window with the highest confidence sum may be determined as the window matching each other. In other embodiments, other methods may be used to determine the final matched window according to the window matching confidence in each set of images, for example, calculating a cost value as a quantitative indicator of the window matching confidence. While an embodiment of projecting three coded structured lights and performing three sets of image processing accordingly has been employed for ease of discussion, it will be understood by those skilled in the art that four or more coded structured lights may be projected and subsequently processed, thereby enabling smaller size window matching.
Specifically, the depth data measuring apparatus of the present invention may further include: a third structured light projecting device for projecting a third coded structured light to the shooting space, and the processor is configured to: controlling the third structured light projecting device to project the coded structured light separately, together with the first and/or second structured light projecting device. Here, for example, when the first, second, and third structured light projection devices project the pattern A, B, C separately, seven patterns of a, B, C, AB, BC, AC, and ABC can be obtained when the pattern combining projection is enabled, which is far more than the three patterns before the combined pattern is taken. Further, the depth data measuring apparatus of the present invention may include more structured light projection devices as needed to obtain more different pattern combinations, and such variations are within the scope of the present invention.
In a specific implementation, the projection and imaging of the at least three coded structured lights are performed sequentially. For example, the processor may control the structured light projecting device 101 to perform image capturing by the first and second imaging devices 11 and 12, respectively, when projecting the first coded structured light; while the first and second imaging devices 11 and 12 perform image capturing of the next frame when the structured light projecting device 102 projects the second coded structured light, the first and second imaging devices 11 and 12 perform image capturing of the next frame when the two projecting devices 101 and 102 project together. The at least three sets of images may be images taken continuously at respective frame rates of the first imaging device and the second imaging device. In a preferred embodiment, the first and second imaging devices have the same frame rate, and the first and second images in each set of images are taken by the first and second imaging devices simultaneously.
Preferably, the above-mentioned shooting and the finding of the depth data may be continuous, thereby enabling the depth data measuring system of the present invention to perform real-time measurement for a dynamic target. Assuming that the first and second imaging devices each perform imaging at a speed of 30 frames/second by default, if three different coded structured lights are projected in turn, the depth image update of the object to be measured can be continuously performed at a speed of 10 frames/second in the case where the processing speed of the processor satisfies the requirement. In other words, the capturing and the finding of the depth image are performed with a predetermined duty cycle of 0.1 second. In particular practice, matching of smaller windows can be achieved at the expense of a reduced update speed. For example, six different coded structured lights are projected repeatedly (three structured light projection devices are required at this time), matching is done with a window size of 1 × 1 pixels, but the depth image update speed is reduced to 0.2 seconds. The flexible compromise can be performed according to the actual application conditions.
The depth data may be derived from only the first set of images (or any other set of images). Depth image data with lower granularity and better clarity can be acquired simply due to the reduction in the size of the matching window. In one embodiment, the depth data may also be derived using each set of images. Because different coded structured light is projected, more positions on the object to be detected can be used for providing depth information. In the same region X of the subject, additional consideration of the denser image of discrete spots can give finer depth detail. In other words, the depth data obtained under different coded structured lights includes depth information of more positions on the object to be detected, so that new depth data obtained by fusing the depth data determined under at least three coded structured lights can be used as the depth data of the object to be detected, the depth information of the object to be detected can be more accurately reflected, and a clearer outline of the object to be detected can be provided.
The depth data measuring method and the image matching method therein according to the present invention will be described in more detail with reference to fig. 5 and 6.
The image matching algorithm according to the present invention is described in detail below with reference to fig. 5.
Fig. 5 shows an exemplary flow chart of an image matching method according to the present invention.
As shown in fig. 5, in step S510, the first structured light projecting device and the second structured light projecting device are controlled to project the first coded structured light, the second coded structured light and the first and second coded structured lights to the shooting space at different times. The first and second coded structured light may be coded structured light with the same code but different projected spatial positions or angles; or to encode different coded structured light. In any case, the respective individual projections and the common projection at a time can result in three different coding patterns.
In step S520, three sets of images respectively obtained by photographing three kinds of coded structured light projected in the same photographing space by a first imaging device and a second imaging device, where the first imaging device and the second imaging device have a predetermined relative positional relationship therebetween, are obtained. The image shot by the first imaging device in each group of images is a first image, and the image shot by the second imaging device is a second image.
The three sets of images may be images continuously captured at respective frame rates of the first imaging device and the second imaging device. Preferably, the first imaging device and the second imaging device have the same frame rate, and the first image and the second image in each set of images are captured by the first imaging device and the second imaging device simultaneously.
In step S530, for each group of images, the confidence of each window matching between the first image and the second image in the same group is obtained by using the same matching window size.
In step S540, a window in which the first image and the second image in each group of images match with each other is determined based on the confidence of each window matching between the first image and the second image in each group of images.
For example, the confidence sums of windows in the same position in each group of images may be found, and the window with the highest confidence sum may be determined as the window matching each other. Therefore, the confidence coefficient of the small window is improved by introducing a plurality of groups of images, and more accurate and clear depth information is obtained.
In other embodiments, multiple structured light projection devices may be included as previously described to achieve more projection combinations and smaller windows.
Fig. 6 shows an exemplary flow chart of a depth data measurement method according to the present invention.
As shown in fig. 6, at least three kinds of coded structured light are projected to the same photographing space at step S610. As mentioned above, at least one of the at least three kinds of coded structured light referred to herein is a coded structured light that is projected simultaneously by two or more structured light projection devices.
Then, in step S620, at least three kinds of coded structured light projected in the same photographing space are photographed using the first and second imaging devices, resulting in at least three kinds of images each including the first and second images, respectively.
The first imaging device and the second imaging device may have a predetermined relative positional relationship, and in each set of images, for example, an image that may be captured by the first imaging device is a first image, and an image that may be captured by the second imaging device is a second image.
In step S630, a window in which the first image and the second image in each set of images match each other is determined by the image matching method as described in detail above with reference to fig. 5.
Then, in step S640, depth data of the first image pixel is determined according to a position difference between the first image pixel and the second image pixel in the windows matched with each other and the predetermined relative positional relationship.
In a preferred embodiment, the image matching may be performed continuously, for example as part of a real-time depth data acquisition scheme. In step S610, at least three kinds of coded structured light may be repeatedly projected to the same photographing space with a predetermined duty cycle. In step S620, each of at least three kinds of coded structured light projected in the same shooting space in each working period is shot by using a first imaging device and a second imaging device, and at least three groups of images respectively including a first image and a second image corresponding to the corresponding working periods are obtained, wherein the first imaging device and the second imaging device have a predetermined relative position relationship. In step S630, a window in which the first image and the second image match each other in each set of images in the corresponding duty cycle is determined by the image matching method as described in detail above with reference to fig. 5. Then, in step S640, depth data of the first image pixel in the corresponding duty cycle is determined according to the position difference between the first image pixel and the second image pixel in the windows matched with each other in the corresponding duty cycle and the predetermined relative position relationship.
In a preferred embodiment, the projection and imaging of the at least three coded structured lights may be performed sequentially. The shooting and the obtaining of the depth data can be continuous, so that the depth data measuring method can measure the dynamic target in real time.
The depth data measuring apparatus, method, and image matching method according to the present invention have been described in detail above with reference to the accompanying drawings. According to the invention, the structured light projection devices are simultaneously utilized to project the respective structured light and the common structured light in turn, so that more structured light patterns can be obtained by the least structured light projection devices, the image obtaining precision is improved, and smaller window matching is realized.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for carrying out the above-mentioned steps defined in the above-mentioned method of the invention.
Alternatively, the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (13)

1. A depth data measuring apparatus comprising:
a first structured light projecting device for projecting a first coded structured light to the shooting space;
a second structured light projection device for projecting a second coded structured light to the shooting space;
an imaging unit fixed relative to the first and second structured light projecting devices, for photographing the photographing space to obtain a two-dimensional image of the photographing object under the structured light irradiation; and
a processor for
Controlling the first and second structured light projecting devices to project the coded structured light toward the photographing space, respectively, and controlling the first and second structured light projecting devices to project the coded structured light toward the photographing space simultaneously,
controlling the imaging unit to take a first pattern two-dimensional image and a second pattern two-dimensional image of the photographic subject under the irradiation of the first and second coded structure lights and a composite pattern two-dimensional image of the photographic subject under the irradiation of the coded structure light composed of the first and second coded structure lights, respectively, and
determining depth data of the photographic subject based on the photographed first pattern two-dimensional image, second pattern two-dimensional image, and the synthetic pattern two-dimensional image.
2. The depth data measuring apparatus of claim 1, wherein the imaging unit comprises:
a first imaging device and a second imaging device for shooting the coded structured light projected in the same shooting space to respectively obtain a first image and a second image of a shooting object under the irradiation of the projected structured light, wherein the first imaging device and the second imaging device have a predetermined relative position relationship, and
the processor is configured to control the first imaging device and the second imaging device to capture three sets of images each including the first and second images when the first structured light projection device projects alone, the second structured light projection device projects alone, and the first and second structured light projection devices project structured light together.
3. The depth data measurement device of claim 2, wherein the processor is to:
the method comprises the steps of obtaining a first image and a second image in each group of images, obtaining the matching confidence degree of each window between the first image and the second image in the same group of images according to the same size of matching windows for the images, determining windows matched with each other between the first image and the second image in each group of images based on the matching confidence degree of each window between the first image and the second image in each group of images, and determining the depth data of the first image pixel according to the position difference between the first image pixel and the second image pixel in the windows matched with each other and the preset relative position relation.
4. The depth data measurement apparatus of claim 1, wherein the depth data measurement apparatus repeats the first structured light projecting device projecting structured light alone, the second structured light projecting device projecting structured light alone, and the first and second structured light projecting devices projecting structured light together at a predetermined duty cycle.
5. The depth data measurement device of claim 1, further comprising:
a third structured light projecting device for projecting the third coded structured light to the shooting space, and
the processor is configured to:
controlling the third structured light projecting device to project the coded structured light separately, together with the first and/or second structured light projecting device.
6. The depth data measurement device of claim 1, wherein the first and second structured light projecting means project first and second coded structured light having different patterns.
7. The depth data measuring apparatus of claim 1, wherein the first and second structured light projecting devices project first and second coded structured light having the same pattern and angular deflection.
8. An image matching method, comprising:
controlling the first structured light projecting device and the second structured light projecting device to project the first coded structured light, the second coded structured light and the first and second coded structured light to the shooting space at different moments;
acquiring three groups of images which are obtained by shooting three types of coded structured light projected in the same shooting space by a first imaging device and a second imaging device respectively, wherein the first imaging device and the second imaging device have a preset relative position relationship, the image shot by the first imaging device in each group of images is a first image, and the image shot by the second imaging device is a second image;
for each group of images, respectively solving the confidence coefficient of each window matching between the first image and the second image in the same group according to the same size of the matching window; and
and determining the windows matched with each other between the first image and the second image in each group of images based on the confidence degree of the matching of the windows between the first image and the second image in each group of images.
9. The method of claim 8, wherein the first and second imaging devices have the same frame rate, and the first and second images in each set of images are taken simultaneously by the first and second imaging devices.
10. The method of claim 8, wherein the first and second coded structured light is at least one of:
coded structured light with the same code but different projected spatial positions or angles; or
Different coded structured light is coded.
11. The method of claim 8, wherein determining windows in each set of images that match each other between the first image and the second image based on a confidence in the respective window matches between the first image and the second image in each set of images comprises:
solving the sum of the confidence degrees of the windows at the same position in each group of images;
the window with the highest sum of confidence is determined as the window that matches each other.
12. A depth data measurement method, comprising:
projecting at least three kinds of coded structured light to the same shooting space, wherein at least one kind of coded structured light is the coded structured light projected and synthesized by two or more structured light projection devices simultaneously;
shooting at least three kinds of coded structured light projected in the same shooting space by using a first imaging device and a second imaging device to respectively obtain at least three groups of images respectively comprising a first image and a second image, wherein the first imaging device and the second imaging device have a preset relative position relationship;
determining, by the method according to any one of claims 8-11, windows in each set of images that match each other between the first image and the second image;
determining depth data of the first image pixel according to a position difference between the first image pixel and the second image pixel in the windows matched with each other and the predetermined relative position relation.
13. A depth data measurement method, comprising:
repeatedly projecting at least three kinds of coded structured light to the same shooting space according to a preset work period, wherein at least one kind of coded structured light is the coded structured light projected and synthesized by two or more structured light projection devices;
shooting each of at least three kinds of coded structured light projected in the same shooting space in each work period by using a first imaging device and a second imaging device, and respectively obtaining at least three groups of images which respectively comprise a first image and a second image and correspond to the corresponding work periods, wherein the first imaging device and the second imaging device have a preset relative position relationship;
determining, by the method according to any of claims 8-11, windows in each set of images within a respective duty cycle that match each other between the first image and the second image;
and determining the depth data of the first image pixel in the corresponding working period according to the position difference between the first image pixel and the second image pixel in the windows matched with each other in the corresponding working period and the preset relative position relation.
CN202011101854.0A 2020-10-15 2020-10-15 Depth data measuring apparatus, depth data measuring method, and image matching method Pending CN114373007A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011101854.0A CN114373007A (en) 2020-10-15 2020-10-15 Depth data measuring apparatus, depth data measuring method, and image matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011101854.0A CN114373007A (en) 2020-10-15 2020-10-15 Depth data measuring apparatus, depth data measuring method, and image matching method

Publications (1)

Publication Number Publication Date
CN114373007A true CN114373007A (en) 2022-04-19

Family

ID=81137822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011101854.0A Pending CN114373007A (en) 2020-10-15 2020-10-15 Depth data measuring apparatus, depth data measuring method, and image matching method

Country Status (1)

Country Link
CN (1) CN114373007A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115421349A (en) * 2022-11-02 2022-12-02 四川川大智胜软件股份有限公司 Non-digital optical machine structure light projection module, acquisition device and three-dimensional measurement system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115421349A (en) * 2022-11-02 2022-12-02 四川川大智胜软件股份有限公司 Non-digital optical machine structure light projection module, acquisition device and three-dimensional measurement system

Similar Documents

Publication Publication Date Title
US10902668B2 (en) 3D geometric modeling and 3D video content creation
US11763518B2 (en) Method and system for generating a three-dimensional image of an object
US10309770B2 (en) Three-dimensional sensor system and three-dimensional data acquisition method
CN109813251B (en) Method, device and system for three-dimensional measurement
CN103649674B (en) Measuring equipment and messaging device
US9392262B2 (en) System and method for 3D reconstruction using multiple multi-channel cameras
US11145077B2 (en) Device and method for obtaining depth information from a scene
JP6456156B2 (en) Normal line information generating apparatus, imaging apparatus, normal line information generating method, and normal line information generating program
US20130335535A1 (en) Digital 3d camera using periodic illumination
CN109299662B (en) Depth data calculation device and method, and face recognition device
CN104903680B (en) The method for controlling the linear dimension of three-dimension object
CN104335005A (en) 3-D scanning and positioning system
WO2004044522A1 (en) Three-dimensional shape measuring method and its device
CN107517346B (en) Photographing method and device based on structured light and mobile device
CN111971525B (en) Method and system for measuring an object with a stereoscope
US20230199324A1 (en) Projection unit and photographing apparatus comprising same projection unit, processor, and imaging device
EP3951314A1 (en) Three-dimensional measurement system and three-dimensional measurement method
JP7184203B2 (en) Image processing device, three-dimensional measurement system, image processing method
CN113160416B (en) Speckle imaging device and method for coal flow detection
CN114373007A (en) Depth data measuring apparatus, depth data measuring method, and image matching method
JP6969739B2 (en) Location information acquisition system, location information acquisition method and program
CN112750157B (en) Depth image generation method and device
CN109357628B (en) High-precision three-dimensional image acquisition method and device for region of interest
JP2013037012A (en) Measurement device, information processing device, measurement method, information processing method and program
CN118411473A (en) Palm brushing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination