WO2000070303A1 - Color structured light 3d-imaging system - Google Patents
Color structured light 3d-imaging system Download PDFInfo
- Publication number
- WO2000070303A1 WO2000070303A1 PCT/US1999/010756 US9910756W WO0070303A1 WO 2000070303 A1 WO2000070303 A1 WO 2000070303A1 US 9910756 W US9910756 W US 9910756W WO 0070303 A1 WO0070303 A1 WO 0070303A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- light
- color
- data
- image data
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2509—Color coding
Definitions
- the present invention relates to a method and apparatus for three-dimensional surface profile imaging and measurement and, more particularly, to distance profile measurement of objects based upon two-dimensional imaging of the objects reflecting structured illumination.
- Three-dimensional (hereinafter also referred to as either “3D” or “3-D”) imaging and measurement systems are known. In general, the purpose is to determine the shape of an object in three dimensions, ideally with actual dimensions.
- imaging and measurement systems fall into two basic categories: 1) Surface Contact Systems and 2) Optical Systems.
- Optical Systems are further categorized as using Laser Triangulation, Structured Illumination, Optical Moire Interferometry, Stereoscopic Imaging, and Time-of-Flight Measurement.
- Optical Moire Interferometry is accurate, but expensive and time-consuming. Stereoscopic
- Time-of-Flight Measurement calculates the time for a laser beam to reflect from an object at each point of interest, and requires an expensive scanning laser transmitter and receiver.
- the present invention is an Optical System based upon Structured Illumination, which means that it determines the three dimensional profile of an object which is illuminated with light having a known structure, or pattern.
- the structured illumination is projected onto an object from a point laterally separated from a camera.
- the camera captures an image of the structured light pattern, as reflected by the object.
- the object can be profiled in three dimensions where the structured light pattern reflected by the object can be discerned clearly.
- the shift in the reflected pattern as compared to that which would be expected from projection of the same pattern onto a reference plane, may be triangulated to calculate the "z" distance, or depth.
- Color-encoded structured light has been proposed to achieve fast active 3D imaging, as for example by K. L. Boyer and A. C. Kak, "Color-encoded structured light for rapid active ranging," IEEE transactions on pattern analysis and machine intelligence, Vol. PA J-9. pp. 14-28, 1987.
- the color of the projected structured light is used to identify the locations of stripes and thereby reduce ambiguity when interpreting the data.
- the present invention provides a three dimensional (3D) imaging system requiring only a single image capture by substantially any presently-manufactured single camera using a single structured light source.
- it provides a 3D imaging system which is inexpensive to manufacture and easy to use.
- it permits 3D imaging and measurement using a structured source of visible, infrared, or ultraviolet light.
- the invention provides a 3D imaging and measurement system which reduces crosstalk between reflections of a color-encoded structured light source.
- the invention provides 3D imaging using any combination of enhancing a color-encoded structured light source, algorithmically enhancing the accuracy of raw data from an image reflecting a structured light source, comparing the image data to a measured reference image, and algorithmically enhancing the calculated depth profile.
- the invention permits 3D imaging and measurement using either a pulsed light source, such as a photographic flash, or a continuous light source.
- the invention provides a light structuring optical device accepting light from a standard commercial flash unit synchronized to a standard commercial digital camera to provide data from which a 3D image may be obtained.
- the invention permits 3D imaging and measurement using a structured light source which modulates the intensity and/or the spectrum of light to provide black and white or multi-color structured lighting pattern.
- the invention provides a method to project a structured illumination pattern onto an object.
- Another aspect provides 3D imaging using an improved color grating.
- the invention provides a 3D imaging system that can be used to project an image onto an object.
- the invention provides a 3D imaging system that can be used with a moving object, and with a living object.
- the invention provides a 3D imaging and measurement system having a camera and light source which can be integrated into a single body.
- the invention permits accurate 3D imaging and measurement of objects having a surface color texture, by using two images reflecting different lighting.
- the present invention enables accurate three dimensional imaging using any off-the-shelf digital camera, or indeed substantially any decent camera, taking a single image of an object under structured lighting.
- the structured lighting may be provided by adding a fairly simple pattern-projecting device in front of an off-the-shelf photographic flash unit.
- the improvements encompassed by the present invention work together to make full realization of the advantages of the invention possible; however, one may in some cases omit individual improvements and yet still obtain good quality 3D profile information.
- the structured light source is improved by separating color images to reduce color crosstalk, and by adaptation for use with off-the-shelf flash units.
- the image data inte ⁇ retive algorithms reduce the effects of color cross-talk, improve detection of light intensity peaks, enhance system calibration, and improve the precision of identifying the location of adjacent lines.
- a three dimensional imaging system for use in obtaining 3D information about an object, constructed in accordance with the principles of the present invention, has a structured light source including a source of illumination which is transmitted through a black and white or color grating and then projected onto the object.
- the grating includes a predetermined pattern of light transmitting areas or apertures, which are typically parallel transmissive bars disposed a predetermined distance apart from each other. In some embodiments the grating will include an opaque area intermediate each of a plurality of differently colored bars.
- the imaging system also includes a camera, or other image capturing device, for capturing an image of an object reflecting light from the structured light source.
- the camera may take short duration exposures, and/or the light source may be a short duration flash synchronized to the exposure, to enable capture clear 3D images of even moving and living objects.
- the system may include a means for digitizing the captured image into computer-manipulable data, if the camera does not provide digital data directly.
- a bias adjusted centroid light peak detection algorithm aspect of the present invention may be employed to enhance accuracy of the detected image, and system calibration methods are an aspect of the invention which can reduce errors by comparing the detected object image to an actual reference image taken using the same system setup.
- color cross-talk compensation aspects of the invention include employing either or both using opaque areas between different colors in the grating, and a color compensation algorithm which is the inverse of a determined color cross-talk matrix.
- a center weighted line average algorithm aspect of the present invention is also particularly useful for plural color gratings.
- a three-dimensional imaging system constructed and operated in accordance with the principles of the present invention would employ a combination of these mechanical and algorithmic aspects, and, in conjunction with well known calculations performed upon the derived image data, would determine information about an imaged object along three dimensional planes x,y, and z.
- Fig. 1 shows structured lighting for a 3D imaging system using a CCD video camera.
- Fig. 2 shows some details of another 3D imaging system.
- Fig. 3 is an improved grating for use with a 3D imaging system.
- Fig. 4 shows image contours of an object reflecting structured light.
- Fig. 5 is a perspective view of a three-dimensional profile obtained from Fig. 4 data.
- Fig. 6 is a graph showing determination, by threshold, of regions as being of a color or not.
- Fig. 7a is a portion of an image reflecting a three-color structured light source.
- Fig. 7b is a graph of color intensities measured from Fig. 7a image.
- Fig. 8 is a flowchart of a color cross-talk compensation image processing procedure.
- Fig. 9 is a graph of color-compensated color intensities from Fig. 7a image.
- Fig. 10 graphically shows bias-adjusted centroid peak detection.
- Fig. 11 is a flowchart of a system calibration procedure.
- Fig. 12 shows details of 3D image system using system calibration.
- Figs. 13 a-e shows measured profiles after progressively enhancing accuracy.
- Fig. 14a shows a human face.
- Fig. 14b shows the human face illuminated by structured light.
- Fig. 14c shows a reconstruction of the 3D profile of the human face.
- Fig. 14d shows a cross-section of the 3D profile of the human face.
- FIG. 1 A three dimensional imaging system is shown in Fig. 1, identified in general by the reference numeral 10.
- a modified 3D imaging system, identified in g Dcral by the reference numeral 12 is shown with somewhat more detail in Fig. 2, and a grating, identified in general by the reference numeral 14 is shown in Fig. 3.
- the 3-D imaging system of Fig. 1 shows structured illumination source 16 projecting patterned light (structured illumination) toward object 18.
- the pattern of light may be color encoded or not, and may be patterned in any way which is known and can be readily recognized in a image.
- a simple and preferred pattern consists of parallel bars of light.
- Light pattern 20 displays a preferred pattern of light projected from structured light source 16, seen where the light passes through plane O-X (pe ⁇ endicular to the plane of the paper) before reaching object 18.
- O-X pe ⁇ endicular to the plane of the paper
- light pattern 20 would continue on to object 18, from whence it would be reflected according to the contours of object 18.
- An image so reflected, indicated generally by reference numeral 32, will be captured by camera 30.
- Object 18 is only a cross- section, but the entire object is the face of a statue of the goddess Venus.
- a representation of image 32, as seen by camera 30, is shown in Fig. 4. In the image of Fig. 4, it can be seen that the parallel bars of light from the structured light source are shifted according to the contours of the statue.
- a representation of the statue as determined using the present invention is shown in Fig. 5.
- each light area of light pattern 20 is a bar of light which is oriented pe ⁇ endicular to the page and thus is seen in cross-section.
- Dark areas 22 are disposed between each light area 24a, 26a, 28a, 24, 26 and 28.
- Distance 21 is the spacing P between the centers of like colors in a three-color embodiment, and distance 23 is the spacing P' between adjacent light bars. In the preferred embodiment, the same distance 23 is the distance between the centers of adjacent dark areas 22.
- Bars of light 24, 26 and 28 may all be white, or all of the same color or mixture of colors, or they may be arranged as a pattern of different colors, preferably repeating.
- Dark areas 22 are preferred intermediate each of light bars 24, 26, 28, etc.
- the preferred proportion of dark area to light area depends upon whether or not a plurality of distinct colors are used. In embodiments not using a plurality of distinct colors, it is preferred that dark areas 22 are about equal to light areas. In embodiments using distinct colors, dark areas 22 are preferably as small as possible without permitting actual intermixing of adjacent colors (which will occur as a result of inevitable defocusing and other blurring of the reflected structured illumination). Dark areas 22 greatly reduce crosstalk which would otherwise corrupt reflections from object 18 of projected light pattern 20.
- light color used with the present invention need not be in the visible spectrum, as long as a structured light source can accurately provide a pattern in that color, and the image- capturing device can detect the color.
- a structured light source can accurately provide a pattern in that color
- the image- capturing device can detect the color.
- the use of light from infrared at least through ultraviolet is well within the scope of the present invention.
- System 10 includes structured light source 16.
- Grating 14 although not shown in Fig. 1, is contained within the optical system of structured light source 16, and determines the pattern to be projected onto object 18.
- Fig. 2 shows details of a light source 16.
- the light from light source 34 is collimated by collimating lens 32.
- the collimated light passes through grating 14, which imposes structure (or patterning) on the light.
- the structured light from grating 14 is focussed by projecting lens 38, so that it may be accurately detected by camera 44 when the structured light reflects from object 40 to form image 42.
- Data 48 representing image 42 as captured by camera 44 is conveyed to processor 46 so that calculations may be performed to extract the 3D information from image data 48.
- any predetermined pattern may be used for grating 14.
- the requirements for the pattern are that it be distinct enough and recognizable enough that it can be identified after reflection from object 40.
- Parallel bars are preferred for the structured light pattern, and are primarily described herein.
- grating 14 includes a repetitive pattern of parallel apertures 4, 6, 8 etc. disposed a predetermined center to center distance 5 apart from each other.
- aperture as used herein means a portion of the grating which transmits light, in contrast to opaque areas which block light transmission.
- the apertures of grating 14 may transmit a plurality of distinct colors of light.
- a structured light pattern using a plurality of distinct colors is considered to be color-encoded.
- the grating modulates the color of the light.
- the apertures of grating 14 may each transmit the same color of light.
- Any particular mix of light frequencies may be understood to be a "color” as used here, so that if all apertures transmit white light they are considered not to transmit a plurality of distinct colors).
- the grating does not vary (or "modulate") the color of the transmitted light, then it must at least modulate the intensity of the single color of light transmitted so that a recognizable pattern of light is projected.
- opaque areas 2 having width 7 are disposed between adjacent apertures 4 and 6, 6 and 8, 4a and 6a, etc.
- Each aperture preferably has width 9.
- a preferred embodiment employs three different colors 4 (e.g. red), 6 (e.g. green), and 8 (e.g. blue), repeating as 4a, 6a, 8a and again as 4b, 6b, 8b, etc.
- interval distance 1 is the spacing between centers of like-colored apertures.
- Opaque area interval distance 3 between the centers of opaque areas 2 is equal to adjacent aperture center spacing distance 5.
- the size of the periods P (21) and P' (23) on projected image 20 of Fig. 1, and of interval distances 3 and 1 in grating 14 in FIG. 3, has been exaggerated to provide improved clarity.
- the size of these periods is a design variable, as are the use of different colors, the number of colors used, the actual colors used, the dimensions of the colored bars, and the size of grating 14.
- the actual spacings of grating 14 may be readily determined when the focussing characteristics of projector lens 38 are known, in order to produce a desired pattern of structured light, as explained below.
- width 7 of opaque areas 2 is preferably about 2/3 of aperture interval distance 5, while for gratings not using a plurality of distinct colors it is preferred that width 7 be about 4/5 of aperture interval distance 5.
- the receptor characteristics are such as to best distinguish red, green and blue.
- the ability to distinguish the colors used to encode a plural-color structured light pattern determines how closely spaced the pattern may be and still be clearly resolved. Accordingly, red, green and blue are generally preferred for the encoding colors of plural color structured light sources. However, it should be understood that more or less distinct colors could be used, and could well be adjusted to correspond to the best color selectivity of a particular camera.
- the film image must be "digitized" to provide data which can be processed by a computer - that is, processable data. Digitizers typically are also typically best able to distinguish red, green and blue.
- a preferred method is to make one central line identifiable, and then to count lines from there.
- a central line can be made identifiable by making it white, as opposed to the other distinct colors.
- another marking should be used, such as periodically adding "ladder steps," pe ⁇ endicular apertures across what would be an opaque area in the rest of the pattern.
- a single different-color line could be used for registration. From certain identified lines, the remainder of the lines are registered by counting. When the 3D profile is steep, the projected lines reflected by the object may become very close together, and become lost or indistinguishable. This can interfere with registration.
- Color-encoded parallel-bar patterns have an advantage in registration, because it is easier to count lines. Given three colors, two adjacent colored lines can be indistinguishable and still not interfere with the ability to count lines (and thus keep track of registration). In general, for n colors, n-1 adjacent lines can be missing without impairing registration. Thus, color-encoded systems generally have more robust registration.
- a structured light source constructed according to the present invention with opaque areas 2 intermediate each of color bars 4, 6 and 8 in grating 14 to create projected dark areas 22, enables a single exposure of object 22, recorded by essentially any commercially available image recorder (e.g. camera) to provide the necessary information from reflected image 32, sufficiently free from unwanted crosstalk, to permit proper detection and registration of the light bar lines, and hence to enable practical 3-D profilometry.
- image recorder e.g. camera
- structured light sources will vary in the accuracy of the projected image, in the difference in intensity between intended light and dark areas, in the colors projected and their spectral purity.
- Cameras may be either film-type photographic units, requiring scanning of the film image to obtain dig*!;!! data, or digital cameras which provide digital information more directly.
- some cameras have a high ability to distinguish different colors with minimal interference; this is most often true of cameras employing three separate monochromatic CCD receptors for each color pixel.
- Other cameras employ broad-band CCD receptors, and deduce the received colors through internal data manipulation which may not be accessible externally.
- the present 3-D imaging system invention may be practiced in a variety of aspects, with various features of the invention employed as appropriate for a particular camera and a particular structured light source.
- Region detection and color-cross-talk compensation Whether single or multiple colors are used, the resulting image data must be examined to assign regions as either light or dark.
- a light bar (or "line) location is the feature which must be determined to calculate the 3D profile, the center of these lines must be located as accurately as possible. To accomplish this, noise and interference should be eliminated as far as possible.
- color crosstalk noise comes from the color grating of plural-color embodiments of grating 14, from object colors, and from color detectors used by camera 30 or 44
- Color crosstalk may be understood by examining the intensities of color-encoded light from a structured light source as detected by a camera. First, the grating which produces the color pattern should be understood.
- Figure 7a shows a color grating including red, green, and blue color lines, which is a preferred color version of grating 14 in Fig. 3.
- a real color grating is created therefrom by writing the designed color pattern on a high resolution film (e.g. Fujichrome Velvia) using a slide maker (e.g. Lasergraphics Mark LU ultrahigh resolution, 8000 x 16000).
- a uniform white light i.e. a light having a uniform distribution of intensity across at least the visible spectrum
- a digital camera e.g. Kodak DC260
- the color spectrum of the grating is obtained by analyzing the intensity distribution of different color lines of the recorded digital image, as shown in Fig. 7b.
- Fig. 7b shows severe color cross-talk among different color channels.
- the intensity of typical false blue peak 71 in the location of a green line is comparable to the intensity of typical true blue peak 75.
- the color cross-talk noise e.g., an apparent but false color detected where it should not be
- the color cross-talk noise is about the same level as the color signal itself. This can lead to false line detection, in which one would detect a line (blue, in this case) that actually did not exist.
- typical red peak 77 does not cause a significant false peak in another color.
- color cross-talk can substantially shift the apparent peak locations of color lines from their actual locations. Since a shift in the peak location will result in errors in the depth calculation, it is important to compensate this shifting effect.
- the color spectrums of the color grating are collected by taking pictures with different objects and different digital cameras.
- the objects include both neutral color objects such as a white plane, a white ball, and a white cube, and lightly- colored objects such as human faces with different skin colors including white, yellow, and black.
- Fig. 6 shows a graph of a portion of intensity data for light. It should be understood that the continuous line is an idealization of data from a multitude of pixels, or points, tied together. In reality there may be as few as three pixels in a region 63 which exceeds threshold level (TL) 61. As long as this convention is understood, the graphs showing continuous intensity data can be properly understood.
- TL threshold level
- Fig. 8 a flowchart for color crosstalk compensation, will be described with reference also to the reference designators of Fig. 6.
- Step 81 The image data is captured of an object reflecting color-encoded structured light.
- the color compensation algorithm is generally applied to only a portion of the image at a time, preferably to a square region covering about 10 colored lines; the size of the regions analyzed is a design variable. Thus it must be understood that this procedure may be repeated in toto for many sub-images.
- Step 84 Areas are tentatively identified assigned as peak areas of the subject color where the intensity of the subject color light exceeds TL 61.
- Step 85 If areas have already been assigned to other colors, a tentative peak area located in an area previously assigned to another color will be rejected as invalid, and will skip the assignment step of Step 86.
- Step 86 For tentative peak areas of the subject color which are not in regions previously assigned to another color, the area will be assigned to the subject color.
- Step 87 Repeat Steps 84-86 until assignment of the subject color is complete.
- Step 88 Repeat Steps 82-87 until each of the encoding colors has been tested.
- Step 89 Calculate a Color Crosstalk Matrix (CCM) for the image (or sub-image).
- CCM Color Crosstalk Matrix
- Step 90 the compensation for color cross-talk can be implemented by using the inverse of CCM, as given by
- r g', b' represent the red, green, and blue colors after cross-talk compensation.
- the matrix is readily adjusted to accommodate N colors.
- the colors are represented as r, g and b by way of example.
- Fig. 9 shows the color spectrum after the color cross-talk compensation for the same color grating and same digital camera used to obtain Fig. 7(b).
- 3D imaging systems may employ bias adjusted centroid peak detection to determine the centers of light patterns, as shown in Fig. 10 and described below.
- Step 1 scanning along data rows which are generally pe ⁇ endicular to the structured image lines, find start point 102 and end point 104 locations of the intensity profile 100 of each region assigned to a particular color. This step is preferably performed region by region, and is the same as steps 81-88 of Fig. 8.
- steps 81-88 are presently preferred to be performed as a separate step, upon data which has already been adjusted by color compensation, as described above.
- start 102 and end 104 points are in reality discrete pixel data, though both are above threshold level TL, one will be higher than the other.
- Step 3 refine the estimated center of the image, inte ⁇ olating between pixels and calculating the refined center (RC) of each line using a bias adjusted centroid method, given by end
- Step 4 repeat steps 1-3 for each row of data in each color.
- the average error of RC is about 0.2 pixel, about 1/2 of the error of C (without bias adjusted centroid detection). This, in turn, will double the accuracy of 3D imaging based upon RC.
- Optional smoothing The location of the centers of lines determined may be smoothed, filtering the location of each point to reduce deviations from other nearby points of the line using any well-known filtering algorithm. Since the lines of the reference image should be smooth, heavy filtering is preferred and should reduce noise without impairing accuracy. With regard to structuring lines reflected from an object, excessive filtering may impair accuracy. It is preferred, therefore, to determine discontinuities along structuring lines in the reflected image, and then perform a modest filtering upon the (continuous) segments of the structuring lines between such discontinuities.
- a line-by-line calibration procedure may be employed in imaging systems embodying the present invention to compensate system modeling error, the aberration of both projector and camera imaging lenses, and defocusing effects due to object depth variation, as enumerated and detailed below:
- Three-dimensional data is often deduced from comparison of (a) object-reflected structured image point locations to (b) the theoretically expected locations of structured light image points reflected from a reference plane.
- the theoretical expected location generally is calculated upon assumptions, including that the structured light image is perfectly created as intended, perfectly projected and accurately captured by the camera Due to manufacturing tolerances of the structured light source and of the camera receptors, and to lens aberrations such as coma and chromatic distortion, each of these assumptions is mistaken to some degree in the real world, introducing a variety of errors which shift the actual location of the structured line from its theoretical location.
- optical axis 121 is the z axis, and is pe ⁇ endicular to reference plane 123
- Baseline 124 is parallel to reference plane 123, is defined by the respective centers 126 and 125 of the p ⁇ nciple planes of the lenses of source 16 and camera 44
- the x axis direction is defined parallel to baseline 124
- y-axis 122 is defined to be pe ⁇ endicular to the plane of baseline 124 and optical axis 121.
- D is the distance between points 125 and 124 of structured light source 16 and digital camera 44.
- L is the distance between base line 144 and the reference plane of a white surface.
- Point 130 is seen by camera 44 as a point having an x-value, translated to reference plane 123, of x c 128.
- the structured light line crossing point 130 is known from a recorded image of reference plane 123 to have an x-value x p 127
- Fig. 11 is a flow chart of general data processing steps using line-by-lme calibration and some of the other accuracy enhancements taught herein.
- Step 112 using a device arranged in a known configuration such as shown in Fig 12, capture a reference image using perfect white reference plane 123, then digitize if necessary and output the digital image data to a processing system
- Step 113" capture an object image using the same system configuration as in Step 112
- Step 114 perform cross-talk compensation on both object and reference data.
- Step 115- perform bias adjusted centroid peak detection to refine the location of center peaks for lines in the object, and optionally smooth the lines along segment between determined discontinuities.
- Step 118 Use center- weighted averaging of the height determined by each of three adjacent structured light lines at that y-location. Do not perform averaging if the adjacent structured light lines are determined to be discontinuous. Adjacent lines are considered discontinuous if the difference between the heights determined by any 2 of the three lines exceeds a threshold.
- the threshold is a design choice, but is preferably about 2 mm in the setup shown in Fig. 12.
- the fluctuation error of 3D profiles may be reduced by averaging the z value determined for points which are near each other.
- the present invention prefers to perform center weighted averaging of nearby points around the image point being adjusted.
- weighted averaging is not performed at all points.
- the threshold is a design variable, but preferably is about 2 mm in the setup of Fig. 12.
- This weighted technique improves accuracy 0.3/0.1 ⁇ 3-fold, substantially better than the ⁇ 1.73-fold improvement resulting from conventional 3 point averaging.
- This technique is effective for all embodiments of the grating according to the present invention. However, since the errors between different colors tend to be independent, while errors between same colored lines may be partly coherent, it is believed that this weighted averaging technique will be more effective for adjacent different colored lines than for adjacent same-colored lines.
- the principles of the present invention permit accurate three dimensional imaging and measurement of an object using a single camera in a known spatial relationship to a single structured light source, taking a single image of the object reflecting the structured illumination.
- the light may be provided by a flash, which due to its very short duration substantially freezes motion, permitting profile imaging on moving objects.
- One prefe ⁇ ed embodiment utilizes a standard commercial photographic flash unit as an illumination source, the flash unit separated from but synchronized with a standard digital camera. By simply placing in front of the flash unit an optical unit including a grating as described above, and one or more focussing lenses, the flash will cause structured light to be projected upon an object.
- a relative 3D image may be obtained by processing only the data obtained from the digital camera image. If the orientation measurements of the structured flash, camera and object are known, then absolute 3D information and measurements of the image can be obtained.
- the invention can use film cameras, but the structured-light-illuminated image must then be digitized to complete enhancements and 3D image determination.
- Object reconstruction in full color Another prefe ⁇ ed embodiment of the invention permits true-color reconstruction of an object, such as a human face. Using two separate exposures, one under structured light and one under white light, a 3D image and a two dimensional color profile are separately obtained at almost the same moment. A model is constructed from the 3D profile information, and the color profile of the original object is projected onto the model. Alignment of the color profile and model is accomplished by matching features of the object and of the image.
- the two images are preferably close in time to each other.
- the structured illumination for one image may be color encoded or not, and is directed at the object from a baseline distance away from the camera, as described in detail above.
- the data therefrom is processed to obtain a 3D profile of the object.
- the unstructured white light image is taken either before or after the other image, preferably capturing a color profile of substantially the same image from substantially the same perspective as the other image.
- the unstructured light source may be simply a built-in flash of the camera, but it may also include lighting of the object from more than one direction to reduce shadows.
- the two images are taken at nearly the same time, and are short duration exposures synchronized with flash illumination sources, so as to permit 3D color image reconstruction of even living or moving objects.
- This can be accomplished by using two cameras.
- Two of the same type camera e.g. Kodak DC 260
- the first camera drives one flash, and the flash control signal from the first camera is also input to a circuit which, in response, sends an exposure initiate signal to the second camera after a delay.
- the second camera in turn controls the second flash.
- the delay between the flashes is preferably about 20-30 ms, to allow for flash and shutter duration and timing jitter, but this will be a design choice depending on the expected movement of the object and upon the characteristics of the chosen cameras.
- a single camera having "burst" mode operation may be used.
- burst mode a single camera (e.g. Fuji DS 300) takes a plurality of exposures closely spaced in time. Presently, the time spacing may be as little as 100ms.
- a means must be added to control the two separate flash units.
- the preferred means is to connect the flash control signal from the camera to an electronic switching circuit which directs the flash control signal first to one flash unit, and then to the other, by any means, many of which will be apparent to those skilled in the electronic arts.
- This embodiment presently has two advantages, due to using a single camera: setup is simpler, and the orientation of the camera is identical for the two shots. However, the period between exposures is longer, and the cost of cameras having burst mode is presently much higher than for other types, so this is not the most preferred embodiment at this time.
- Principles of the present invention for accurately determining the 3D profile of an image include structured light grating opaque areas, color compensation, bias adjusted centroid detection, calibration, filtering and weighted averaging of determined heights. All of these principles work together to enable one to produce high accuracy 3D profiles of even a living or moving object with virtually any commercially available image-capturing device, particularly an off-the-shelf digital camera, and any compatible, separate commercially available flash (or other lighting) unit. To these off-the-shelf items, only a light structuring optical device and an image data processing program need be added to implement this embodiment of the invention. However, some aspects of the present invention may be employed while omitting others to produce still good-quality 3D images under many circumstances. Accordingly, the invention is conceived to encompass use of less than all of the disclosed principles.
- a preferred embodiment of the present invention may be practiced and tested as follows.
- a Kodak DC 260 digital camera is used.
- a well-defined triangular object is imaged.
- the object is 25 mm in height, 25 mm thick, and 125 mm long.
- the DC 260 camera has 1536 x 1024 pixels, the test object only occupies 600x570 pixels, due to limitations of the camera zoom lens.
- the structured light source employs opaque areas between each of 3 separate colored bars in a repeating pattern, as described above.
- Fig. 13a shows a one-dimensional scanned profile derived from the basic test setup. The worst case range error is about 1.5 mm, so the relative e ⁇ or is about 1.5/25 ⁇ 6%, an accuracy comparable to that reported by other groups using a single encoded color frame. The theoretical range accuracy ⁇ z rA based upon camera resolution can be estimated by simply differentiating Eq. (6).
- Fig. 13b shows the result of processing the image data with cross-talk compensation as described above, which reduces measured error to -0.8 mm, for a -1.9-fold improvement in range accuracy.
- Fig. 13c shows the result of adding line-by-line reference plane calibration, as discussed above, to the color compensated data.
- the maximum measured error is reduced to about 0.5 mm, a relative improvement of about 1.6-fold, for about 3-fold total improvement in accuracy.
- Fig. 13d shows the calculated profile after adding bias adjusted centroid peak detection to the cross-talk compensated and line-by-line calibrated data.
- the maximum error is reduced to about .25 mm, a 2-fold relative improvement and ⁇ 6-fold total improvement.
- the measured accuracy exceeds by 2-fold the error estimate based on camera resolution.
- Fig. 13e shows the calculated profile after weighted averaging. Averaging the data between adjacent lines will artificially improve the camera resolution; and weighted averaging as taught above improves it more than uniform averaging. Accordingly, the resolution is not limited by the .5 pixel basic camera resolution.
- the maximum e ⁇ or is now reduced to 0.1 mm, about a 2.5-fold relative improvement, and an overall 15-fold (1.5mm / 0.1mm) improvement in accuracy.
- Fig. 14a shows a human face
- Fig. 14b shows the face as illuminated by structured light in three colors
- Fig. 14c shows the reconstructed profile of the face after processing as described above.
- a cross-section of the height profile is shown in Fig. 14d.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA002373284A CA2373284A1 (en) | 1999-05-14 | 1999-05-14 | Color structured light 3d-imaging system |
PCT/US1999/010756 WO2000070303A1 (en) | 1999-05-14 | 1999-05-14 | Color structured light 3d-imaging system |
JP2000618688A JP2002544510A (en) | 1999-05-14 | 1999-05-14 | Color structured optical 3D imaging system |
EP99923100A EP1190213A1 (en) | 1999-05-14 | 1999-05-14 | Color structured light 3d-imaging system |
CNB998166340A CN1159566C (en) | 1999-05-14 | 1999-05-14 | 3D-imaging system |
AU39947/99A AU3994799A (en) | 1999-05-14 | 1999-05-14 | Color structured light 3d-imaging system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US1999/010756 WO2000070303A1 (en) | 1999-05-14 | 1999-05-14 | Color structured light 3d-imaging system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000070303A1 true WO2000070303A1 (en) | 2000-11-23 |
Family
ID=22272767
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US1999/010756 WO2000070303A1 (en) | 1999-05-14 | 1999-05-14 | Color structured light 3d-imaging system |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP1190213A1 (en) |
JP (1) | JP2002544510A (en) |
CN (1) | CN1159566C (en) |
AU (1) | AU3994799A (en) |
CA (1) | CA2373284A1 (en) |
WO (1) | WO2000070303A1 (en) |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1217328A1 (en) * | 2000-12-20 | 2002-06-26 | Olympus Optical Co., Ltd. | 3d image acquisition apparatus and 3d image acquisition method |
WO2002059545A1 (en) * | 2001-01-26 | 2002-08-01 | Genex Technologies, Inc. | Three-dimensional surface profile imaging method and apparatus using single spectral light condition |
WO2002066925A1 (en) * | 2001-02-21 | 2002-08-29 | A4 Vision S.A. | Device for contactless inspection of linear dimensions of three-dimensional objects |
WO2002075244A1 (en) * | 2001-03-19 | 2002-09-26 | A4 Vision Sa | Method for noncontact inspection of dimensions of three-dimensional objects |
WO2004114243A1 (en) * | 2003-06-19 | 2004-12-29 | Honeywell International Inc. | Method and apparatus for detecting objects using structured light patterns |
WO2006031143A1 (en) | 2004-08-12 | 2006-03-23 | A4 Vision S.A. | Device for contactlessly controlling the surface profile of objects |
US7146036B2 (en) | 2003-02-03 | 2006-12-05 | Hewlett-Packard Development Company, L.P. | Multiframe correspondence estimation |
US7174033B2 (en) | 2002-05-22 | 2007-02-06 | A4Vision | Methods and systems for detecting and recognizing an object based on 3D image data |
EP1788348A1 (en) * | 2004-08-12 | 2007-05-23 | A4 Vision S.A. | Device for biometrically controlling a face surface |
US7257236B2 (en) | 2002-05-22 | 2007-08-14 | A4Vision | Methods and systems for detecting and recognizing objects in a controlled wide area |
DE10250954B4 (en) * | 2002-10-26 | 2007-10-18 | Carl Zeiss | Method and device for carrying out a televisite and televisite receiving device |
US7415151B2 (en) | 2003-06-20 | 2008-08-19 | Industrial Technology Research Institute | 3D color information acquisition method and 3D color information acquisition device |
US7495758B2 (en) | 2006-09-06 | 2009-02-24 | Theo Boeing Company | Apparatus and methods for two-dimensional and three-dimensional inspection of a workpiece |
US7646896B2 (en) | 2005-08-02 | 2010-01-12 | A4Vision | Apparatus and method for performing enrollment of user biometric information |
US8050486B2 (en) | 2006-05-16 | 2011-11-01 | The Boeing Company | System and method for identifying a feature of a workpiece |
WO2014016001A1 (en) * | 2012-07-25 | 2014-01-30 | Siemens Aktiengesellschaft | Colour coding for 3d measurement, more particularly for transparent scattering surfaces |
CN103697815A (en) * | 2014-01-15 | 2014-04-02 | 西安电子科技大学 | Method for acquiring three-dimensional information of frequency mixing structured light based on phase encoding |
WO2014074003A1 (en) | 2012-11-07 | 2014-05-15 | Артек Европа С.А.Р.Л. | Method for monitoring linear dimensions of three-dimensional objects |
US9052294B2 (en) | 2006-05-31 | 2015-06-09 | The Boeing Company | Method and system for two-dimensional and three-dimensional inspection of a workpiece |
US9147253B2 (en) | 2010-03-17 | 2015-09-29 | Microsoft Technology Licensing, Llc | Raster scanning for depth detection |
WO2015185608A1 (en) | 2014-06-05 | 2015-12-10 | BSH Hausgeräte GmbH | Cooking device with light pattern projector and camera |
WO2016137351A1 (en) * | 2015-02-25 | 2016-09-01 | Андрей Владимирович КЛИМОВ | Method and device for the 3d registration and recognition of a human face |
CN108693538A (en) * | 2017-04-07 | 2018-10-23 | 北京雷动云合智能技术有限公司 | Accurate confidence level depth camera range unit based on binocular structure light and method |
CN109348607A (en) * | 2018-10-16 | 2019-02-15 | 华为技术有限公司 | A kind of illuminating module bracket and terminal device |
US10728519B2 (en) | 2004-06-17 | 2020-07-28 | Align Technology, Inc. | Method and apparatus for colour imaging a three-dimensional structure |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100417976C (en) * | 2002-09-15 | 2008-09-10 | 深圳市泛友科技有限公司 | Three-dimensional photographic technology |
CN100387065C (en) * | 2003-07-07 | 2008-05-07 | 财团法人工业技术研究院 | Three-dimensional color information acquisition method and device therefor |
JP3831946B2 (en) * | 2003-09-26 | 2006-10-11 | ソニー株式会社 | Imaging device |
JP2005164434A (en) * | 2003-12-03 | 2005-06-23 | Fukuoka Institute Of Technology | Noncontact three-dimensional measuring method and apparatus |
DE102004007829B4 (en) * | 2004-02-18 | 2007-04-05 | Isra Vision Systems Ag | Method for determining areas to be inspected |
WO2007105205A2 (en) * | 2006-03-14 | 2007-09-20 | Prime Sense Ltd. | Three-dimensional sensing using speckle patterns |
JP2008170281A (en) * | 2007-01-11 | 2008-07-24 | Nikon Corp | Shape measuring device and shape measuring method |
DK2401575T3 (en) * | 2009-02-25 | 2020-03-30 | Dental Imaging Technologies Corp | Method and apparatus for generating a display of a three-dimensional surface |
DE102010064593A1 (en) * | 2009-05-21 | 2015-07-30 | Koh Young Technology Inc. | Form measuring device and method |
JP5633719B2 (en) * | 2009-09-18 | 2014-12-03 | 学校法人福岡工業大学 | 3D information measuring apparatus and 3D information measuring method |
CN102022981B (en) * | 2009-09-22 | 2013-04-03 | 重庆工商大学 | Peak-valley motion detection method and device for measuring sub-pixel displacement |
CN102052900B (en) * | 2009-11-02 | 2013-09-25 | 重庆工商大学 | Peak valley motion detection method and device for quickly measuring sub-pixel displacement |
US20110187878A1 (en) * | 2010-02-02 | 2011-08-04 | Primesense Ltd. | Synchronization of projected illumination with rolling shutter of image sensor |
CN101975994B (en) * | 2010-08-27 | 2012-03-28 | 中国科学院自动化研究所 | Three-dimensional imaging system of multi-stage lens |
US20120062725A1 (en) * | 2010-09-10 | 2012-03-15 | Gm Global Technology Operations, Inc. | System for error-proofing manual assembly operations using machine vision |
CN102161291B (en) * | 2010-12-08 | 2013-03-27 | 合肥中加激光技术有限公司 | Three-dimensional imaging crystal internally carving pavilion |
TW201315962A (en) * | 2011-10-05 | 2013-04-16 | Au Optronics Corp | Projection image recognition apparatus and method thereof |
CN102628693A (en) * | 2012-04-16 | 2012-08-08 | 中国航空无线电电子研究所 | Method for registering camera spindle and laser beam in parallel |
US10368053B2 (en) | 2012-11-14 | 2019-07-30 | Qualcomm Incorporated | Structured light active depth sensing systems combining multiple images to compensate for differences in reflectivity and/or absorption |
EP2799810A1 (en) * | 2013-04-30 | 2014-11-05 | Aimess Services GmbH | Apparatus and method for simultaneous three-dimensional measuring of surfaces with multiple wavelengths |
TWI464367B (en) * | 2013-07-23 | 2014-12-11 | Univ Nat Chiao Tung | Active image acquisition system and method |
CN104243843B (en) | 2014-09-30 | 2017-11-03 | 北京智谷睿拓技术服务有限公司 | Pickup light shines compensation method, compensation device and user equipment |
WO2016100933A1 (en) | 2014-12-18 | 2016-06-23 | Oculus Vr, Llc | System, device and method for providing user interface for a virtual reality environment |
CN104809940B (en) * | 2015-05-14 | 2018-01-26 | 广东小天才科技有限公司 | Geometry stereographic projection device and projecting method |
CN105157613A (en) * | 2015-06-03 | 2015-12-16 | 五邑大学 | Three-dimensional fast measurement method utilizing colored structured light |
CN105021138B (en) * | 2015-07-15 | 2017-11-07 | 沈阳派特模式识别技术有限公司 | 3-D scanning microscope and fringe projection 3-D scanning method |
CN106403838A (en) * | 2015-07-31 | 2017-02-15 | 北京航天计量测试技术研究所 | Field calibration method for hand-held line-structured light optical 3D scanner |
TWI550253B (en) * | 2015-08-28 | 2016-09-21 | 國立中正大學 | Three-dimensional image scanning device and scanning method thereof |
CN105300319B (en) * | 2015-11-20 | 2017-11-07 | 华南理工大学 | A kind of quick three-dimensional stereo reconstruction method based on chromatic grating |
US10884127B2 (en) * | 2016-08-02 | 2021-01-05 | Samsung Electronics Co., Ltd. | System and method for stereo triangulation |
CN108732066A (en) * | 2017-04-24 | 2018-11-02 | 河北工业大学 | A kind of Contact-angle measurement system |
KR101931773B1 (en) | 2017-07-18 | 2018-12-21 | 한양대학교 산학협력단 | Method for shape modeling, device and system using the same |
CN109855559B (en) * | 2018-12-27 | 2020-08-04 | 成都市众智三维科技有限公司 | Full-space calibration system and method |
TWI763206B (en) | 2020-12-25 | 2022-05-01 | 宏碁股份有限公司 | Display driving device and operation method thereof |
CN113654487B (en) * | 2021-08-17 | 2023-07-18 | 西安交通大学 | Dynamic three-dimensional measurement method and system for single color fringe pattern |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4349277A (en) * | 1980-06-11 | 1982-09-14 | General Electric Company | Non-contact measurement of surface profile |
EP0076866A1 (en) * | 1981-10-09 | 1983-04-20 | Ibm Deutschland Gmbh | Interpolating light section process |
US5615003A (en) * | 1994-11-29 | 1997-03-25 | Hermary; Alexander T. | Electromagnetic profile scanner |
US5640962A (en) * | 1993-01-21 | 1997-06-24 | Technomed Gesellschaft fur Med. und Med. Techn. Systeme mbH | Process and device for determining the topography of a reflecting surface |
-
1999
- 1999-05-14 CA CA002373284A patent/CA2373284A1/en not_active Abandoned
- 1999-05-14 CN CNB998166340A patent/CN1159566C/en not_active Expired - Fee Related
- 1999-05-14 JP JP2000618688A patent/JP2002544510A/en active Pending
- 1999-05-14 EP EP99923100A patent/EP1190213A1/en not_active Withdrawn
- 1999-05-14 AU AU39947/99A patent/AU3994799A/en not_active Abandoned
- 1999-05-14 WO PCT/US1999/010756 patent/WO2000070303A1/en not_active Application Discontinuation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4349277A (en) * | 1980-06-11 | 1982-09-14 | General Electric Company | Non-contact measurement of surface profile |
EP0076866A1 (en) * | 1981-10-09 | 1983-04-20 | Ibm Deutschland Gmbh | Interpolating light section process |
US5640962A (en) * | 1993-01-21 | 1997-06-24 | Technomed Gesellschaft fur Med. und Med. Techn. Systeme mbH | Process and device for determining the topography of a reflecting surface |
US5615003A (en) * | 1994-11-29 | 1997-03-25 | Hermary; Alexander T. | Electromagnetic profile scanner |
Non-Patent Citations (2)
Title |
---|
TAJIMA J ET AL: "3-D DATA ACQUISITION BY RAINBOW RANGE FINDER", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, ATLANTIC CITY, JUNE 16 - 21, 1990 CONFERENCE A: COMPUTER VISION AND CONFERENCE B: PATTERN RECOGNITI SYSTEMS AND APPLICATIONS, vol. 1, no. CONF. 10, 16 June 1990 (1990-06-16), INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS, pages 309 - 313, XP000166402, ISBN: 0-8186-2062-5 * |
WUST C ET AL: "Surface profile measurement using color fringe projection", MACHINE VISION AND APPLICATIONS, SUMMER 1991, USA, vol. 4, no. 3, pages 193 - 203, XP002114480, ISSN: 0932-8092 * |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6556706B1 (en) | 2000-01-28 | 2003-04-29 | Z. Jason Geng | Three-dimensional surface profile imaging method and apparatus using single spectral light condition |
EP1217328A1 (en) * | 2000-12-20 | 2002-06-26 | Olympus Optical Co., Ltd. | 3d image acquisition apparatus and 3d image acquisition method |
US7176440B2 (en) | 2001-01-19 | 2007-02-13 | Honeywell International Inc. | Method and apparatus for detecting objects using structured light patterns |
WO2002059545A1 (en) * | 2001-01-26 | 2002-08-01 | Genex Technologies, Inc. | Three-dimensional surface profile imaging method and apparatus using single spectral light condition |
WO2002066925A1 (en) * | 2001-02-21 | 2002-08-29 | A4 Vision S.A. | Device for contactless inspection of linear dimensions of three-dimensional objects |
WO2002075244A1 (en) * | 2001-03-19 | 2002-09-26 | A4 Vision Sa | Method for noncontact inspection of dimensions of three-dimensional objects |
US7174033B2 (en) | 2002-05-22 | 2007-02-06 | A4Vision | Methods and systems for detecting and recognizing an object based on 3D image data |
US7257236B2 (en) | 2002-05-22 | 2007-08-14 | A4Vision | Methods and systems for detecting and recognizing objects in a controlled wide area |
DE10250954B4 (en) * | 2002-10-26 | 2007-10-18 | Carl Zeiss | Method and device for carrying out a televisite and televisite receiving device |
US7146036B2 (en) | 2003-02-03 | 2006-12-05 | Hewlett-Packard Development Company, L.P. | Multiframe correspondence estimation |
WO2004114243A1 (en) * | 2003-06-19 | 2004-12-29 | Honeywell International Inc. | Method and apparatus for detecting objects using structured light patterns |
US7415151B2 (en) | 2003-06-20 | 2008-08-19 | Industrial Technology Research Institute | 3D color information acquisition method and 3D color information acquisition device |
US10944953B2 (en) | 2004-06-17 | 2021-03-09 | Align Technology, Inc. | Method and apparatus for colour imaging a three-dimensional structure |
US10924720B2 (en) | 2004-06-17 | 2021-02-16 | Align Technology, Inc. | Systems and methods for determining surface topology and associated color of an intraoral structure |
US10812773B2 (en) | 2004-06-17 | 2020-10-20 | Align Technology, Inc. | Method and apparatus for colour imaging a three-dimensional structure |
US10764557B2 (en) | 2004-06-17 | 2020-09-01 | Align Technology, Inc. | Method and apparatus for imaging a three-dimensional structure |
US10750152B2 (en) | 2004-06-17 | 2020-08-18 | Align Technology, Inc. | Method and apparatus for structure imaging a three-dimensional structure |
US10750151B2 (en) | 2004-06-17 | 2020-08-18 | Align Technology, Inc. | Method and apparatus for colour imaging a three-dimensional structure |
US10728519B2 (en) | 2004-06-17 | 2020-07-28 | Align Technology, Inc. | Method and apparatus for colour imaging a three-dimensional structure |
US8238661B2 (en) | 2004-08-12 | 2012-08-07 | Bioscrypt, Inc. | Device for contactlessly controlling the surface profile of objects |
AU2005285558B2 (en) * | 2004-08-12 | 2012-02-02 | A4 Vision S.A | Device for biometrically controlling a face surface |
WO2006031143A1 (en) | 2004-08-12 | 2006-03-23 | A4 Vision S.A. | Device for contactlessly controlling the surface profile of objects |
EP1788348A1 (en) * | 2004-08-12 | 2007-05-23 | A4 Vision S.A. | Device for biometrically controlling a face surface |
EP1788348A4 (en) * | 2004-08-12 | 2008-10-22 | A4 Vision S A | Device for biometrically controlling a face surface |
AU2005285558C1 (en) * | 2004-08-12 | 2012-05-24 | A4 Vision S.A | Device for biometrically controlling a face surface |
US9117107B2 (en) | 2004-08-12 | 2015-08-25 | Bioscrypt, Inc. | Device for biometrically controlling a face surface |
US7646896B2 (en) | 2005-08-02 | 2010-01-12 | A4Vision | Apparatus and method for performing enrollment of user biometric information |
US8050486B2 (en) | 2006-05-16 | 2011-11-01 | The Boeing Company | System and method for identifying a feature of a workpiece |
US9052294B2 (en) | 2006-05-31 | 2015-06-09 | The Boeing Company | Method and system for two-dimensional and three-dimensional inspection of a workpiece |
US7495758B2 (en) | 2006-09-06 | 2009-02-24 | Theo Boeing Company | Apparatus and methods for two-dimensional and three-dimensional inspection of a workpiece |
US9147253B2 (en) | 2010-03-17 | 2015-09-29 | Microsoft Technology Licensing, Llc | Raster scanning for depth detection |
WO2014016001A1 (en) * | 2012-07-25 | 2014-01-30 | Siemens Aktiengesellschaft | Colour coding for 3d measurement, more particularly for transparent scattering surfaces |
US9404741B2 (en) | 2012-07-25 | 2016-08-02 | Siemens Aktiengesellschaft | Color coding for 3D measurement, more particularly for transparent scattering surfaces |
WO2014074003A1 (en) | 2012-11-07 | 2014-05-15 | Артек Европа С.А.Р.Л. | Method for monitoring linear dimensions of three-dimensional objects |
CN103697815A (en) * | 2014-01-15 | 2014-04-02 | 西安电子科技大学 | Method for acquiring three-dimensional information of frequency mixing structured light based on phase encoding |
CN103697815B (en) * | 2014-01-15 | 2017-03-01 | 西安电子科技大学 | Mixing structural light three-dimensional information getting method based on phase code |
US10228145B2 (en) | 2014-06-05 | 2019-03-12 | BSH Hausgeräte GmbH | Cooking device with light pattern projector and camera |
DE102014210672A1 (en) | 2014-06-05 | 2015-12-17 | BSH Hausgeräte GmbH | Cooking device with light pattern projector and camera |
WO2015185608A1 (en) | 2014-06-05 | 2015-12-10 | BSH Hausgeräte GmbH | Cooking device with light pattern projector and camera |
WO2016137351A1 (en) * | 2015-02-25 | 2016-09-01 | Андрей Владимирович КЛИМОВ | Method and device for the 3d registration and recognition of a human face |
CN108693538A (en) * | 2017-04-07 | 2018-10-23 | 北京雷动云合智能技术有限公司 | Accurate confidence level depth camera range unit based on binocular structure light and method |
CN109348607A (en) * | 2018-10-16 | 2019-02-15 | 华为技术有限公司 | A kind of illuminating module bracket and terminal device |
Also Published As
Publication number | Publication date |
---|---|
EP1190213A1 (en) | 2002-03-27 |
CA2373284A1 (en) | 2000-11-23 |
CN1159566C (en) | 2004-07-28 |
JP2002544510A (en) | 2002-12-24 |
AU3994799A (en) | 2000-12-05 |
CN1350633A (en) | 2002-05-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2000070303A1 (en) | Color structured light 3d-imaging system | |
TW385360B (en) | 3D imaging system | |
US6341016B1 (en) | Method and apparatus for measuring three-dimensional shape of object | |
Tajima et al. | 3-D data acquisition by rainbow range finder | |
US6549288B1 (en) | Structured-light, triangulation-based three-dimensional digitizer | |
JP3884321B2 (en) | 3D information acquisition apparatus, projection pattern in 3D information acquisition, and 3D information acquisition method | |
JP3519698B2 (en) | 3D shape measurement method | |
US4842411A (en) | Method of automatically measuring the shape of a continuous surface | |
JP4040825B2 (en) | Image capturing apparatus and distance measuring method | |
CA2369710C (en) | Method and apparatus for high resolution 3d scanning of objects having voids | |
KR20100017234A (en) | Single-lens,3-d imaging device using a polarization coded aperture mask combined with a polarization sensitive sensor | |
WO1997035439A1 (en) | System and method for rapid shape digitizing and adaptive mesh generation | |
US6765606B1 (en) | Three dimension imaging by dual wavelength triangulation | |
CA2017518A1 (en) | Colour-range imaging | |
JP3482990B2 (en) | 3D image capturing device | |
US11212508B2 (en) | Imaging unit and system for obtaining a three-dimensional image | |
JP2714152B2 (en) | Object shape measurement method | |
JP4516590B2 (en) | Image capturing apparatus and distance measuring method | |
CN109242895B (en) | Self-adaptive depth constraint method based on real-time three-dimensional measurement of multi-camera system | |
JP3912666B2 (en) | Optical shape measuring device | |
JP2001304821A (en) | Image pickup apparatus and range measuring method | |
JP4204746B2 (en) | Information acquisition method, imaging apparatus, and image processing apparatus | |
JP4141627B2 (en) | Information acquisition method, image capturing apparatus, and image processing apparatus | |
JPH0723684Y2 (en) | Range finder | |
JP2006058092A (en) | Three-dimensional shape measuring device and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 99816634.0 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
ENP | Entry into the national phase |
Ref document number: 2000 618688 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1999923100 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2373284 Country of ref document: CA Ref document number: 2373284 Country of ref document: CA Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWP | Wipo information: published in national office |
Ref document number: 1999923100 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1999923100 Country of ref document: EP |