WO2005031253A1 - 3次元形状検出装置、撮像装置、及び、3次元形状検出プログラム - Google Patents
3次元形状検出装置、撮像装置、及び、3次元形状検出プログラム Download PDFInfo
- Publication number
- WO2005031253A1 WO2005031253A1 PCT/JP2004/014167 JP2004014167W WO2005031253A1 WO 2005031253 A1 WO2005031253 A1 WO 2005031253A1 JP 2004014167 W JP2004014167 W JP 2004014167W WO 2005031253 A1 WO2005031253 A1 WO 2005031253A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern light
- pixel
- parameter
- projection image
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
Definitions
- 3D shape detection device imaging device, and 3D shape detection program
- the present invention relates to a three-dimensional shape detection device that detects a three-dimensional shape of a target object using pattern light, an imaging device, and a three-dimensional shape detection program.
- a whiteboard, a book, or the like is imaged as a target object, and the three-dimensional shape of the target object is detected from the captured image, whereby the whiteboard, the book, or the like is inclined with respect to the front surface.
- an imaging apparatus including a correcting unit that corrects a captured image so that even if a positional force is imaged, a warm force is also captured as a frontal force.
- FIG. 1 and the like in Japanese Patent Application Laid-Open No. Hei 9-289611 (hereinafter referred to as Document 1) disclose a portable digital camera provided with such a correction unit.
- Patent document 3282331 discloses a technique for detecting a three-dimensional shape of a target object as a parameter necessary for the above-described correction means, in the tenth paragraph, FIG. 3, and the like.
- Document 2 discloses a stationary three-dimensional shape measuring apparatus, which includes a slit light projection image obtained by imaging a target object in a state where slit light is projected, and The slit light is projected, and the slit light is extracted by subtracting from the image when the slit light is not projected, which captures the target object in the state, and the three-dimensional object is extracted based on the extracted slit light. Detect the shape.
- the three-dimensional shape measuring apparatus Since the above-described three-dimensional shape measuring apparatus is a stationary type and has a limited degree of freedom in imaging, which is inconvenient, it is desirable that the three-dimensional shape measuring apparatus be portable. If the 3D shape measuring device is portable while moving, the image capturing position of the image when the slit light is not projected and the image capturing position of the image when the slit light is not projected may be shifted due to camera shake. . In such a case, a gap naturally occurs between the image when the slit light is projected and the image when the slit light is not projected, and the difference between the image when the slit light is projected and the image when the slit light is not projected is subtracted.
- the present invention provides a three-dimensional shape detection device, an imaging device, and a three-dimensional shape detection device that can detect a pattern light with high accuracy even when a pattern light projection image image of a target object in a state where the pattern light is projected is provided.
- the purpose is to provide a 3D shape detection program.
- an aspect of the present invention provides a three-dimensional shape detecting apparatus, wherein a light projecting unit for projecting pattern light, and a light projecting unit for projecting the pattern light.
- Imaging means for capturing a pattern light projection image of a target object in a state in which the pattern light is projected, and extracting a position of the pattern light projected on the target object based on the pattern light projection image captured by the imaging means.
- Pattern light position extracting means and three-dimensional shape calculating means for calculating a three-dimensional shape of the target object based on the position of the pattern light extracted by the pattern light position extracting means.
- the three-dimensional shape detection device further includes a storage unit that stores color value data of the pattern light projection image captured by the imaging unit, and the pattern light based on the color value data stored in the storage unit.
- Hue parameter calculation means for calculating a hue parameter corresponding to a main hue constituting each pixel in pixel units
- luminance parameter calculation means for calculating a luminance parameter in pixel units based on the color value data stored in the storage means.
- Light detecting means is provided.
- the pattern light position extracting means extracts the position of the pattern light based on the pixel including the pattern light detected by the pattern light detecting means.
- the pixel including the pattern light is obtained based on the hue parameter and the luminance parameter corresponding to the main hues constituting the pattern light, the pixel including the pattern light has a low saturation.
- the illumination reflection part or the brightness is low, close to the pattern light, and included in the printed part printed in color, the difference between the pixel including the no-turn light and the other pixels is clear, and the pattern light is highly accurate. Can be detected. As a result, the position of the pattern light It is possible to extract with high accuracy.
- a three-dimensional shape detection device which includes a light projecting means for projecting pattern light, and a light projecting means for projecting the pattern light.
- a light projecting means for projecting pattern light
- a light projecting means for projecting the pattern light.
- Imaging means configured to also capture an optical image
- pattern light position extraction means for extracting the position of the pattern light projected on the target object based on the pattern light projection image captured by the imaging means
- a three-dimensional shape calculating means for calculating a three-dimensional shape of the target object based on the position of the pattern light extracted by the pattern light position extracting means.
- the three-dimensional shape detecting device further includes a pattern light detecting unit that detects a pixel including the pattern light from the pattern light projection image, and a pixel corresponding to the pixel including the pattern light detected by the pattern light detecting unit. And a search unit for searching whether or not the image exists in the pattern light non-light projection image.
- the no-turn light position extracting means detects the pixel corresponding to the pixel detected by the pattern light detecting means by the pattern light detecting means when the searching means does not search the pixel in the pattern light non-projecting image. The position of the pattern light is extracted based on the pixel to be processed.
- the pattern light detecting unit detects the pixels including the pattern light based on the pattern light projection image, the pixels corresponding to the pixels are detected in the pattern light.
- the position of the pattern light is not extracted (that is, the pattern light is not extracted as a no-turn light).
- the pixel containing the pattern light is detected from the pattern light projection image and the corresponding pixel is not searched from the non-turn light non-projection image, the position of the pattern light is extracted. . Therefore, the position of the pattern light can be extracted with high accuracy.
- an aspect of the present invention provides a three-dimensional shape inspection program for projecting a pattern light of a target object in a state where the pattern light is projected.
- a pattern light position extracting step for extracting a position of the pattern light projected on the target object based on the pattern light projection image captured by the imaging means for capturing an image; and a pattern extracted in the pattern light position extracting step.
- the three-dimensional shape detection program includes a storage step of storing color value data of the pattern light projection image, and a main part of the pattern light based on the color value data stored in the storage step.
- the pattern light position extraction step the position of the pattern light is extracted based on V based on the pixel including the pattern light detected in the pattern light detection step.
- the pixel including the pattern light is obtained based on the hue parameter and the luminance parameter corresponding to the main hues constituting the pattern light, the pixel including the pattern light has a low hue parameter.
- the difference between the pixel including the pattern light and the other pixels becomes clear, and the pattern is accurately determined. Pixels containing light can be detected. As a result, the position of the pattern light can be extracted with high accuracy.
- FIG. 1 (a) is an external perspective view of an imaging device
- FIG. 1 (b) is a schematic sectional view of the imaging device 1.
- FIG. 2 is a diagram showing a configuration of a slit light projecting unit.
- FIG. 3 (a) and FIG. 3 (b) are diagrams for explaining the angular width of slit light.
- FIG. 4 is a block diagram showing an electrical configuration of the imaging device.
- FIG. 5 is a flowchart showing a processing procedure in a processor.
- FIGS. 6 (a) and 6 (b) are diagrams for explaining the principle of slit light trajectory extraction processing. is there.
- FIG. 7 is a flowchart showing slit light locus extraction processing.
- FIG. 8 is a flowchart showing slit light barycenter calculation processing.
- FIG. 9 shows a captured image of a document P in a state where slit light is irradiated
- FIG. 9 (b) is an enlarged view schematically showing peripheral pixels at a slit light detection position cX.
- FIG. 10 (a) and FIG. 10 (b) are diagrams for explaining an image with slit light.
- FIG. 11 (a) and FIG. 11 (b) are diagrams for explaining a three-dimensional space position calculation method.
- FIGS. 12 (a), 12 (b), and 12 (c) are diagrams for explaining a coordinate system at the time of document orientation calculation.
- FIG. 13 is a flowchart showing a plane conversion process.
- FIG. 14 is a block diagram showing an electrical configuration of an imaging device according to a second embodiment.
- FIG. 15 is a flowchart showing a slit light locus extraction process according to the second embodiment.
- FIG. 16 is a diagram for explaining a method of setting a search range when searching for a corresponding pixel in an image without slit light.
- FIG. 17 is a diagram for explaining a method of setting a search range in a slit light non-image in consideration of a camera shake amount.
- Imaging device including 3D shape detection device
- Slit light projecting unit pattern light projecting means
- FIG. 1A is an external perspective view of an imaging device 1 according to an embodiment of the present invention
- FIG. 1B is a schematic sectional view of the imaging device 1. Note that the three-dimensional shape detection device according to the embodiment of the present invention is included in the imaging device 1.
- the imaging device 1 is provided with a rectangular box-shaped main body case 10, an imaging lens 31 provided on the front surface of the main body case 10, and a rear side of the imaging lens 31 (inside the imaging device 1). It has a CCD image sensor 32 and a slit light projecting unit 20 provided below the imaging lens 31. Further, the imaging device 1 has a processor 40 built in the main body case 10, a release button 52 and a mode switching switch 59 provided on the upper part of the main body case 10, and a memory card 55 built in the main body case 10. . These components are connected by signal lines as shown in Fig. 4.
- the imaging device 1 includes an LCD (Liquid Crystal Display) 51 provided on the back of the main body case 10 and a main body, which are used when a user determines an imaging range of the imaging device 1.
- LCD Liquid Crystal Display
- a rear view of the case 10 is also provided with a finder 53 disposed through the front.
- the imaging lens 31 includes a plurality of lenses.
- the imaging apparatus 1 has an auto-force function, and drives the imaging lens 31 so that light from an external force is focused on the CCD image sensor 32 by automatically adjusting the focal length and the aperture.
- the CCD image sensor 32 is configured by arranging photoelectric conversion elements such as CCD (Charge Coupled Device) elements in a matrix.
- the CCD image sensor 32 generates a signal corresponding to the color and intensity of the light of the image formed on the surface, converts the signal into digital data, and outputs the digital data to the processor 40.
- data for one CCD element forms an image.
- the pixel data is pixel data, and the image data is composed of pixel data corresponding to the number of CCD elements.
- FIG. 2 is a diagram showing a configuration of the slit light projecting unit 20.
- FIG. 3 is a diagram for explaining the angular width of the slit light.
- the slit light projecting unit 20 includes a laser diode 21, a collimator lens 22, an aperture 23, a transparent flat plate 24, a cylindrical lens 25, a reflection mirror 26, and a rod lens 27.
- the laser diode 21 emits a red laser beam.
- the emission and the stop of the laser beam in the laser diode 21 are switched.
- the output of the laser diode 21 is adjusted so that a constant output (for example, lmW) can be obtained at the point passing through the aperture 23 in consideration of the individual dispersion of the spread angle of the laser beam with respect to the maximum output rating (for example, 5 mW).
- the rated output has been adjusted.
- the collimating lens 22 focuses the laser beam from the laser diode 21 so as to focus on a reference distance VP (for example, 330 mm) from the slit light projecting unit 20.
- VP for example, 330 mm
- the aperture 23 is formed of a plate having a rectangular opening, and transmits the laser beam from the collimating lens 22 through the opening to be shaped into a rectangle.
- the transparent flat plate 24 is made of a transparent flat plate made of a solid glass material or the like, and has an AR coat (anti-reflection coating) on the back surface.
- the transparent flat plate 24 is disposed at a predetermined angle
- the transparent flat plate 24 reflects about 5% (about 50 ⁇ W) of the power of one laser beam incident from the aperture 23 on the surface and transmits about 95% (about 950 ⁇ W). Note that the direction in which the laser beam is reflected by the transparent flat plate 24 (the direction in front of the imaging device 1 and upward by 33 degrees with respect to the horizontal plane) is referred to as a second direction.
- the reflection mirror 26 is made of a member such as a mirror that totally reflects the laser beam.
- the reflection mirror 26 is disposed at an angle of 45 degrees on the front side of the main body case 10 downstream of the laser beam transmitted through the transparent flat plate 24, and totally reflects the laser beam to change the direction of the optical path by 90 degrees.
- the direction in which the laser beam is reflected by the reflection mirror 26 (the direction of 0 ° with respect to the horizontal plane in front of the imaging device 1) is referred to as a first direction.
- the rod lens 27 is formed of a cylindrical lens having a short positive focal length.
- the rod lens 27 is provided downstream of the laser beam reflected by the reflection mirror 26 so that the axial direction of the cylindrical shape is vertical.
- the rod lens 27 has a short focal length. Therefore, as shown in FIG. 3 (a), the laser beam that has passed through the rod lens 27 immediately begins to spread from the focal position near the rod lens 27, and has a slit with a predetermined spread angle ⁇ (for example, 48 degrees). The light is emitted in the first direction.
- the slit light emitted from the rod lens 27 is referred to as a first slit light 71.
- the cylindrical lens 25 is a lens having a concave shape in one direction so as to have a negative focal length.
- the cylindrical lens 25 is disposed downstream of the laser beam reflected by the transparent flat plate 24 so that the lens surface is orthogonal to the second direction.
- the cylindrical lens 25 emits the laser beam incident from the transparent flat plate 24 as slit light which spreads at a spread angle ⁇ .
- the slit light emitted from the cylindrical lens 25 is referred to as a second slit light 72.
- the divergence angle ⁇ of the cylindrical lens 25 is the ratio of the divergence angle ⁇ of the first slit light 71 to the divergence angle ⁇ of the second slit light 72 is equal to the power ratio when the laser beam is split by the transparent flat plate 24. It has become so. That is, the spread angle ⁇ of the second slit light 72 is 5% (2.4 degrees) of the spread angle ⁇ of the first slit light.
- the slit light projecting unit 20 emits a laser beam from the laser diode 21 in response to a command from the processor 40, and emits the first slit light 71 in the first direction and Then, the second slit light 72 is emitted from the window 29 provided below the imaging lens 31 of the main body case 10 in the second direction. Since one red laser beam is emitted from the laser diode 21, the first slit light 71 and the second slit light 72 that also generate the red laser beam power are mainly composed of a red value R.
- the power of the first slit light 71 split by the transparent flat plate 24 is 95% of the power that is also output by the laser diode 21.
- the power of the second slit light 72 is as low as about 5%, but when viewed in terms of the power per angular width, the power per unit angle of the first slit light 71 with a spread angular force of 8 degrees is about 20 WZ, and spreads
- the power per unit angle of the second slit light 72 having an angle of 2.4 degrees is about 21 / zWZ, which is almost the same.
- the illuminance of the first slit light 71 and the second slit light 72 is about 1260 liters, which is a general indoor brightness. Even at a certain 500-1000 lux place, there is a sufficient luminance difference between the locus of the slit light and the original P. Therefore, the trail image of the slit light can be reliably extracted by the difference extraction program 422 described later.
- the release button 52 is constituted by a push button type switch, and is connected to the processor 40.
- the processor 40 detects that the user has pressed down the release button 52.
- the mode switching switch 59 is configured by a slide switch or the like that can be switched to two positions.
- One switch position in the mode switching switch 59 is assigned by the processor 40 so as to be detected as "normal mode", and the other switch position is detected as "corrected imaging mode”.
- the “Normal mode” is a mode in which the image of the document P itself is taken as image data
- the “Corrected imaging mode” is that when the document P is imaged from an oblique direction, the image data is imaged from the front of the document P. In this mode, the image data is corrected as described above.
- the memory card 55 is constituted by a nonvolatile and rewritable memory, and is detachable from the main body case 10.
- the LCD 51 is configured by a liquid crystal display or the like for displaying an image, and displays an image in response to an image signal from the processor 40. From the processor 40 to the LCD 51, an image signal for displaying a real-time image received by the CCD image sensor 32, an image stored in the memory card 55, characters of the device setting content, etc., according to the situation. Is sent
- the finder 53 is configured by an optical lens.
- the viewfinder 53 allows the user to
- the rear side force is also configured so that, when looking into the viewfinder 53, a range substantially coincident with the range in which the imaging lens 31 forms an image on the CCD image sensor 32 can be seen.
- FIG. 4 is a block diagram showing an electrical configuration of the imaging device 1.
- the processor 40 mounted on the imaging device 1 includes a CPU 41, a ROM 42, and a RAM 43.
- the CPU 41 uses the RAM 43 to detect a press-down operation of the release button 52, capture image data from the CCD image sensor 32, and transfer the image data to the memory card 55 in accordance with processing by a program stored in the ROM 42.
- Various processes such as writing the data, detecting the state of the mode switching switch 59, and switching the emission of the slit light by the slit light projecting unit 20.
- the ROM 42 stores a camera control program 421, a slit light locus extraction program 422, a triangulation calculation program 423, a document attitude calculation program 424, and a plane conversion program 425. Te ru.
- the camera control program 421 is a process of the flowchart shown in FIG. 5 (details will be described later).
- the slit light locus extraction program 422 is a program for extracting the locus of the slit light from the image of the document P on which the slit light is projected.
- the triangulation calculation program 423 is a program for calculating the three-dimensional spatial position of each of the pixels of the locus of the slit light extracted by the slit light locus extraction program 422 with respect to each pixel.
- the document attitude calculation program 424 is a program for estimating and obtaining the three-dimensional shape of the document P from the three-dimensional spatial positions of the first slit light locus 71a and the second slit light locus 72b.
- the plane conversion program 425 is a program for converting the image data stored in the slit light non-image storage section 432 given the position and orientation of the document P into an image that also captures the frontal force of the document P. is there.
- the RAM 43 has, as storage areas, an image storage section 431 with slit light for storing image data from the CCD image sensor 32 as color value data expressed in RGB values, and an image storage section 432 without slit light. And are assigned.
- the RAM 43 stores a red difference value Rd obtained by subtracting the average of the red value R force green value G and blue value B for each pixel included in the search range in the slit light image.
- the luminance value Y (Rd'Y value) A detection target pixel value temporary storage unit 433 for saving is allocated.
- the RAM 43 has a triangulation measurement operation result storage unit 434 for storing the result of calculating the position of each point of the slit light image, and a document for storing the calculation result of the position and orientation of the document P.
- the posture calculation result storage unit 435 is assigned.
- the RAM 43 stores a slit light trajectory information storage unit 436 for storing a gravity center position calculated in a slit light gravity center position calculation process described later, and temporarily stores data for calculation in the CPU 41. And a working area 437 for use.
- FIG. 5 is a flowchart showing a processing procedure in the processor 40 of the imaging device 1.
- step S110 the position of the switch of the mode switching switch 59 is detected, and it is determined whether or not the force of the switch is the position of the “corrected imaging mode” ( S110). If the result of determination is that the camera is at the position of the "corrected imaging mode” (S110: Yes), the process proceeds to step S120.
- step S120 the light emission of the laser diode 21 is commanded to the slit light projecting unit 20, and when the first slit light 71 and the second slit light 72 are emitted, the image represented by the RGB value from the CCD image sensor 32 is outputted. The data is read as a slit light image. Further, in S120, the read image data is stored in the slit light image storage unit 431 of the RAM 43.
- step S130 the emission stop of the laser diode 21 is instructed to the slit light projecting unit 20, and when the first slit light 71 and the second slit light 72 are not emitted, they are expressed by RGB values from the CCD image sensor 32. Image data is read as a slit lightless image. Further, in S130, the read image data is stored in the slit light non-image storage section 432 of the RAM 43 (S130).
- step S140 the slit light trajectory extraction process (S140) is performed by the slit light trajectory extraction program 422 to extract the slit light as well as the image data of the slit light existing image read into the slit light existing image storage unit 431.
- the slit light trajectory extraction processing (S140) will be described in detail with reference to FIG. 6 to FIG.
- the slit light trajectory extraction process clarifies the difference between the pixels that contain slit light and the pixels that do not contain slit light in the image with slit light, and extracts the locus of slit light with high accuracy in the image with slit light. It is processing of.
- FIG. 6A shows a captured image of the document P in a state where the slit light is irradiated.
- a print is made with a character portion M arranged in a plurality of rows in the width direction of the original, an illumination reflection portion S shown in a rectangular shape, and a red (R) component as a main color component surrounded by a circle.
- a portion I and trajectories 71a and 72a of the first and second slit lights extending in the width direction of the document P are formed.
- An alternate long and short dash line extending in a direction orthogonal to the width direction of the document P indicates a slit light detection position, and an intersection between the slit light detection position and the locus 71a of the first slit light is defined as a slit light detection pixel K.
- Fig. 6 (b) is a graph showing predetermined parameter values at the slit light detection position (see a dashed line in the figure).
- the slit light detection position force also draws an extension line straightforward in each graph.
- the given partial force shows each predetermined parameter value of the slit light detection position. That is, the position on the vertical axis of each graph of FIG. 6B corresponds to the vertical position of the graph of FIG. 6A.
- Graph A1 adopts the red value R
- graph A2 adopts the red difference value Rd
- graph A3 adopts the brightness value Y
- graph ⁇ 4 adopts the product value Rd′ ⁇ of the red difference value Rd and the luminance value ⁇ ⁇ ⁇ as the predetermined parameter. .
- the red difference value Rd is calculated by subtracting the average of the green value G and the blue value B of the red value R. That is, the red value R corresponding to the R component, which is the main component of the slit light, can be emphasized more than the other components (G value, B value) at the slit light detection position by the red difference value Rd.
- a pixel having a red value R close to the green value G or blue value B is a pixel having a low red difference value Rd
- a pixel having a red value R higher than the green value G or blue value B is a red difference value.
- the value Rd value becomes higher.
- the luminance value Y indicates the luminance of each pixel at the slit light detection position.
- the luminance value Y is a Y value in the YCbCr space, and is converted from the RGB space to the YCbCr space by the following formula.
- the illumination reflection portion S has a lower red difference value Rd than the printing portion I having the slit light detection pixels K and R components. Therefore, if the slit light detection pixel K is detected based on the level of the red difference value Rd, even if the slit light detection pixel K is included in the illumination reflection portion S, the difference between the two with respect to the red difference value Rd is Because it is clear, it is possible to accurately detect the slit light detection pixel K from the illumination reflection part S. However, when the slit light detection pixel K is included in the print portion I having the R component, there is no clear difference between the two with respect to the red difference value Rd. Pixel K cannot be detected accurately.
- the print portion I having the R component has a lower luminance value ⁇ ⁇ than the slit light detection pixel K and the illumination reflection portion S. Therefore, if the slit light detection pixel ⁇ is detected based on the level of the luminance value ⁇ , even if the slit light detection pixel ⁇ is included in the printed portion I having the R component, the difference between the two with respect to the luminance value ⁇ Is clear, it is possible to detect the slit light detection pixel ⁇ ⁇ from the printed part having the R component.
- the slit light detection pixel is included in the illumination reflection portion S, the slit light detection pixel ⁇ cannot be accurately detected from the illumination reflection portion S because there is no clear difference between the two with respect to the luminance value ⁇ .
- the slit light detection pixel ⁇ has a red difference value R d and a luminance value ⁇ both higher than the illumination reflection portion S and the print portion I having the R component. Focusing on the fact that it has a ⁇ value, it is necessary to detect pixels that include slit light based on the level of the product value Rd'Y (hereinafter Rd'Y value) of the red difference value Rd and the luminance value Y. And
- the Rd′Y value of the slit light detection position pixel K is higher than the Rd′Y value of the illumination reflection portion S and the Rd′Y value of the print portion I having the R component. Therefore, even if the slit light detection pixel K is included in the illumination reflection part S and the printing part I having the R component, Since the difference between the two with respect to the d'Y value is clear, the slit light detection pixel K can be accurately detected from the illumination reflection portion S and the printing portion I having the R component.
- FIG. 7 is a flowchart of the slit light locus extraction process.
- FIG. 8 is a flowchart of the slit light barycenter calculation processing included in the slit light trajectory extraction processing.
- FIG. 9A shows a captured image of the document P in a state where the slit light is irradiated.
- FIG. 9B is an enlarged view schematically showing peripheral pixels at the slit light detection position cX.
- a search parameter for specifying a search range for extracting the second slit light trajectory 72a is set (S701). These search parameters are set by cX2 in the ccdx direction on the trajectory 72a of the second slit light as shown in FIG. 9 and in the range from yMi2 to yMax2 in the ccdy direction.
- the range from yMin2 to yMax2 is the upper half area of the captured image. It is set in the range of 0 to 799.
- a slit light barycenter position calculation process described later is executed (S702).
- the barycentric position calculated in the slit light barycentric position calculation process (S702) is stored in the slit light locus information storage unit 436 (S703).
- a search parameter for specifying a search range for extracting the trajectory 71a of the first slit light is set (S704).
- This search parameter is set in the range from yMinl to yMaxl in the ccdy direction.
- the entire area of the lower half is not set because, in the present embodiment, the first slit light 71 is parallel to the optical axis of the imaging lens 31 and is also irradiated with a lower force than the imaging lens 31. 1st pickpocket The range in which the cut light 71 exists is because the distance force between the document P and the imaging lens 31 that can be used when imaging the document P can be calculated backward, so that the search range is narrowed down in advance and processing is performed at high speed. .
- a search parameter in the ccdx direction is set.
- the initial value of the variable cX indicating the position where the locus 71a of the first slit light is detected is 0, the maximum value of cX is cXMax, the number of pixels in the width W direction is 1, and the detection interval dx is 20 (pixel). .
- Initial force of cX Detection interval Detects sequentially up to the maximum value cXMax at every dx. The processing for this corresponds to the repetition processing from S705 to S709.
- a slit light centroid calculation process (S707) described later is executed for the cX, and the slit light centroid position calculation process is performed.
- the position of the center of gravity calculated in (S707) is stored in the slit light locus information storage unit 436 (S708).
- the detection position is updated by adding the detection interval dx to the variable cX (S709), and the processing from S706 to S709 is repeated. Then, when the variable cX becomes larger than the maximum value cXMax (S706: No), the process ends.
- the position of the pixel detected as a pixel including the slit light and the luminance center position of the slit light are the characteristics of one laser beam that constitutes the slit light, the fine irregularities on the surface of the imaging object. Therefore, the center of gravity of the Rd'Y value is calculated within a certain range around the pixel to be detected, and the center of gravity is set as a position including the slit light.
- ccdx direction xRange 2
- a red difference value Rd and a luminance value Y are calculated for each pixel in the range of xMin ⁇ ccdx ⁇ xMax and yMin ⁇ cc dy ⁇ yMax ( S802, S803).
- the Rd'Y value is calculated for each pixel by multiplying the red difference value Rd calculated for each pixel by the luminance value Y, and the result is stored in the detection target pixel value temporary storage unit 433. (S804).
- variable ccX does not exceed the image range (S807: Yes)
- the pixel having the maximum Rd'Y value is extremely likely to be a pixel including slit light within the search range.
- the condition that the pixel exceeds the threshold value vTh is a condition that even if the pixel has the maximum Rd'Y value, the pixel ⁇ that exceeds the threshold value vTh is not detected from the imaging target.
- the slit light hits a distant object that has come off! / Pull (in this case, it has very weak brightness), and that pixel also removes the candidate candidate power of the pixel containing the slit light This is for detecting pixels including slit light with higher accuracy.
- the aberration correction process is a process for correcting image distortion that depends on the angle from the optical axis.
- This triangulation calculation processing is processing for calculating the three-dimensional spatial position of each pixel of the trajectory 7la of the first slit light and the trajectory 72a of the second slit light by the triangulation calculation program 423.
- the vertical peaks of the locus 71a of the first slit light and the locus 72a of the second slit light are centered on the basis of the image data read into the temporary storage unit 433 for the pixel value to be detected.
- the calculation is performed for each horizontal coordinate of the image data, and the three-dimensional spatial position for this peak extraction coordinate is calculated as follows.
- FIGS. 10A and 10B are diagrams for explaining an image with slit light.
- FIGS. 11A and 11B are diagrams for explaining a method of calculating the three-dimensional spatial position of the slit light.
- the position separated by the distance VP is defined as the origin position of the X, Y, and Z axes.
- the number of pixels in the X-axis direction of the CCD image sensor 32 is called ResX, and the number of pixels in the Y-axis direction is called ResY.
- the upper end of the position where the CCD image sensor 32 is projected through the imaging lens 31 on the XY plane Is called Yftop, the bottom is Yfbottom, the left is Xfstart, and the right is Xfend.
- the distance from the optical axis of the imaging lens 31 to the optical axis of the first slit light 71 emitted from the slit light projecting unit 20 is D, and the Y axis at which the first slit light 71 intersects the XY plane.
- the position in the direction is lasl
- the position in the Y-axis direction where the second slit light 72 intersects the XY plane is las2.
- the three-dimensional spatial position (XI, Yl) corresponding to the coordinates (ccdxl, ccdyl) on the CCD image sensor 32 of the point of interest 1 focused on one of the pixels of the image of the first slit light locus 71a , Z1) are set for a triangle formed by a point on the image plane of the CCD image sensor 32, an emission point of the first slit light 71 and the second slit light 72, and a point intersecting the XY plane.
- the document posture calculation process is a process of calculating the position and posture of the document P by the document posture calculation program 424 based on the three-dimensional spatial position of each slit light.
- the original posture calculation processing will be described with reference to FIGS. 12 (a) to 12 (c).
- FIGS. 12A to 12C are diagrams for explaining a coordinate system at the time of document orientation calculation.
- each point in the three-dimensional space corresponding to the locus 71 a of the first slit light is approximated to the regression curve.
- Find a similar line and assume a straight line connecting the point where the X-axis position of this curve is “0” and the three-dimensional position where the X-axis position of the second slit light trajectory 72a is “0”.
- the point at which this straight line intersects the Z axis that is, the point at which the optical axis intersects the document P, is defined as the three-dimensional space position (0, 0, L) of the document P (see FIG. 12A).
- the angle between the straight line and the X-Y plane is defined as the inclination ⁇ of the document P about the X axis.
- the line obtained by approximating the regression curve of the locus 71a of the first slit light is rotationally transformed in the opposite direction by the previously obtained inclination ⁇ ⁇ ⁇ about the X axis, that is, Consider a document P parallel to the XY plane.
- the cross-sectional shape of the document P in the X-axis direction is obtained, and the displacement in the Z-axis direction of the cross section of the document P in the X--Z plane is obtained at a plurality of points in the X-axis direction.
- a curvature ⁇ (X) which is a function of the tilt in the X-axis direction with the position in the X-axis direction as a variable, is obtained, and the process ends.
- the plane conversion processing is processing for converting image data stored in the slit light non-image storage section 432 into image data of an image viewed from the front from the three-dimensional shape data by the plane conversion program 425. is there.
- FIG. 13 is a flowchart showing the plane conversion process.
- a processing area of the processing is allocated to the working area 437 of the RAM 43, and an initial value of a variable b used in the processing, such as a variable for a counter, is set. Is set (S1300).
- the area of the erect image which is an image when the surface of the original P on which the characters and the like are written is observed from a substantially vertical direction, is determined based on the calculation result of the original posture calculation program 425.
- the points at the four corners of the slit lightless image are converted and set, and within this area Find the number of pixels a included (S130 Do
- the set erect image area is first arranged on the XY plane (S1303), and for each pixel included therein, the three-dimensional spatial position is calculated based on the curvature ⁇ (X).
- ⁇ curvature
- S1304 rotate around the X-axis with an inclination ⁇ (S1305), and shift by the distance L in the Z-axis direction (S1306).
- the obtained three-dimensional spatial position is converted to coordinates (ccdcx, ccdcy) on the CCD image captured by the ideal camera using the above triangulation relational expression (SI 307) and used.
- the coordinates (ccdx, ccdy) on the CCD image captured by the actual force camera are converted into coordinates (ccdx, ccdy) by a known calibration method (SI 308).
- the state of the pixel of the image is obtained and stored in the working area 437 of the RAM 43 (S1309). This is repeated by the number of pixels a (S1310, S1302), and image data of the erect image is generated. After the processing area of the working area 437 is opened (S1311), the processing is terminated.
- the laser diode 21 of the slit light projecting unit 20 does not emit light
- an image without slit light is read from the CCD image sensor 32 (S200).
- the read image data is written to the memory card 55 (S210).
- the “normal mode” the three-dimensional spatial position L, the inclination 0, and the curvature ⁇ of the original P as described above are not calculated, and these data are not written into the S memory card 55. .
- the imaging apparatus 1 And the second slit light 72 is projected onto the document P, and the document P is imaged by the imaging lens 31 onto the CCD image sensor 32, and then the slit light is projected. First, capture an image of document P.
- Rd ' is used for each predetermined range.
- the illumination reflection portion S has a low red value R and a low Rd'Y value. The difference from the pixel containing slit light with Rd'Y value is clear, and the pixel containing slit light can be detected with high accuracy.
- the print portion I having the pixel component R component including the slit light is included in the print portion I having the R component, the luminance value Y is low and the Rd'Y value is low.
- the difference between the print portion I having the R component and the pixel including the slit light having the maximum Rd'Y value becomes clear, and the pixel including the slit light can be detected with high accuracy.
- the locus of the slit light is thus extracted, and the three-dimensional spatial position of each part of the locus of the slit light is calculated based on the principle of triangulation.
- the three-dimensional shape data, the image data of the image without slit light, and the force are written to the S memory card 55.
- the user switches the mode switching switch 59 to the “corrected imaging mode” side, and the desired range of the document P is within the imaging range with the viewfinder 53 or the LCD 51.
- image data as if the flat original P were imaged from the front. Can be stored in the memory card 55.
- the 3D shape detection device As a measure against camera shake when the 3D shape detection device is portable, there is no need to subtract the image when slit light is projected and the image when slit light is not projected, and only the image when slit light is projected. It is conceivable to extract slit light.
- slit light in the infrared region.
- the pixel at the barycentric position calculated in the slit light barycenter calculation process described above is a pixel that immediately includes the slit light. Is not treated, and whether or not the pixel corresponding to the pixel exists in the slit light non-image is searched. If the pixel force corresponding to the pixel at the center of gravity does not exist in the image without slit light, it is determined that the pixel is a pixel unique to the image with slit light, that is, a pixel including slit light. .
- FIG. 14 is a block diagram showing an electrical configuration of the imaging apparatus 1 that executes the slit light trajectory extraction processing according to the second embodiment.
- FIG. 14 is a diagram corresponding to FIG. 4. The same components as those shown in FIG. 4 are denoted by the same reference numerals, and description thereof will be omitted.
- the ROM 42 includes, in addition to the program described in FIG. 4, a luminance variance calculation program 426 for calculating a standard deviation regarding a color value for each small area in the image without slit light, A cross-correlation coefficient calculation program 427 for calculating the amount of deviation from the image without slit light, and a corresponding pixel search for searching whether a pixel detected from the image with slit light exists in the image without slit light Program 428.
- the RAM 43 includes, in addition to the various storage units described with reference to FIG. 4, a camera shake amount storage unit 438 that stores a shift amount between the slit light image and the slit light non-image calculated by the cross-correlation coefficient calculation program 428. ing.
- the slit light trajectory extraction processing according to the second embodiment will be described with reference to the flowchart in FIG.
- the amount of deviation between the image with slit light and the image without slit light is calculated (S1501).
- the amount of deviation between the image with slit light and the image without slit light can be obtained by calculating the cross-correlation coefficient cc between two pixels by the cross-correlation coefficient calculation program 427.
- the cross-correlation coefficient cc has a value of 1 to 1, and the position having the maximum value is the displacement amount.
- the image without slit light is divided into four large regions 14, and each large region 14 is divided into the ends (region 1) in each large region.
- the upper right, the upper left in the area 2, the lower left in the area 3, and the lower right in the area 4) it is divided into small areas from the center to the center, and the standard deviation of the luminance Y is calculated for each of the small areas.
- the standard deviation ⁇ ⁇ of the luminance Y is calculated by the luminance variance calculation program 426 using the following formula.
- (xc, yc) indicates a small area central pixel
- Rd indicates a size obtained by dividing the size of the small area into two equal parts.
- the image size is about 1200 pixels X 1600 pixels.
- the size of the small area is about 41 pixels X 41 pixels (Rd in the calculation formula is good).
- the center coordinates of the small area having the largest standard deviation in each large area 1 to 4 are set as the center position (xc, yc) for obtaining the correlation coefficient cc, near the center coordinates of the two images with and without slit light.
- the pixel position difference is (xd, yd)
- the cross-correlation number cc (xd, yd) at each (xd, yd) is calculated, and (xd, yd) with the largest cross-correlation coefficient is It can be the deviation amount.
- the difference in pixel position between the image with slit light and the image without slit light is (xd, yd)
- the luminance of the pixel in the image with slit light is Yl
- the luminance of the image without slit light is Is ⁇ 2.
- the image size is about 1200 pixels X 1600 pixels
- the trajectory 72a of the second slit light is extracted in the same manner as described in S701 and S702 in FIG.
- a search parameter specifying a search range to be performed is set (S1502), and a slit light barycenter position calculation process is executed (S1503). Then, it is searched whether or not the pixel corresponding to the pixel at the barycentric position calculated in the slit light barycentric position calculation processing (S1503) exists in the slit light no image (S1504).
- a pixel detected as a pixel including slit light from the image with slit light is located at (xp, yp) in the large area 4.
- dx4, dy4 the shift amount of the large area 4 in the slit light no image calculated in S1501
- the amount of camera shake is about several tens of pixels, so Rs may be set to about several tens of pixels.
- the pixel exists in both the image with slit light and the image without slit light. Pixel. That is, since the pixels detected in the image with slit light are not recognized as the pixels including the slit light, the pixels around the calculated position of the center of gravity are removed from the extraction target power (S1506), and again, Repeat the processing from S1503 to S1505.
- the corresponding pixel is not searched in the image without slit light (S1505: NO)
- the pixel does not exist in the image without slit light, and is a pixel unique to the image with slit light. That is, the pixel is determined to be a pixel including slit light, and the calculated center of gravity position is stored in the slit trajectory information storage unit 436 (S1507).
- the trajectory 72a of the second slit light is extracted in this way, the trajectory 71a of the first slit light is then extracted in the same manner as in the processing of S704 in FIG.
- a search parameter for specifying a search range for extracting is set (S1509), and then the processing from S1509 to S1516 corresponding to the processing from S705 to S711 in FIG. 7 is repeated.
- the center of gravity calculated in the slit light centroid calculation processing (S1511)
- the pixel power corresponding to the position is searched for in the slit light no image (S1512) 0.
- the corresponding pixel is found as a result of searching the slit light no image (S1512) 1513: Yes)
- the pixels around the barycentric position for which the pixel is calculated are excluded from the extraction target (S1514), and the processes from S1511 to S1513 are performed again.
- the corresponding pixel is not found in the slit light no image (S1513: NO)
- the calculated center of gravity position is stored in the slit locus information storage unit 436 (S1515).
- the detected pixel is excluded from the extraction target because it is not a pixel common to the image with slit light and the image without slit light, that is, a pixel that does not include slit light. Is done.
- the pixel is determined to be a pixel existing only in the image with slit light, that is, a pixel including slit light, and is set as an extraction target pixel. Therefore, compared to the slit light trajectory extraction processing of the first embodiment, pixels including the slit light can be detected with higher accuracy, and the trajectory of the slit light can be extracted with higher accuracy.
- the process of S140 in the flowchart of FIG. 5 is positioned as a pattern light position extraction unit or a pattern light position extraction step.
- the processing of S150 to S170 in the flowchart of FIG. 5 is positioned as a three-dimensional shape calculation means or a three-dimensional shape detection step.
- the processing of S802 in the flowchart of FIG. 8 is positioned as a hue parameter calculation step or a hue parameter calculation step.
- the processing of S803 in the flowchart of FIG. 8 is positioned as a luminance parameter calculation unit or a luminance parameter calculation step.
- the process of S804 in the flowchart of FIG. 8 is positioned as an emphasis parameter calculation unit or an emphasis parameter calculation step.
- the processing of S1504 and S1505 and the processing of S1512 and S1513 in the flowchart of FIG. 15 are positioned as search means.
- the process of S1501 in the flowchart of FIG. 15 is positioned as a movement amount calculation unit.
- the main component is not limited to the red component
- the green component and the blue component are not limited to the red component.
- the green difference value Rd R— (G + BZ2)
- the red difference value Rd is calculated by subtracting the average value of the green value G and the blue value B from the red value R.
- a method of calculating the red difference value Rd the weighted average of the green value G and the blue value B may be subtracted from the red value R.
- the object to be imaged by the imaging apparatus 1 may be a smooth surface of a solid block or a surface of an object having a ridgeline in some cases, in addition to the sheet-shaped original P.
- the effect of detecting the three-dimensional shape of the target object can be similarly exhibited.
- the entire shape of the document P is estimated by assuming that the trajectory 71a of the first slit light is the cross-sectional shape of the document P.
- image correction for shape deformation such as the curvature of the document P can be performed.
- the slit light projecting unit 20 is configured to emit two rows of slit lights of the first slit light 71 and the second slit light 72.
- the emitted slit light is not limited to two rows, but may be configured to emit three or more rows.
- a third slit light similar to the second slit light 72 is formed.
- the slit light projecting unit 20 is configured so that the document P is projected above the second slit light 72. good.
- the vertical curved shape of the document P can be estimated from the position of each point of the locus of the slit light of the first, third, and third slit lights, thereby correcting the non-slit image to make it easier to see. It can be an image.
- a laser diode 21 that emits a red laser beam is used as a light source.
- any other devices that can output a light beam such as a surface emitting laser, an LED, and an EL device, may be used. There may be.
- a transparent flat plate formed on one surface with a diffraction grating for diffracting a predetermined ratio of the power of the incident laser beam in a predetermined direction may be used.
- the first-order laser beam diffracted by the diffraction grating is used as the second slit light 72 and transmitted as it is.
- the 0th-order light laser beam can be used as the first slit light 71.
- the slit light emitted from the slit light projecting unit 20 may be a striped light pattern having a constant width, in addition to a fine line sharply narrowed in a direction orthogonal to the longitudinal direction.
- the positional relationship between the first slit light 71 and the second slit light 72 may be reversed in the first direction, that is, the second slit light 72 is located on the lower side when viewed from the imaging device 1, and Each optical element may be arranged so that the first slit is formed in the direction of.
- the imaging device 1 is configured to capture an image with slit light and an image without slit light using the imaging lens 31 and the CCD image sensor 32.
- the imaging apparatus is provided with an imaging lens and a CCD image sensor for imaging an image with a slit separately in addition to the imaging lens 31 and the CCD image sensor 32. Good,.
- it is possible to eliminate the lapse of time between capturing an image with slit light and an image without slit light (time for transferring the image data of the CCD image sensor 32, etc.). Therefore, it is possible to improve the accuracy of the three-dimensional shape of the target object to be detected without any shift in the imaging range of the image without slit light with respect to the image with slit light.
- the imaging device 1 of the present embodiment can be made smaller and inexpensive with fewer components.
- the hue parameter calculating means calculates a parameter obtained by subtracting an average value of color values and other color values corresponding to main hues constituting the pattern light. It may be configured to calculate as a hue parameter. According to such a configuration, the hue parameter is obtained by subtracting the average value of the other color values of the color values corresponding to the main hues constituting the pattern light. A pixel having a large color value corresponding to a main hue can be emphasized as compared with other pixels. That is, since the pixel including the pattern light has a large color value corresponding to the main hue constituting the pattern light, the pixel including the pattern light can be emphasized more than other pixels. Conversely, pixels having a plurality of close color values can be excluded from the detection target.
- the three-dimensional shape detecting device emits pattern light based on a luminance parameter calculated by the luminance parameter calculating means and a hue parameter calculated by the hue parameter calculating means.
- the image processing apparatus may further include an emphasis parameter calculating unit that calculates an emphasis parameter for emphasizing a pixel including the pattern light in the image from other pixels in pixel units.
- the pattern light detecting means detects pixels including the pattern light from the no-turn light projection image based on the enhancement parameter calculated by the enhancement parameter calculating means.
- the enhancement parameter calculation means may be configured to calculate a parameter obtained by multiplying the hue parameter and the luminance parameter as an enhancement parameter.
- the enhancement parameter is calculated by multiplying the hue parameter and the luminance parameter, when both the hue parameter and the luminance parameter are high, the value of the enhancement parameter Therefore, the difference between the pixel including the pattern light and the other pixels becomes clearer, and the effect that the pixel including the pattern light can be detected with higher accuracy can be obtained.
- a predetermined threshold value is set for the emphasis parameter, and the pattern light detecting means transmits the pattern light to the pixel whose emphasis parameter also exceeds the threshold value in the Noturn light projection image. It may be configured to detect that the pixel includes the pixel.
- the pattern light detecting means detects, from the pattern light projection image, a pixel whose emphasis meter exceeds the threshold value as a pixel including the pattern light.
- the pattern light detection means detects a pixel including the pattern light for each predetermined region along the pattern light in the pattern light projection image, and It may be configured to detect a pixel having the maximum enhancement parameter in a predetermined area as a pixel including the pattern light.
- the pattern light detection means detects a pixel having the maximum enhancement parameter in the predetermined area as a pixel including the pattern light. If the pixel most likely to be a pixel containing pattern light can be detected from within, the! / ⁇ ⁇ effect is obtained.
- the imaging means in addition to the pattern light projection image, includes the pattern light of the target object in a state where the pattern light is not projected, corresponding to the pattern light projection image. It may be configured to capture a non-light projection image as well.
- the three-dimensional shape detection apparatus uses a search means for searching whether or not a pixel corresponding to a pixel including the pattern light detected by the pattern light detection means exists in the pattern light non-light projection image. Further, it may be provided. Further, the pattern light position extracting means determines whether or not the pixel corresponding to the pixel detected by the pattern light detecting means is not searched from the pattern light non-projection image by the searching means. Based on this, it is configured to extract the position of the pattern light.
- the pattern light position extraction unit outputs the detected pixel.
- the trajectory of the pattern light is extracted based on. Therefore, from the pixels detected by the pattern light detecting means, only pixels peculiar to the pattern light projection image, which are not present in the pattern light non-projection image, that is, pixels including the pattern light are extracted. be able to. As a result, an effect is obtained that the extraction accuracy of the position of the pattern light can be further improved.
- the three-dimensional shape detection device may further include a movement amount calculating means for calculating a movement amount of the pattern light non-projection image with respect to the pattern light projection image.
- the search means is a putter calculated by the movement amount calculating means.
- the pattern light non-projection image can be searched based on the amount of movement of the pattern light non-projection image with respect to the light projection image.
- the retrieving means determines whether or not the pattern light non-light projection image is calculated based on the movement distance of the pattern light non-light projection image with respect to the pattern light projection image calculated by the movement distance calculation means. Can be searched. Therefore, according to the above configuration, it is possible to set the search range to a wide range in consideration of the case where the pattern light projection image and the pattern light non-projection image are shifted due to camera shake. As compared with, the effect that the search range can be narrowed down and the search can be executed at high speed and with high accuracy can be obtained.
- the target object calculated by the three-dimensional shape calculation device of the three-dimensional shape detection device Based on the three-dimensional shape of the target object, the pattern light non-projection image of the target object in a state where the pattern light imaged by the imaging means of the three-dimensional shape detection device is not projected is approximately the predetermined surface of the target object.
- an imaging apparatus including: a plane image correction unit configured to correct a plane image observed from a vertical direction.
- the position of the pattern light can be extracted with high accuracy by the three-dimensional shape detection device, and the accurate three-dimensional shape can be calculated.
- the effect that the image can be corrected to a planar image is obtained.
- the three-dimensional shape detection program executes the pattern light projection image based on the luminance parameter calculated in the luminance parameter calculation step and the hue parameter calculated in the hue parameter calculation step
- the method may further include an emphasis parameter calculating step of calculating an emphasis parameter for emphasizing a pixel including the pattern light from another pixel in pixel units.
- a pixel including the pattern light is detected from the pattern light projection image based on the enhancement parameter calculated in the enhancement parameter calculation step. Is also good.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/390,067 US7391522B2 (en) | 2003-09-29 | 2006-03-28 | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003337066 | 2003-09-29 | ||
JP2003-337066 | 2003-09-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/390,067 Continuation-In-Part US7391522B2 (en) | 2003-09-29 | 2006-03-28 | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005031253A1 true WO2005031253A1 (ja) | 2005-04-07 |
Family
ID=34386112
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/014167 WO2005031253A1 (ja) | 2003-09-29 | 2004-09-28 | 3次元形状検出装置、撮像装置、及び、3次元形状検出プログラム |
Country Status (2)
Country | Link |
---|---|
US (1) | US7391522B2 (ja) |
WO (1) | WO2005031253A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007271530A (ja) * | 2006-03-31 | 2007-10-18 | Brother Ind Ltd | 3次元形状検出装置及び3次元形状検出方法 |
EP2198780B1 (en) * | 2008-12-19 | 2018-01-31 | Sirona Dental Systems GmbH | Method and device for optical scanning of three-dimensional objects by means of a dental 3D camera using a triangulation method |
DE102008054985B4 (de) * | 2008-12-19 | 2012-02-02 | Sirona Dental Systems Gmbh | Verfahren und Vorrichtung zur optischen Vermessung von dreidimensionalen Objekten mittels einer dentalen 3D-Kamera unter Verwendung eines Triangulationsverfahrens |
TWI379224B (en) * | 2009-06-30 | 2012-12-11 | Cheng Uei Prec Ind Co Ltd | Optical positing apparatus and positing method thereof |
JP2015172493A (ja) * | 2014-03-11 | 2015-10-01 | 株式会社東芝 | 距離測定装置 |
JP6970376B2 (ja) * | 2017-12-01 | 2021-11-24 | オムロン株式会社 | 画像処理システム、及び画像処理方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1194530A (ja) * | 1997-09-12 | 1999-04-09 | Toyota Motor Corp | 塗布物の塗布状態判定方法およびその装置 |
JP2003254727A (ja) * | 2002-03-01 | 2003-09-10 | Ckd Corp | 三次元計測装置 |
JP2004108950A (ja) * | 2002-09-18 | 2004-04-08 | Ricoh Co Ltd | 光学式形状測定システム |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4634279A (en) * | 1984-02-06 | 1987-01-06 | Robotic Vision Systems, Inc. | Method of three-dimensional measurement with few projected patterns |
US4846577A (en) * | 1987-04-30 | 1989-07-11 | Lbp Partnership | Optical means for making measurements of surface contours |
US5668631A (en) | 1993-12-20 | 1997-09-16 | Minolta Co., Ltd. | Measuring system with improved method of reading image data of an object |
JP3282331B2 (ja) | 1993-12-20 | 2002-05-13 | ミノルタ株式会社 | 3次元形状測定装置 |
US5561526A (en) * | 1994-05-26 | 1996-10-01 | Lockheed Missiles & Space Company, Inc. | Three-dimensional measurement device and system |
US6028672A (en) * | 1996-09-30 | 2000-02-22 | Zheng J. Geng | High speed three dimensional imaging method |
US6690474B1 (en) * | 1996-02-12 | 2004-02-10 | Massachusetts Institute Of Technology | Apparatus and methods for surface contour measurement |
JP3493886B2 (ja) | 1996-04-23 | 2004-02-03 | ミノルタ株式会社 | デジタルカメラ |
JPH10267628A (ja) * | 1997-01-23 | 1998-10-09 | Hitachi Ltd | 3次元形状検出方法およびその装置並びに基板の製造方法 |
DE19821611A1 (de) * | 1998-05-14 | 1999-11-18 | Syrinx Med Tech Gmbh | Verfahren zur Erfassung der räumlichen Struktur einer dreidimensionalen Oberfläche |
US6341016B1 (en) * | 1999-08-06 | 2002-01-22 | Michael Malione | Method and apparatus for measuring three-dimensional shape of object |
US6449044B1 (en) | 2001-08-06 | 2002-09-10 | General Motors Corporation | Method for checking cam lobe angles |
-
2004
- 2004-09-28 WO PCT/JP2004/014167 patent/WO2005031253A1/ja active Application Filing
-
2006
- 2006-03-28 US US11/390,067 patent/US7391522B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1194530A (ja) * | 1997-09-12 | 1999-04-09 | Toyota Motor Corp | 塗布物の塗布状態判定方法およびその装置 |
JP2003254727A (ja) * | 2002-03-01 | 2003-09-10 | Ckd Corp | 三次元計測装置 |
JP2004108950A (ja) * | 2002-09-18 | 2004-04-08 | Ricoh Co Ltd | 光学式形状測定システム |
Also Published As
Publication number | Publication date |
---|---|
US7391522B2 (en) | 2008-06-24 |
US20060192082A1 (en) | 2006-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI668997B (zh) | 產生全景深度影像的影像裝置及相關影像裝置 | |
US9826216B1 (en) | Systems and methods for compact space-time stereo three-dimensional depth sensing | |
EP3288259B1 (en) | Array detector for depth mapping | |
US11774769B2 (en) | Depth measurement using a pulsed structured light projector | |
US9958383B2 (en) | Range camera | |
KR20180033222A (ko) | 프리즘-기반 눈 추적 | |
EP1413850A3 (en) | Optical sensor for measuring position and orientation of an object in three dimensions | |
WO2005029408A1 (ja) | 画像処理装置、及び、撮像装置 | |
US7440119B2 (en) | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting method | |
JP2009043139A (ja) | 位置検出装置 | |
US10679370B2 (en) | Energy optimized imaging system with 360 degree field-of-view | |
US20140240228A1 (en) | User interface display device | |
US7391522B2 (en) | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting program | |
US20230013134A1 (en) | Electronic device | |
US7365301B2 (en) | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting program | |
JP2003207324A (ja) | 3次元情報取得装置及び3次元情報取得方法 | |
JP3477279B2 (ja) | 視線検出装置 | |
TWI258706B (en) | Method and device for optical navigation | |
JP4148700B2 (ja) | 目画像撮像装置 | |
JP2005128006A (ja) | 3次元形状検出装置、撮像装置、及び、3次元形状検出プログラム | |
JP2005148813A5 (ja) | ||
US7372580B2 (en) | Three-dimensional shape detecting device, three-dimensional shape detecting system, and three-dimensional shape detecting program | |
JP4360145B2 (ja) | 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 | |
JP4608855B2 (ja) | 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 | |
Wang | High resolution 2D imaging and 3D scanning with line sensors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11390067 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 11390067 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |