WO2015189985A1 - 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 - Google Patents
形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 Download PDFInfo
- Publication number
- WO2015189985A1 WO2015189985A1 PCT/JP2014/065751 JP2014065751W WO2015189985A1 WO 2015189985 A1 WO2015189985 A1 WO 2015189985A1 JP 2014065751 W JP2014065751 W JP 2014065751W WO 2015189985 A1 WO2015189985 A1 WO 2015189985A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- attention area
- pattern
- unit
- region
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/2416—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures of gears
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/0006—Industrial image inspection using a design-rule based approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Definitions
- the present invention relates to a shape measuring device, a structure manufacturing system, a shape measuring method, a structure manufacturing method, a shape measuring program, and a recording medium.
- the shape measuring device includes a projection unit that projects a predetermined pattern such as slit light on the object to be measured, and an imaging unit that captures a pattern image drawn by the light projected on the measurement target region, and uses a light cutting method.
- a projection unit that projects a predetermined pattern such as slit light on the object to be measured
- an imaging unit that captures a pattern image drawn by the light projected on the measurement target region, and uses a light cutting method.
- an optical measuring device for example, see Patent Document 1).
- Patent Document 1 describes a displacement sensor that detects an image included in a set measurement target region from image data acquired by imaging and measures the displacement of the measurement target. Further, Patent Document 1 describes that the measurement target region is moved in the displacement measurement direction by following the movement of the reference surface.
- the shape measuring apparatus described in Patent Document 1 moves the measurement target region in the displacement measurement direction by following the movement of the reference plane. Specifically, based on the movement of the step boundary line on the measurement target object, the measurement target area is moved in the displacement measurement direction by following the movement of the reference plane.
- the movement direction and movement distance of the pattern position on the image data relative to the relative movement of the pattern and the measurement target object are the shape of the measurement target object, the relative movement direction of the pattern and the measurement target object, the projection unit, the imaging unit, and the measurement target. Since it changes depending on the relative position to the object, the measurement target region may not be set appropriately.
- An object is to provide a shape measuring device, a structure manufacturing system, a shape measuring method, a structure manufacturing method, a shape measuring program, and a recording medium.
- the projection unit that projects the pattern onto the measurement target, the imaging unit that captures the measurement target on which the pattern is projected by the projection unit, and the projection unit and the measurement target are relatively moved.
- a moving unit capable of moving a projection position of the pattern on the measurement target on the measurement target, and a region of interest for acquiring information used for measurement of the measurement target in the imaging unit so as to include the image of the pattern.
- a shape measuring device including an attention area setting unit that is set in at least a part of an imaged area.
- a projection unit that projects a pattern onto a measurement target, an imaging unit that captures a measurement target on which the pattern is projected by the projection unit, and the projection unit and the measurement target are relative to each other.
- a moving unit capable of moving and moving a projection position of the pattern on the measurement target onto the measurement target, and detecting a presence state of an image of the pattern projected on the measurement target captured by the imaging unit
- a reference region of interest that can set a reference region of interest, and generates a region of interest that sets a region for acquiring information used for the measurement according to the presence state of the pattern image in the reference region of interest.
- a shape measuring apparatus is provided.
- the molding apparatus that molds the structure based on the design information related to the shape of the structure, and the first aspect or the second that measures the shape of the structure molded by the molding apparatus.
- a structure manufacturing system including the shape measuring device according to the aspect and a control device that compares shape information indicating the shape of the structure measured by the shape measuring device with design information.
- a pattern is projected onto a measurement target, an image of the pattern projected onto the measurement target from a direction different from the projection direction of the pattern is acquired to obtain image data
- the structure is formed based on the design information related to the shape of the structure, and the shape of the formed structure is measured by the shape measuring method of the fourth aspect.
- a structure manufacturing method is provided that includes comparing shape information indicating the shape of the measured structure with design information.
- a pattern is projected onto the measurement target, an image of the pattern projected onto the measurement target from a direction different from the pattern projection direction is captured, and image data is acquired.
- a shape measurement program for measuring the shape of a measurement object on the basis of a pattern image, wherein the projection of the pattern projection position moves relative to the measurement object on a computer, and information used for measurement of the measurement object
- the target area for acquisition is set to at least a part of the area captured by the imaging unit so as to include the pattern image, and the measurement target is based on the position of the pattern image located within the target area of the image data.
- a shape measuring program is provided for executing the following:
- a computer-readable recording medium that records the shape measurement program of the sixth aspect.
- a pattern image can be appropriately extracted and used for measurement of a measurement target, and the shape of the object to be measured can be measured under more appropriate conditions.
- FIG. 1 is a perspective view showing a shape measuring apparatus according to this embodiment.
- FIG. 2 is a schematic diagram showing the configuration of the shape measuring apparatus of the present embodiment.
- FIG. 3 is a block diagram showing a schematic configuration of the control device of the shape measuring apparatus of the present embodiment.
- FIG. 4 is a block diagram showing a schematic configuration of the attention area setting unit of the control device.
- FIG. 5 is an explanatory diagram for explaining the measurement operation of the shape measuring apparatus according to the present embodiment.
- FIG. 6 is an explanatory diagram for explaining an example of a screen displayed on the shape measuring apparatus of the present embodiment.
- FIG. 7 is an explanatory diagram for explaining the measurement operation of the shape measuring apparatus according to the present embodiment.
- FIG. 1 is a perspective view showing a shape measuring apparatus according to this embodiment.
- FIG. 2 is a schematic diagram showing the configuration of the shape measuring apparatus of the present embodiment.
- FIG. 3 is a block diagram showing a schematic configuration of the control device of the shape measuring apparatus of the present embodiment.
- FIG. 8 is a flowchart illustrating an example of the attention area setting operation of the shape measuring apparatus according to the present embodiment.
- FIG. 9 is a flowchart illustrating an example of the second region-of-interest generation determination process of the shape measuring apparatus according to the present embodiment.
- FIG. 10 is a flowchart illustrating an example of the second region-of-interest generation determination process of the shape measuring apparatus according to the present embodiment.
- FIG. 11 is an explanatory diagram of an example of reference attention area data.
- FIG. 12 is an explanatory diagram for explaining an example of the attention area setting operation.
- FIG. 13 is a flowchart showing an example of the measuring operation of the shape measuring apparatus according to the present embodiment.
- FIG. 14 is a flowchart illustrating an example of the attention area setting operation of the shape measuring apparatus according to the present embodiment.
- FIG. 15 is a block diagram illustrating a schematic configuration of the attention area setting unit and the storage unit of the control device.
- FIG. 16 is a flowchart illustrating an example of the attention area setting operation of the shape measuring apparatus according to the present embodiment.
- FIG. 17 is an explanatory diagram of an example of attention area template data.
- FIG. 18 is an explanatory diagram for explaining an example of the attention area setting operation.
- FIG. 19 is an explanatory diagram for explaining an example of the attention area setting operation.
- FIG. 20 is an explanatory diagram for explaining an example of the attention area setting operation.
- FIG. 21 is an explanatory diagram for explaining an example of the attention area setting operation.
- FIG. 15 is a block diagram illustrating a schematic configuration of the attention area setting unit and the storage unit of the control device.
- FIG. 16 is a flowchart illustrating an example of the attention area setting operation of
- FIG. 22 is an explanatory diagram for explaining an example of the attention area setting operation.
- FIG. 23 is a schematic diagram showing a configuration of a system having a shape measuring apparatus.
- FIG. 24 is a diagram showing a configuration of the structure manufacturing system of the present embodiment.
- FIG. 25 is a flowchart showing the structure manufacturing method of the present embodiment.
- an XYZ orthogonal coordinate system is set, and the positional relationship of each part will be described with reference to this XYZ orthogonal coordinate system.
- the Z-axis direction is set, for example, in the vertical direction
- the X-axis direction and the Y-axis direction are set, for example, in directions that are parallel to the horizontal direction and orthogonal to each other.
- the rotation (inclination) directions around the X, Y, and Z axes are the ⁇ X, ⁇ Y, and ⁇ Z axis directions, respectively.
- FIG. 1 is a diagram illustrating an appearance of a shape measuring apparatus 1 according to the present embodiment.
- FIG. 2 is a schematic diagram showing a schematic configuration of the shape measuring apparatus of the present embodiment.
- FIG. 3 is a block diagram showing a schematic configuration of the control device of the shape measuring apparatus of the present embodiment.
- the shape measuring apparatus 1 measures the three-dimensional shape of the measurement target object (object to be measured) M using, for example, a light cutting method.
- the shape measuring device 1 includes a probe moving device 2, an optical probe 3, a control device 4, a display device 5, an input device 6, and a holding and rotating device 7.
- the shape measuring device 1 measures the shape of the object M to be measured that is held by the holding and rotating device 7 provided on the base B.
- the optical probe 3 captures an image of the line pattern on the object to be measured M while projecting the line pattern on the object to be measured M.
- the probe moving device 2 and the holding and rotating device 7 serve as a moving mechanism that relatively moves the probe and the object M to be measured.
- the probe moving device 2 moves the optical probe 3 relative to the measurement target M so that the line pattern projected on the optical probe 3 is projected onto the measurement target region of the measurement target M. Further, the optical probe 3 is moved with respect to the measurement object M so that the projection position of the line pattern can be sequentially moved on the measurement object M.
- the probe moving device 2 includes a drive unit 10 and a position detection unit 11.
- the drive unit 10 includes an X moving unit 50X, a Y moving unit 50Y, a Z moving unit 50Z, a first rotating unit 53, and a second rotating unit 54.
- the X moving part 50X is provided so as to be movable with respect to the base B in the direction of the arrow 62, that is, in the X axis direction.
- the Y moving unit 50Y is provided so as to be movable in the direction of the arrow 63, that is, in the Y-axis direction with respect to the X moving unit 50X.
- the Y moving unit 50Y is provided with a holding body 52 extending in the Z-axis direction.
- the Z moving part 50Z is provided so as to be movable with respect to the holding body 52 in the direction of the arrow 64, that is, in the Z-axis direction.
- the X moving unit 50X, the Y moving unit 50Y, and the Z moving unit 50Z move together with the first rotating unit 53 and the second rotating unit 54 to move the optical probe 3 in the X axis direction, the Y axis direction, and the Z axis direction.
- the mechanism is configured.
- the first rotating portion 53 rotates the optical probe 3 supported by a holding member (holding portion) 55 described later in a rotation direction around a rotation axis (rotation axis) 53 a parallel to the X axis, that is, in the direction of the arrow 65. Then, the posture of the optical probe 3 is changed. In particular, the projection direction of the pattern projected from the optical probe 3 onto the measurement object M is changed by the first rotating unit 53.
- the second rotating portion 54 rotates the optical probe 3 supported by the holding member 55 in a rotation direction around an axis parallel to a direction in which a first holding portion 55A described later extends, that is, in the direction of the arrow 66. The posture of the optical probe 3 is changed.
- the second rotating unit 54 changes the longitudinal direction of the line pattern projected from the optical probe 3 with respect to the object M to be measured.
- the shape measuring apparatus 1 has a reference sphere 73a or a reference sphere 73b used for correcting the relative position between the optical probe 3 and the holding member 55 holding the optical probe 3.
- the holding and rotating device 7 includes a table 71 that holds the object M, a rotation driving unit 72 that rotates the table 71 in the ⁇ Z axis direction, that is, the direction of the arrow 68, and the table 71. And a position detector 73 for detecting the position in the rotation direction.
- the position detection unit 73 is an encoder device that detects the rotation of the rotary shaft of the table 71 or the rotation drive unit 72.
- the holding and rotating device 7 rotates the table 71 by the rotation driving unit 72 based on the result detected by the position detecting unit 73.
- the linear pattern projected from the optical probe 3 can be projected onto an arbitrary measurement target area of the object M by the holding and rotating device 7 and the probe moving device 3.
- the driving of the X moving unit 50X, the Y moving unit 50Y, the Z moving unit 50Z, the first rotating unit 53, the second rotating unit 54, and the holding rotating device 7 is driven by the position detecting unit 11 configured by an encoder device or the like. Control is performed by the control device 4 based on the detection result.
- the optical probe 3 is supported by the holding member 55.
- the holding member 55 extends in a direction orthogonal to the rotation axis 53a, and is supported by the first rotating portion 53.
- a second holding portion (second portion, second member) 55B that is provided in the portion and extends in parallel with the rotation axis 53a.
- the + Z side end of the first holding portion 55 ⁇ / b> A is disposed on the far side from the DUT M.
- the first holding unit 55A and the second holding unit 55B are orthogonal to each other.
- the optical probe 3 is supported on the + X side end portion of the second holding portion 55B.
- the position of the rotation axis 53 a of the first rotating unit 53 is arranged closer to the object to be measured M than the optical probe 3.
- a counter balance 55c is provided at the end of the first holding portion 55A on the side closer to the object M to be measured. Therefore, the moment generated on the holding member 55 side and the moment generated on the counter balance 55c side are balanced with respect to the rotation axis 53a of the first rotating portion 53.
- the optical probe 3 includes a light source device 8 and an imaging device 9.
- the light source device 8 and the imaging device 9 are fixed by a common housing. Therefore, the positional relationship between the projection direction of the line pattern by the light source device 8 and the shooting direction by the imaging device 9 is kept fixed. Therefore, based on the triangulation method based on the position of the image of the line pattern detected by the imaging device 9, based on the projection direction of the line pattern, the shooting direction of the imaging device 9, and the positional relationship between them. The position of the measuring object M in the measurement target region in the three-dimensional space can be obtained.
- the light source device (projection unit) 8 of the optical probe 3 is controlled by the control device 4 and projects a line-shaped pattern onto the measurement area of the object M held by the holding and rotating device 7.
- An illumination optical system 13 is provided.
- the light source 12 of this embodiment includes a laser diode, for example.
- the light source 12 may include a solid light source such as a light emitting diode (LED) other than the laser diode.
- the light emission amount of the light source 12 of this embodiment is controlled by the control device 4. In particular, it is controlled by the dimming control unit 38 in the control device 4.
- the illumination optical system 13 adjusts the spatial light intensity distribution of the light emitted from the light source 12.
- the illumination optical system 13 of the present embodiment is composed of a plurality of optical elements including, for example, a cylindrical lens.
- the illumination optical system 13 may be a single optical element or may include a plurality of optical elements.
- the light emitted from the light source 12 is emitted in the first direction from the light source device 8 toward the object to be measured M, with the spot being expanded in the direction in which the cylindrical lens has positive power.
- the light source device 8 and the imaging device 9 are arranged on a surface orthogonal to the rotation axis 53a, and the traveling direction of the light projected from the light source device 8 passes through the surface orthogonal to the rotation axis 53a.
- the direction parallel to the rotation axis 53a is elongated.
- the line pattern is parallel to the rotation axis 53a.
- the longitudinal direction of the linear pattern can be changed by the second rotating unit 54 described above.
- the longitudinal direction of the line-shaped pattern according to the spreading direction of the surface of the object to be measured M, it is possible to efficiently measure.
- the shooting direction by the imaging device 9 also changes. Therefore, even if the object to be measured has a shape in which convex portions such as gears are arranged, the shape of the root can be measured by setting the photographing direction along the tooth trace.
- the illumination optical system 13 may include a diffractive optical element such as CGH, and the spatial light intensity distribution of the illumination light beam L emitted from the light source 12 may be adjusted by the diffractive optical element.
- the projection light whose spatial light intensity distribution is adjusted may be referred to as a pattern.
- the illumination light beam L is an example of a pattern.
- the imaging device (imaging unit) 9 includes an imaging element 20, an imaging optical system 21, a diaphragm 23, and a diaphragm driving unit 24.
- the illumination light beam L projected from the light source device 8 onto the object to be measured M is reflected and scattered by the surface of the object to be measured M, and at least a part thereof enters the imaging optical system 21.
- the imaging optical system 21 connects the image of the line pattern projected on the surface of the object to be measured M by the light source device 8 to the image sensor 20 together with the image of the object to be measured M.
- the image sensor 20 captures an image formed by the imaging optical system 21.
- the image processing unit 25 generates image data from the light reception signal received by the image sensor 20.
- the diaphragm 23 has an opening whose size can be changed, and the amount of light passing through the imaging optical system 21 can be controlled by changing the size of the opening.
- the size of the opening of the diaphragm 23 can be adjusted by the diaphragm driving unit 24.
- the diaphragm driving unit 24 is controlled by the control device 4. In particular, it is controlled by the dimming control unit 38 in the control device 4.
- an object plane 21 a is set so as to include the projection direction of the line pattern projected from the light source device 8, and the object plane and the light receiving surface 20 a (image plane) of the image sensor 20 are conjugated.
- the imaging optical system 21 and the image sensor 20 are arranged so as to satisfy the following relationship.
- the propagation direction of the illumination light beam L is substantially parallel to a plane including the projection direction of the illumination light beam L from the light source device 8 and the longitudinal direction of the spot shape of the illumination light beam L.
- the control device 4 controls each part of the shape measuring device 1.
- the control device 4 calculates the three-dimensional shape of the measurement target region of the object M to be measured based on the imaging result of the optical probe 3 and the positional information of the probe moving device 2 and the holding and rotating device 7.
- the shape information in the present embodiment indicates at least one of the shape, size, unevenness distribution, surface roughness, and position (coordinates) of a point on the measurement target surface regarding at least a part of the measurement target M to be measured. Contains information.
- a display device 5 and an input device 6 are connected to the control device 4. As illustrated in FIG. 3, the control device 4 includes a control unit 30 and a storage unit 31.
- the control unit 30 has a circuit block necessary for accurately measuring the object M to be measured.
- the functions necessary for correctly measuring the object M to be measured may be realized by executing a corresponding program using the central processing unit.
- the control unit 30 includes an attention area setting unit 32, a measurement range setting unit 36, a dimming area setting unit 37, a dimming control unit 38, a measuring unit 39, and an operation control unit 40.
- the attention area setting unit 32 sets an attention area indicating an area on the image data.
- This attention area is used to limit the search range of the image of the illumination light beam L in order to easily extract the image of the illumination light beam L projected on the measurement object M, which is imaged by the image sensor 20. Or for limiting the range for acquiring the brightness information of the image of the illumination light beam L acquired for the exposure control at the time of imaging or the illumination light quantity control of the illumination light beam L.
- the object to be measured M has a shape in which the pitch of convex portions such as gears is narrow and the surface thereof is glossy, the surface of the object M exhibits an action like a mirror.
- An image of the illumination light beam L is also formed outside the irradiated region. This is called a multiple reflection image.
- the attention area setting unit 32 receives data from the imaging device 9 and receives information on the operation of each unit, specifically information on the relative position between the optical probe 3 and the object M to be measured, from the operation control unit 40. .
- the attention area setting unit 32 sets the attention area at which position with respect to the area that can be imaged by the imaging device 9 (or the visual field range that can be acquired by the imaging device 20) from the position setting data 42 of the attention area of the storage unit 31. Get the location information.
- the attention area setting unit 32 acquires reference attention area data from the position setting data 44 of the reference attention area.
- the attention area setting unit 32 outputs the position information of the attention area set within the imaging range by the user or the like to the position setting data 42 of the attention area in the storage unit 31.
- the measurement range setting unit 36 sets the measurement range based on the position setting data 42 of the region of interest or an instruction input from the input device 6.
- the measurement range setting unit 36 outputs the set measurement range to the measurement unit 39.
- the dimming area setting unit 37 sets the dimming area settable range and the dimming area based on the position setting data 42 of the attention area or an instruction input from the input device 6.
- the dimming area setting unit 37 outputs the set dimming area to the dimming control unit 38.
- the dimming control unit 38 receives the dimming area from the dimming area setting unit 37.
- the dimming control unit 38 determines dimming conditions, for example, operating conditions when acquiring image data of the light source device 8 or the imaging device 9 based on information of image data in the dimming region.
- the dimming control unit 38 outputs the determined dimming conditions to the light source device 8 or the imaging device 9.
- the measurement unit 39 receives the measurement range set by the measurement range setting unit 36.
- the measurement unit 39 receives image data acquired by the imaging device 9.
- the measurement unit 39 receives information on the operation of each unit from the operation control unit 40, specifically, information on the relative position between the optical probe 3 and the object M to be measured.
- the measurement unit 39 has an image of a line pattern included in the measurement range of the image data (also referred to as an image of line light in the present embodiment) with respect to the relative position between the optical probe 3 that has acquired the image data and the object M to be measured. ) And the outer shape of the object M to be measured is measured based on the image of the pattern.
- the operation control unit 40 controls the operation of the probe moving device 2, the optical probe 3, and the holding and rotating device 7.
- the motion control unit 40 outputs motion control information to the attention area setting unit 32 and the measurement unit 39.
- FIG. 4 is a block diagram showing a schematic configuration of the attention area setting unit of the control device.
- the attention area setting unit 32 includes an attention area setting section 34 as shown in FIG.
- the attention area setting section 34 includes an image data acquisition section 80, a movement information acquisition section 82, a projection pattern detection section 84, an attention area determination section 86, and a second attention area generation section 88.
- the image data acquisition unit 80 acquires image data captured by the imaging device 9.
- the movement information acquisition unit 82 acquires the drive information of the probe moving device 2, the optical probe 3, and the holding and rotating device 7 from the operation control unit 40 from the encoder of each unit, and how the current measurement target area of the optical probe 3 is. Detect if you are moving to. For example, it is determined how much and in which direction and how much the measurement target area has changed with respect to the previous image acquisition time.
- the projection pattern image detection unit 84 acquires the position setting data 42 of the attention area, and acquires image data from the image data acquisition unit 80.
- the attention area is an attention area on the image data.
- One or more attention areas can be set for one image data.
- the projection pattern image detection unit 84 detects the position of the pattern image on the image data only within the attention area of the image data.
- the attention area determination unit 86 uses the position of the pattern on the image data detected by the projection pattern image detection unit 84 as a reference, and focuses the attention area at a position separated by a predetermined number of pixel pitch intervals with the position of the pattern image as the center. Determine the outline of.
- the attention area determination unit 86 Is determined as a region of interest.
- the number of attention areas provided in the image data is not particularly limited, and may be one, two, or three or more.
- the attention area determination section 86 also pays attention to a second attention area that is different from the attention area set based on the position of the pattern. Decide on an area.
- the attention area determination unit 86 is set as follows in the present embodiment.
- the region of interest is set by setting, as an outer periphery, a position that is separated by a predetermined distance in all directions around the position of the pattern image detected by the projection pattern image detection unit 84.
- the distance used to set the outer circumference is calculated as to how far the measurement target region will move on the measurement object M by the next time when the imaging device 9 captures the image.
- distance information for determining the outer periphery of the region of interest is set.
- the movement information acquisition unit 82 acquires drive information of the X movement unit 50X, the Y movement unit 50Y, the Z movement unit 50Z, the first rotation unit 53, the second rotation unit 54, and the holding rotation device 7.
- the line light is used when the moving direction of the line light on the image data can be estimated instead of setting the region of interest so as to widen the range in all directions with respect to the position of the pattern image.
- the region of interest may be moved on the image data in the moving direction of the line light with reference to the position of.
- set the focus area when expanding the area around the pattern image, set the focus area so that the outer circumference is set at a position farther away from the pattern image in the direction of line light movement than in the other directions. May be.
- the second attention area generation section 88 includes a reference attention area projection pattern image detection section 90, a new attention area generation determination section 92, and a first attention area generation determination section 92. And two attention area determination units 94.
- the reference attention area position setting data 44 is an area set in the field of view that can be designated by the user in advance and can be imaged by the imaging device 9.
- the reference attention area can be used in the same manner as the position setting data 42 of the attention area described above, and is also used to determine whether or not to create a new attention area.
- the projection pattern image detection unit 90 in the reference region of interest generates a pattern from the image data acquired from the image data acquisition unit 80 based on the imaging information in the region of interest set based on the position setting data 44 of the reference region of interest.
- the presence state of the pattern image including the presence / absence of the image and the position of the pattern image is acquired.
- the reference attention area projection pattern image detection unit 90 functions as a reference attention area setting section that determines the pixel range of the image sensor 22 and the reference attention area on the image data based on the position setting data 44 of the reference attention area.
- the reference in-region projected pattern image detection unit 90 measures the length of the line light image within the reference attention region of the image data. .
- the length of the image of the line pattern can be measured by the following method, for example.
- the image of the line pattern is approximately orthogonal to the image of the line pattern sequentially from top to bottom or from bottom to top.
- a maximal pixel search column it is detected whether there is a pixel whose change in luminance value shows a maximum.
- the maximum pixel search column is a direction along the epipolar line.
- the direction in which the maximum pixel search string extends is the detection direction.
- the existence state of the image length of the line-like pattern is determined by evaluating how long the maximum pixel exists continuously.
- the arrangement direction of the pixels of the image sensor 20 may be substantially coincident with the direction of the epipolar line.
- the new attention area generation determination unit 92 generates a second attention area, that is, a new attention area based on the presence state of the pattern detected by the reference attention area projection pattern image detection section 90, for example, the length of the line light image. It is determined whether or not to generate.
- the second attention area determination unit 94 determines an area for creating the second attention area when the new attention area generation determination unit 92 determines to generate the second attention area.
- the second attention area determination unit 94 of the present embodiment determines the same area as the reference attention area as an area for creating the second attention area.
- the second region-of-interest determination unit 94 sets the region of interest in the same manner as the region-of-interest setting method performed by the region-of-interest setting unit 34 described above, centered on the position of the line light image located within the reference region of interest. You only have to set it.
- the attention area cancellation determination unit 89 determines whether or not to cancel the attention area set based on the position or size of the attention area.
- the attention area cancellation determination unit 89 sets the attention area set based on the movement distance estimated by the next imaging device 9 around the pattern existing in the screen data outside the field of view of the screen data. In such a case, the attention area is canceled.
- the attention area cancellation determination unit 89 cancels the attention area when the area of the attention area set on the image data is smaller than the threshold area.
- the set attention area information set by the attention area setting unit 32 is used by the measurement range setting section 36 and the light control area setting section 37 described below.
- the measurement range setting unit 36 sets the attention area set by the attention area setting unit 32 as the position detection area of the pattern image used for calculating the position data of the measurement area of the measurement object M.
- the measurement range is a range in which the position of the pattern image is detected from the image of the object M measured by the image processing unit 25.
- the brightest pixel value is searched for each pixel column only within the range set as the measurement range. Note that the process of searching for the brightest pixel value for each pixel column only within the set range is performed in the same manner when a pattern is detected in the dimming area.
- the light control area setting unit 37 sets a light control area.
- the dimming area is an area for detecting the brightness of the captured image within the imaging range of the imaging device 20. Although the details of the dimming area in the present embodiment will be described later, the dimming area is also changed in conjunction with the attention area. By doing so, it is possible to limit the range of brightness information to be referred to when performing exposure control of the image sensor 20 or irradiation light amount control of the illumination light beam L according to the point cloud generation region, and harmful. The effect of regular specular reflection light can be minimized.
- the light control area setting unit 37 of the present embodiment sets a light control area settable range based on the position information of the measurement range, and sets the light control area based on the set light control area settable range.
- the light control area settable range is a range that is set based on the position information of the measurement range within the imaging range by the imaging unit, and is a range in which the light control area can be set.
- the dimming area setting unit 37 may set the dimming area setting range based on at least one of the data stored in the storage unit 31 and the input data received by the input device 6 in addition to the measurement range. it can.
- the dimming area setting unit 37 sets the dimming area based on at least one of the data stored in the storage unit 31 and the input data received by the input device 6 in addition to the dimming area setting range. You can also
- the dimming control unit 38 acquires the pixel value of the brightest pixel for each pixel row detected in the dimming region, and controls the amount of light emitted to the light source 12 according to the size of the acquired pixel value.
- the exposure time when acquiring one piece of image data is controlled, or the time when the imaging surface is exposed is controlled by a mechanical shutter (not shown) incorporated in the imaging device 20. Also good.
- the dimming control unit 38 depends on the size of the acquired pixel value, the light projection amount of the projection unit, the amount of received light received by the imaging unit, the exposure amount when acquiring image data by the imaging unit, or the imaging unit Input / output characteristics (sensitivity or amplification factor for signals detected at each pixel of the image sensor), that is, various conditions (light control conditions) when image data is acquired by the optical probe 3 are controlled.
- the measuring unit 39 is the position of the pattern image projected by the light source device 8 located within the measurement range of the image data, and the X moving unit 50X, the Y moving unit 50Y, and the Z moving unit 50Z when the image data is acquired.
- the shape of the object to be measured is measured based on the position information of each moving member of the first rotating unit 53, the second rotating unit 54, and the holding rotating device 7.
- the measuring unit 39 moves the optical probe 3 and the object to be measured relative to each other by the probe moving device 2 and the holding and rotating device 7, thereby sequentially moving the position where the pattern is projected, and displaying the image of the measurement target region. Imaging is performed by the imaging device 9.
- the position information of the probe moving device 2 and the holding and rotating device 7 from the position detecting unit 11 is acquired at the timing when the image is picked up by the imaging device 9.
- the measurement unit 39 adds the positional information of the probe moving device 2 and the holding rotation device 7 from the position detection unit 11 acquired at the timing of imaging by the imaging device 9 and the image of the pattern image of the measurement range acquired by the imaging device 9.
- the position where the pattern of the object to be measured M is projected is calculated, and the shape data of the object to be measured M is output.
- the shape measuring apparatus 1 controls the moving speeds of the probe moving device 2 and the holding and rotating device 7 based on the measurement point interval information input from the input device 6 by setting the imaging timing of the imaging device 9 to a constant interval. .
- the operation control unit 40 controls the operation of each part of the shape measuring apparatus 1 including the probe moving device 2, the optical probe 3, and the holding and rotating device 7.
- the operation control unit 40 controls the operation of the probe moving device 2, the optical probe 3, and the holding and rotating device 7 based on the operation control information created by the control unit 30. Further, the control unit 30 controls the image acquisition operation by the optical probe 3.
- the storage unit 31 is a storage device that stores various programs and data such as a hard disk and a memory.
- the storage unit 31 includes attention area position setting data 42, reference attention area position setting data 44, a condition table 46, a shape measurement program 48, and specification data 49.
- the storage unit 31 stores various programs and data used for controlling the operation of the shape measuring apparatus 1 in addition to these programs and data.
- the attention area position setting data 42 stores attention area information on the image data and information on the relative position between the optical probe 3 and the object M to be measured in association with each other.
- the attention area data 42 is written in association with the attention area information set by the attention area setting unit 32 and the relative position information at the time of setting.
- the reference attention area position setting data 44 stores information on the reference attention area corresponding to the outer edge of the image.
- the reference attention area is information on an area where the position with respect to the image is fixed.
- the reference attention area is a single area, but it may be a plurality of areas.
- the condition table 46 stores conditions set by the control unit 30 and various conditions input in advance.
- the shape measurement program 48 stores a program for executing processing of each unit of the control device 4. That is, the control device 4 executes the program stored in the shape measurement program 48 to sequentially reproduce the operation of each unit described above, thereby projecting the measurement target region on the object to be measured M while sequentially moving the measurement target region. Control is performed so that an image of the selected pattern is captured.
- the shape measurement program 48 includes both a program for measuring the workpiece M generated by the control unit 30 and a program for the control unit 30 to generate the program.
- the shape measurement program 48 may be stored in the storage unit 31 in advance, but is not limited to this.
- the shape measurement program 48 may be read from a storage medium stored therein and stored in the storage unit 31 or may be acquired from outside by communication.
- the specification data 49 stores design data, CAD data, and condition data that can define the shape of the object M to be measured.
- the control device 4 controls the drive unit 10 of the probe moving device 2 and the rotation drive unit 72 of the holding rotation device 7 so that the relative position between the optical probe 3 and the object M to be measured has a predetermined positional relationship. Further, the control device 4 controls the dimming and the like of the optical probe 3 so that the projected line pattern on the object to be measured M is imaged with an optimum light amount.
- the control device 4 acquires the position information of the optical probe 3 from the position detection unit 11 of the probe moving device 2, and acquires data (captured image data) indicating an image obtained by capturing the measurement region from the optical probe 3.
- control device 4 associates the surface position of the measurement object M obtained from the captured image data corresponding to the position of the optical probe 3, the position of the optical probe 3, the projection direction of the line light, and the imaging direction of the imaging device.
- shape information regarding the three-dimensional shape of the measurement target is calculated and acquired.
- the display device 5 is configured by, for example, a liquid crystal display device, an organic electroluminescence display device, or the like.
- the display device 5 displays measurement information related to the measurement by the shape measuring device 1.
- the measurement information includes, for example, image data captured by the image sensor 20, information indicating the position of the attention area (or measurement area) set in the imaging area of the image sensor 20, information indicating the position of the dimming area, Information indicating the position of the light control area settable range can be displayed.
- the position information of the attention area (or measurement area), the light control area, and the light control area settable range is displayed superimposed on the image data captured by the image sensor 20.
- the object to be measured M and the optical probe 3 are photographed by the imaging device 9 at a predetermined photographing rate while relatively moving the object to be measured M and the optical probe 3, and accordingly, according to the relative movement between the optical probe 9 and the object to be measured M.
- the pattern image also moves.
- the attention area following this movement can also be displayed with a halftone frame or color display.
- it includes setting information indicating settings related to measurement, progress information indicating the progress of measurement, shape information indicating the result of measurement, and the like.
- the display device 5 of the present embodiment is supplied with image data indicating measurement information from the control device 4 and displays an image indicating measurement information according to the image data.
- the input device 6 includes various input devices such as a keyboard, a mouse, a joystick, a trackball, and a touch pad.
- the input device 6 receives various information input to the control device 4.
- the various information includes, for example, command information indicating a command (command) for causing the shape measuring device 1 to start measurement, setting information related to measurement by the shape measuring device 1, and an operation for manually operating at least a part of the shape measuring device 1. Includes information.
- the control device 4 includes a control unit 30 and a storage unit 31.
- a display device 5 and an input device 6 are connected to the control device 4.
- the control device 4, the display device 5, and the input device 6 may be, for example, a computer connected to the shape measuring device 1, a host computer provided in a building where the shape measuring device 1 is installed, or the shape measuring device. It is not limited to the building in which 1 is installed, but may be located away from the shape measuring apparatus 1 and connected to the shape measuring apparatus 1 using a communication means such as the Internet with a computer.
- the control device 4, the display device 5, and the input device 6 may be held at different places. For example, apart from the computer including the input device 6 and the display device 5, for example, the shape measuring device 1 may be supported inside the optical probe 3. In this case, the information acquired by the shape measuring apparatus 1 is connected to a computer using a communication means.
- FIG. 5 is an explanatory diagram for explaining the measurement operation of the shape measuring apparatus according to the present embodiment.
- FIG. 6 is an explanatory diagram for explaining an example of a screen displayed on the shape measuring apparatus of the present embodiment.
- FIG. 7 is an explanatory diagram for explaining the measurement operation of the shape measuring apparatus according to the present embodiment.
- FIG. 8 is a flowchart illustrating an example of the attention area setting operation of the shape measuring apparatus according to the present embodiment.
- FIG. 9 and FIG. 10 are flowcharts illustrating an example of the second region-of-interest generation determination process of the shape measuring apparatus according to the present embodiment, respectively.
- FIG. 11 is an explanatory diagram of an example of reference attention area data.
- FIG. 12 is an explanatory diagram for explaining an example of the attention area setting operation.
- FIG. 13 is a flowchart showing an example of the measuring operation of the shape measuring apparatus according to the present embodiment.
- the shape measuring device 1 will be described as a case of measuring the shape of the object to be measured Ma in which the shape is repeatedly formed in the circumferential direction.
- the shape measuring apparatus 1 projects the illumination light beam L onto a tooth, which is one unit of the repetitive shape of the object to be measured Ma, and obtains an image of the pattern projected on the object to be measured Ma, thereby measuring the object to be measured Ma. Measure the shape.
- the shape measuring apparatus 1 according to the present embodiment obtains an image of a pattern projected on the measurement object Ma while moving the illumination light beam L along the direction of the tooth trace, thereby obtaining the shape of one tooth. Can be measured.
- the shape measuring apparatus 1 can measure the shape of the measurement object Ma by sequentially measuring the shape of the teeth of the measurement object Ma.
- the object to be measured Ma is a bevel gear in which teeth having substantially the same shape in design are formed at predetermined intervals in the circumferential direction.
- the measurement object Ma is a bevel gear, but the shape can be measured using objects of various shapes as the measurement object.
- the type of gear is not particularly limited.
- the shape measuring apparatus 1 also measures a spur gear, a helical gear, a helical gear, a worm gear, a pinion, a hypoid gear, and the like, and becomes the object to be measured Ma.
- the shape measuring apparatus 1 is not limited to measuring the shape of one tooth or measuring the entire shape of the object to be measured Ma, but can also measure the shape of any one point of the object to be measured Ma. it can.
- the shape measuring apparatus 1 displays a screen 100 shown in FIG.
- the screen 100 is displayed in a mode for setting conditions for measuring the shape of the workpiece Ma, for example, a teaching mode.
- the screen 100 includes an image window 102 and windows 104 and 106.
- the image window 102 displays an image captured by the imaging device 9 and acquired.
- the image displayed in the image window 102 is an entire image within the imaging range by the imaging device 9.
- the outer shape 140 of the object to be measured Ma the patterns 142a, 142b, 142c formed by projecting the illumination light beam L onto the outer shape 140, and the illumination light beam L are projected onto the outer shape 140.
- the bright line 144 generated by the multiple reflection or irregular reflection of the illumination light beam L is included.
- measurement ranges 150a and 150b, a dimming area settable range 152, and a dimming area 154 are displayed so as to overlap the image.
- the measurement ranges 150 a and 150 b and the dimming area settable range 152 are areas set by the measurement range setting unit 36.
- the dimming area 154 is an area set by the dimming area setting unit 37.
- the dimmable area settable range 152 shown in FIG. 7 is a rectangular range that includes all of the measurement ranges 150a and 150b and circumscribes the measurement ranges 150a and 150b.
- the light control region settable range 152 is a longitudinal direction of an image of a line pattern formed when the illumination light beam L is projected onto the measurement object Ma having a flat and horizontal surface. And a direction perpendicular to the longitudinal direction is a rectangular range.
- the dimming area 154 is a range included in the dimming area settable range 152.
- the window 104 has a measurement condition column 112 and a button 114.
- the measurement condition column 112 includes an acquisition method when acquiring a shape using the light cutting method, a scanning speed of the illumination light beam L, a distance pitch that is an interval for acquiring image data, a distance to be measured, and the optical probe 3 and a measurement target. Information such as relative value movement distance in each direction with the object Ma is displayed.
- the control device 34 updates the information displayed in the measurement condition column 112 when various conditions are updated.
- the button 114 is a button that is operated when a pre-scan process or a scan process (measurement process) described later is started. When the control unit 30 detects that the button 114 is operated by an input to the input device 6, the control unit 30 performs a pre-scan process or a scan process (measurement process).
- the window 106 has a check box 120 for selecting whether or not to display various profiles related to the displayed image data, a dimming area selection point 122, a measurement range selection point 124, a range column 126, and a button 128. , 130.
- the check box 120 is a box for selecting whether or not to display a light intensity profile of an image and a histogram.
- the histogram is a distribution of the maximum luminance value for each column created by extracting the peak value in the width direction of the pattern on the line for each column of the image sensor.
- the control unit 30 displays an image light amount distribution or a histogram of the maximum luminance value distribution over the image window 102.
- the dimming area selection point 122 When the dimming area selection point 122 is checked, the coordinate position on the diagonal line of the dimming area range is displayed in the range column 126.
- the measurement range selection point 124 When the measurement range selection point 124 is checked, the measurement range, that is, the measurement area where the image is acquired by the imaging device can be set in a rectangle. For the range of the measurement area, the coordinate position on the diagonal line of the measurement area is displayed in the range column 126.
- the button 128 is a button operated when the automatic light control process is started.
- the button 130 is a button operated when setting a range and a region on the selected side of the dimming region selection point 122 and the measurement range selection point 124.
- the control unit 30 of the shape measuring apparatus 1 performs an arithmetic process using the central processing unit and sets an attention area used for measuring the shape of the measurement object.
- the control unit 30 of the shape measuring apparatus 1 executes the process shown in FIG.
- other processing described later is also realized by executing processing in each unit of the control unit 30 based on a program stored in the storage unit.
- the process shown in FIG. 8 is realized by executing the process by the attention area setting unit 32 of the control unit 30.
- the control unit 30 relatively moves the optical probe 3 and the object to be measured, and measures the shape of the object to be measured M by executing the process shown in FIG. 8 for each position where the pattern image is projected. Set the region of interest at each position.
- the control unit 30 acquires the image data captured by the imaging device 9 by the image data acquisition unit 80 (step S12). After acquiring the image data, the control unit 30 causes the projection pattern image detection unit 84 to read the attention area data stored in the attention area data 42 (step S14). The control unit 30 reads attention area data set based on an image taken before the optical probe 3 and the object to be measured are relatively moved. The control unit 30 reads the initially set attention area data at the start of processing. The initially set attention area data may be set by user input.
- control unit 30 uses the projection pattern image detection unit 84 to detect an image of line light in the attention area (step S16). After detecting the position of the line light image, the control unit 30 determines how much distance the measurement target region transitions between the detected position of the line light image and the next time the image is picked up by the imaging device 9. Based on the information (scanning information), the position of the region of interest is determined (step S18). The transition distance is calculated based on the movement information of the optical probe input in the teaching mode and the rotation angle information of the holding and rotating device 7. The transition distance is calculated by replacing the distance on the object plane of the imaging device 9 with the transition distance on the imaging surface of the imaging device 9.
- control unit 30 After determining the position of the attention area, the control unit 30 performs a second attention area generation determination process (step S20).
- the control unit 30 acquires reference attention area data from the position setting data 44 of the reference attention area by using the projected pattern image detection section 90 in the reference attention area (step S40). After acquiring the reference attention area data, the control unit 30 uses the reference attention area projection pattern image detection section 90 to specify a maximal pixel search string in the reference attention area of the image data (step S42). The control unit 30 specifies the pixel having the highest luminance in the maximum pixel search sequence by using the projected pattern image detection unit 90 in the reference region of interest (step S44), and the specified pixel in the maximum pixel search sequence is a pixel indicating the maximum. Is determined (step S46).
- the reference pattern-of-interest projection pattern image detection unit 90 detects a change in luminance value of a pixel between a specific pixel and a pixel adjacent to the specific pixel in the maximum pixel search sequence, and the luminance value is maximum at the specific pixel. Detect what happens.
- Step S46 When it is determined that the specified pixel shows the maximum (Yes in Step S46), the control unit 30 extracts the specified pixel as a peak pixel (Step S48). When it is determined that the specified pixel does not show the maximum (No in step S46), or when the process of step S48 is performed, the control unit 30 causes the reference attention area projection pattern image detection unit 90 to perform the reference attention area. It is determined whether the determination of the entire area is completed (step S50).
- Step S50 When it is determined that the determination of the entire region within the reference region of interest has not been completed (No in Step S50), the control unit 30 moves the maximum pixel search sequence in a direction substantially orthogonal to the maximum pixel search sequence (Step S52). Return to step S44. That is, the position of the maximal pixel search string that serves as a reference for detecting the peak pixel is moved from the top to the bottom of the image data or from the bottom to the top, and the above processing is performed again.
- control unit 30 determines whether the number of peak pixels is equal to or greater than the threshold by the new region of interest generation determination unit 92 ( Step S54).
- Step S54 When the number of peak pixels is determined to be equal to or greater than the threshold (Yes in Step S54), the control unit 30 determines to generate the second attention area (Step S56). When determining that the number of peak pixels is less than the threshold (No in Step S54), the control unit 30 determines not to generate the second region of interest (Step S58).
- the new attention area generation determination unit 92 further determines, based on the determined attention area information, whether a attention area has already been set for the detected line light image, and the line light image. If a region of interest is set for, it is determined that a second region of interest is not generated.
- the second attention area generation section 88 detects the line light image with the reference in-target area projection pattern image detection section 90, and the new attention area generation determination section 92 based on the length of the line light image. It may be determined whether to generate the second region of interest.
- a process for determining whether or not the second attention area generation determination unit 92 generates the second attention area based on the length of the line light image will be described with reference to FIG.
- the control unit 30 acquires reference attention area data from the position setting data 44 of the reference attention area by using the projected pattern image detection section 90 in the reference attention area (step S40). After acquiring the reference attention area data, the control unit 30 causes the reference attention area projection pattern image detection unit 90 to detect the image of the line light in the reference attention area of the image data (step S62).
- the control unit 30 detects the length of the line light image (step S64).
- the length of the line light image is the length of the line light image in the longitudinal direction.
- the reference pattern-of-interest projection pattern image detection unit 90 continuously detects how many pixels are detected in each maximal pixel search sequence when image data is sequentially detected from top to bottom or from bottom to top. Detect what to do.
- the reference in-region projected pattern image detection unit 90 has a plurality of pixels in which the line light image is not detected. The length may be detected as being connected.
- the image of the line pattern acquired by the image sensor 20 is not necessarily continuous. This is because a part of the object to be measured may be lost due to a defect on the surface. For these, for example, an exception process may be inserted in which one or two pixels or a few pixels are missing, assuming that they are continuously connected.
- the control unit 30 After detecting the length of the line light image, the control unit 30 determines whether the length of the line light image is equal to or greater than the threshold by the new attention area generation determination unit 92 (step S64). When it is determined that the length of the line light image is greater than or equal to the threshold (Yes in step S66), the control unit 30 determines to generate the second region of interest (step S68). When it is determined that the length of the line light image is less than the threshold (No in Step S66), the control unit 30 determines that the second region of interest is not generated (Step S69).
- the second attention area generation unit 88 of the control unit 30 can generate the second attention area based on the length of the line light image, so that the line light image in the reference attention area is also generated. Based on this, a new attention area can be generated, and an attention area from which an image of line light included in the image data is extracted can be set.
- the control section 30 causes the second attention area determination section 94 to determine the second attention area based on the reference attention area.
- the second attention area is determined (step S24).
- the second region of interest is determined based on the reference region of interest.
- the second region of interest may be determined based on the position of the line light image included in the reference region of interest.
- the control section 30 uses the attention area cancellation determination section 89. Attention area cancellation determination processing is performed (step S26).
- the attention area cancellation determination unit 89 determines whether or not to cancel the attention area set based on the position or size of the attention area.
- the attention area cancellation determination unit 89 sets the attention area set based on the movement distance estimated by the next imaging device 9 around the pattern existing in the screen data outside the field of view of the screen data. In such a case, it is determined to cancel the attention area.
- the attention area cancellation determination unit 89 determines that the attention area is canceled when the area of the attention area set on the image data is smaller than the threshold area.
- the area of the attention area on the image data is used as a material for determining the cancellation of the attention area
- range information such as an effective shooting range may be set on the image data, and the determination may be made based on the area of an area that matches both the effective shooting range and the attention area.
- the image in the peripheral area of the image data is easily affected by various aberrations caused by the imaging optical system, the intensity distribution in the short direction of the pattern image is broken by the influence of the aberration. Since this affects measurement accuracy, it is possible to avoid such a peripheral area and set an effective photographing range only in the central area of the image data.
- the control section 30 cancels the target attention area by the attention area determination section 86 (step S30).
- the control section 30 determines and generates the attention area determination section 86.
- the obtained attention area is stored in the attention area data (step S32).
- the control unit 30 stores the relative position of the optical probe 3 acquired by the movement information acquisition unit 82 and the measurement target in the attention area data 42 in association with the attention area.
- FIG. 12 shows a case where the line light image moves upward in the image data.
- the shape measuring apparatus 1 has data in which the reference region of interest 402 is set for the frame 400 that overlaps the entire image data shown in FIG. 11 as the position setting data 44 of the reference region of interest.
- the shape measuring apparatus 1 will be described with respect to a case where the attention area is not set in the initial setting.
- the control unit 30 acquires the image data F (N-1) as shown in step ST101.
- the control unit 30 superimposes the reference attention area data 402 on the acquired image data F (N ⁇ 1), and detects an image of the line light included in the reference attention area 402.
- the second region of interest is not generated.
- the control unit 30 After setting the attention area for the image data F (N ⁇ 1) in step ST101, the control unit 30 acquires the image data F (N) as the next frame as shown in step ST102.
- the image data F (N) is an image acquired at a relative position in which the relative position between the optical probe 3 and the object M to be measured is moved by a predetermined distance from the relative position captured by the image data F (N-1).
- the control unit 30 detects the line light image 410 included in the reference attention area 402 by superimposing the reference attention area data 402 on the acquired image data F (N) as shown in step ST101.
- the control unit 30 When the control unit 30 detects the line light image 410 included in the reference region of interest 402 and detects that the line light image 410 satisfies the condition, the control unit 30 displays the line light image 410 as shown in step S103.
- An attention area (second attention area) 412 is generated based on the position.
- the generated data of the attention area 412 is stored in the attention area data 42.
- the control unit 30 After setting the attention area for the image data F (N) in step ST103, the control unit 30 acquires the image data F (N + 1) to be the next frame as shown in step ST104. As shown in step ST104, the line light image 410a included in the image data F (N + 1) is more focused than the line light image 410 because the relative position of the optical probe 3 and the object M to be measured has moved. The position is close to the edge of the region 412. As shown in step ST105, the control unit 30 detects the line light image 410a included in the reference attention area 402. However, since the control section 30 is an image of line light that is a target that can be detected in the attention area 412, the second attention is paid. Do not generate a region. As shown in step S106, the control unit 30 generates the attention area 412a with reference to the position of the line light image 410a.
- the control unit 30 performs the processing from step ST104 to step S106 every time image data of the next frame is acquired.
- the line light image moves to the upper side of the screen as the frame advances.
- the control unit 30 performs the processing from step ST104 to step ST106 each time image data of the next frame is acquired, thereby moving the position of the attention area to the upper side of the screen in accordance with the movement of the line light image.
- control unit 30 Even when the control unit 30 acquires the image data F (N + m) in which the frame is advanced by (m ⁇ 1) frames from the image data F (N + 1) in step ST106, as shown in step ST107, the control unit 30 A line light image is detected based on the attention area information, and the attention area 412b is generated with reference to the detected line light image 410b. The control unit 30 also detects line light images included in the reference region of interest 402 by parallel processing.
- the control unit 30 also obtains the image data F (N + l ⁇ 1) further advanced from the image data F (N + m) in step ST107, as shown in step S108, based on the information on the attention area of the previous frame. Then, the line light image is detected, and the attention area 412c is generated with reference to the detected line light image 410c. In the image data F (N + l ⁇ 1), there is another line light 420 below the screen from the line light image 410c.
- the control unit 30 uses the attention area information of the previous frame as shown in step ST109. Based on this, a line light image is detected, and a region of interest 412d is generated with reference to the detected line light image 410d.
- the reference attention area 402 includes another line light image 420a on the lower side of the screen from the line light image 410c.
- a region of interest (second region of interest) 422 is generated with reference to the position of. Thereby, in the case of the image data F (N + l), as the attention area, two attention areas 412d and 422 are set in the imaging area of the imaging element or the image data acquired from the imaging element.
- the shape measuring apparatus 1 can set a region including the line light image as the region of interest by setting the region of interest based on the position of the line light image. That is, the shape measuring apparatus 1 can set a region of interest where the region moves in response to the movement of the position of the line light image on the image data.
- the shape measuring apparatus 1 uses a region including a part separated by a distance set from the line light image on the basis of the line light image as a region of interest, so that the position of the line light image can be determined during actual measurement. Even in the case of deviation, an image of line light can be included in the region of interest.
- the control unit 30 of the shape measuring apparatus 1 displays the screen 100 shown in FIG. 6 and executes the processing shown in FIG.
- the control unit 30 presets the measurement range, the light control region, and the light control region settable range to the entire range that can be imaged by the imaging device 8.
- the image data is transferred from the imaging device 8 to the control unit 30, and the image data is displayed on the window 102.
- the control unit 30 sets the measurement range when a range smaller than the entire range that can be captured by the imaging device 9, that is, a range smaller than the entire range of the acquired image data is set as the measurement range.
- the method for setting the measurement range can be specified by enclosing part of the image data in an arbitrary shape such as a circle, rectangle, or ellipse on the window 102.
- the control unit 30 sets the range specified by the user as the measurement range. Further, the control unit 30 may set a measurement range based on information set in the condition table 46. Further, the control unit 30 may extract an image of a pattern onto which an illumination light beam is projected from an image acquired by the imaging device 9 and set a range in which the image of the pattern is extracted as a measurement range.
- the measurement range setting unit 36 performs the measurement range setting process based on the position coordinates of the image data, and the conditions of the storage unit 31 are determined. Write to the table 46 (step S114).
- the control unit 30 sets a range smaller than the range that can be captured by the imaging device 9 as the measurement range.
- the control part 30 sets an attention area to a measurement range, when the setting which makes an attention area the measurement range is selected.
- control unit 30 determines whether the dimming area is set by the input device 6 (step S112). S116).
- control unit 30 When it is determined that the light control area is set (Yes in step S116), the control unit 30 performs a light control area setting process by the light control area setting unit 37 (step S118). The control unit 30 sets the attention area as the dimming area when the setting for setting the attention area as the dimming area is selected.
- the control unit 30 determines whether to perform pre-scanning (step S120).
- the pre-scan is a position where the optical probe 3 and the object to be measured are relatively moved based on a set condition, the position where the pattern image is projected is moved, and the line light image is projected. Is displayed on the display device 5. While relatively moving the object to be measured, image data including an image of a line pattern acquired by the imaging device 9 at a predetermined frame rate is displayed one after another.
- the control unit 30 executes the pre-scan process (Step S122).
- control unit 30 determines whether the setting is completed (step S124). When it is determined that the setting has not been completed (No in step S124), the control unit 30 returns to step S112 and executes the above-described process again.
- the control unit 30 generates the shape measurement program after the measurement range position information and the dimming range position information stored in the storage unit 31 and the measurement conditions such as the scanning path of the optical probe 3 are set (step). S128).
- the control unit 30 generates a shape measurement program for measuring the object to be measured Ma including the measurement range, the light control region, the light control method, and the measurement coordinate calculation region, and stores the shape measurement program in the storage unit 31.
- the control unit 30 determines the measurement path and the measurement speed based on various conditions, the movement path in the XYZ-axis direction by the probe moving device 2, the rotation speed in the Z ⁇ direction by the holding and rotating device 7, and the optical probe 3.
- a shape measurement program including information on the determined operation, information on the determined operation, information on the set dimming conditions, and information on a measurement range for extracting the position of the pattern image from the acquired image Generate.
- the control unit 30 determines whether to perform measurement after generating the shape measurement program (step S130).
- the measurement means that the optical probe 3 and the object to be measured Ma are relatively moved based on the set conditions, the position where the pattern image is projected is moved, and the pattern image is projected within the measurement region. This is a process for measuring the shape by obtaining the coordinate value (point group data) of each part of the object to be measured.
- the control unit 30 repeats imaging of the imaging device 8 at a predetermined frame rate.
- the dimming control unit 38 detects the luminance value of the pixel having the maximum value in each maximum pixel search sequence included in the dimming range from the captured image data, and the maximum luminance value (the value of the brightest pixel). ) As the maximum pixel value, and the dimming control unit 38 outputs dimming control information to the light source device 8 and the imaging device 9 (step S131). Next, based on the dimming conditions, the imaging device 9 captures an image of the pattern projected on the object Ma to be measured, and sends the image data at that time to the measuring unit 39 of the control unit 30 (step S132).
- the measurement unit 39 obtains the position of the pattern image from the image data based on the measurement range information, and the pattern of the object to be measured Ma is determined from the position information of the probe moving device 2 and the position information of the pattern image.
- a three-dimensional coordinate value of the projected portion is calculated (step S133).
- the control unit 30 of the present embodiment can extract the line light image with high accuracy by setting the attention area and setting the attention area as the measurement range.
- the measurement range can be set on the basis of the image of the line light, it is possible to suppress the bright line 144 from being included in the measurement range. As a result, the line light image (pattern image) can be more reliably extracted.
- the control unit 30 of the present embodiment can reduce the possibility that light other than the light of the line light enters the dimming area by setting the attention area and setting the attention area as the dimming area. Thereby, the light control conditions which are easy to detect a line light can be set.
- the control unit 30 of the present embodiment can automatically extract a region of interest whose position changes based on the position of the line light image based on the image data as described above. Accordingly, the image data is confirmed for each relative position between the optical probe 3 and the object M to be measured, and the measurement range and the light control region in which the position moves can be set without setting the measurement range and the light control region. be able to.
- control unit 30 enables the prescan process to be performed, so that the variation in the image in the measurement range and the dimming area when the object to be measured M and the illumination light beam L move relative to each other can be changed. Can be visually confirmed. Thereby, a measurement range and a light control area
- the attention area setting unit 32 of the above embodiment sets the set area around the line light image as the attention area on the basis of the position of the line light image
- the attention area setting method is not limited to this.
- FIG. 14 is a flowchart showing an example of the attention area setting operation of the shape measuring apparatus of the present embodiment.
- the attention area setting operation shown in FIG. 14 is the same as the attention area setting operation shown in FIG. 8 except for some processes.
- the same processes as those of the attention area setting operation shown in FIG. 8 are denoted by the same steps, and detailed description thereof is omitted. This series of setting operations is also performed at each photographing timing.
- the control unit 30 acquires the image data captured by the imaging device 9 by the image data acquisition unit 80 (step S12). After acquiring the image data, the control unit 30 causes the projection pattern image detection unit 84 to read the attention area data stored in the attention area data 42 (step S14). Next, the control unit 30 causes the projection pattern image detection unit 84 to detect an image of line light in the attention area (step S16).
- the control unit 30 After detecting the position of the line light image, the control unit 30 detects the distance between the edge of the attention region and the detected line light image by the attention region determination unit 86 (step S140).
- the attention area determination unit 86 detects the distance between the edge of the attention area and the line light image.
- the distance between the edge of the region of interest and the line light image may be detected at a plurality of set representative points or at all positions.
- the distance between the edge of the region of interest and the line light image is the distance in the direction along the maximum pixel search sequence.
- the control section 30 After detecting the distance between the edge of the attention area and the detected line light image, the control section 30 determines whether the distance is within the threshold by the attention area determination section 86 (step S141). When the attention area determination section 86 determines that the distance is within the threshold (Yes in step S141), that is, when the line light image is close to the edge of the attention area, the control section 30 determines the position of the line light image. Based on this, the position of the attention area is moved (step S142). When the attention area determination section 86 determines that the distance is within the threshold (Yes in step S141), that is, when the line light image is close to the edge of the attention area, the control section 30 determines the position of the line light image. Based on this, the position of the attention area is moved (step S142).
- the attention area determination unit 86 moves the attention area in a direction in which the line light image approaches the center of the attention area.
- the control section 30 determines the position of the line light image. Based on this, the position of the attention area is moved.
- control unit 30 After performing the processing of step S142 and step S144, the control unit 30 performs the second attention area generation determination process after determining the position of the attention area (step S20).
- the subsequent processing is equivalent to the processing shown in FIG.
- the shape measuring apparatus 1 can simplify the process of setting the attention area by maintaining the shape of the attention area.
- FIG. 15 is a block diagram illustrating a schematic configuration of the attention area setting unit and the storage unit of the control device.
- FIG. 15 shows an attention area setting unit 32A and a storage unit 31A of the control unit 30A of the control device 4A.
- the storage unit 31A includes a condition table 46, a shape measurement program 48, attention area template data 182, read address data 184, and image data 186.
- the storage unit 31 stores various programs and data used for controlling the operation of the shape measuring apparatus 1 in addition to these programs and data.
- the attention area template data 182 can be designated in advance by the user, and is an area set corresponding to the field of view that can be imaged by the imaging device 9. This attention area template data can be moved in a position to be superimposed on the field of view that can be imaged by the imaging device 9.
- the read address data 184 stores information on which position in the image data the region-of-interest template data 182 is superimposed on and the relationship between the relative position between the optical probe 3 and the object M to be measured.
- the image data 186 stores image data acquired by the image data acquisition unit 80 and captured by the imaging device 9.
- the attention area setting unit 32A includes an attention area setting section 34A, a distance measurement area setting section 178, and a superimposed image generation section 179.
- the attention area setting section 34A includes an image data acquisition section 80, a movement information acquisition section 82, a projection pattern image detection section 84, a detection attention area generation section 172, a distance measurement section 174, and an attention area position determination section 176. And consisted of
- the detection attention area setting section 172 generates a detection attention area used for setting the attention area based on the attention area template data 182 and the read address data 184.
- the detection attention area is information on the attention area set based on the image data acquired immediately before the movement to the relative position between the target optical probe 3 and the object M to be set.
- the distance measuring unit 174 measures the distance between the line light image included in the detection attention area and the detection attention area.
- the distance between the detection region of interest and the line light image may be detected for a plurality of set representative points or at all positions.
- the distance between the edge of the region of interest for detection and the image of the line light is the distance in the direction along the maximum pixel search sequence.
- the distance in the direction along the maximum pixel search sequence between the edge of the region of interest and the line light image may be the average of the distances of the representative points of a plurality of points, or the entire region in the direction substantially orthogonal to the maximum pixel search sequence of the region of interest It is good also as the average measured by.
- the attention area position determination unit 176 determines the position where the attention area template data is arranged with respect to the image data.
- the attention area position determination unit 176 stores the determined position information in the read address data 184 in association with the relative positions of the optical probe 3 and the object M to be measured.
- the distance measurement area setting unit 178 sets a position for measuring the distance between the detection attention area and the line light image.
- the superimposed image generation unit 179 generates an image in which the image of the line light extracted by each of the plurality of acquired image data is superimposed while the optical probe 3 and the measurement object M are relatively moved.
- the superimposed image generation unit 179 extracts the image of the line light image by extracting only the image of the attention area from the image data based on the information of the attention area.
- FIG. 16 is a flowchart showing an example of the attention area setting operation of the shape measuring apparatus of the present embodiment.
- FIG. 17 is an explanatory diagram of an example of attention area template data.
- 18 and 19 are explanatory diagrams for explaining an example of the attention area setting operation.
- the process illustrated in FIG. 16 is realized by executing the process in the attention area setting unit 32A of the control unit 30A. Further, the control unit 30A relatively moves the optical probe 3 and the object to be measured, and measures the shape of the object to be measured M by executing the process shown in FIG. 16 for each position where the pattern image is projected. Set the region of interest at each position.
- control parts acquire the image data imaged with the imaging device 9 by the image data acquisition part 80 (step S212).
- the control unit 30A acquires the attention region template data 182 by using the detection attention region generation unit 172 (step S214), and acquires the read address data 184 (step S216).
- a region of interest for detection is generated with the region of interest template data (step S218).
- the detection attention area is an attention area in which a position to be overlapped with the image data is specified.
- the shape measuring apparatus 1 has an attention area template 500 shown in FIG. 17 as attention area template data 182.
- the attention area template 500 is data indicating the positions of the two attention areas 502 and 504.
- the attention area template data 182 stores the attention area template 500 in, for example, a lookup table.
- the control unit 30A determines the position of the attention area template 500 to be overlaid on the image data Fa based on the address information.
- the control unit 30 shifts the image data Fa by a distance of ⁇ 1 in the ⁇ direction and a distance of ⁇ 1 in the ⁇ direction.
- a target region for detection having target regions 502 and 504 in which the target region template 500 is superimposed at the determined position is generated.
- the control unit 30A After generating the attention area for detection, the control unit 30A detects the line light image in the attention area for detection by the projection pattern image detection section 84 (step S220). After detecting the position of the line light image, the control unit 30A detects the distance in the direction along the maximum pixel search sequence between the edge of the region of interest and the line light image (step S222).
- the attention area is a detection attention area or a attention area in which the position of the detection attention area is changed.
- the control unit 30 ⁇ / b> A has six directions of points P 1, P 2, and P 3 of the attention area 502 and points P 4, P 5, and P 6 of the attention area 504 in the direction along the maximum pixel search sequence.
- the line light image and distance are detected.
- the distance in the direction along the maximum pixel search sequence between the edge of the region of interest and the line light image may be the average of the distances of the representative points of a plurality of points, or the entire region in the direction substantially orthogonal to the maximum pixel search sequence of the region of interest It is good also as the average measured by.
- the control unit 30A determines whether the distance detected by the attention area position determination unit 176 is within the threshold range (step S224). When the control unit 30A determines that the distance is not within the threshold (No in step S224), the attention region position determination unit 176 changes the address of the attention region (step S226), and an image is generated using the address and the attention region template data. A region of interest whose position with respect to the data is changed is generated (step S228), and the process returns to step S222.
- control unit 30A When it is determined that the distance is within the threshold value (Yes in step S224), the control unit 30A causes the attention region position determination unit 176 to read the address of the attention region and store it in the address data 184. In addition, the control unit 30A associates the relative position between the optical probe 3 acquired by the movement information acquisition unit 82 and the measured object with the address of the region of interest and stores it in the read address data 184.
- the control device 30A sets the address position for the image data acquired while the relative position is moving, that is, the image data of each frame, thereby moving the attention area in the moving direction of the line light image as shown in FIG. It can be moved along the arrow 510. Further, the control device 30A switches the correspondence between the attention area and the line light image so that the line light image detected in the attention area 504 is detected in the attention area 502 at a predetermined timing. Is preferably set. Thereby, it is possible to efficiently detect the line light image in a plurality of attention areas.
- control device 30A can set the relationship between the relative movement between the optical probe 3 and the object M to be measured and the moving direction of the line light image on the image data. This makes it easier to find the line light image when detecting the line light image based on the region of interest.
- the control device 30A displays the image captured by the imaging device 9 together with the screen for setting the moving direction.
- the image data Fc including the line light image 610 illustrated in FIG. 20
- the image data Fd including the line light image 610a illustrated in FIG.
- the image data Fc is image data of a frame at the start of measurement.
- Image data Fc is image data of a frame at the end of measurement.
- the control device 30A preferably displays both the image data Fc and the image data Fd.
- the control device 30A displays the image generated by the superimposed image generation unit 179 on the screen for setting the moving direction.
- the control device 30 ⁇ / b> A adds the line light image of the frame between the image data Fc and the image data Fd, and displays the image data 600 in which the line light images are superimposed.
- the image data 600 includes both a line light image 610 of the image data Fc and a line light image 610a of the image data Fd.
- the control device 30A can display an image in which the direction of the arrow 630 can be easily recognized by displaying the image generated by the superimposed image generation unit 179.
- the shape measuring apparatus 1 detects the position of the line light image based on the image data including the line light image acquired by the imaging device 9 as in the above-described embodiment, and pays attention based on the position of the line light image. Although it is preferable to set an area
- the shape measuring apparatus 1 may set a region of interest without using image data.
- the shape measuring apparatus 1 may set the region of interest based on position information when the line light is projected onto the object to be measured. As an example, the shape measuring apparatus 1 performs a simulation based on the specification data 49 and various conditions of the apparatus, estimates the position of the line light image, and determines the region of interest based on the estimated position of the line light image. It may be set.
- the shape measuring apparatus 1 may detect the moving direction of the line light image and move the position of the region of interest on the screen data based on the moving direction.
- the movement amount may be constant or may vary according to a set rule.
- the shape measuring apparatus 1 may detect the moving direction based on the relative movement between the optical probe 3 and the DUT M and move the attention area.
- the present invention is not limited to this.
- a system may be used.
- the longitudinal direction of the line pattern corresponds to the scanning direction of the deflection scanning mirror.
- FIG. 23 is a schematic diagram showing a configuration of a system having a shape measuring apparatus.
- the shape measuring system 300 includes a shape measuring device 1, a plurality (two in the figure) of shape measuring devices 1 a, and a program creation device 302.
- the shape measuring devices 1 and 1a and the program creating device 302 are connected by a wired or wireless communication line.
- the shape measuring apparatus 1a has the same configuration as the shape measuring apparatus 1 except that the attention area setting unit 32 is not provided.
- the program creation device 302 creates various settings and programs created by the control device 4 of the shape measuring device 1 described above.
- the program creation device 302 includes a shape measurement program including information on a measurement range, a dimming area settable range and a dimming area, a measurement range, a dimming area settable range, and information on a dimming area. create.
- the program creation device 302 outputs the created program and data to the shape measuring devices 1 and 1a.
- the shape measuring device 1a acquires area and range information and a shape measuring program from the shape measuring device 1 and the program creation device 302, and performs processing using the acquired data and program.
- the shape measurement system 300 effectively uses the created data and program by executing measurement with the shape measurement device 1a using the data and program created by the shape measurement device 1 and the program creation device 302 as a measurement program. be able to.
- the shape measuring apparatus 1a can perform the measurement without providing the attention area setting unit 32 and each part for performing other settings.
- FIG. 24 is a block diagram of the structure manufacturing system.
- the structure manufacturing system 200 of this embodiment includes a shape measuring device 201, a design device 202, a forming device 203, a control device (inspection device) 204, and a repair device 205 as described in the above embodiment.
- the control device 204 includes a coordinate storage unit 210 and an inspection unit 211.
- the design device 202 creates design information related to the shape of the structure, and transmits the created design information to the molding device 203.
- the design apparatus 202 stores the created design information in the coordinate storage unit 210 of the control apparatus 204.
- the design information includes information indicating the coordinates of each position of the structure.
- the forming apparatus 203 creates the above structure based on the design information input from the design apparatus 202.
- the molding of the molding apparatus 203 includes, for example, casting, forging, cutting, and the like.
- the shape measuring device 201 measures the coordinates of the created structure (measurement object) and transmits information (shape information) indicating the measured coordinates to the control device 204.
- the coordinate storage unit 210 of the control device 204 stores design information.
- the inspection unit 211 of the control device 204 reads design information from the coordinate storage unit 210.
- the inspection unit 211 compares information (shape information) indicating coordinates received from the shape measuring apparatus 201 with design information read from the coordinate storage unit 210.
- the inspection unit 211 determines whether or not the structure has been molded according to the design information based on the comparison result. In other words, the inspection unit 211 determines whether or not the created structure is a non-defective product.
- the inspection unit 211 determines whether or not the structure can be repaired when the structure is not molded according to the design information. When the structure can be repaired, the inspection unit 211 calculates a defective portion and a repair amount based on the comparison result, and transmits information indicating the defective portion and information indicating the repair amount to the repair device 205.
- the repair device 205 processes the defective portion of the structure based on the information indicating the defective portion received from the control device 204 and the information indicating the repair amount.
- FIG. 25 is a flowchart showing the flow of processing by the structure manufacturing system.
- the design device 202 creates design information related to the shape of the structure (step S301).
- molding apparatus 203 produces the said structure based on design information (step S302).
- the shape measuring apparatus 201 measures the shape of the created structure (step S303).
- the inspection unit 211 of the control device 204 inspects whether or not the structure is created according to the design information by comparing the shape information obtained by the shape measurement device 201 with the design information (step). S304).
- the inspection unit 211 of the control device 204 determines whether or not the created structure is a good product (step S305).
- the inspection unit 211 determines that the created structure is a non-defective product (Yes in step S305)
- the structure manufacturing system 200 ends the process. If the inspection unit 211 determines that the created structure is not a non-defective product (No in step S305), the inspection unit 211 determines whether the created structure can be repaired (step S306).
- the repair device 205 performs reworking of the structure (Step S307), and the process of Step S303 is performed. Return to processing.
- the inspection unit 211 determines that the created structure cannot be repaired (No in step S306), the structure manufacturing system 200 ends the process.
- the structure manufacturing system 200 ends the process of the flowchart shown in FIG.
- the structure manufacturing system 200 of the present embodiment can determine whether or not the created structure is a good product because the shape measuring apparatus 201 in the above embodiment can measure the coordinates of the structure with high accuracy. be able to. In addition, the structure manufacturing system 200 can repair the structure by reworking the structure when the structure is not a good product.
- the repair process executed by the repair device 205 in the present embodiment may be replaced with a process in which the molding device 203 re-executes the molding process.
- molding apparatus 203 re-executes a shaping
- the shape measuring apparatus 1 in the above embodiment has exemplified the configuration in which the holding member 55 cantilever holds the optical probe 3, but the configuration is not limited to this and may be held by both ends. By holding both ends, the deformation generated in the holding member 55 during rotation can be reduced, and the measurement accuracy can be improved.
- line light is projected as the illumination light beam L from the optical probe 3 and a line pattern reflected from the object to be measured is imaged.
- the type of the optical probe 3 is not limited to this.
- the illumination light emitted from the optical probe 3 may be projected in a lump on a predetermined plane.
- the method described in US Pat. No. 6,075,605 may be used.
- the illumination light emitted from the optical probe may be in the form of projecting point spot light.
- the shape measuring apparatus is preferably used for measurement of an object to be measured having a concavo-convex shape having a repeated shape in the circumferential direction and extending in a direction different from the circumferential direction as in the above embodiment. be able to.
- the shape measuring apparatus can use the set conditions for measuring other repetitive shapes by setting a measurement range, a dimmable region settable range, and a dimming region for one of the repetitive shapes.
- the object to be measured is not limited to a shape having a repetitive shape in the circumferential direction and having an uneven shape extending in a direction different from the circumferential direction, and various shapes such as a shape not having a repetitive shape. It may be.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Quality & Reliability (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
図1は、本実施形態に係る形状測定装置1の外観を示す図である。図2は、本実施形態の形状測定装置の概略構成を示す模式図である。図3は、本実施形態の形状測定装置の制御装置の概略構成を示すブロック図である。
なお、本実施の形態における形状測定装置では、撮像素子20で取得される画像データに注目領域を設定することで、このような多重反射像やその他の原因で発生する偽像を照明光束Lの像として誤認識することを低減する効果を持つ。
ところで、注目領域設定ユニット32は、撮像装置9からデータが入力され、動作制御部40から各部の動作の情報、具体的に光学プローブ3と被測定物Mとの相対位置の情報が入力される。注目領域設定ユニット32は、記憶部31の注目領域の位置設定データ42から撮像装置9で撮影できる領域(または、撮像素子20で取得できる視野範囲)に対して、どの位置に注目領域を設定するかの位置情報を取得する。注目領域設定ユニット32は、基準注目領域の位置設定データ44から基準注目領域データを取得する。注目領域設定ユニット32は、ユーザーなどにより撮影範囲内で設定した注目領域の位置情報を記憶部31の注目領域の位置設定データ42に出力する。
測定範囲設定部36は、注目領域の位置設定データ42また入力装置6から入力される指示に基づいて、測定範囲を設定する。測定範囲設定部36は、設定した測定範囲を測定部39に出力する。
調光領域設定部37は、注目領域の位置設定データ42また入力装置6から入力される指示に基づいて、調光領域設定可能範囲及び調光領域を設定する。調光領域設定部37は、設定した調光領域を調光制御部38に出力する。
調光制御部38は、調光領域設定部37から調光領域が入力される。調光制御部38は、調光領域の画像データの情報に基づいて、調光条件、例えば光源装置8または撮像装置9の画像データ取得時の動作条件を決定する。調光制御部38は、決定した調光条件を光源装置8または撮像装置9に出力する。
測定部39は、測定範囲設定部36で設定された測定範囲が入力される。測定部39は、撮像装置9で取得した画像データ入力される。測定部39は、動作制御部40から各部の動作の情報、具体的に光学プローブ3と被測定物Mとの相対位置の情報が入力される。測定部39は、画像データを取得した光学プローブ3と被測定物Mとの相対位置について、画像データの測定範囲に含まれるライン状のパターンの像(本実施形態ではライン光の像ともいう。)を検出し、そのパターンの像に基づいて、被測定物Mの外形形状を測定する。動作制御部40は、プローブ移動装置2、光学プローブ3及び保持回転装置7の動作制御を実施する。動作制御部40は、動作制御の情報を注目領域設定部32及び測定部39に出力する。
基準注目領域内投影パターン像検出部90は、基準注目領域の位置設定データ44に基づいて、撮像素子22の画素範囲や画像データ上の基準注目領域を決定する基準注目領域設定部としての機能を備える。
基準注目領域内投影パターン像検出部90は、パターンの像がライン状の強度分布を有するライン光の像である場合、画像データの基準注目領域内において、ライン光の像の長さを測定する。
ライン状のパターンの像の長さは、例えば、次のような方法で測定することができる。たとえば、ライン状のパターンの長手方向を画像データの上下方向としたときに、ライン状のパターンの像を順次、上から下へ、または下から上へライン状パターンの像に対してほぼ直交する方向に沿って(以下、極大画素検索列と称す)、輝度値の変化が極大を示す画素があるかどうかを検出する。このようにして、画像データを上から下へまたは下から上へ順次検出していったときに、それぞれの極大画素検索列で検出された画素がどこまで連続して存在するかを評価することによって、長さを測定することができる。なお、極大画素検索列は、エピポーララインに沿った方向である。極大画素検索列が延在する方向が検出方向となる。このように極大画素がどれくらいの長さにわたって、連続して存在するかを評価するかによって、ライン状のパターンの像の長さの存在状態を判定する。
また、このように検出された明るさが極大を示す画素位置を検索する必要があるため、撮像素子20の画素の配列方向をエピポーララインの方向に略一致するようにしてもよい。
また、他にも測定に関する設定を示す設定情報、測定の経過を示す経過情報、測定の結果を示す形状情報等を含む。本実施形態の表示装置5は、測定情報を示す画像データを制御装置4から供給され、この画像データに従って測定情報を示す画像を表示する。
注目領域の取り消し判定の判断材料として画像データ上における注目領域の面積としたが、これだけに限られない。たとえば、画像データ上に有効撮影範囲などの範囲情報を設定しておき、有効撮影範囲と注目領域との両方合致した領域の面積などで判定してもよい。特に画像データの周縁領域の像は、結像光学系による諸収差の影響を受けやすいため、パターンの像の短手方向の強度分布が収差の影響を受けて崩れる。これが測定精度に影響を与えるので、このような周縁領域を避け、画像データの中央領域のみ有効撮影範囲を設定するようにしてもよい。
制御部30は、注目領域決定部86により距離が閾値以内である(ステップS141でYes)と判定した場合、つまり、ライン光の像が注目領域の縁に近い場合、ライン光の像の位置に基づいて、注目領域の位置を移動させる。
2 プローブ移動装置
3 光学プローブ
4 制御装置
5 表示装置
6 入力装置
7 保持回転装置
8 光源装置
9 撮像装置
10 駆動部
11 位置検出部
12 光源
13 照明光学系
20 撮像素子
20a 受光面
21 結像光学系
21a 物体面
30 制御部
31 記憶部
32,32A 注目領域設定ユニット
34 注目領域設定部
36 測定範囲設定部
37 調光領域設定部
38 調光制御部
39 測定部
40 動作制御部
42 注目領域の位置設定データ
44 基準注目領域の位置設定データ
46 条件テーブル
48 形状測定プログラム
49 緒元データ
50X,50Y,50Z 移動部
51X,51Y,51Z ガイド部
52 保持体
53 第1回転部
53a 回転軸線
54 第2回転部
55 保持部材
55A 第1保持部
55B 第2保持部
62,63,64,65,66,68 矢印
71 テーブル
72 回転駆動部
73a,73b 基準球
80 画像データ取得部
82 移動情報取得部
84 投影パターン像検出部
86 注目領域決定部
88 第2の注目領域生成部
89 注目領域取消判定部
90 基準注目領域内投影パターン像検出部
92 新規注目領域生成判定部
94 第2の注目領域決定部
100 画面
102,102a,102b,102c 画像ウインドウ
104,106 ウインドウ
112 測定条件欄
114,128,130 ボタン
120 チェックボックス
122 調光領域選択点
124 測定範囲選択点(点群抽出範囲)
126 範囲欄
140 被測定物
142a,142b,142c ライン光
144 輝線
150a,150b 測定領域
152 調光領域設定可能範囲
154 調光領域
172 検出用注目領域生成部
174 距離測定部
176 注目領域位置決定部
178 距離測定領域設定部
179 重ね合わせ画像生成部
182 注目領域テンプレートデータ
184 読み出しアドレスデータ
186 画像データ
200 構造物製造システム
201 形状測定装置
202 設計装置
203 成形装置
204 制御装置
205 リペア装置
210 座標記憶部
211 検査部
300 形状測定システム
302 プログラム作成装置
AX 回転軸中心
B ベース
M 被測定物
L 照明光束
Claims (29)
- パターンを測定対象に投影する投影部と、
前記投影部によりパターンが投影された測定対象を撮像する撮像部と、
前記投影部と前記測定対象とを相対移動させて、前記測定対象への前記パターンの前記測定対象上における投影位置を移動可能な移動部と、
前記測定対象の測定に利用される情報を取得するための注目領域を前記パターンの像を含むように前記撮像部で撮像される領域の少なくとも一部に設定する注目領域設定部と、
を備える形状測定装置。 - 前記注目領域設定部は、前記撮像部で取得された前記パターンの像を含む画像に基づき、前記注目領域を設定する請求項1に記載の形状測定装置。
- 前記注目領域設定部は、前記パターンが前記測定対象に投影された時の位置情報に基づいて、前記注目領域を設定する請求項1に記載の形状測定装置。
- 請求項1から3のいずれか一項に記載の形状測定装置において、
前記注目領域設定部は、前記投影部と前記前記測定対象との相対移動条件を含む移動情報に基づいて前記注目領域を設定する形状測定装置。 - 請求項4に記載の形状測定装置において、
前記移動情報は、前記測定対象に対する前記パターンの投影位置の移動方向を含み、
前記注目領域設定部は、前記移動方向に基づいて前記注目領域を設定する形状測定装置。 - 請求項4または請求項5に記載の形状測定装置において、
前記注目領域設定部は、前記相対移動を行う前の前記パターンの像と、前記相対移動条件により算出される前記撮像部の撮像面におけるパターンの像の移動距離と、に基づいて、前記注目領域を設定する形状測定装置。 - 請求項6に記載の形状測定装置において、
更に、前記注目領域を前記パターンの投影位置の移動に基づかずに、前記撮像部で撮像される領域内の少なくとも一部に前記測定対象の測定に利用される情報を取得するための基準注目領域を設定する基準注目領域設定部を有する形状測定装置。 - 請求項7に記載の形状測定装置において、
前記注目領域設定部は、前記基準注目領域と前記移動情報に基づいて、前記注目領域を設定する形状測定装置。 - 請求項7または8に記載の形状測定装置において、
前記注目領域設定部は、前記基準注目領域を設定した位置に新たに第2の注目領域を生成する第2の注目領域生成部を備える形状測定装置。 - 請求項9に記載の形状測定装置において、
前記第2の注目領域生成部は、前記基準注目領域の内側に形成される像の特徴量に基づいて、前記第2の注目領域を新たに生成するか否かを判定する新規注目領域生成判定部を有する形状測定装置。 - 請求項10に記載の形状測定装置において、
前記投影部は、前記測定対象に投影されるパターンとして、ライン状の光強度分布を持つ光束を投影することでライン光を投影し、
前記新規注目領域生成判定部は、前記基準注目領域に内のライン光のパターンの像について、前記ライン光のパターンの像の検出方向に沿って検出されるピーク値を持つ画素を、前記ライン光のパターンの像の前記検出方向に交差する方向の各位置で検出し、前記ピーク画素の個数が設定された値以上である場合、前記注目領域を新たに生成するように制御する形状測定装置。 - 請求項10に記載の形状測定装置において、
前記新規注目領域生成判定部は、前記基準注目領域内の前記ライン光のパターンの像の長さが設定された長さ以上である場合、前記注目領域を新たに生成するように制御する形状測定装置。 - 請求項9に記載の形状測定装置において、
前記注目領域設定部は、前記第2の注目領域の内側に形成される像の特徴に基づいて、前記第2の注目領域を取り消すか否かを判定する注目領域取消判定部を有する形状測定装置。 - 請求項4に記載の形状測定装置において、
前記注目領域設定部は、更に前記注目領域を、前記パターンの投影位置の相対移動に基づいて生ずるパターンの像の移動距離または移動方向に基づいて、前記注目領域を再設定する形状測定装置。 - 請求項14に記載の形状測定装置において、
前記注目領域設定部は、前記相対移動の前後の複数の画像から前記パターンの像の移動距離または移動方向を算出する形状測定装置。 - 請求項14に記載の形状測定装置において、
前記注目領域設定部は、前記初期注目領域の外縁から前記パターンの像までの距離に基づき、前記注目領域を移動する形状測定装置。 - 請求項16に記載の形状測定装置において、
更に前記初期注目領域の外縁から前記パターンの像まで距離を測定する場所を特定する距離測定領域を設定する距離測定領域設定部を備える形状測定装置。 - 請求項14に記載の形状測定装置において、
更に前記相対移動しながら撮影することで取得された複数の画像データの前記パターンの像を重ね合わせた画像を生成する重ね合わせ画像生成部を備え、
前記注目領域設定部は、前記重ね合わせ画像生成部で生成された前記パターンの像を重ね合わせた画像に基づいて前記パターンの像の移動距離または移動方向を算出する形状測定装置。 - 請求項14に記載の形状測定装置において、
前記注目領域設定部は、前記移動部から得られた前記測定対象に対する前記投影部の相対移動方向と、前記測定対象の緒元データに基づいて、前記パターンの像の移動距離または移動方向を算出する形状測定装置。 - 請求項1から19のいずれか一項に記載の形状測定装置において、
更に、前記投影部の前記ライン光の投影方向と、前記撮像部の撮像方向とがそれぞれ異なるように前記投影部と前記撮像部とを保持する筐体とを有するプローブを備える形状測定装置。 - 請求項1から20のいずれか一項の形状測定装置において、
前記注目領域は、点群生成領域であり、
更に、前記画像データの前記点群生成領域内に位置する前記パターンの像の位置に基づいて、前記測定対象の形状を測定する測定部を備える形状測定装置。 - 請求項1から21のいずれか一項の形状測定装置において、
前記注目領域は、調光領域であり、
更に、前記画像データの前記調光領域で検出された撮影された像の明るさに応じて、前記投影部からの投光量、前記撮像部で受光する受光量、前記撮像部で前記画像データを取得するときの露光量または前記撮像部の入出力特性を制御する調光制御部を備える形状測定装置。 - パターンを測定対象に投影する投影部と、
前記投影部によりパターンが投影された測定対象を撮像する撮像部と、
前記投影部と前記測定対象とを相対移動させて、前記測定対象への前記パターンの前記測定対象上における投影位置を移動可能な移動部と、
前記撮像部で撮像された前記測定対象へ投影されたパターンの像の存在状態を検出する基準注目領域を設定可能とし、前記基準注目領域内のパターンの像の存在状態に応じて、前記測定に利用される情報を取得する領域を設定する注目領域を生成する注目領域生成部と、
を備える形状測定装置。 - 更に、前記注目領域生成部で生成された生成された前記注目領域を、前記測定対象上における前記パターンの投影位置の移動に基づいて、変化させる注目領域設定部を有する請求項23に記載の形状測定装置。
- 構造物の形状に関する設計情報に基づいて前記構造物を成形する成形装置と、
前記成形装置によって成形された前記構造物の形状を測定する請求項1から24のいずれか一項に記載の形状測定装置と、
前記形状測定装置によって測定された前記構造物の形状を示す形状情報と前記設計情報とを比較する制御装置と、を備える構造物製造システム。 - 測定対象へパターンを投影し、前記パターンの投影方向とは異なる方向から前記測定対象に投影された前記パターンの像を撮像して画像データを取得し、前記画像データの前記パターンの像に基づいて、前記測定対象の形状を測定する形状測定方法であって、
前記パターンの投影位置が前記測定対象に対して相対移動することと、
前記測定対象の測定に利用される情報を取得するための注目領域を前記パターンの像を含むように前記撮像部で撮像される領域の少なくとも一部に設定することと、
前記画像データの前記注目領域内に位置する前記パターンの像の位置に基づいて前記測定対象の形状を測定することと、
を備える形状測定方法。 - 構造物の形状に関する設計情報に基づいて前記構造物を成形することと、
前記成形された前記構造物の形状を請求項26の形状測定方法によって測定することと、
前記測定された前記構造物の形状を示す形状情報と前記設計情報とを比較することと、
を含む構造物製造方法。 - 測定対象へパターンを投影し、前記パターンの投影方向とは異なる方向から前記測定対象に投影された前記パターンの像を撮像して画像データを取得し、前記画像データの前記パターンの像に基づいて、前記測定対象の形状を測定する形状測定プログラムであって、
コンピュータに
前記パターンの投影位置が前記測定対象に対して相対移動することと、
前記測定対象の測定に利用される情報を取得するための注目領域を前記パターンの像を含むように前記撮像部で撮像される領域の少なくとも一部に設定することと、
前記画像データの前記注目領域内に位置する前記パターンの像の位置に基づいて前記測定対象の形状を測定することと、
を実行させる形状測定プログラム。 - 請求項28に記載の形状測定プログラムを記録し、コンピュータが読み取り可能な記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/065751 WO2015189985A1 (ja) | 2014-06-13 | 2014-06-13 | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 |
JP2016527593A JP6350657B2 (ja) | 2014-06-13 | 2014-06-13 | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 |
EP14894726.0A EP3156763B1 (en) | 2014-06-13 | 2014-06-13 | Shape measurement device |
US15/318,409 US10482592B2 (en) | 2014-06-13 | 2014-06-13 | Shape measuring device, structured object manufacturing system, shape measuring method, structured object manufacturing method, shape measuring program, and recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/065751 WO2015189985A1 (ja) | 2014-06-13 | 2014-06-13 | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015189985A1 true WO2015189985A1 (ja) | 2015-12-17 |
Family
ID=54833113
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/065751 WO2015189985A1 (ja) | 2014-06-13 | 2014-06-13 | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10482592B2 (ja) |
EP (1) | EP3156763B1 (ja) |
JP (1) | JP6350657B2 (ja) |
WO (1) | WO2015189985A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108204791A (zh) * | 2017-12-30 | 2018-06-26 | 北京工业大学 | 一种六轴线激光齿轮测量装置 |
JP2018146365A (ja) * | 2017-03-03 | 2018-09-20 | リコーエレメックス株式会社 | 画像検査装置 |
CN111795657A (zh) * | 2020-07-16 | 2020-10-20 | 南京大量数控科技有限公司 | 一种快速测量柔性板材平整度设备及方法 |
US20210072019A1 (en) * | 2018-02-27 | 2021-03-11 | Nikon Corporation | Image analysis device, analyis device, shape measurement device, image analysis method, measurement condition determination method, shape measurement method, and program |
CN112525078A (zh) * | 2019-09-18 | 2021-03-19 | 财团法人工业技术研究院 | 三维量测装置与其操作方法 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10824315B2 (en) * | 2015-05-29 | 2020-11-03 | Canon Medical Systems Corporation | Medical image processing apparatus, magnetic resonance imaging apparatus and medical image processing method |
TWI653563B (zh) * | 2016-05-24 | 2019-03-11 | 仁寶電腦工業股份有限公司 | 投影觸控的圖像選取方法 |
DE102018004592A1 (de) * | 2017-06-20 | 2018-12-20 | Mitutoyo Corporation | Messapparat für dreidimensionale Geometrie und Messverfahren für dreidimensionale Geometrie |
US20190012782A1 (en) * | 2017-07-05 | 2019-01-10 | Integrated Vision Systems LLC | Optical inspection apparatus and method |
JP6962145B2 (ja) * | 2017-11-13 | 2021-11-05 | 富士通株式会社 | 画像処理プログラム、画像処理方法および情報処理装置 |
US10692236B2 (en) * | 2017-12-22 | 2020-06-23 | Symbol Technologies, Llc | Container use estimation |
KR101921021B1 (ko) * | 2018-04-06 | 2018-11-21 | (주)이즈미디어 | 회전식 카메라모듈 검사장치 |
US11040452B2 (en) * | 2018-05-29 | 2021-06-22 | Abb Schweiz Ag | Depth sensing robotic hand-eye camera using structured light |
JP6606234B1 (ja) * | 2018-07-13 | 2019-11-13 | Dmg森精機株式会社 | 測定装置 |
CN109238132B (zh) * | 2018-09-12 | 2020-03-27 | 北京工业大学 | 基于外差干涉的双基圆盘式渐开线样板测量光学***仿真方法 |
JP7219034B2 (ja) * | 2018-09-14 | 2023-02-07 | 株式会社ミツトヨ | 三次元形状測定装置及び三次元形状測定方法 |
US11475571B2 (en) * | 2019-03-13 | 2022-10-18 | Canon Kabushiki Kaisha | Apparatus, image processing apparatus, and control method |
US11545114B2 (en) * | 2020-07-14 | 2023-01-03 | Qualcomm Incorporated | Methods and apparatus for data content integrity |
DE102021102122B4 (de) * | 2021-01-29 | 2023-12-28 | Klingelnberg GmbH. | Verfahren und Vorrichtung zum Messen einer Verzahnung |
US11663761B2 (en) * | 2021-08-25 | 2023-05-30 | Sap Se | Hand-drawn diagram recognition using visual arrow-relation detection |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63149507A (ja) * | 1986-12-13 | 1988-06-22 | Kobe Steel Ltd | 作業線自動検出方法 |
WO1991010111A1 (en) * | 1989-12-28 | 1991-07-11 | Kabushiki Kaisha Toyota Chuo Kenkyusho | Apparatus for measuring three-dimensional coordinate |
WO2001057471A1 (fr) * | 2000-01-31 | 2001-08-09 | Omron Corporation | Capteur visuel a deplacement |
JP2005241570A (ja) * | 2004-02-27 | 2005-09-08 | Sunx Ltd | 表面形状検出器 |
JP2013234854A (ja) * | 2012-05-02 | 2013-11-21 | Nikon Corp | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及びそのプログラム |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6858826B2 (en) * | 1996-10-25 | 2005-02-22 | Waveworx Inc. | Method and apparatus for scanning three-dimensional objects |
JP4111592B2 (ja) * | 1998-06-18 | 2008-07-02 | コニカミノルタセンシング株式会社 | 3次元入力装置 |
JP2001289614A (ja) | 2000-01-31 | 2001-10-19 | Omron Corp | 変位センサ |
JP4282643B2 (ja) * | 2005-08-30 | 2009-06-24 | ダイハツ工業株式会社 | 歪評価装置及び歪評価方法 |
JP2008096123A (ja) | 2006-10-05 | 2008-04-24 | Keyence Corp | 光学式変位計、光学式変位測定方法、光学式変位測定プログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器 |
US8531650B2 (en) * | 2008-07-08 | 2013-09-10 | Chiaro Technologies LLC | Multiple channel locating |
DE102010064593A1 (de) * | 2009-05-21 | 2015-07-30 | Koh Young Technology Inc. | Formmessgerät und -verfahren |
DE102010029319B4 (de) * | 2009-05-27 | 2015-07-02 | Koh Young Technology Inc. | Vorrichtung zur Messung einer dreidimensionalen Form und Verfahren dazu |
JP5425601B2 (ja) * | 2009-12-03 | 2014-02-26 | 株式会社日立ハイテクノロジーズ | 荷電粒子線装置およびその画質改善方法 |
US9179106B2 (en) * | 2009-12-28 | 2015-11-03 | Canon Kabushiki Kaisha | Measurement system, image correction method, and computer program |
WO2011085225A1 (en) * | 2010-01-08 | 2011-07-14 | Wake Forest University Health Sciences | Delivery system |
WO2012133926A2 (en) | 2011-04-01 | 2012-10-04 | Nikon Corporation | Profile measuring apparatus, method for measuring profile, and method for manufacturing structure |
US20140157579A1 (en) * | 2012-12-08 | 2014-06-12 | 8 Tree, Llc | Networked marketplace for custom 3D fabrication |
US9717461B2 (en) * | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9041914B2 (en) * | 2013-03-15 | 2015-05-26 | Faro Technologies, Inc. | Three-dimensional coordinate scanner and method of operation |
-
2014
- 2014-06-13 US US15/318,409 patent/US10482592B2/en active Active
- 2014-06-13 WO PCT/JP2014/065751 patent/WO2015189985A1/ja active Application Filing
- 2014-06-13 EP EP14894726.0A patent/EP3156763B1/en active Active
- 2014-06-13 JP JP2016527593A patent/JP6350657B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63149507A (ja) * | 1986-12-13 | 1988-06-22 | Kobe Steel Ltd | 作業線自動検出方法 |
WO1991010111A1 (en) * | 1989-12-28 | 1991-07-11 | Kabushiki Kaisha Toyota Chuo Kenkyusho | Apparatus for measuring three-dimensional coordinate |
WO2001057471A1 (fr) * | 2000-01-31 | 2001-08-09 | Omron Corporation | Capteur visuel a deplacement |
JP2005241570A (ja) * | 2004-02-27 | 2005-09-08 | Sunx Ltd | 表面形状検出器 |
JP2013234854A (ja) * | 2012-05-02 | 2013-11-21 | Nikon Corp | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及びそのプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3156763A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018146365A (ja) * | 2017-03-03 | 2018-09-20 | リコーエレメックス株式会社 | 画像検査装置 |
CN108204791A (zh) * | 2017-12-30 | 2018-06-26 | 北京工业大学 | 一种六轴线激光齿轮测量装置 |
US20210072019A1 (en) * | 2018-02-27 | 2021-03-11 | Nikon Corporation | Image analysis device, analyis device, shape measurement device, image analysis method, measurement condition determination method, shape measurement method, and program |
CN112525078A (zh) * | 2019-09-18 | 2021-03-19 | 财团法人工业技术研究院 | 三维量测装置与其操作方法 |
CN111795657A (zh) * | 2020-07-16 | 2020-10-20 | 南京大量数控科技有限公司 | 一种快速测量柔性板材平整度设备及方法 |
Also Published As
Publication number | Publication date |
---|---|
US20170132784A1 (en) | 2017-05-11 |
JPWO2015189985A1 (ja) | 2017-04-20 |
EP3156763B1 (en) | 2019-02-06 |
EP3156763A4 (en) | 2018-01-03 |
JP6350657B2 (ja) | 2018-07-04 |
US10482592B2 (en) | 2019-11-19 |
EP3156763A1 (en) | 2017-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6350657B2 (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 | |
JP5436431B2 (ja) | 被検体の凹凸検出方法とその装置 | |
JP6184289B2 (ja) | 三次元画像処理装置、三次元画像処理方法、三次元画像処理プログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器 | |
RU2762619C2 (ru) | Графическая накладка для измерения размеров элементов при помощи устройства видеоконтроля | |
JP5109598B2 (ja) | 物品検査方法 | |
JP2012045610A (ja) | ビードの終端部の形状を判定する装置及びその方法 | |
US20160054119A1 (en) | Shape measurement device, structure production system, shape measurement method, structure production method, and shape measurement program | |
JP2013064644A (ja) | 形状測定装置、形状測定方法、構造物製造システム及び構造物の製造方法 | |
JP2005121486A (ja) | 3次元測定装置 | |
JP2014145735A (ja) | 形状測定装置、構造物製造システム、評価装置、形状測定方法、構造物製造方法、及び形状測定プログラム | |
JP6248510B2 (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 | |
JP5170622B2 (ja) | 形状測定方法、プログラム、および形状測定装置 | |
JP2014126381A (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム | |
JP2023176008A (ja) | 像解析装置、解析装置、形状測定装置、像解析方法、測定条件決定方法、形状測定方法及びプログラム | |
JP6887141B2 (ja) | 表面形状測定方法および表面形状測定装置 | |
US20170069110A1 (en) | Shape measuring method | |
JP7491664B2 (ja) | 形状測定装置、構造物製造システム、形状測定方法、及び構造物製造方法 | |
JP2018141810A (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、及び記録媒体 | |
JP2012093234A (ja) | 三次元形状測定装置、三次元形状測定方法、構造物の製造方法および構造物製造システム | |
JP7001947B2 (ja) | 表面形状測定方法 | |
JP5786999B2 (ja) | 三次元形状計測装置、三次元形状計測装置のキャリブレーション方法 | |
JP6270264B2 (ja) | 情報処理装置、情報処理方法、プログラム、測定装置、及び測定方法 | |
JP6820516B2 (ja) | 表面形状測定方法 | |
US20020159073A1 (en) | Range-image-based method and system for automatic sensor planning | |
JP2009069063A (ja) | 計測方法、形状計測方法、計測装置及び形状計測装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14894726 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016527593 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2014894726 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15318409 Country of ref document: US Ref document number: 2014894726 Country of ref document: EP |