US20190012782A1 - Optical inspection apparatus and method - Google Patents
Optical inspection apparatus and method Download PDFInfo
- Publication number
- US20190012782A1 US20190012782A1 US15/660,600 US201715660600A US2019012782A1 US 20190012782 A1 US20190012782 A1 US 20190012782A1 US 201715660600 A US201715660600 A US 201715660600A US 2019012782 A1 US2019012782 A1 US 2019012782A1
- Authority
- US
- United States
- Prior art keywords
- optical
- component
- data
- optical sensor
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/0006—Industrial image inspection using a design-rule based approach
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
Definitions
- This disclosure relates to computerized optical inspection systems.
- Optical inspection systems are employed to determine whether a workpiece, such as a manufactured object, is within design tolerance.
- An optical inspection system typically includes a camera or other light sensor.
- An optical inspection system includes a first optical sensor, a second optical sensor, and a drive system.
- the drive system supports the first and second optical sensors and is configured to selectively move the first optical sensor and second optical sensor with respect to each other. Accordingly, the drive system is capable of moving the first and second optical sensors independently of one another.
- the ability to move the first and second optical sensors independently of one another facilitates the inspection of differently configured parts with a single inspection apparatus, thereby providing cost savings compared to purchasing or building multiple custom inspection fixtures.
- a corresponding method is also provided.
- FIG. 1 is a schematic, perspective view of an optical inspection system having a processor, a drive system, a platform, and first and second optical sensors;
- FIG. 2 is a flow chart depicting the control logic of the processor of FIG. 1 ;
- FIG. 3 is a schematic top view of the inspection system of FIG. 1 with a first component on the platform, the first optical sensor in a first position, and the second optical sensor in a second position;
- FIG. 4 is a schematic top view of the optical inspection system of FIG. 1 with a second component on the platform, the first optical sensor in a third position, and the second optical sensor in a fourth position;
- FIG. 5 is a flow chart depicting a method of using the optical inspection system of FIG. 1 ;
- FIG. 6 is a schematic top view of the drive system of FIG. 1 ;
- FIG. 7 is a schematic top view of the drive system of FIG. 6 in a different configuration
- FIG. 8 is a schematic depiction of a portion of an alternative optical inspection system having processing distributed between multiple processors
- FIG. 9 is a schematic, cross-sectional side view of an actuator and mechanism for use with the optical inspection system of FIG. 1 to selectively move the drive system and first and second optical sensors vertically;
- FIG. 10 is a schematic side view of one of the optical sensors of FIG. 1 with an actuator attached thereto to selectively rotate the optical sensor in an alternative embodiment.
- the optical inspection system 10 includes a support structure 12 that supports a drive system 14 above an inspection platform 18 .
- the system 10 further includes a first optical sensor 22 and a second optical sensor 26 operatively connected to the drive system 14 and suspended above the inspection platform 18 .
- the optical sensors 22 , 26 are cameras.
- the drive system 14 is configured to selectively move the first and second optical sensors 22 , 26 with respect to the platform 18 and with respect to each other.
- a processor 30 is operatively connected to, and configured to control, the drive system 14 and optical sensors 22 , 26 .
- the system 10 also includes an input device 34 through which a user of the system 10 may instruct the processor 30 or otherwise input information to the processor 30 .
- An output device 38 is operatively connected to the processor 30 to receive signals from the processor 30 and generate a user-perceptible indicator in response thereto.
- the system 10 also includes a data storage medium 42 that is operatively connected to the processor 30 such that data stored on the medium 42 is selectively retrievable by the processor 30 , i.e., the processor 30 can selectively obtain data stored on the medium 42 .
- the optical inspection system 10 is configured to inspect a plurality of different components having different sizes and/or geometries.
- the data storage medium 42 stores data for a first component, i.e., stored first component data file 46 .
- the data storage medium 42 also stores data for a second component, i.e., stored second component data file 50 .
- a first component is shown at 56 in FIG. 3 ; one example of a second component is shown at 110 in FIG. 4 .
- FIG. 2 a method of operation of the optical inspection system 10 is schematically depicted.
- the method of operation depicted in FIG. 2 represents an exemplary control logic used by the processor 30 .
- the processor 30 is programmed and configured to carry out the steps shown and described in FIG. 2 .
- a first component 56 is disposed on the upper surface of the platform 18 for inspection.
- the method includes, at step 52 , receiving a signal 54 from the input device 34 indicating which of a plurality of different components a user (not shown) desires to be inspected by the system 10 .
- a screen may display the options available for the user to select.
- Signals may take any form within the scope of the claimed invention, including, but not limited to, electronic, wireless, etc. and may be digital or analog.
- the processor 30 determines which of the data files 46 , 50 to retrieve or obtain from the storage medium 42 based on the signal 54 from the input device 34 . More specifically, if the signal 54 indicates that the first component 56 is selected by the user, then the processor 30 proceeds to step 62 . At step 62 the processor 30 obtains the stored first component data file 46 from the data storage medium 42 .
- the stored first component data file 46 includes data describing a first position 66 , data describing a second position 68 , a first image file 70 , and a second image file 72 .
- the processor 30 then proceeds to step 76 .
- the processor 30 controls the drive system 14 to cause the movement of the first optical sensor 22 to the first position, as shown in FIG. 3 .
- the processor 30 then proceeds to step 78 , at which the processor 30 controls the drive system 14 to cause the movement of the second optical sensor 26 to the second position, as shown in FIG. 3 .
- the first and second positions are predetermined vantage points at which the optical sensors 22 , 26 will capture images (“image data sets” or “sets of optical data”) of respective portions of the first component 56 within their respective fields of view.
- the processor 30 causes the first optical sensor 22 to obtain a first image data set at step 82 ; the processor 30 then causes the second optical sensor 26 to obtain a second image data set at step 86 .
- the first image file 70 includes data describing the design geometry of the portion 90 of the first component 56 to be sensed by the first optical sensor 22 in the first position.
- the second image file 72 includes data describing the design geometry of the portion 94 of the first component 56 to be sensed by the second optical sensor 26 in the second position.
- the processor 30 compares the first image data set obtained at step 82 to the stored first image file 70 in a manner understood by those skilled in the art of optical inspection and thereby determines whether the portion 90 of the first component 56 captured by the first optical sensor 22 is within design specification.
- the processor 30 compares the second image data set obtained at step 86 to the stored second image file 72 in a manner understood by those skilled in the art of optical inspection and thereby determines whether the portion 94 of the first component 56 captured by the second optical sensor 26 is within design specification.
- the processor 30 determines whether the first image data set is within a predetermined amount of variance (i.e., within design tolerance) from the first data file. Similarly, the processor 30 determines whether the second image data set is within a predetermined amount of variance (i.e., within design tolerance) from the second data file.
- the processor 30 causes the output device 38 to generate an indicator, perceptible to a user of the system 10 , that informs the user whether the first component 56 is within design specification, which the processor determined at steps 98 and 102 .
- an indicator perceptible to a user of the system 10 , that informs the user whether the first component 56 is within design specification, which the processor determined at steps 98 and 102 .
- a visual display such as an LCD screen, lights, speakers, etc.
- the indicator may, for example, be an icon or color on a screen, a light, a sound, etc.
- step 114 the processor 30 obtains the stored second component data file 50 from the data storage medium 42 .
- the stored second component data file 50 includes data describing a third position 118 , data describing a fourth position 120 , a third image file 121 , and a fourth image file 122 .
- the processor 30 then proceeds to step 124 .
- step 124 the processor 30 controls the drive system 14 to cause the movement of the first optical sensor 22 to the third position relative to the platform 18 , as shown in FIG. 4 .
- step 126 at which the processor 30 controls the drive system 14 to cause the movement of the second optical sensor 26 to the fourth position relative to the platform 18 , as shown in FIG. 4 .
- the third and fourth positions are predetermined vantage points at which the optical sensors 22 , 26 will capture images of respective portions of the second component 110 .
- the processor 30 causes the first optical sensor 22 to obtain a third image data set at step 130 ; the processor 30 then causes the second optical sensor 26 to obtain a fourth image data set at step 134 .
- the third image file 121 includes data describing the design geometry of the portion of the second component 110 to be sensed by the first optical sensor 22 in the third position.
- the fourth image file 122 includes data describing the design geometry of the portion of the second component 110 to be sensed by the second optical sensor 26 in the fourth position.
- the processor 30 compares the third image data set obtained at step 130 to the stored third image file 121 in a manner understood by those skilled in the art of optical inspection and thereby determines whether the portion of the second component 110 captured by the first optical sensor 22 at step 130 is within design specification.
- the processor 30 compares the fourth image data set obtained at step 134 to the stored fourth image file 122 in a manner understood by those skilled in the art of optical inspection and thereby determines whether the portion of the second component 110 captured by the second optical sensor 26 at step 134 is within design specification.
- the processor 30 determines whether the third image data set is within a predetermined amount of variance (i.e., within design tolerance) from the third data file. Similarly, the processor 30 determines whether the fourth image data set is within a predetermined amount of variance (i.e., within design tolerance) from the fourth data file.
- the processor 30 causes the output device 38 to generate an indicator, perceptible to a user of the system 10 , that informs the user whether the second component 110 is within design specification at step 146 .
- the system 10 enables a single device to effectively inspect at least two components, e.g., first component 56 and second component 110 , having different shapes, sizes, and design specifications, thereby reducing costs compared to procuring a separate custom inspection apparatus for each component configuration. Further, the system 10 reduces the time required to inspect a complex part compared to an inspection apparatus having only a single optical sensor.
- FIG. 5 depicts a method of using the system 10 .
- the method includes obtaining the system 10 at step 150 .
- the method also includes storing a plurality of data files 46 , 50 on the data storage medium 42 at step 154 .
- the method also includes placing a first component (as shown at 56 in FIG. 3 ) on the platform 18 at step 158 , and instructing the processor 30 (via input device 34 ) to retrieve and use data file 46 from the storage medium 42 at step 162 .
- the method also includes removing the first component from the platform 18 and placing the second component (as shown at 110 in FIG. 4 ) on the platform 18 at step 166 , and instructing the processor 30 (via input device 34 ) to retrieve and use data file 48 from the storage medium 42 at step 170 .
- the drive system 14 may have any configuration that permits independent movement of the optical sensors 22 , 26 relative to one another within the scope of the claims.
- the drive mechanism 14 may employ a rack and pinion system, belts and pulleys, etc.
- the drive system 14 in the embodiment depicted employs screw drive mechanisms. More specifically, and with reference to FIG. 6 , wherein like reference numbers refer to like components from FIGS. 1-5 , the drive system 14 includes lead screw linear drive mechanisms 174 A, 174 B, 174 C, 174 D.
- Drive mechanism 174 A includes a lead screw 178 A characterized by external helical threads 182 A, as understood by those skilled in the art.
- Drive mechanism 174 A further includes a drive nut 186 A having internal helical threads.
- the lead screw 178 A extends through the hole of drive nut 186 A such that the threads 182 A of the lead screw 178 A are engaged with the threads of the drive nut 186 A.
- rotation of the lead screw 178 A about its centerline causes the drive nut 186 A to move linearly along the centerline of lead screw 178 A.
- Optical sensor 22 is mounted to the drive nut 186 A via a bracket 190 A. Accordingly, rotation of lead screw 178 A causes linear translation of the optical sensor 22 .
- Drive mechanism 174 A also includes two cylindrical guide rods 194 A, 198 A, that are parallel to each other and to the lead screw 178 A.
- the bracket 190 A defines two holes; each of the rods 194 A, 194 B extends through a respective one of the holes.
- the rods 194 A, 198 A thereby interact with the bracket 190 A to substantially limit movement of the bracket 190 A and the optical sensor 22 to linear translation.
- Drive mechanism 174 A further includes an actuator 202 A that is operatively connected to the lead screw 178 A and configured to selectively apply torque to the lead screw 178 A and thereby cause the lead screw 178 A to rotate about its centerline.
- actuator 202 A is a stepper motor.
- drive mechanism 174 A is configured to selectively cause movement of the first optical sensor 22 in one of two opposite directions 204 , 205 along the axis of rotation of the lead screw 178 A.
- Drive mechanism 174 B is configured to selectively cause movement of the first optical sensor 22 in either of two opposite directions 208 , 209 perpendicular to the axis of rotation of the lead screw 178 A and directions 204 , 205 .
- drive mechanism 174 B includes a lead screw 178 B characterized by external helical threads 182 B, as understood by those skilled in the art.
- Lead screw 178 B is oriented perpendicularly to lead screw 178 A.
- Drive mechanism 174 B further includes a drive nut 186 B having internal helical threads.
- the lead screw 178 B extends through the hole of drive nut 186 B such that the threads 182 B of the lead screw are engaged with the threads of the drive nut 186 B.
- rotation of the lead screw 178 B about its centerline causes the drive nut 186 B to move linearly along the centerline of lead screw 178 B.
- the drive nut 186 B is mounted to drive mechanism 174 A.
- drive mechanism 174 A may include a structural member 206 to which the drive nut 186 B is mounted. Accordingly, rotation of lead screw 178 B causes linear movement of the drive mechanism 174 A and, correspondingly, the first optical sensor 22 , in two opposite directions 208 , 209 that are perpendicular to directions 204 , 205 .
- Drive mechanism 174 A engages two cylindrical guide rods 210 , 214 that are parallel to each other and to the lead screw 178 B. The guide rods 210 , 214 are supported by the support structure 12 and retain the drive mechanism 174 A above the platform 18 .
- Drive mechanism 174 B further includes an actuator 202 B that is operatively connected to the lead screw 178 B and configured to selectively apply torque to the lead screw 178 B and thereby cause the lead screw 178 B to rotate about its centerline.
- actuator 202 B is a stepper motor.
- Drive mechanism 174 C includes a lead screw 178 C characterized by external helical threads 182 C, as understood by those skilled in the art.
- Drive mechanism 174 C further includes a drive nut 186 C having internal helical threads.
- the lead screw 178 C extends through the hole of drive nut 186 C such that the threads 182 C of the lead screw 178 C are engaged with the threads of the drive nut 186 C.
- rotation of the lead screw 178 C about its centerline causes the drive nut 186 C to move linearly along the centerline of lead screw 178 C.
- Optical sensor 26 is mounted to the drive nut 186 C via a bracket 190 C. Accordingly, rotation of lead screw 178 C causes linear translation of the optical sensor 26 .
- Drive mechanism 174 C also includes two cylindrical guide rods 194 C, 198 C, that are parallel to each other and to the lead screw 178 C.
- the bracket 190 C defines two holes; each of the rods 194 C, 194 C extends through a respective one of the holes.
- the rods 194 C, 198 C thereby interact with the bracket 190 C to substantially limit movement of the bracket 190 C and the optical sensor 26 to linear translation.
- Drive mechanism 174 C further includes an actuator 202 C that is operatively connected to the lead screw 178 C and configured to selectively apply torque to the lead screw 178 C and thereby cause the lead screw 178 C to rotate about its centerline.
- actuator 202 C is a stepper motor.
- drive mechanism 174 C is configured to selectively cause movement of the second optical sensor 26 in either of two opposite directions 204 , 205 along the axis of rotation of the lead screw 178 C.
- Drive mechanism 174 D is configured to selectively cause movement of the second optical sensor 26 in either of two opposite directions 208 , 209 perpendicular to the axis of rotation of the lead screw 178 C and directions 204 , 205 .
- drive mechanism 174 D includes a lead screw 178 D characterized by external helical threads 182 D, as understood by those skilled in the art.
- Lead screw 178 D is oriented perpendicularly to lead screw 178 C.
- Drive mechanism 174 D further includes a drive nut 186 D having internal helical threads.
- the lead screw 178 D extends through the hole of drive nut 186 D such that the threads 182 D of the lead screw are engaged with the threads of the drive nut 186 D.
- rotation of the lead screw 178 D about its centerline causes the drive nut 186 D to move linearly along the centerline of lead screw 178 D.
- the drive nut 186 D is mounted to drive mechanism 174 C.
- drive mechanism 174 C may include a structural member 206 to which the drive nut 186 D is mounted. Accordingly, rotation of lead screw 178 D causes linear movement of the drive mechanism 174 C and, correspondingly, the second optical sensor 26 , in two opposite directions 208 , 209 that are perpendicular to directions 204 , 205 .
- Drive mechanism 174 C engages two cylindrical guide rods 220 , 224 that are parallel to each other and to the lead screw 178 C.
- the guide rods 220 , 224 are supported by the support structure 12 and retain the drive mechanism 174 C above the platform 18 .
- Drive mechanism 174 D further includes an actuator 202 D that is operatively connected to the lead screw 178 D and configured to selectively apply torque to the lead screw 178 D and thereby cause the lead screw 178 D to rotate about its centerline.
- actuator 202 D is a stepper motor.
- the processor 30 is operatively connected to, and configured to control, the actuators 202 A, 202 B, 202 C, 202 D. Accordingly, the processor 30 selectively causes the movement of the first optical sensor 22 through application of torque by the actuators 202 A, 202 B. Similarly, the processor 30 selectively causes the movement of the second optical sensor 26 through application of torque by the actuators 202 C, 202 D.
- FIG. 7 wherein like reference numbers refer to like components from FIGS. 1-6 , exemplary movement of drive mechanisms 174 A, 174 C and optical sensors 22 , 26 is depicted.
- the drive system 14 depicted is configured such that each of the optical sensors 22 , 26 is movable to any point within a portion of a plane. Movement of the first optical sensor 22 in directions 204 , 205 is achieved by the application of torque to lead screw 178 A by actuator 202 A; movement of the first optical sensor 22 in directions 208 , 209 is achieved by the application of torque to lead screw 178 B by actuator 202 B. Movement of the second optical sensor 26 in directions 204 , 205 is achieved by the application of torque to lead screw 178 C by actuator 202 C; movement of the second optical sensor 26 in directions 208 , 209 is achieved by the application of torque to lead screw 178 D by actuator 202 D.
- first, second, third, and fourth positions are relative to a fixed, stationary portion of the inspection system 10 , such as the platform 18 or support structure 12 .
- the first, second, third, and fourth positions may, for example, be expressed as points on a Cartesian plane, though other ways of expressing the positions may be employed within the scope of the claimed invention.
- a “processor” may include a plurality of processors that cooperate to perform the operations described herein.
- data storage medium 194 may be a hard drive, read-only memory, writable read-only memory, optical media such as a compact disk, etc.
- data storage medium 194 may be a hard drive, read-only memory, writable read-only memory, optical media such as a compact disk, etc.
- a “data storage medium” may include multiple storage media that together store the data used by the processor.
- an “output device” may include one or more output devices.
- FIG. 8 schematically depicts an alternative embodiment within the scope of the claims that includes a plurality of processors that cooperate to perform the steps shown in FIG. 2 .
- optical inspection system 300 includes a personal computer 304 , i.e., one processor, that includes an input device 308 that performs the same functions as the input device shown at 34 in FIG. 1 .
- personal computer 304 also includes a data storage medium 310 that performs the same functions as the data storage medium shown at 42 in FIG. 1 .
- the personal computer 304 perfoms steps 52 and 58 .
- the personal computer 304 When a user selects which data file to use via the input device 308 , the personal computer 304 performs either step 62 or step 114 of FIG. 2 , depending on which component is selected. The personal computer 304 then transmits the data obtained at step 62 or 114 to a microcontroller 314 , i.e., another processor.
- Each actuator 202 A, 202 B, 202 C, 202 D is directly controlled by a respective stepper controller 318 A, 318 B, 318 C, 318 D.
- the microcontroller 314 transmits position data to each stepper controller 318 A, 318 B, 318 C, 318 D to effectuate movement of the optical sensors 22 , 26 to their desired positions.
- microcontroller 314 cooperates with stepper controllers 318 A, 318 B, 318 C, 318 D to perform steps 76 , 78 , 124 , and 126 .
- the drive mechanisms interconnecting the optical sensors 22 , 26 and the actuators 202 A, 202 B, 202 C, 202 D are not shown in FIG. 8 for clarity but are substantially similar to the ones shown in FIGS. 6 and 7 .
- optical sensors 22 , 26 are stand-alone units each having a respective processor (not shown) that performs steps 82 - 106 and steps 130 - 146 .
- Microcontroller 314 is configured to transmit the design data (i.e., image files 68 , 70 , 120 , 122 ) received by the personal computer 304 to the processors of the optical sensors 22 , 26 via relay 322 .
- the processor of optical sensor 22 performs steps 82 , 98 , and 106 (for the portion of the first component inspected by the first optical sensor 22 ), and the processor of optical sensor 26 performs steps 86 , 102 , and 106 (for the portion of the first component inspected by the second optical sensor 26 ).
- the processor of optical sensor 22 performs steps 130 , 138 , and 146 (for the portion of the second component inspected by the first optical sensor 22 ), and the processor of optical sensor 26 performs steps 134 , 142 , and 146 (for the portion of the second component inspected by the second optical sensor 26 ).
- Optical sensors 22 , 26 may have respective output devices (not shown) to indicate whether components are within design specification.
- a plurality of limit switches 326 A-D or other feedback devices is operatively connected to the microcontroller 314 to provide feedback to the microcontroller 314 regarding the movement and position of the optical sensors 22 , 26 , as understood by those skilled in the art.
- a screw drive mechanism 174 E is configured to selectively move the drive system 14 , and correspondingly the optical sensors 22 , 26 , vertically.
- Drive mechanism 174 E includes a lead screw 178 E that is vertically oriented, and that is engaged with a drive nut 186 E.
- Drive nut 186 E is mounted to the drive system 14 .
- Actuator 202 E is configured to selectively rotate lead screw 178 E, and is controlled by processor 30 .
- an actuator 202 F operatively interconnects optical sensor 22 to bracket 190 A; actuator 202 F is configured to selectively rotate optical sensor 22 about a horizontal axis 400 .
- fixtures may be employed to positively position the components 56 , 110 on the platform 18 .
- Pins on the fixture may be inserted into holes (not shown) in platform 18 to assist a user with properly positioning the component for inspection.
Abstract
An optical inspection system includes a first optical sensor, a second optical sensor, and a drive system operatively connected to the first and second optical sensors and configured to selectively move the first optical sensor and second optical sensor with respect to one another. The ability to move the first and second optical sensors independently of one another facilitates the inspection of differently configured parts with a single inspection apparatus, thereby providing cost savings compared to purchasing or building multiple custom inspection fixtures.
Description
- This application claims the benefit of U.S. Provisional Patent Application No. 62/528,617, filed Jul. 5, 2017, and which is hereby incorporated by reference in its entirety.
- This disclosure relates to computerized optical inspection systems.
- Optical inspection systems are employed to determine whether a workpiece, such as a manufactured object, is within design tolerance. An optical inspection system typically includes a camera or other light sensor.
- An optical inspection system includes a first optical sensor, a second optical sensor, and a drive system. The drive system supports the first and second optical sensors and is configured to selectively move the first optical sensor and second optical sensor with respect to each other. Accordingly, the drive system is capable of moving the first and second optical sensors independently of one another.
- The ability to move the first and second optical sensors independently of one another facilitates the inspection of differently configured parts with a single inspection apparatus, thereby providing cost savings compared to purchasing or building multiple custom inspection fixtures.
- A corresponding method is also provided.
- The above features and advantages and other features and advantages of the present disclosure are readily apparent from the following detailed description of the best modes for carrying out the disclosure when taken in connection with the accompanying drawings.
-
FIG. 1 is a schematic, perspective view of an optical inspection system having a processor, a drive system, a platform, and first and second optical sensors; -
FIG. 2 is a flow chart depicting the control logic of the processor ofFIG. 1 ; -
FIG. 3 is a schematic top view of the inspection system ofFIG. 1 with a first component on the platform, the first optical sensor in a first position, and the second optical sensor in a second position; -
FIG. 4 is a schematic top view of the optical inspection system ofFIG. 1 with a second component on the platform, the first optical sensor in a third position, and the second optical sensor in a fourth position; -
FIG. 5 is a flow chart depicting a method of using the optical inspection system ofFIG. 1 ; -
FIG. 6 is a schematic top view of the drive system ofFIG. 1 ; -
FIG. 7 is a schematic top view of the drive system ofFIG. 6 in a different configuration; -
FIG. 8 is a schematic depiction of a portion of an alternative optical inspection system having processing distributed between multiple processors; -
FIG. 9 is a schematic, cross-sectional side view of an actuator and mechanism for use with the optical inspection system ofFIG. 1 to selectively move the drive system and first and second optical sensors vertically; and -
FIG. 10 is a schematic side view of one of the optical sensors ofFIG. 1 with an actuator attached thereto to selectively rotate the optical sensor in an alternative embodiment. - Referring to
FIG. 1 , anoptical inspection system 10 is schematically depicted. Theoptical inspection system 10 includes asupport structure 12 that supports adrive system 14 above aninspection platform 18. Thesystem 10 further includes a firstoptical sensor 22 and a secondoptical sensor 26 operatively connected to thedrive system 14 and suspended above theinspection platform 18. In one embodiment, theoptical sensors - More specifically, the
drive system 14 is configured to selectively move the first and secondoptical sensors platform 18 and with respect to each other. Aprocessor 30 is operatively connected to, and configured to control, thedrive system 14 andoptical sensors system 10 also includes aninput device 34 through which a user of thesystem 10 may instruct theprocessor 30 or otherwise input information to theprocessor 30. Anoutput device 38 is operatively connected to theprocessor 30 to receive signals from theprocessor 30 and generate a user-perceptible indicator in response thereto. Thesystem 10 also includes adata storage medium 42 that is operatively connected to theprocessor 30 such that data stored on themedium 42 is selectively retrievable by theprocessor 30, i.e., theprocessor 30 can selectively obtain data stored on themedium 42. - The
optical inspection system 10 is configured to inspect a plurality of different components having different sizes and/or geometries. Thedata storage medium 42 stores data for a first component, i.e., stored firstcomponent data file 46. Thedata storage medium 42 also stores data for a second component, i.e., stored secondcomponent data file 50. One example of a first component is shown at 56 inFIG. 3 ; one example of a second component is shown at 110 inFIG. 4 . - Referring to
FIG. 2 , a method of operation of theoptical inspection system 10 is schematically depicted. The method of operation depicted inFIG. 2 represents an exemplary control logic used by theprocessor 30. In the embodiment depicted, theprocessor 30 is programmed and configured to carry out the steps shown and described inFIG. 2 . Referring toFIG. 3 , wherein like reference numbers refer to like components fromFIG. 1 , afirst component 56 is disposed on the upper surface of theplatform 18 for inspection. With reference toFIGS. 1-3 , the method includes, atstep 52, receiving asignal 54 from theinput device 34 indicating which of a plurality of different components a user (not shown) desires to be inspected by thesystem 10. Those skilled in the art will recognize a variety of different input devices that may be employed, including, but not limited to, keyboards, computer mice, a touchscreen display, etc. In one embodiment, a screen may display the options available for the user to select. Signals may take any form within the scope of the claimed invention, including, but not limited to, electronic, wireless, etc. and may be digital or analog. - At
step 58, theprocessor 30 determines which of thedata files storage medium 42 based on thesignal 54 from theinput device 34. More specifically, if thesignal 54 indicates that thefirst component 56 is selected by the user, then theprocessor 30 proceeds tostep 62. Atstep 62 theprocessor 30 obtains the stored firstcomponent data file 46 from thedata storage medium 42. The stored firstcomponent data file 46 includes data describing a first position 66, data describing asecond position 68, afirst image file 70, and asecond image file 72. Theprocessor 30 then proceeds tostep 76. Atstep 76, theprocessor 30 controls thedrive system 14 to cause the movement of the firstoptical sensor 22 to the first position, as shown inFIG. 3 . Theprocessor 30 then proceeds tostep 78, at which theprocessor 30 controls thedrive system 14 to cause the movement of the secondoptical sensor 26 to the second position, as shown inFIG. 3 . - The first and second positions are predetermined vantage points at which the
optical sensors first component 56 within their respective fields of view. Followingstep 78, theprocessor 30 causes the firstoptical sensor 22 to obtain a first image data set atstep 82; theprocessor 30 then causes the secondoptical sensor 26 to obtain a second image data set atstep 86. - The
first image file 70 includes data describing the design geometry of theportion 90 of thefirst component 56 to be sensed by the firstoptical sensor 22 in the first position. Thesecond image file 72 includes data describing the design geometry of theportion 94 of thefirst component 56 to be sensed by the secondoptical sensor 26 in the second position. Atstep 98, theprocessor 30 compares the first image data set obtained atstep 82 to the storedfirst image file 70 in a manner understood by those skilled in the art of optical inspection and thereby determines whether theportion 90 of thefirst component 56 captured by the firstoptical sensor 22 is within design specification. Similarly, atstep 102, theprocessor 30 compares the second image data set obtained atstep 86 to the storedsecond image file 72 in a manner understood by those skilled in the art of optical inspection and thereby determines whether theportion 94 of thefirst component 56 captured by the secondoptical sensor 26 is within design specification. - More specifically, the
processor 30 determines whether the first image data set is within a predetermined amount of variance (i.e., within design tolerance) from the first data file. Similarly, theprocessor 30 determines whether the second image data set is within a predetermined amount of variance (i.e., within design tolerance) from the second data file. - At
step 106, theprocessor 30 causes theoutput device 38 to generate an indicator, perceptible to a user of thesystem 10, that informs the user whether thefirst component 56 is within design specification, which the processor determined atsteps - Returning to step 58, if the
signal 54 indicates that the second component (shown at 110 inFIG. 4 ) is selected by the user, then theprocessor 30 proceeds to step 114. Referring toFIGS. 1, 2, and 4 , atstep 114, theprocessor 30 obtains the stored second component data file 50 from thedata storage medium 42. The stored second component data file 50 includes data describing athird position 118, data describing afourth position 120, athird image file 121, and afourth image file 122. Theprocessor 30 then proceeds to step 124. Atstep 124, theprocessor 30 controls thedrive system 14 to cause the movement of the firstoptical sensor 22 to the third position relative to theplatform 18, as shown inFIG. 4 . Theprocessor 30 then proceeds to step 126, at which theprocessor 30 controls thedrive system 14 to cause the movement of the secondoptical sensor 26 to the fourth position relative to theplatform 18, as shown inFIG. 4 . - The third and fourth positions are predetermined vantage points at which the
optical sensors second component 110. Followingstep 126, theprocessor 30 causes the firstoptical sensor 22 to obtain a third image data set atstep 130; theprocessor 30 then causes the secondoptical sensor 26 to obtain a fourth image data set atstep 134. - The
third image file 121 includes data describing the design geometry of the portion of thesecond component 110 to be sensed by the firstoptical sensor 22 in the third position. Thefourth image file 122 includes data describing the design geometry of the portion of thesecond component 110 to be sensed by the secondoptical sensor 26 in the fourth position. Atstep 138, theprocessor 30 compares the third image data set obtained atstep 130 to the storedthird image file 121 in a manner understood by those skilled in the art of optical inspection and thereby determines whether the portion of thesecond component 110 captured by the firstoptical sensor 22 atstep 130 is within design specification. Similarly, atstep 142, theprocessor 30 compares the fourth image data set obtained atstep 134 to the storedfourth image file 122 in a manner understood by those skilled in the art of optical inspection and thereby determines whether the portion of thesecond component 110 captured by the secondoptical sensor 26 atstep 134 is within design specification. - More specifically, the
processor 30 determines whether the third image data set is within a predetermined amount of variance (i.e., within design tolerance) from the third data file. Similarly, theprocessor 30 determines whether the fourth image data set is within a predetermined amount of variance (i.e., within design tolerance) from the fourth data file. - Following
steps 138 and 143, theprocessor 30 causes theoutput device 38 to generate an indicator, perceptible to a user of thesystem 10, that informs the user whether thesecond component 110 is within design specification atstep 146. - Thus, the
system 10 enables a single device to effectively inspect at least two components, e.g.,first component 56 andsecond component 110, having different shapes, sizes, and design specifications, thereby reducing costs compared to procuring a separate custom inspection apparatus for each component configuration. Further, thesystem 10 reduces the time required to inspect a complex part compared to an inspection apparatus having only a single optical sensor. -
FIG. 5 depicts a method of using thesystem 10. Referring toFIGS. 1 and 5 , the method includes obtaining thesystem 10 atstep 150. The method also includes storing a plurality of data files 46, 50 on thedata storage medium 42 atstep 154. The method also includes placing a first component (as shown at 56 inFIG. 3 ) on theplatform 18 atstep 158, and instructing the processor 30 (via input device 34) to retrieve and use data file 46 from thestorage medium 42 atstep 162. The method also includes removing the first component from theplatform 18 and placing the second component (as shown at 110 inFIG. 4 ) on theplatform 18 atstep 166, and instructing the processor 30 (via input device 34) to retrieve and use data file 48 from thestorage medium 42 atstep 170. - The
drive system 14 may have any configuration that permits independent movement of theoptical sensors drive mechanism 14 may employ a rack and pinion system, belts and pulleys, etc. Thedrive system 14 in the embodiment depicted employs screw drive mechanisms. More specifically, and with reference toFIG. 6 , wherein like reference numbers refer to like components fromFIGS. 1-5 , thedrive system 14 includes lead screwlinear drive mechanisms Drive mechanism 174A includes alead screw 178A characterized by externalhelical threads 182A, as understood by those skilled in the art. -
Drive mechanism 174A further includes adrive nut 186A having internal helical threads. Thelead screw 178A extends through the hole ofdrive nut 186A such that thethreads 182A of thelead screw 178A are engaged with the threads of thedrive nut 186A. As understood by those skilled in the art, rotation of thelead screw 178A about its centerline causes thedrive nut 186A to move linearly along the centerline oflead screw 178A.Optical sensor 22 is mounted to thedrive nut 186A via abracket 190A. Accordingly, rotation oflead screw 178A causes linear translation of theoptical sensor 22. -
Drive mechanism 174A also includes twocylindrical guide rods lead screw 178A. Thebracket 190A defines two holes; each of therods 194A, 194B extends through a respective one of the holes. Therods bracket 190A to substantially limit movement of thebracket 190A and theoptical sensor 22 to linear translation.Drive mechanism 174A further includes anactuator 202A that is operatively connected to thelead screw 178A and configured to selectively apply torque to thelead screw 178A and thereby cause the lead screw 178A to rotate about its centerline. In the embodiment depicted,actuator 202A is a stepper motor. - Thus,
drive mechanism 174A is configured to selectively cause movement of the firstoptical sensor 22 in one of twoopposite directions lead screw 178A.Drive mechanism 174B is configured to selectively cause movement of the firstoptical sensor 22 in either of twoopposite directions lead screw 178A anddirections - More specifically,
drive mechanism 174B includes alead screw 178B characterized by externalhelical threads 182B, as understood by those skilled in the art.Lead screw 178B is oriented perpendicularly to leadscrew 178A.Drive mechanism 174B further includes adrive nut 186B having internal helical threads. Thelead screw 178B extends through the hole ofdrive nut 186B such that thethreads 182B of the lead screw are engaged with the threads of thedrive nut 186B. As understood by those skilled in the art, rotation of thelead screw 178B about its centerline causes thedrive nut 186B to move linearly along the centerline oflead screw 178B. - The
drive nut 186B is mounted to drivemechanism 174A. For example,drive mechanism 174A may include a structural member 206 to which thedrive nut 186B is mounted. Accordingly, rotation oflead screw 178B causes linear movement of thedrive mechanism 174A and, correspondingly, the firstoptical sensor 22, in twoopposite directions directions Drive mechanism 174A engages twocylindrical guide rods lead screw 178B. Theguide rods support structure 12 and retain thedrive mechanism 174A above theplatform 18. Theguide rods drive mechanism 174A todirections Drive mechanism 174B further includes anactuator 202B that is operatively connected to thelead screw 178B and configured to selectively apply torque to thelead screw 178B and thereby cause thelead screw 178B to rotate about its centerline. In the embodiment depicted,actuator 202B is a stepper motor. - Drive mechanism 174C includes a
lead screw 178C characterized by externalhelical threads 182C, as understood by those skilled in the art. Drive mechanism 174C further includes adrive nut 186C having internal helical threads. Thelead screw 178C extends through the hole ofdrive nut 186C such that thethreads 182C of thelead screw 178C are engaged with the threads of thedrive nut 186C. As understood by those skilled in the art, rotation of thelead screw 178C about its centerline causes thedrive nut 186C to move linearly along the centerline oflead screw 178C.Optical sensor 26 is mounted to thedrive nut 186C via abracket 190C. Accordingly, rotation oflead screw 178C causes linear translation of theoptical sensor 26. - Drive mechanism 174C also includes two cylindrical guide rods 194C, 198C, that are parallel to each other and to the
lead screw 178C. Thebracket 190C defines two holes; each of the rods 194C, 194C extends through a respective one of the holes. The rods 194C, 198C thereby interact with thebracket 190C to substantially limit movement of thebracket 190C and theoptical sensor 26 to linear translation. Drive mechanism 174C further includes anactuator 202C that is operatively connected to thelead screw 178C and configured to selectively apply torque to thelead screw 178C and thereby cause thelead screw 178C to rotate about its centerline. In the embodiment depicted,actuator 202C is a stepper motor. - Thus, drive mechanism 174C is configured to selectively cause movement of the second
optical sensor 26 in either of twoopposite directions lead screw 178C.Drive mechanism 174D is configured to selectively cause movement of the secondoptical sensor 26 in either of twoopposite directions lead screw 178C anddirections - More specifically,
drive mechanism 174D includes alead screw 178D characterized by externalhelical threads 182D, as understood by those skilled in the art.Lead screw 178D is oriented perpendicularly to leadscrew 178C.Drive mechanism 174D further includes adrive nut 186D having internal helical threads. Thelead screw 178D extends through the hole ofdrive nut 186D such that thethreads 182D of the lead screw are engaged with the threads of thedrive nut 186D. As understood by those skilled in the art, rotation of thelead screw 178D about its centerline causes thedrive nut 186D to move linearly along the centerline oflead screw 178D. - The
drive nut 186D is mounted to drive mechanism 174C. For example, drive mechanism 174C may include a structural member 206 to which thedrive nut 186D is mounted. Accordingly, rotation oflead screw 178D causes linear movement of the drive mechanism 174C and, correspondingly, the secondoptical sensor 26, in twoopposite directions directions cylindrical guide rods lead screw 178C. Theguide rods support structure 12 and retain the drive mechanism 174C above theplatform 18. Theguide rods directions Drive mechanism 174D further includes anactuator 202D that is operatively connected to thelead screw 178D and configured to selectively apply torque to thelead screw 178D and thereby cause the lead screw 178D to rotate about its centerline. In the embodiment depicted,actuator 202D is a stepper motor. - The
processor 30 is operatively connected to, and configured to control, theactuators processor 30 selectively causes the movement of the firstoptical sensor 22 through application of torque by theactuators processor 30 selectively causes the movement of the secondoptical sensor 26 through application of torque by theactuators FIG. 7 , wherein like reference numbers refer to like components fromFIGS. 1-6 , exemplary movement ofdrive mechanisms 174A, 174C andoptical sensors - Accordingly, the
drive system 14 depicted is configured such that each of theoptical sensors optical sensor 22 indirections screw 178A byactuator 202A; movement of the firstoptical sensor 22 indirections screw 178B byactuator 202B. Movement of the secondoptical sensor 26 indirections screw 178C byactuator 202C; movement of the secondoptical sensor 26 indirections screw 178D byactuator 202D. - As used herein, the first, second, third, and fourth positions are relative to a fixed, stationary portion of the
inspection system 10, such as theplatform 18 orsupport structure 12. The first, second, third, and fourth positions may, for example, be expressed as points on a Cartesian plane, though other ways of expressing the positions may be employed within the scope of the claimed invention. - It should be noted that, in the context of the claimed invention, a “processor” may include a plurality of processors that cooperate to perform the operations described herein. Those skilled in the art will recognize a variety of data storage media that may be employed within the scope of the claimed invention. For example, data storage medium 194 may be a hard drive, read-only memory, writable read-only memory, optical media such as a compact disk, etc. It should also be noted that, in the context of the claimed invention, a “data storage medium” may include multiple storage media that together store the data used by the processor. Similarly, an “output device” may include one or more output devices.
-
FIG. 8 schematically depicts an alternative embodiment within the scope of the claims that includes a plurality of processors that cooperate to perform the steps shown inFIG. 2 . Referring toFIG. 8 , wherein like reference numbers refer to like components fromFIGS. 1-7 ,optical inspection system 300 includes apersonal computer 304, i.e., one processor, that includes aninput device 308 that performs the same functions as the input device shown at 34 inFIG. 1 .Personal computer 304 also includes adata storage medium 310 that performs the same functions as the data storage medium shown at 42 inFIG. 1 . Referring toFIGS. 2 and 8 , thepersonal computer 304 perfoms steps 52 and 58. When a user selects which data file to use via theinput device 308, thepersonal computer 304 performs either step 62 or step 114 ofFIG. 2 , depending on which component is selected. Thepersonal computer 304 then transmits the data obtained atstep microcontroller 314, i.e., another processor. - Each
actuator respective stepper controller microcontroller 314 transmits position data to eachstepper controller optical sensors microcontroller 314 cooperates withstepper controllers steps optical sensors actuators FIG. 8 for clarity but are substantially similar to the ones shown inFIGS. 6 and 7 . - In the embodiment depicted in
FIG. 8 ,optical sensors Microcontroller 314 is configured to transmit the design data (i.e., image files 68, 70, 120, 122) received by thepersonal computer 304 to the processors of theoptical sensors relay 322. - If the first component is selected at
step 58, then the processor ofoptical sensor 22 performssteps optical sensor 26 performssteps step 58, then the processor ofoptical sensor 22 performssteps optical sensor 26 performssteps Optical sensors - A plurality of
limit switches 326A-D or other feedback devices is operatively connected to themicrocontroller 314 to provide feedback to themicrocontroller 314 regarding the movement and position of theoptical sensors - Although the
inspection systems optical sensors drive system 14 may be configured for additional actuators to cause vertical movement of theoptical sensors FIG. 9 , wherein like reference numbers refer to like components fromFIGS. 1-8 , ascrew drive mechanism 174E is configured to selectively move thedrive system 14, and correspondingly theoptical sensors Drive mechanism 174E includes alead screw 178E that is vertically oriented, and that is engaged with adrive nut 186E. Drivenut 186E is mounted to thedrive system 14. Accordingly, rotation of thelead screw 178E about a vertical axis causes vertical movement of theoptical sensors Actuator 202E is configured to selectively rotatelead screw 178E, and is controlled byprocessor 30. Referring toFIG. 10 , anactuator 202F operatively interconnectsoptical sensor 22 tobracket 190A;actuator 202F is configured to selectively rotateoptical sensor 22 about ahorizontal axis 400. - It should be noted that fixtures (not shown) may be employed to positively position the
components platform 18. Pins on the fixture may be inserted into holes (not shown) inplatform 18 to assist a user with properly positioning the component for inspection. - While the best modes for carrying out the invention have been described in detail, those familiar with the art to which this invention relates will recognize various alternative designs and embodiments for practicing the invention within the scope of the appended claims.
Claims (11)
1. An optical inspection system comprising:
a first optical sensor;
a second optical sensor; and
a drive system operatively connected to the first and second optical sensors and configured to selectively move the first optical sensor and second optical sensor with respect to one another.
2. The optical inspection system of claim 1 , further comprising a processor operatively connected to the drive system such that the movement of the first and second optical sensors is controllable by the processor.
3. The optical inspection system of claim 2 , further comprising a data storage medium on which data describing a first position, a second position, a third position, and a fourth position is storable;
said processor being operatively connected to the data storage medium and configured to obtain the data describing the first position, the second position, the third position, and the fourth position.
4. The optical inspection system of claim 3 , further comprising an input device operatively connected to the processor and configured to transmit a signal to the processor that is indicative of whether a user has selected to inspect a first component or a second component having different design geometry from the first component;
wherein the processor is configured to cause the first optical sensor to move to the first position and the second optical sensor to move to the second position in response to the signal being indicative of the user selecting to inspect the first component; and
wherein the processor is configured to cause the first optical sensor to move to the third position and the second optical sensor to move to the fourth position in response to the signal being indicative of the user selecting to inspect the second component.
5. The optical inspection system of claim 4 ,
wherein the data storage medium is configured such that a first data file, a second data file, a third data file, and a fourth data file are storable on the data storage medium for retrieval by the processor;
wherein the processor is configured to obtain the first and second data files in response to the signal being indicative of the user selecting to inspect the first component; and
wherein the processor is configured to obtain the third and fourth data files in response to the signal being indicative of the user selecting to inspect the second component.
6. The optical inspection system of claim 5 , wherein the first optical sensor is configured to selectively generate a first set of optical data and the second optical sensor is configured to selectively generate a second set of optical data;
wherein the processor is configured such that, in response to the signal being indicative of the user selecting to inspect the first component, the processor
causes the first optical sensor to generate the first set of optical data in the first position,
causes the second optical sensor to generate the second set of optical data in the second position,
determines whether the first set of optical data is within a predetermined amount of variance from the first data file,
determines whether the second set of optical data is within a predetermined amount of variance from the second data file, and
causes an output device to produce a perceptible indication if the first set of optical data is not within the predetermined amount of variance of the first data file or the second set of optical data is not within the predetermined amount of variance of the second data file.
7. The optical inspection system of claim 6 , wherein the processor is configured such that, in response to the signal being indicative of the user selecting to inspect the second component, the processor
causes the first optical sensor to generate the first set of optical data in the third position,
causes the second optical sensor to generate the second set of optical data in the fourth position,
determines whether the first set of optical data is within a predetermined amount of variance from the third data file,
determines whether the second set of optical data is within a predetermined amount of variance from the fourth data file, and
causes an output device to produce a perceptible indication if the first set of optical data is not within the predetermined amount of variance of the third data file or the second set of optical data is not within the predetermined amount of variance of the fourth data file.
8. A method comprising:
possessing an optical inspection system including
a first optical sensor,
a second optical sensor,
a drive system supporting the first and second optical sensors and configured to selectively move the first optical sensor and second optical sensor with respect to one another,
a data storage medium,
a processor operative connected to the data storage medium and configured to selectively retrieve data from the optical storage medium, operatively connected to the drive system and configured to control the movement of the first and second optical sensors, and
an input device configured to transmit a signal to the processor that is indicative of whether a user has selected to inspect a first component or a second component having different design geometry from the first component;
storing on the data storage medium a first position, a second position, a third position, and a fourth position, a first data file, a second data file, a third data file, and a fourth data file;
said first data file including design geometry of a first portion of a first component, said second data file including design geometry of a second portion of the first component, said third data file including design geometry of a first portion of second component, and said fourth data file including design geometry of a second portion of the second component;
positioning the first component relative to the optical sensor system for inspection;
causing the drive system to move the first optical sensor to the first position;
causing the drive system to move the second optical sensor to the second position;
causing the first optical sensor to obtain a first image data set of the first portion of the first component; and
causing the second optical sensor to obtain a second image data set of the second portion of the first component.
9. The method of claim 4 , further comprising:
positioning the second component relative to the optical sensor system for inspection;
causing the drive system to move the first optical sensor to the third position;
causing the drive system to move the second optical sensor to the fourth position;
causing the first optical sensor to obtain a third image data set of the first portion of the second component; and
causing the second optical sensor to obtain a fourth image data set of the second portion of the second component.
10. The method of claim 9 , further comprising comparing the first image data set to the first data file and thereby determining whether the first portion of the first component is within a predetermined amount of variance of the design geometry of the first portion of the first component;
comparing the second image data set to the second data file and thereby determining whether the second portion of the first component is within a predetermined amount of variance of the design geometry of the second portion of the first component;
comparing the third image data set to the third data file and thereby determining whether the first portion of the second component is within a predetermined amount of variance of the design geometry of the first portion of the second component; and
comparing the fourth image data set to the fourth data file and thereby determining whether the second portion of the second component is within a predetermined amount of variance of the design geometry of the second portion of the second component.
11. The method of claim 10 , further comprising causing an output device to generate a perceptible indication of whether any of said portions are not within their respective design geometries.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/660,600 US20190012782A1 (en) | 2017-07-05 | 2017-07-26 | Optical inspection apparatus and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762528617P | 2017-07-05 | 2017-07-05 | |
US15/660,600 US20190012782A1 (en) | 2017-07-05 | 2017-07-26 | Optical inspection apparatus and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190012782A1 true US20190012782A1 (en) | 2019-01-10 |
Family
ID=64902802
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/660,600 Abandoned US20190012782A1 (en) | 2017-07-05 | 2017-07-26 | Optical inspection apparatus and method |
Country Status (1)
Country | Link |
---|---|
US (1) | US20190012782A1 (en) |
Citations (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5768443A (en) * | 1995-12-19 | 1998-06-16 | Cognex Corporation | Method for coordinating multiple fields of view in multi-camera |
US5970166A (en) * | 1996-09-24 | 1999-10-19 | Cognex Corporation | System or method for identifying contents of a semi-opaque envelope |
US6172748B1 (en) * | 1998-12-28 | 2001-01-09 | Applied Vision | Machine vision system and method for non-contact container inspection |
US20020024659A1 (en) * | 1999-11-25 | 2002-02-28 | Olympus Optical Co., Ltd. | Defect inspection data processing system |
US6486879B1 (en) * | 1998-09-25 | 2002-11-26 | Sony Corporation | Image processing apparatus and method, and information providing medium |
US6493079B1 (en) * | 2000-09-07 | 2002-12-10 | National Instruments Corporation | System and method for machine vision analysis of an object using a reduced number of cameras |
US20030197925A1 (en) * | 2002-04-18 | 2003-10-23 | Leica Microsystems Wetzlar Gmbh | Autofocus method for a microscope, and system for adjusting the focus for a microscope |
US20040201669A1 (en) * | 2001-02-09 | 2004-10-14 | Guha Sujoy D. | Web inspection system |
US20050151841A1 (en) * | 2002-03-25 | 2005-07-14 | Nelson Bruce N. | Automated inspection and processing system |
US6956963B2 (en) * | 1998-07-08 | 2005-10-18 | Ismeca Europe Semiconductor Sa | Imaging for a machine-vision system |
US20050275831A1 (en) * | 2004-06-09 | 2005-12-15 | Silver William M | Method and apparatus for visual detection and inspection of objects |
US20060018531A1 (en) * | 2004-07-21 | 2006-01-26 | Omron Corporation | Methods of and apparatus for inspecting substrate |
US20060114531A1 (en) * | 2004-10-22 | 2006-06-01 | Webb Sean E | Systems and methods for automated vehicle image acquisition, analysis, and reporting |
US7117047B1 (en) * | 2001-12-04 | 2006-10-03 | Assembly Guidance Systems, Inc. | High accuracy inspection system and method for using same |
US7394530B2 (en) * | 2004-03-30 | 2008-07-01 | Budd Gerald W | Surface inspection technology for the detection of porosity and surface imperfections on machined metal surfaces |
US20090087080A1 (en) * | 2007-03-28 | 2009-04-02 | Snu Precision Co., Ltd. | Vision inspection system and method for inspecting workpiece using the same |
US7649545B2 (en) * | 2003-04-12 | 2010-01-19 | Jan Antonis | Inspection system and method |
US20110013015A1 (en) * | 2008-02-18 | 2011-01-20 | Snu Precision Co., Ltd | Vision inspection system and inspection method using the same |
US20120027307A1 (en) * | 2010-07-29 | 2012-02-02 | Keyence Corporation | Image Measurement Device, Method For Image Measurement, And Computer Readable Medium Storing A Program For Image Measurement |
US8111904B2 (en) * | 2005-10-07 | 2012-02-07 | Cognex Technology And Investment Corp. | Methods and apparatus for practical 3D vision system |
US8248592B2 (en) * | 2007-12-19 | 2012-08-21 | Hitachi High-Technologies Corporation | Defect inspection system |
US20130002850A1 (en) * | 2011-06-29 | 2013-01-03 | Rolls-Royce Plc | Inspection of a component |
US20140078498A1 (en) * | 2012-09-14 | 2014-03-20 | Keyence Corporation | Appearance Inspection Device, Appearance Inspection Method, And Program |
US20150002847A1 (en) * | 2011-12-28 | 2015-01-01 | Bridgestone Corporation | Appearance inspection apparatus and appearance inspection method |
US20150220799A1 (en) * | 2014-01-31 | 2015-08-06 | Omron Corporation | Image processing device, managing system, and managing method |
US20150221077A1 (en) * | 2014-02-03 | 2015-08-06 | Prosper Creative Co., Ltd. | Image inspecting apparatus and image inspecting program |
US20150237308A1 (en) * | 2012-02-14 | 2015-08-20 | Kawasaki Jukogyo Kabushiki Kaisha | Imaging inspection apparatus, control device thereof, and method of controlling imaging inspection apparatus |
US20160239976A1 (en) * | 2014-10-22 | 2016-08-18 | Pointivo, Inc. | Photogrammetric methods and devices related thereto |
US20170032177A1 (en) * | 2015-07-30 | 2017-02-02 | Keyence Corporation | Image Inspection Device, Image Inspection Method And Image Inspection Program |
US20170078514A1 (en) * | 2014-08-13 | 2017-03-16 | Pfu Limited | Image reading apparatus |
US20170132784A1 (en) * | 2014-06-13 | 2017-05-11 | Nikon Corporation | Shape measuring device, structured object manufacturing system, shape measuring method, structured object manufacturing method, shape measuring program, and recording medium |
US20170154417A1 (en) * | 2014-08-14 | 2017-06-01 | Krones Ag | Optical inspection method and optical inspection device for containers |
US20170161904A1 (en) * | 2015-12-08 | 2017-06-08 | Mitutoyo Corporation | Image measurement device and controlling method of the same |
US9735036B2 (en) * | 2011-08-19 | 2017-08-15 | Cognex Corporation | System and method for aligning a wafer for fabrication |
US20180005370A1 (en) * | 2016-06-30 | 2018-01-04 | Tokyo Electron Limited | Substrate defect inspection apparatus, method of adjusting sensitivity parameter value for substrate defect inspection, and non-transitory storage medium |
US10027928B2 (en) * | 2014-10-28 | 2018-07-17 | Exnodes Inc. | Multiple camera computational wafer inspection |
-
2017
- 2017-07-26 US US15/660,600 patent/US20190012782A1/en not_active Abandoned
Patent Citations (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5768443A (en) * | 1995-12-19 | 1998-06-16 | Cognex Corporation | Method for coordinating multiple fields of view in multi-camera |
US5970166A (en) * | 1996-09-24 | 1999-10-19 | Cognex Corporation | System or method for identifying contents of a semi-opaque envelope |
US6956963B2 (en) * | 1998-07-08 | 2005-10-18 | Ismeca Europe Semiconductor Sa | Imaging for a machine-vision system |
US6486879B1 (en) * | 1998-09-25 | 2002-11-26 | Sony Corporation | Image processing apparatus and method, and information providing medium |
US6172748B1 (en) * | 1998-12-28 | 2001-01-09 | Applied Vision | Machine vision system and method for non-contact container inspection |
US20020024659A1 (en) * | 1999-11-25 | 2002-02-28 | Olympus Optical Co., Ltd. | Defect inspection data processing system |
US6493079B1 (en) * | 2000-09-07 | 2002-12-10 | National Instruments Corporation | System and method for machine vision analysis of an object using a reduced number of cameras |
US20040201669A1 (en) * | 2001-02-09 | 2004-10-14 | Guha Sujoy D. | Web inspection system |
US7117047B1 (en) * | 2001-12-04 | 2006-10-03 | Assembly Guidance Systems, Inc. | High accuracy inspection system and method for using same |
US20050151841A1 (en) * | 2002-03-25 | 2005-07-14 | Nelson Bruce N. | Automated inspection and processing system |
US20030197925A1 (en) * | 2002-04-18 | 2003-10-23 | Leica Microsystems Wetzlar Gmbh | Autofocus method for a microscope, and system for adjusting the focus for a microscope |
US7649545B2 (en) * | 2003-04-12 | 2010-01-19 | Jan Antonis | Inspection system and method |
US7394530B2 (en) * | 2004-03-30 | 2008-07-01 | Budd Gerald W | Surface inspection technology for the detection of porosity and surface imperfections on machined metal surfaces |
US20050275831A1 (en) * | 2004-06-09 | 2005-12-15 | Silver William M | Method and apparatus for visual detection and inspection of objects |
US20060018531A1 (en) * | 2004-07-21 | 2006-01-26 | Omron Corporation | Methods of and apparatus for inspecting substrate |
US20060114531A1 (en) * | 2004-10-22 | 2006-06-01 | Webb Sean E | Systems and methods for automated vehicle image acquisition, analysis, and reporting |
US8111904B2 (en) * | 2005-10-07 | 2012-02-07 | Cognex Technology And Investment Corp. | Methods and apparatus for practical 3D vision system |
US20090087080A1 (en) * | 2007-03-28 | 2009-04-02 | Snu Precision Co., Ltd. | Vision inspection system and method for inspecting workpiece using the same |
US8248592B2 (en) * | 2007-12-19 | 2012-08-21 | Hitachi High-Technologies Corporation | Defect inspection system |
US20110013015A1 (en) * | 2008-02-18 | 2011-01-20 | Snu Precision Co., Ltd | Vision inspection system and inspection method using the same |
US20120027307A1 (en) * | 2010-07-29 | 2012-02-02 | Keyence Corporation | Image Measurement Device, Method For Image Measurement, And Computer Readable Medium Storing A Program For Image Measurement |
US20130002850A1 (en) * | 2011-06-29 | 2013-01-03 | Rolls-Royce Plc | Inspection of a component |
US9735036B2 (en) * | 2011-08-19 | 2017-08-15 | Cognex Corporation | System and method for aligning a wafer for fabrication |
US20150002847A1 (en) * | 2011-12-28 | 2015-01-01 | Bridgestone Corporation | Appearance inspection apparatus and appearance inspection method |
US20150237308A1 (en) * | 2012-02-14 | 2015-08-20 | Kawasaki Jukogyo Kabushiki Kaisha | Imaging inspection apparatus, control device thereof, and method of controlling imaging inspection apparatus |
US20140078498A1 (en) * | 2012-09-14 | 2014-03-20 | Keyence Corporation | Appearance Inspection Device, Appearance Inspection Method, And Program |
US20150220799A1 (en) * | 2014-01-31 | 2015-08-06 | Omron Corporation | Image processing device, managing system, and managing method |
US20150221077A1 (en) * | 2014-02-03 | 2015-08-06 | Prosper Creative Co., Ltd. | Image inspecting apparatus and image inspecting program |
US20170132784A1 (en) * | 2014-06-13 | 2017-05-11 | Nikon Corporation | Shape measuring device, structured object manufacturing system, shape measuring method, structured object manufacturing method, shape measuring program, and recording medium |
US20170078514A1 (en) * | 2014-08-13 | 2017-03-16 | Pfu Limited | Image reading apparatus |
US20170154417A1 (en) * | 2014-08-14 | 2017-06-01 | Krones Ag | Optical inspection method and optical inspection device for containers |
US20160239976A1 (en) * | 2014-10-22 | 2016-08-18 | Pointivo, Inc. | Photogrammetric methods and devices related thereto |
US10027928B2 (en) * | 2014-10-28 | 2018-07-17 | Exnodes Inc. | Multiple camera computational wafer inspection |
US20170032177A1 (en) * | 2015-07-30 | 2017-02-02 | Keyence Corporation | Image Inspection Device, Image Inspection Method And Image Inspection Program |
US20170161904A1 (en) * | 2015-12-08 | 2017-06-08 | Mitutoyo Corporation | Image measurement device and controlling method of the same |
US20180005370A1 (en) * | 2016-06-30 | 2018-01-04 | Tokyo Electron Limited | Substrate defect inspection apparatus, method of adjusting sensitivity parameter value for substrate defect inspection, and non-transitory storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160084631A1 (en) | Roundness measurement device and control method | |
KR102324287B1 (en) | Determination of Automated Splicing Sequences for Optical Splicing | |
US10024774B2 (en) | Hardness test apparatus and hardness testing method | |
WO2005098556A2 (en) | Programmable control system for automated actuator operation | |
US20190012782A1 (en) | Optical inspection apparatus and method | |
US20180054551A1 (en) | Observation apparatus, observation method and observation system | |
CN105136169B (en) | A kind of laser gyro optical element assembly device | |
CN105277175B (en) | Image measuring apparatus and the method for showing measurement result | |
US20150287177A1 (en) | Image measuring device | |
US11295406B2 (en) | Image management device | |
CN110567354A (en) | Calibration device and method for direct-current differential transformer type displacement sensor | |
JP4687853B2 (en) | X-ray fluoroscopic equipment | |
JP6472935B2 (en) | Fixtures to support reel-to-reel inspection of semiconductor devices and other components | |
US20210063719A1 (en) | Universal Microscope Stage | |
CN208334824U (en) | A kind of detection clamp tool | |
KR101822749B1 (en) | Sphericity measurement device | |
CN110355783A (en) | A kind of detection device of articulated robot positioning accuracy | |
JP2007041395A (en) | Material texture observing device | |
JPWO2018235234A1 (en) | Contact probe inspection apparatus and control method of contact probe inspection apparatus | |
JP7213108B2 (en) | Fastening system and fastening method | |
JP4042144B2 (en) | LENS DEVICE DIAGNOSIS SYSTEM, DIAGNOSIS PROGRAM, AND RECORDING MEDIUM | |
JP6680148B2 (en) | Pump device and manipulation system for manipulating minute objects | |
JP4042147B2 (en) | Controller diagnostic system | |
JP2007260799A (en) | Inspection apparatus by articulated robot | |
CN112649797A (en) | Radar test support and radar test method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |