EP3889590B1 - Apparatus for inspecting a substrate for detecting a foreign substance - Google Patents

Apparatus for inspecting a substrate for detecting a foreign substance Download PDF

Info

Publication number
EP3889590B1
EP3889590B1 EP21170906.8A EP21170906A EP3889590B1 EP 3889590 B1 EP3889590 B1 EP 3889590B1 EP 21170906 A EP21170906 A EP 21170906A EP 3889590 B1 EP3889590 B1 EP 3889590B1
Authority
EP
European Patent Office
Prior art keywords
substrate
foreign substance
image
inspection
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP21170906.8A
Other languages
German (de)
French (fr)
Other versions
EP3889590C0 (en
EP3889590A2 (en
EP3889590A3 (en
Inventor
Hyun-Seok Lee
Jae-Sik Yang
Ja-Geun Kim
Hee-Tae Kim
Hee-Wook You
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koh Young Technology Inc
Original Assignee
Koh Young Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koh Young Technology Inc filed Critical Koh Young Technology Inc
Publication of EP3889590A2 publication Critical patent/EP3889590A2/en
Publication of EP3889590A3 publication Critical patent/EP3889590A3/en
Application granted granted Critical
Publication of EP3889590C0 publication Critical patent/EP3889590C0/en
Publication of EP3889590B1 publication Critical patent/EP3889590B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/94Investigating contamination, e.g. dust
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8829Shadow projection or structured background, e.g. for deflectometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8845Multiple wavelengths of illumination or detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • G01N2021/95638Inspecting patterns on the surface of objects for PCB's
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/061Sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/063Illuminating optical parts
    • G01N2201/0635Structured illumination, e.g. with grating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/068Optics, miscellaneous
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/12Circuits of general importance; Signal processing
    • G01N2201/13Standards, constitution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]

Definitions

  • the present invention relates to an apparatus for inspecting a substrate for detecting a foreign substance.
  • an electronic apparatus includes at least one printed circuit board (PCB), and a circuit pattern, a connection pad and various circuit devices such as a driving chip electrically connected to the connection pad, etc. are mounted on the PCB.
  • PCB printed circuit board
  • various circuit devices such as a driving chip electrically connected to the connection pad, etc.
  • a shape measuring apparatus may be used.
  • a conventional shape measuring apparatus displays an image of a substrate on a monitor screen in order that an operator performs a serial inspection.
  • a pad region where a solder is applied is displayed by using a gerber data.
  • the image of a substrate displays only a portion such as a pad region but not a stencil, a hole, etc. unlikely real image. Therefore, an operator may have a problem of inspecting a desired region, which is induced by a difference between a real image and a displayed image in performing the serial inspection.
  • catching a real position with a problem on a substrate through a displayed image of a substrate may be hard and require too much time.
  • This substance may include a malfunction of a substrate. Therefore, an apparatus for inspecting a substrate for detecting a foreign substance, is required.
  • US 2011002527 A1 discloses an inspection method including photographing a measurement target to acquire image data for each pixel of the measurement target, acquiring height data for each pixel of the measurement target, acquiring visibility data for each pixel of the measurement target, multiplying the acquired image data by at least one of the height data and the visibility data for each pixel to produce a result value, and setting a terminal area by using the produced result value.
  • US 2010295941 A1 discloses a shape measurement apparatus including a work stage supporting a target substrate, a pattern-projecting section including a light source, a grating part partially transmitting and blocking light generated by the light source to generate a grating image and a projecting lens part making the grating image on a measurement target of the target substrate, an image-capturing section capturing the grating image reflected by the measurement target of the target substrate, and a control section controlling the work stage.
  • US 2002088952 A1 discloses an optical inspection module and method for detecting particles on a surface of a substrate.
  • the technical problem of the present invention is to provide an apparatus for inspecting a substrate for detecting a foreign substance, through which a foreign substance can be easily and exactly detected.
  • the present invention is directed to an apparatus for inspecting a substrate for detecting a foreign substance as defined in appended independent claim 1.
  • Particular embodiments are defined in appended dependent claims 2 and 3.
  • the present invention is directed to an apparatus for inspecting a substrate for a foreign substance, comprising: a projecting part configured to provide at least one grid pattern light onto the substrate; a lighting part configured to provide a plurality of color lights onto the substrate; an image capturing part configured to capture at least one image of the substrate and a foreign substance on the substrate by receiving the grid pattern light and the color lights reflected by the foreign substance and the substrate; the apparatus further comprising: an infrared lighting to provide infrared light onto the substrate and a processing part connected to the image capturing part for receiving and processing the at least one image captured by the image capturing part, the processing part being configured to obtain 3-dimensional image information including height of the substrate by using the grid pattern light, detect a foreign substance of the substrate by using the 3-dimensional image information including height, obtain 2-dimensional image information of the substrate per color by using the color lights, detect a foreign substance of the substrate by using the 2-dimensional image information per color, combine a detection result of a foreign substance performed by using the 3-dimensional image information including height and a detection
  • the mask information comprises information about a hole formed at the substrate such that the hole is removed from the substrate to be detected.
  • the processing part is configured to determine a foreign substance as a final foreign substance, when the foreign substance is considered as a foreign substance in both of inspection by using the 3-dimensional image information including height and inspection by using the 2-dimensional image information per color.
  • target object may be used instead of the term “foreign substance”, or the term “inspection target” may be used instead of the term “substrate”.
  • target object may be used instead of the term “foreign substance”
  • inspection target may be used instead of the term “substrate”.
  • the image of the substrate, which is displayed for performing inspection is renewed to be captured inspection region image so that an operator can easily catch a real portion in the real substrate, which corresponds to a portion in the displayed image.
  • a foreign substance can be easily detected by comparing the obtained image of the inspection region with the reference image of the substrate.
  • the detection result obtained by using the 3-dimensional image information including height and the detection result obtained by using 2-dimensional image information per color are combined, a reliability of detection of a foreign substance can be improved in detecting a foreign substance of a substrate.
  • more easy or precise detection can be performed by excluding, from at least one captured image, a region formed on a substrate and that is unnecessary or may induce error in inspecting a substrate for detecting a foreign substance by using infrared lighting.
  • first, second, third etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, or section discussed below could be termed a second element, component, or section without departing from the teachings of the present invention.
  • FIG. 1 is a conceptual view showing a 3-dimensional shape measuring apparatus according to an embodiment of the present invention.
  • a 3-dimensional shape measuring apparatus which can be applied to a method of inspecting 3-dimensional image according to an embodiment of the present invention, may include a measurement stage part 100, an image-capturing part 200, a first projecting part 300, a second projecting part 400, a lighting part 450, an image obtaining part 500, a module controlling part 600, and a central processing part 700.
  • the measurement stage part 100 may include a stage 110 which supports a target object 10 and a stage transfer unit 120 which transfers the stage 110.
  • measurement position of the target object 10 may be changed as the target object 10 is moved by the stage 110 with respect to the image-capturing part 200 and first and second projecting parts 300 and 400
  • the image-capturing part 200 is arranged above the stage 110 to capture an image of a target object 10 by receiving a light reflected by the target object 10.
  • the image-capturing part 200 receives the light which is emitted from the first and second lighting devices 300 and 400 and reflected by the target object 10 and captures the image of the target object.
  • the image-capturing part 200 may include a camera 210, an image forming lens 220, a filter 230, and a circular lamp 240.
  • the camera 210 receives the light which is reflected by the target object 10 and captures a plane image of the target object 10, in one embodiment, the camera may be a CCD camera or CMOS camera.
  • the image forming lens 220 is arranged at the bottom of the camera 210, receives the light reflected by the target object 10, and forms an image on the camera 210.
  • the filter 230 is arranged at the bottom of the image forming lens 220 to filter the reflected light and provide to the image forming lens 220, in one embodiment, the filter may be one of a frequency filter, a color filter, and a light intensity adjustment filter.
  • the circular lamp 240 is arranged at the bottom of the filter 230 and provides light to capture unusual image such as 2-dimensional image of the target object 10.
  • the first projecting part 300 may be slantly disposed with respect to the stage 110 supporting the target object 10, for example, at a right side of the image-capturing part 200.
  • the first projecting part 300 may include a first lighting unit 310, a first grid unit 320, a first grid transfer unit 330 and a first condensing lens 340.
  • the first lighting unit 310 includes a lighting source and at least one lens to generate a light
  • the first grid unit 320 is arranged at the bottom of the first lighting unit 310 to convert the light emitted from the first lighting unit 310 to a first grid pattern light having grid pattern.
  • the first grid transfer unit 330 is connected to the fist grid unit 320 to transfer the first grid unit 320.
  • a piezoelectric transfer unit or a fine linear transfer unit may be used as the first grid transfer unit 330.
  • the first condensing lens 340 is arranged at the bottom of the first grid unit 320 to condense the first grid pattern light which has passed the first grid unit320 to the target object 10.
  • the second projecting part 400 may be slantly disposed with respect to the stage 110 supporting the target object 10, for example, at a left side of the image-capturing part 200.
  • the second projecting part 400 may include a second lighting unit 410, a second grid unit 420, a second grid transfer unit 430 and a second condensing lens 440.
  • the elements of second projecting part 300 are substantially the same as the first projecting part 300, and the same explanations is omitted.
  • the image-capturing part 200 may capture N-number of first pattern images in sequence by receiving the N-number of first grid pattern light reflected by the target obj ect 10. Further, When second projecting part 400 projects N-number of second grid pattern lights onto the target object 10 during the second grid transfer unit 430 moves the first grid unit 420 step by step, the image-capturing part 200 may capture N-number of second pattern images in sequence by receiving the N-number of second grid pattern light reflected by the target object 10.
  • the 'N' described above is an integer. For example, the 'N' may be three or four.
  • the number of the projecting part may be equal to or greater than three. That is, the grid pattern light may be emitted toward the target object 10 in various directions to get various pattern images. For example, when three projecting parts are arranged to form a triangle, three grid pattern lights may be emitted onto the target object 10 in different directions, and when four projecting parts are arranged to form a square, four grid pattern lights may be emitted onto the target object 10 in different directions. Further, the number of the projecting parts may be eight. In this case, images can be captured by emitting grid pattern light in eight directions.
  • the lighting part 450 irradiates a light toward the target object 10 to capture 2-dimensional image of the target object 10.
  • the lighting part 450 may include a red lighting 452, a green lighting 454, and a blue lighting 456.
  • the red lighting 452, the green lighting 454 and the blue lighting 456 may be arranged above the target object in circular to irradiate each of red, green, and blue lights toward the target object 10, or may be arranged with different heights as shown in FIG. 1 .
  • the image obtaining part 500 is electrically connected to the camera 210 of the image-capturing part 200 to obtain and store pattern images generated by using the first and second projecting parts 300 and 400 from the camera 210. Also, the image obtaining part 500 obtains and stores 2-dimensional images generated by using the lighting part 450 from the camera 210.
  • the image obtaining part 500 includes an imaging system which receives and stores N-number of first pattern images and N-number of second pattern images captured by the camera 210.
  • the module controlling part 600 is electrically connected to the measurement stage part 100, the image-capturing part 200, the first projecting part 300 and second projecting part 400 to control them.
  • the module controlling part 600 includes a lighting controller, a grid controller, and a stage controller.
  • the lighting controller controls the first and second lighting units 310 and 410 and to generate lights
  • the grid controller controls the first and second grid transfer units 330 and 430 to move the first and second grid units 320 and 420.
  • the stage controller controls the stage transfer unit 120 to move the stage 110 to up down left right side.
  • the central processing part 700 is electrically connected to the image obtaining part 500 and the module controlling part 600 to control them.
  • the central processing part 700 may measure 3-dimensional shape of the target object 110 by receiving and processing the N-number of first pattern images and N-number of second pattern images from the imaging system of the image obtaining part 500.
  • the central processing part 700 may control each of the light controller, the grid controller, and the state controller.
  • the central processing part 700 may include an image processing board, a controlling board, and an interface board.
  • FIG. 2 is a flow chart showing a method of inspecting a substrate and a method of inspecting a substrate for detecting a foreign substance on a substrate,
  • FIG. 3 is a plan view showing an example of image information of a substrate displayed through a step of displaying image information of a substrate in FIG. 2
  • FIG. 4 is a plan view showing a real substrate corresponding to the image of the substrate in FIG. 3 .
  • an image information of a substrate 10 before applying solder is displayed (S 110).
  • the image information of a substrate 10 may include a gerber information of the substrate 10.
  • the gerber information of the substrate 10 may be an information of a design standard of the substrate 10 before applying solder, and may be displayed on a monitor of an operator as a gerber image GI shown in FIG. 3 .
  • the gerber image GI may include pads GI-P with various shapes, on which solder is applied, and the gerber information may be a black-and-white image information.
  • At least one image of at least one inspection region on the substrate is captured to obtain image of captured inspection region (S120).
  • the inspection region is a target region to be measured or inspected on the substrate 10, and can be set automatically or by an operator.
  • the substrate 10 may be divided into several regions with definite area to be set as the inspection region, or the whole region of the substrate 10 may be set as the inspection region.
  • the region with definite area may be defined as a field of view (FOV) of the camera 210 of the image-capturing part 200.
  • FIG. 5 is a conceptual view showing an example of a step of obtaining an image of an inspection region on a substrate in FIG. 2 .
  • the inspection region FOV of the substrate 10 may be defined by the field of view of the camera 210, and may be captured along an arrow direction.
  • all regions of the substrate 10 are image-captured. However, some required regions may be selectively captured.
  • Image-capturing of the inspection region FOV may be performed, for example, by at least one of the projection parts 300 and 400 and the lighting part 450.
  • the projecting parts 300 and 400 may emit grid pattern light onto the inspection region FOV, so that the inspection region FOV may be image-captured.
  • the image of the inspection region FOV includes 3-dimensional image based on height information.
  • the light part 450 may emit at least one color light onto the inspection region FOV, so that the inspection region FOV may be image-captured.
  • the image of the inspection region FOV includes 2-dimensional plane image.
  • the renewed image information may be color image information.
  • the camera 210 providing the renewed image information may be color camera.
  • the camera 210 obtaining the image of the inspection region FOV may be a black-and-white camera.
  • the renewed image can be converted into a color image since the images obtained by color lights emitted by the lighting part 450 has difference, even though the camera 210 is a black-and-white camera.
  • FIG. 6 is a plan view for explaining a process of displaying renewed image information in FIG. 2 .
  • the renewed image information corresponds to an internal area of bold line, and image information that is not renewed corresponds to an outer area of the bold line.
  • the image-captured inspection region as shown in FIG. 5 may be real time renewed as shown in FIG. 6 .
  • the image of the substrate 10, which is displayed is renewed as the image of the image-captured inspection region FOV so that an operator can easily match the image of the substrate 10, which is displayed, with a real substrate 10.
  • the gerber information is black-and-white and the renewed image information is color, the operator can more easily catch the position of the real substrate 10 since the operator uses the renewed color image of the substrate 10, which is displayed.
  • this step can be omitted in the method of inspecting a substrate for detecting a foreign substance.
  • the existence of a foreign substance is inspected by comparing the obtained image of the inspection region FOV with a reference image of the substrate 10 (S140).
  • the reference image of the substrate 10 can be obtained from a reference substrate that is previously selected.
  • the reference substrate may be a master substrate or a master board.
  • the reference substrate is previously selected, and the inspection region FOV, which is previously explained, is image-captured from the reference substrate as the same way previously explained to get the reference image.
  • the reference image of the substrate 10 may include an image captured through emitting at least one color light onto the inspection region FOV of the reference substrate by using the light part 450.
  • the reference image is 2-dimensional plane image, and the reference image is compared with the captured image of the inspection region FOV so that a different portion may be considered as a foreign substance.
  • the reference image does not contain the foreign substances FS1 and FS2 but the captured image CI contains the foreign substances FS1 and FS2 as shown in FIG. 6 . Therefore, the different portion CI-FS1 obtained by comparing the reference image and the captured image CI of the inspection region FOV may be considered as a foreign substance.
  • the reference image of the substrate 10 may further include an image captured through emitting at least one grid pattern light onto the inspection region FOV of the reference substrate by using the projecting parts 300 and 400.
  • the reference image is 3-dimensional image based on height, and the reference image is compared with the captured image of the inspection region FOV so that a different portion may be considered as a foreign substance.
  • the reference image does not contain the foreign substances FS1 and FS2 but the heights of the foreign substances FS1 and FS2 in the captured image are different from those in the reference image. Therefore, the different portion obtained by comparing the reference image and the captured image CI of the inspection region FOV may be considered as a foreign substance.
  • the foreign substance can be detected by using only the obtained 3-dimensional image of the inspection region FOV based on height instead of inspecting the foreign substance by comparing the obtained 3-dimensional image of the inspection region FOV with the reference image of the substrate 10.
  • first and second foreign substances FS1 and FS2 when the first and second foreign substances FS1 and FS2 are exist on the substrate 10 as shown in FIG. 4 , heights of the first and second foreign substances FS1 and FS2 may abruptly increase or exceed a previously set value in the obtained 3-dimensional image based on height. Therefore, the abrupt height change or the portion exceeding a previously set value may be considered as a foreign substance without comparing the reference image with the obtained image.
  • the reference image can be obtained before the step S110 in which the image information of the substrate 10 is displayed.
  • the image of the substrate, which is displayed for performing inspection is renewed to be captured inspection region image so that an operator can easily catch a real portion in the real substrate, which corresponds to a portion in the displayed image.
  • a foreign substance can be easily detected by comparing the obtained image of the inspection region with the reference image of the substrate.
  • FIG. 7 is a flow chart showing another method of inspecting a substrate for detecting a foreign substance.
  • master images of a master substrate per color is obtained by using a plurality of color lights (S210).
  • a red master image of the master substrate is obtained by using a red light emitted by the red lighting 452
  • a green master image of the master substrate is obtained by using a green light emitted by the green lighting 45
  • a blue master image of the master substrate is obtained by using a blue light emitted by the blue lighting 456.
  • inspection images of the substrate 10 are obtained per color by using the plurality of color lights (S220).
  • a red inspection image of the substrate 10 is obtained by using a red light emitted by the red lighting 452
  • a green inspection image of the substrate 10 is obtained by using a green light emitted by the green lighting 454
  • a blue inspection image of the substrate 10 is obtained by using a blue light emitted by the blue lighting 452.
  • Foreign substances may have various color characteristics.
  • a substance may be characterized by a bright foreign substance and a dark foreign substance.
  • the bright foreign substance and the dark foreign substance are characterized by relative brightness with respect to the substrate 10.
  • the bright foreign substance and the dark foreign substance may be detected by different method. Therefore, more precise detection may be performed.
  • a bright foreign substance such as dust, chip, etc. is detected by comparing the master images per color and the obtained images per color (S230).
  • FIG. 8 is a flow chart showing a process of detecting a bright foreign substrate.
  • the obtained images per color are merged to form a substrate image regarding the substrate 10 (S232).
  • intensities of each of the red inspection image, the green inspection image and the blue inspection image are merged per pixel to form an image of the substrate.
  • the master images per color are merged to form an image of the master substrate (S234).
  • intensities of each of the red master image, the green master image and the blue master image are merged per pixel to form an image of the master substrate.
  • a bright foreign substance is detected by comparing the image and the master image (S236).
  • FIG. 9 is a conceptual view showing a process of detecting a bright foreign substance by comparing a substrate image and a master substrate image.
  • a master substrate image (MSI) is subtracted from the substrate image (ISI) to form a comparison image (PI).
  • the master substrate image (MSI) is a clean image without a foreign substance. Therefore, when the substrate image (ISI) contains a foreign substance (FM), there exists only the foreign substance (FM) in the comparison image (PI).
  • the comparison (PI) may be binary-coded based on a specific reference value to form a binary-coded image (BI).
  • BI binary-coded image
  • a foreign substance (FM) region and other regions are clearly distinguished. Therefore, detecting a foreign substance is enhanced. Additionally, when a noise removing is performed regarding to the binary-coded image (BI), detecting a foreign substance is further enhanced.
  • a dark foreign substance such as a hair, an insulation tape, etc. is detected by using the master images per color and the obtained images per color, after detecting the bright foreign substance (S240).
  • FIG. 10 is a flow chart showing a process of detecting a dark foreign substrate.
  • a chroma (or saturation) map regarding to the substrate 10 is formed by using the obtained images per color (S242).
  • a chroma map regarding to the master substrate is formed by using the master images per color (S244).
  • the chroma map may be formed by using chroma information of a red image, a green image and a blue image per pixel.
  • 'R' is a chroma information of each pixel in the red image
  • 'G' is a chroma information of each pixel in the green image
  • 'B' is a chroma information of each pixel in the blue image.
  • the chroma map 300 obtained by using the Equation 1 has a value 0 ⁇ 1 per pixel, and a value means a primary color, when the value approaches to 1.
  • a dark foreign substance is near to achromatic color, so that the dark foreign substance region is expressed as a region with a value near to 0.
  • a dark foreign substance is detected by comparing the chroma map regarding to a substrate with the chroma map regarding to a master substrate (S246).
  • FIG. 11 is a conceptual view showing a process of detecting a dark foreign substance by comparing a chroma map of a substrate and a chroma map of a master substrate.
  • a chroma map (TSM) regarding to a substrate is subtracted from a chroma map(MSM) regarding to a master substrate to form a comparison image (PM).
  • a comparison image PM
  • FM foreign substance
  • the comparison (PM) may be binary-coded based on a specific reference value to form a binary-coded image (BM).
  • BM binary-coded image
  • a foreign substance (FM) region and other regions are clearly distinguished. Therefore, detecting a foreign substance (FM) is enhanced.
  • the region with the foreign substance (FM) is set as a region of interest (ROI), and color analysis may be performed regarding to the region of interest (ROI) to perform more precise detection. For example, after chroma map of the whole substrate is compared and analyzed, a region with a foreign substance (FM) is set as the region of interest (ROI). Then, other regions are mask-processed to be excluded, and the region of interest (ROI) is selectively amplified to generate region of interest (ROI) image by merging color images (for example, a red image, a green image and a blue image) of the region of interest (ROI). Then, binary-coding and noise-removing are performed regarding to the region of interest, so that a foreign substance can be more precisely detected.
  • ROI region of interest
  • a detection result of the bright foreign substance and a detection result of the dark foreign substance are merged (S250). That is, the detection results of a bright foreign substance and the dark foreign substance are merged to finally detect foreign substances of the substrate.
  • detection of both of a dark foreign substance and a bright foreign substance may be performed and the sequence of detection of a dark foreign substance and detection of a bright foreign substance is not limited.
  • a foreign substance exclusively corresponds to a bright foreign substance or a dark foreign substance. Therefore, when a substance is considered a bright foreign substance or a dark foreign substance, the substance is considered as a foreign substance.
  • FIG. 12 is a flow chart showing a method of inspecting a substrate for detecting a foreign substance.
  • a 3-dimensional information based on height of a substrate is obtained by using at least grid pattern light (S310). That is, 3-dimensional information including height information of a substrate is obtained by using at least one of the projection parts 300 and 400.
  • a region of abrupt height change or a region exceeding a previously set reference height may be considered as a foreign substance in the 3-dimensional information based on height of the substrate to detect a foreign substance.
  • 2-dimensional information per color is obtained by using a plurality of color light (S330).
  • a foreign substance of the substrate is detected by using the 2-dimensinoal information per color (S340).
  • the method of detecting foreign substance by using the 2-dimension information is substantially same as the embodiment explained referring to FIG. 7 through FIG. 11 . Therefore, any further explained will be omitted.
  • a detection result obtained by using the 3-dimensional information based on height and a detection result obtained by using 2-dimensional information per color are merged (S350). That is, the detection result obtained by using the 3-dimensional information based on height and the detection result obtained by using 2-dimensional information per color are merged to detect finally a foreign substance.
  • a substance may be considered as a foreign substance when the substance is considered as a foreign substance in both of the detection by using the 3-dimensional information based on height and the detection by using the 2-dimensional information per color.
  • a substance may be considered as a foreign substance when the substance is considered as a foreign substance in one of the detection by using the 3-dimensional information based on height and the detection by using the 2-dimensional information per color.
  • a region which may be wrongly considered as a foreign substance or a region necessary to be removed may be removed before detecting a foreign substance by using the 3-dimensional information based on height and detecting a foreign substance by using the 2-dimensional information per color.
  • FIG. 13 is a flow chart showing another method of inspecting a substrate for detecting a foreign substance.
  • a mask information of the substrate is firstly obtained (S410).
  • the mask information may include a hole information formed at the substrate.
  • a hole formed at the substrate may be different from a hole formed at the master substrate in size, so that a hole may be wrongly considered as a foreign substance, when a hole is included in an inspection target that is to be detected. Therefore, in order to eliminate the possibility of wrong inspection, a hole may be removed from an inspection target that is to be detected.
  • the mask information such as a hole information may be obtained by using an infrared (IR) lighting.
  • the mask information may include a circuit pattern information formed on the substrate.
  • the circuit pattern is generally formed through an etching process, and a circuit pattern formed on a substrate may be different from a circuit pattern formed on the master substrate in position, so that a circuit pattern may be wrongly considered as a foreign substance, when the circuit pattern is included in an inspection target that is to be detected. Therefore, in order to eliminate the possibility of wrong inspection, a circuit pattern may be removed from an inspection target that is to be detected.
  • the mask information regarding a substrate may be obtained from the substrate or the master substrate.
  • a mask is generated based on the mask information to remove wrongly inspectable material from inspection target (S420).
  • a hole mask including a hole of the substrate, and an edge mask including a circuit pattern of the substrate may be formed.
  • the hole mask and the edge mask can exclude a hole and a circuit pattern from an inspection target to be detected.
  • a foreign substance may be inspected based on the 3-dimensional information based on height and the 2-dimensional information per color (S430).
  • more easy or precise detection can be performed by excluding a region that is unnecessary or may induce error in inspecting a substrate for detecting a foreign substance by using infrared lighting.

Landscapes

  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Immunology (AREA)
  • Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Electric Connection Of Electric Components To Printed Circuits (AREA)

Description

    [Technical Field]
  • The present invention relates to an apparatus for inspecting a substrate for detecting a foreign substance.
  • [Background Art]
  • In general, an electronic apparatus includes at least one printed circuit board (PCB), and a circuit pattern, a connection pad and various circuit devices such as a driving chip electrically connected to the connection pad, etc. are mounted on the PCB. In order to inspect whether the various circuit devices are properly mounted or disposed on the PCB, a shape measuring apparatus may be used.
  • A conventional shape measuring apparatus displays an image of a substrate on a monitor screen in order that an operator performs a serial inspection. In this case, a pad region where a solder is applied, is displayed by using a gerber data.
  • However, the image of a substrate displays only a portion such as a pad region but not a stencil, a hole, etc. unlikely real image. Therefore, an operator may have a problem of inspecting a desired region, which is induced by a difference between a real image and a displayed image in performing the serial inspection.
  • For example, catching a real position with a problem on a substrate through a displayed image of a substrate may be hard and require too much time.
  • Therefore, a method of displaying, through which a real position corresponding to a specific region of a displayed substrate image can be easily catched, is required.
  • On the other hand, it is very difficult to detect a foreign substance on a substrate, since a substrate image displayed in order to perform a serial inspection is not a real image of substrate.
  • This substance may include a malfunction of a substrate. Therefore, an apparatus for inspecting a substrate for detecting a foreign substance, is required.
  • US 2011002527 A1 discloses an inspection method including photographing a measurement target to acquire image data for each pixel of the measurement target, acquiring height data for each pixel of the measurement target, acquiring visibility data for each pixel of the measurement target, multiplying the acquired image data by at least one of the height data and the visibility data for each pixel to produce a result value, and setting a terminal area by using the produced result value. US 2010295941 A1 discloses a shape measurement apparatus including a work stage supporting a target substrate, a pattern-projecting section including a light source, a grating part partially transmitting and blocking light generated by the light source to generate a grating image and a projecting lens part making the grating image on a measurement target of the target substrate, an image-capturing section capturing the grating image reflected by the measurement target of the target substrate, and a control section controlling the work stage. US 2002088952 A1 discloses an optical inspection module and method for detecting particles on a surface of a substrate.
  • [Disclosure] [Technical Problem]
  • Therefore, the technical problem of the present invention is to provide an apparatus for inspecting a substrate for detecting a foreign substance, through which a foreign substance can be easily and exactly detected.
  • [Technical Solution]
  • The present invention is directed to an apparatus for inspecting a substrate for detecting a foreign substance as defined in appended independent claim 1. Particular embodiments are defined in appended dependent claims 2 and 3.
  • In particular, the present invention is directed to an apparatus for inspecting a substrate for a foreign substance, comprising: a projecting part configured to provide at least one grid pattern light onto the substrate; a lighting part configured to provide a plurality of color lights onto the substrate; an image capturing part configured to capture at least one image of the substrate and a foreign substance on the substrate by receiving the grid pattern light and the color lights reflected by the foreign substance and the substrate; the apparatus further comprising: an infrared lighting to provide infrared light onto the substrate and a processing part connected to the image capturing part for receiving and processing the at least one image captured by the image capturing part, the processing part being configured to obtain 3-dimensional image information including height of the substrate by using the grid pattern light, detect a foreign substance of the substrate by using the 3-dimensional image information including height, obtain 2-dimensional image information of the substrate per color by using the color lights, detect a foreign substance of the substrate by using the 2-dimensional image information per color, combine a detection result of a foreign substance performed by using the 3-dimensional image information including height and a detection result of a foreign substance performed by using the 2-dimensional image information per color, and, wherein the processing part is configured to: obtain, before detecting a foreign substance of the substrate, mask information about the substrate by using the infrared lighting; and exclude, from the the at least one image, a region formed on the substrate and which may be wrongly considered as a foreign substance or a region formed on the substrate and which is necessary to be removed from the substrate to be detected by using a mask that is generated based on the mask information.
  • Preferably, the mask information comprises information about a hole formed at the substrate such that the hole is removed from the substrate to be detected. Preferably, the processing part is configured to determine a foreign substance as a final foreign substance, when the foreign substance is considered as a foreign substance in both of inspection by using the 3-dimensional image information including height and inspection by using the 2-dimensional image
    information per color.
  • The terminology used in the following part of the description may be inconsistent. For example, the term "target object" may be used instead of the term "foreign substance", or the term "inspection target" may be used instead of the term "substrate". However, irrespective of what is stated in the following part of the description and which terms are used, the scope of the present invention is solely defined and limited by the appended claims.
  • [Advantageous Effects]
  • According to the present invention described above, the image of the substrate, which is displayed for performing inspection, is renewed to be captured inspection region image so that an operator can easily catch a real portion in the real substrate, which corresponds to a portion in the displayed image.
  • Additionally, a foreign substance can be easily detected by comparing the obtained image of the inspection region with the reference image of the substrate.
  • Further, when a bright foreign substance and a dark foreign substance are separately detected and merged in inspecting of a foreign substance of a substrate, a reliability of detection of a foreign substance can be improved.
  • Further, when the detection result obtained by using the 3-dimensional image information including height and the detection result obtained by using 2-dimensional image information per color are combined, a reliability of detection of a foreign substance can be improved in detecting a foreign substance of a substrate.
  • Further, more easy or precise detection can be performed by excluding, from at least one captured image, a region formed on a substrate and that is unnecessary or may induce error in inspecting a substrate for detecting a foreign substance by using infrared lighting.
  • [Description of Drawings]
    • FIG. 1 is a conceptual view showing a 3-dimensional shape measuring apparatus according to an embodiment of the present invention.
    • FIG. 2 is a flow chart showing a method of inspecting a substrate and a method of inspecting a substrate for detecting a foreign substance on a substrate according to an embodiment of the present disclosure.
    • FIG. 3 is a plan view showing an example of image information of a substrate displayed through a step of displaying image information of a substrate in FIG. 2.
    • FIG. 4 is a plan view showing a real substrate corresponding to the image of the substrate in FIG. 3.
    • FIG. 5 is a conceptual view showing an example of a step of obtaining an image of an inspection region on a substrate in FIG. 2.
    • FIG. 6 is a plan view for explaining a process of displaying renewed image information in FIG. 2.
    • FIG. 7 is a flow chart showing a method of inspecting a substrate for detecting a foreign substance, which are not falling under the scope of the appended claims and are considered merely as examples suitable for understanding the invention.
    • FIG. 8 is a flow chart showing a process of detecting a bright foreign substrate.
    • FIG. 9 is a conceptual view showing a process of detecting a bright foreign substance by comparing a substrate image and a master substrate image.
    • FIG. 10 is a flow chart showing a process of detecting a dark foreign substrate.
    • FIG. 11 is a conceptual view showing a process of detecting a dark foreign substance by comparing a chroma map of a substrate and a chroma map of a master substrate.
    • FIG. 12 is a flow chart showing a method of inspecting a substrate for detecting a foreign substance, which are not falling under the scope of the appended claims and are considered merely as examples suitable for understanding the invention.
    • FIG. 13 is a flow chart showing a method of inspecting a substrate for detecting a foreign substance according to an embodiment of the present invention.
    [Mode for Invention]
  • Preferred embodiments of the present invention are described more fully hereinafter with reference to the accompanying drawings, in which example embodiments of the present invention are shown. The present invention may, however, within the scope defined by the appended claims, be embodied in many different forms and should not be construed as limited to the example embodiments set forth herein. Rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the present invention, defined by the appended claims, to those skilled in the art. In the drawings, the sizes and relative sizes of layers and regions may be exaggerated for clarity.
  • It will be understood that, although the terms first, second, third etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, or section discussed below could be termed a second element, component, or section without departing from the teachings of the present invention.
  • The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present invention. As mentioned above, the terminology used in the description may be inconsistent. For example, the term "target object" may be used instead of the term "foreign substance", or the term "inspection target" may be used instead of the term "substrate". However, irrespective of what is stated in the following part of the description and which terms are used, the scope of the present invention is solely defined and limited by the appended claims. As used herein, the singular forms "a," "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
  • Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
  • For convenience, same numerals are used for identical or similar elements of an apparatus of cutting a tempered substrate and the conventional one.
  • Hereinafter, with reference to the drawings, preferred embodiments of the present invention will be described in detail.
  • FIG. 1 is a conceptual view showing a 3-dimensional shape measuring apparatus according to an embodiment of the present invention.
  • Referring to FIG. 1, a 3-dimensional shape measuring apparatus, which can be applied to a method of inspecting 3-dimensional image according to an embodiment of the present invention, may include a measurement stage part 100, an image-capturing part 200, a first projecting part 300, a second projecting part 400, a lighting part 450, an image obtaining part 500, a module controlling part 600, and a central processing part 700.
  • The measurement stage part 100 may include a stage 110 which supports a target object 10 and a stage transfer unit 120 which transfers the stage 110. In the embodiment, measurement position of the target object 10 may be changed as the target object 10 is moved by the stage 110 with respect to the image-capturing part 200 and first and second projecting parts 300 and 400
  • The image-capturing part 200 is arranged above the stage 110 to capture an image of a target object 10 by receiving a light reflected by the target object 10. In other words, the image-capturing part 200 receives the light which is emitted from the first and second lighting devices 300 and 400 and reflected by the target object 10 and captures the image of the target object.
  • The image-capturing part 200 may include a camera 210, an image forming lens 220, a filter 230, and a circular lamp 240. The camera 210 receives the light which is reflected by the target object 10 and captures a plane image of the target object 10, in one embodiment, the camera may be a CCD camera or CMOS camera. The image forming lens 220 is arranged at the bottom of the camera 210, receives the light reflected by the target object 10, and forms an image on the camera 210. The filter 230 is arranged at the bottom of the image forming lens 220 to filter the reflected light and provide to the image forming lens 220, in one embodiment, the filter may be one of a frequency filter, a color filter, and a light intensity adjustment filter. The circular lamp 240 is arranged at the bottom of the filter 230 and provides light to capture unusual image such as 2-dimensional image of the target object 10.
  • The first projecting part 300 may be slantly disposed with respect to the stage 110 supporting the target object 10, for example, at a right side of the image-capturing part 200. The first projecting part 300 may include a first lighting unit 310, a first grid unit 320, a first grid transfer unit 330 and a first condensing lens 340. The first lighting unit 310 includes a lighting source and at least one lens to generate a light, and the first grid unit 320 is arranged at the bottom of the first lighting unit 310 to convert the light emitted from the first lighting unit 310 to a first grid pattern light having grid pattern. The first grid transfer unit 330 is connected to the fist grid unit 320 to transfer the first grid unit 320. In one embodiment, a piezoelectric transfer unit or a fine linear transfer unit may be used as the first grid transfer unit 330. The first condensing lens 340 is arranged at the bottom of the first grid unit 320 to condense the first grid pattern light which has passed the first grid unit320 to the target object 10.
  • The second projecting part 400 may be slantly disposed with respect to the stage 110 supporting the target object 10, for example, at a left side of the image-capturing part 200. The second projecting part 400 may include a second lighting unit 410, a second grid unit 420, a second grid transfer unit 430 and a second condensing lens 440. The elements of second projecting part 300 are substantially the same as the first projecting part 300, and the same explanations is omitted.
  • When first projecting part 300 projects N-number of first grid pattern lights onto the target object 10 during the first grid transfer unit 330 moves the first grid unit 320 step by step, the image-capturing part 200 may capture N-number of first pattern images in sequence by receiving the N-number of first grid pattern light reflected by the target obj ect 10. Further, When second projecting part 400 projects N-number of second grid pattern lights onto the target object 10 during the second grid transfer unit 430 moves the first grid unit 420 step by step, the image-capturing part 200 may capture N-number of second pattern images in sequence by receiving the N-number of second grid pattern light reflected by the target object 10. The 'N' described above is an integer. For example, the 'N' may be three or four.
  • In the present embodiment, only the first and second projecting parts 300 and 400 emit the first and second grid pattern lights respectively, but the number of the projecting part may be equal to or greater than three. That is, the grid pattern light may be emitted toward the target object 10 in various directions to get various pattern images. For example, when three projecting parts are arranged to form a triangle, three grid pattern lights may be emitted onto the target object 10 in different directions, and when four projecting parts are arranged to form a square, four grid pattern lights may be emitted onto the target object 10 in different directions. Further, the number of the projecting parts may be eight. In this case, images can be captured by emitting grid pattern light in eight directions.
  • The lighting part 450 irradiates a light toward the target object 10 to capture 2-dimensional image of the target object 10. In one embodiment, the lighting part 450 may include a red lighting 452, a green lighting 454, and a blue lighting 456. For example, the red lighting 452, the green lighting 454 and the blue lighting 456 may be arranged above the target object in circular to irradiate each of red, green, and blue lights toward the target object 10, or may be arranged with different heights as shown in FIG. 1.
  • The image obtaining part 500 is electrically connected to the camera 210 of the image-capturing part 200 to obtain and store pattern images generated by using the first and second projecting parts 300 and 400 from the camera 210. Also, the image obtaining part 500 obtains and stores 2-dimensional images generated by using the lighting part 450 from the camera 210. For example, the image obtaining part 500 includes an imaging system which receives and stores N-number of first pattern images and N-number of second pattern images captured by the camera 210.
  • The module controlling part 600 is electrically connected to the measurement stage part 100, the image-capturing part 200, the first projecting part 300 and second projecting part 400 to control them. For example, the module controlling part 600 includes a lighting controller, a grid controller, and a stage controller. The lighting controller controls the first and second lighting units 310 and 410 and to generate lights, the grid controller controls the first and second grid transfer units 330 and 430 to move the first and second grid units 320 and 420. The stage controller controls the stage transfer unit 120 to move the stage 110 to up down left right side.
  • The central processing part 700 is electrically connected to the image obtaining part 500 and the module controlling part 600 to control them. In more detail, the central processing part 700 may measure 3-dimensional shape of the target object 110 by receiving and processing the N-number of first pattern images and N-number of second pattern images from the imaging system of the image obtaining part 500. Also, the central processing part 700 may control each of the light controller, the grid controller, and the state controller. In order for that, the central processing part 700 may include an image processing board, a controlling board, and an interface board.
  • Hereinafter, a method of inspecting a substrate for detecting a substrate with a target object 10, and a method of inspecting a foreign substance on a substrate by using the 3-dimensional shape measuring apparatus described above, will be explained referring to figures.
  • FIG. 2 is a flow chart showing a method of inspecting a substrate and a method of inspecting a substrate for detecting a foreign substance on a substrate,
  • FIG. 3 is a plan view showing an example of image information of a substrate displayed through a step of displaying image information of a substrate in FIG. 2, and FIG. 4 is a plan view showing a real substrate corresponding to the image of the substrate in FIG. 3.
  • Referring to FIG. 2 through FIG. 4, in order to inspect a substrate 10, an image information of a substrate 10 before applying solder is displayed (S 110).
  • For example, the image information of a substrate 10 may include a gerber information of the substrate 10. The gerber information of the substrate 10 may be an information of a design standard of the substrate 10 before applying solder, and may be displayed on a monitor of an operator as a gerber image GI shown in FIG. 3.
  • As shown in FIG. 3, the gerber image GI may include pads GI-P with various shapes, on which solder is applied, and the gerber information may be a black-and-white image information.
  • Then, at least one image of at least one inspection region on the substrate is captured to obtain image of captured inspection region (S120).
  • The inspection region is a target region to be measured or inspected on the substrate 10, and can be set automatically or by an operator. The substrate 10 may be divided into several regions with definite area to be set as the inspection region, or the whole region of the substrate 10 may be set as the inspection region. For example, the region with definite area may be defined as a field of view (FOV) of the camera 210 of the image-capturing part 200.
  • FIG. 5 is a conceptual view showing an example of a step of obtaining an image of an inspection region on a substrate in FIG. 2.
  • Referring to FIG. 5, the inspection region FOV of the substrate 10 may be defined by the field of view of the camera 210, and may be captured along an arrow direction.
  • In FIG. 5, all regions of the substrate 10 are image-captured. However, some required regions may be selectively captured.
  • Image-capturing of the inspection region FOV may be performed, for example, by at least one of the projection parts 300 and 400 and the lighting part 450.
  • That is, the projecting parts 300 and 400 may emit grid pattern light onto the inspection region FOV, so that the inspection region FOV may be image-captured. In this case, the image of the inspection region FOV includes 3-dimensional image based on height information.
  • On the other hand, the light part 450 may emit at least one color light onto the inspection region FOV, so that the inspection region FOV may be image-captured. In this case, the image of the inspection region FOV includes 2-dimensional plane image.
  • Then, displayed image information is renewed by using the obtained inspection region FOV image, and the renewed image information is displayed (S130).
  • In this case, the renewed image information may be color image information.
  • The camera 210 providing the renewed image information may be color camera. Alternatively, the camera 210 obtaining the image of the inspection region FOV may be a black-and-white camera. When the image of the inspection region FOV is an image obtained by using the lighting part 450, the renewed image can be converted into a color image since the images obtained by color lights emitted by the lighting part 450 has difference, even though the camera 210 is a black-and-white camera.
  • FIG. 6 is a plan view for explaining a process of displaying renewed image information in FIG. 2.
  • Referring to FIG. 6, the renewed image information corresponds to an internal area of bold line, and image information that is not renewed corresponds to an outer area of the bold line.
  • The image-captured inspection region as shown in FIG. 5 may be real time renewed as shown in FIG. 6.
  • As described above, in order to perform inspection, the image of the substrate 10, which is displayed, is renewed as the image of the image-captured inspection region FOV so that an operator can easily match the image of the substrate 10, which is displayed, with a real substrate 10. Further, when the gerber information is black-and-white and the renewed image information is color, the operator can more easily catch the position of the real substrate 10 since the operator uses the renewed color image of the substrate 10, which is displayed.
  • On the other hand, this step can be omitted in the method of inspecting a substrate for detecting a foreign substance.
  • Then, the existence of a foreign substance is inspected by comparing the obtained image of the inspection region FOV with a reference image of the substrate 10 (S140).
  • For example, the reference image of the substrate 10 can be obtained from a reference substrate that is previously selected. For example, the reference substrate may be a master substrate or a master board. The reference substrate is previously selected, and the inspection region FOV, which is previously explained, is image-captured from the reference substrate as the same way previously explained to get the reference image.
  • That is, the reference image of the substrate 10 may include an image captured through emitting at least one color light onto the inspection region FOV of the reference substrate by using the light part 450.
  • In this case, the reference image is 2-dimensional plane image, and the reference image is compared with the captured image of the inspection region FOV so that a different portion may be considered as a foreign substance.
  • For example, when there exist first and second foreign substances FS1 and FS2 on the substrate 10 as shown in FIG. 4, the reference image does not contain the foreign substances FS1 and FS2 but the captured image CI contains the foreign substances FS1 and FS2 as shown in FIG. 6. Therefore, the different portion CI-FS1 obtained by comparing the reference image and the captured image CI of the inspection region FOV may be considered as a foreign substance.
  • Further, the reference image of the substrate 10 may further include an image captured through emitting at least one grid pattern light onto the inspection region FOV of the reference substrate by using the projecting parts 300 and 400.
  • In this case, the reference image is 3-dimensional image based on height, and the reference image is compared with the captured image of the inspection region FOV so that a different portion may be considered as a foreign substance.
  • For example, when there exist first and second foreign substances FS1 and FS2 on the substrate 10 as shown in FIG. 4, the reference image does not contain the foreign substances FS1 and FS2 but the heights of the foreign substances FS1 and FS2 in the captured image are different from those in the reference image. Therefore, the different portion obtained by comparing the reference image and the captured image CI of the inspection region FOV may be considered as a foreign substance.
  • Alternatively, in the present step, the foreign substance can be detected by using only the obtained 3-dimensional image of the inspection region FOV based on height instead of inspecting the foreign substance by comparing the obtained 3-dimensional image of the inspection region FOV with the reference image of the substrate 10.
  • That is, when the image of the inspection region FOV, which is obtained by using the projecting parts 300 and 400, is 3-dimensional image based on height, an abrupt height change or a portion exceeding a previously set value may be considered as a foreign substance.
  • For example, when the first and second foreign substances FS1 and FS2 are exist on the substrate 10 as shown in FIG. 4, heights of the first and second foreign substances FS1 and FS2 may abruptly increase or exceed a previously set value in the obtained 3-dimensional image based on height. Therefore, the abrupt height change or the portion exceeding a previously set value may be considered as a foreign substance without comparing the reference image with the obtained image.
  • On the other hand, the reference image can be obtained before the step S110 in which the image information of the substrate 10 is displayed.
  • According to the present disclosure the image of the substrate, which is displayed for performing inspection, is renewed to be captured inspection region image so that an operator can easily catch a real portion in the real substrate, which corresponds to a portion in the displayed image.
  • Additionally, a foreign substance can be easily detected by comparing the obtained image of the inspection region with the reference image of the substrate.
  • FIG. 7 is a flow chart showing another method of inspecting a substrate for detecting a foreign substance.
  • Referring to FIG. 7, in order to inspect a foreign substance on a substrate, master images of a master substrate per color is obtained by using a plurality of color lights (S210).
  • For example, a red master image of the master substrate is obtained by using a red light emitted by the red lighting 452, a green master image of the master substrate is obtained by using a green light emitted by the green lighting 454, and a blue master image of the master substrate is obtained by using a blue light emitted by the blue lighting 456.
  • Then, inspection images of the substrate 10 are obtained per color by using the plurality of color lights (S220).
  • For example, a red inspection image of the substrate 10 is obtained by using a red light emitted by the red lighting 452, a green inspection image of the substrate 10 is obtained by using a green light emitted by the green lighting 454, and a blue inspection image of the substrate 10 is obtained by using a blue light emitted by the blue lighting 452.
  • Then, a foreign substance is detected by comparing the master images per color and the obtained images per color.
  • Foreign substances may have various color characteristics. For example, a substance may be characterized by a bright foreign substance and a dark foreign substance.
  • The bright foreign substance and the dark foreign substance are characterized by relative brightness with respect to the substrate 10. The bright foreign substance and the dark foreign substance may be detected by different method. Therefore, more precise detection may be performed.
  • Hereinafter, a process for detecting foreign substance according to brightness by comparing the master images per color and the obtained images per color.
  • First, a bright foreign substance such as dust, chip, etc. is detected by comparing the master images per color and the obtained images per color (S230).
  • FIG. 8 is a flow chart showing a process of detecting a bright foreign substrate.
  • Referring to FIG.8, the obtained images per color are merged to form a substrate image regarding the substrate 10 (S232). For example, intensities of each of the red inspection image, the green inspection image and the blue inspection image are merged per pixel to form an image of the substrate.
  • Aside from forming the images of the substrate, the master images per color are merged to form an image of the master substrate (S234). For example, intensities of each of the red master image, the green master image and the blue master image are merged per pixel to form an image of the master substrate.
  • After forming the image of the substrate and the master images of the master substrate, a bright foreign substance is detected by comparing the image and the master image (S236).
  • FIG. 9 is a conceptual view showing a process of detecting a bright foreign substance by comparing a substrate image and a master substrate image.
  • Referring to FIG. 9, a master substrate image (MSI) is subtracted from the substrate image (ISI) to form a comparison image (PI). The master substrate image (MSI) is a clean image without a foreign substance. Therefore, when the substrate image (ISI) contains a foreign substance (FM), there exists only the foreign substance (FM) in the comparison image (PI).
  • On the other hand, in order to enhance detection performance of the foreign substance (FM) in the comparison image (PI), the comparison (PI) may be binary-coded based on a specific reference value to form a binary-coded image (BI). In the binary-coded image (BI), a foreign substance (FM) region and other regions are clearly distinguished. Therefore, detecting a foreign substance is enhanced. Additionally, when a noise removing is performed regarding to the binary-coded image (BI), detecting a foreign substance is further enhanced.
  • Referring again to FIG. 7, a dark foreign substance such as a hair, an insulation tape, etc. is detected by using the master images per color and the obtained images per color, after detecting the bright foreign substance (S240).
  • FIG. 10 is a flow chart showing a process of detecting a dark foreign substrate.
  • Referring to FIG. 10, a chroma (or saturation) map regarding to the substrate 10 is formed by using the obtained images per color (S242).
  • Additionally, a chroma map regarding to the master substrate is formed by using the master images per color (S244).
  • For example, the chroma map may be formed by using chroma information of a red image, a green image and a blue image per pixel. In detail, the chroma map may be formed based on chroma per pixel, which is calculated by the following Equation 1. Saturation = 1 3 * Min R G B / R + G + B
    Figure imgb0001
  • In the Equation 1, 'R' is a chroma information of each pixel in the red image, 'G' is a chroma information of each pixel in the green image, and 'B' is a chroma information of each pixel in the blue image.
  • The chroma map 300 obtained by using the Equation 1 has a value 0 ~ 1 per pixel, and a value means a primary color, when the value approaches to 1. In general, a dark foreign substance is near to achromatic color, so that the dark foreign substance region is expressed as a region with a value near to 0.
  • After forming the chroma map regarding to a substrate and the chroma map regarding to a master substrate through the Equation 1, a dark foreign substance is detected by comparing the chroma map regarding to a substrate with the chroma map regarding to a master substrate (S246).
  • FIG. 11 is a conceptual view showing a process of detecting a dark foreign substance by comparing a chroma map of a substrate and a chroma map of a master substrate.
  • Referring to FIG. 11, a chroma map (TSM) regarding to a substrate is subtracted from a chroma map(MSM) regarding to a master substrate to form a comparison image (PM). When there exists a foreign substance (FM) in the chroma map (TSM) of the substrate, the foreign substance (FM) can be seen in the comparison image (PM).
  • On the other hand, in order to enhance detection performance of the foreign substance (FM) in the comparison image (PM), the comparison (PM) may be binary-coded based on a specific reference value to form a binary-coded image (BM). In the binary-coded image (BM), a foreign substance (FM) region and other regions are clearly distinguished. Therefore, detecting a foreign substance (FM) is enhanced.
  • Further, after a foreign substance (FM) region is firstly identified by using the binary-coded image (BM), the region with the foreign substance (FM) is set as a region of interest (ROI), and color analysis may be performed regarding to the region of interest (ROI) to perform more precise detection. For example, after chroma map of the whole substrate is compared and analyzed, a region with a foreign substance (FM) is set as the region of interest (ROI). Then, other regions are mask-processed to be excluded, and the region of interest (ROI) is selectively amplified to generate region of interest (ROI) image by merging color images (for example, a red image, a green image and a blue image) of the region of interest (ROI). Then, binary-coding and noise-removing are performed regarding to the region of interest, so that a foreign substance can be more precisely detected.
  • Referring again to FIG. 7, a detection result of the bright foreign substance and a detection result of the dark foreign substance are merged (S250). That is, the detection results of a bright foreign substance and the dark foreign substance are merged to finally detect foreign substances of the substrate.
  • As described above, when a bright foreign substance and a dark foreign substance are separately detected and merged in inspecting of a foreign substance of a substrate, a reliability of detection of a foreign substance can be improved.
  • On the other hand, when a foreign substance is estimated if it is bright or dark, or detection of only one of a dark foreign substance and a bright foreign substance is required, only the detection of only one of a dark foreign substance and a bright foreign substance may be employed.
  • Alternatively, when a foreign substance cannot be estimated if it is bright or dark, or detection of both of a dark foreign substance and a bright foreign substance is required, detection of both of a dark foreign substance and a bright foreign substance may be performed and the sequence of detection of a dark foreign substance and detection of a bright foreign substance is not limited.
  • On the other hand, a foreign substance exclusively corresponds to a bright foreign substance or a dark foreign substance. Therefore, when a substance is considered a bright foreign substance or a dark foreign substance, the substance is considered as a foreign substance.
  • FIG. 12 is a flow chart showing a method of inspecting a substrate for detecting a foreign substance.
  • Referring to FIG. 12, in order to inspect a foreign substance, a 3-dimensional information based on height of a substrate is obtained by using at least grid pattern light (S310). That is, 3-dimensional information including height information of a substrate is obtained by using at least one of the projection parts 300 and 400.
  • Then, a foreign substance of the substrate is detected by using the 3-dimensional information based on height (S320).
  • For example, a region of abrupt height change or a region exceeding a previously set reference height may be considered as a foreign substance in the 3-dimensional information based on height of the substrate to detect a foreign substance.
  • Aside from detecting a foreign substance by using the 3-dimensional information, 2-dimensional information per color is obtained by using a plurality of color light (S330).
  • Then, a foreign substance of the substrate is detected by using the 2-dimensinoal information per color (S340). The method of detecting foreign substance by using the 2-dimension information is substantially same as the embodiment explained referring to FIG. 7 through FIG. 11. Therefore, any further explained will be omitted.
  • Then, a detection result obtained by using the 3-dimensional information based on height and a detection result obtained by using 2-dimensional information per color are merged (S350). That is, the detection result obtained by using the 3-dimensional information based on height and the detection result obtained by using 2-dimensional information per color are merged to detect finally a foreign substance.
  • As described above, when the detection result obtained by using the 3-dimensional information based on height and the detection result obtained by using 2-dimensional information per color are merged, a reliability of detection of a foreign substance can be improved in detecting a foreign substance of a substrate.
  • On the other hand, in order for exact detection or minimum error, a substance may be considered as a foreign substance when the substance is considered as a foreign substance in both of the detection by using the 3-dimensional information based on height and the detection by using the 2-dimensional information per color.
  • Alternatively, a substance may be considered as a foreign substance when the substance is considered as a foreign substance in one of the detection by using the 3-dimensional information based on height and the detection by using the 2-dimensional information per color.
  • On the other hand, a region which may be wrongly considered as a foreign substance or a region necessary to be removed may be removed before detecting a foreign substance by using the 3-dimensional information based on height and detecting a foreign substance by using the 2-dimensional information per color.
  • FIG. 13 is a flow chart showing another method of inspecting a substrate for detecting a foreign substance.
  • Referring to FIG. 13, in order to inspect a foreign substance of a substrate, a mask information of the substrate is firstly obtained (S410).
  • For example, the mask information may include a hole information formed at the substrate. A hole formed at the substrate may be different from a hole formed at the master substrate in size, so that a hole may be wrongly considered as a foreign substance, when a hole is included in an inspection target that is to be detected. Therefore, in order to eliminate the possibility of wrong inspection, a hole may be removed from an inspection target that is to be detected. In this case, the mask information such as a hole information may be obtained by using an infrared (IR) lighting.
  • For example, the mask information may include a circuit pattern information formed on the substrate. The circuit pattern is generally formed through an etching process, and a circuit pattern formed on a substrate may be different from a circuit pattern formed on the master substrate in position, so that a circuit pattern may be wrongly considered as a foreign substance, when the circuit pattern is included in an inspection target that is to be detected. Therefore, in order to eliminate the possibility of wrong inspection, a circuit pattern may be removed from an inspection target that is to be detected.
  • The mask information regarding a substrate may be obtained from the substrate or the master substrate.
  • Then, a mask is generated based on the mask information to remove wrongly inspectable material from inspection target (S420).
  • For example, a hole mask including a hole of the substrate, and an edge mask including a circuit pattern of the substrate may be formed. The hole mask and the edge mask can exclude a hole and a circuit pattern from an inspection target to be detected.
  • After excluding wrongly inspectable material from the inspection target to be detected, a foreign substance may be inspected based on the 3-dimensional information based on height and the 2-dimensional information per color (S430).
  • As described above, more easy or precise detection can be performed by excluding a region that is unnecessary or may induce error in inspecting a substrate for detecting a foreign substance by using infrared lighting.

Claims (3)

  1. An apparatus for inspecting a substrate (10) for detecting a foreign substance comprising: a projecting part (300, 400) configured to provide at least one grid pattern light onto the substrate;
    a lighting part (450) configured to provide a plurality of color lights onto the substrate (10);
    an image capturing part (200) configured to capture at least one image of the substrate (10) and a foreign substance on the substrate (10) by receiving the grid pattern light and the color lights reflected by the foreign substance and the substrate (10); the apparatus further comprising: an infrared lighting to provide infrared light onto the substrate (10) and a processing part (700) connected to the image capturing part (200) for receiving and processing the at least one image captured by the image capturing part (200), the processing part (700) being configured to obtain 3-dimensional image information including height of the substrate (10) by using the grid pattern light, detect a foreign substance of the substrate (10) by using the 3-dimensional image information including height, obtain 2-dimensional image information of the substrate (10) per color by using the color lights, detect a foreign substance of the substrate (10) by using the 2-dimensional image information per color, combine a detection result of a foreign substance performed by using the 3-dimensional image information including height and a detection result of a foreign substance performed by using the 2-dimensional image information per color, and
    wherein the processing part (700) is configured to:
    obtain, before detecting a foreign substance of the substrate (10), mask information about the substrate (10) by using the infrared lighting; and
    exclude, from the the at least one image, a region formed on the substrate (10) and which may be wrongly considered as a foreign substance or a region formed on the substrate (10) and which is necessary to be removed from the substrate (10) to be detected by using a mask that is generated based on the mask information.
  2. The apparatus of claim 1, wherein the mask information comprises information about a hole formed at the substrate (10) such that the hole is removed from the substrate (10) to be detected.
  3. The apparatus of claim 1, wherein the processing part (700) is configured to determine a foreign substance as a final foreign substance, when the foreign substance is considered as a foreign substance in both of inspection by using the 3-dimensional image information including height and inspection by using the 2-dimensional image information per color.
EP21170906.8A 2013-04-02 2014-04-01 Apparatus for inspecting a substrate for detecting a foreign substance Active EP3889590B1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20130036076 2013-04-02
KR1020140032676A KR101590831B1 (en) 2013-04-02 2014-03-20 Method of inspecting foreign substance on a board
PCT/KR2014/002784 WO2014163375A1 (en) 2013-04-02 2014-04-01 Method for inspecting for foreign substance on substrate
EP14778804.6A EP2982969B1 (en) 2013-04-02 2014-04-01 Method for inspecting for foreign substance on substrate

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
EP14778804.6A Division EP2982969B1 (en) 2013-04-02 2014-04-01 Method for inspecting for foreign substance on substrate
EP14778804.6A Division-Into EP2982969B1 (en) 2013-04-02 2014-04-01 Method for inspecting for foreign substance on substrate

Publications (4)

Publication Number Publication Date
EP3889590A2 EP3889590A2 (en) 2021-10-06
EP3889590A3 EP3889590A3 (en) 2021-11-10
EP3889590C0 EP3889590C0 (en) 2024-03-20
EP3889590B1 true EP3889590B1 (en) 2024-03-20

Family

ID=51992596

Family Applications (2)

Application Number Title Priority Date Filing Date
EP14778804.6A Active EP2982969B1 (en) 2013-04-02 2014-04-01 Method for inspecting for foreign substance on substrate
EP21170906.8A Active EP3889590B1 (en) 2013-04-02 2014-04-01 Apparatus for inspecting a substrate for detecting a foreign substance

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP14778804.6A Active EP2982969B1 (en) 2013-04-02 2014-04-01 Method for inspecting for foreign substance on substrate

Country Status (6)

Country Link
US (2) US10060859B2 (en)
EP (2) EP2982969B1 (en)
JP (2) JP6574163B2 (en)
KR (1) KR101590831B1 (en)
CN (2) CN107064163A (en)
WO (1) WO2014163375A1 (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101590831B1 (en) * 2013-04-02 2016-02-03 주식회사 고영테크놀러지 Method of inspecting foreign substance on a board
CN104359915B (en) 2014-12-08 2017-05-10 合肥京东方光电科技有限公司 Coated glue detection method and device
US9747520B2 (en) * 2015-03-16 2017-08-29 Kla-Tencor Corporation Systems and methods for enhancing inspection sensitivity of an inspection tool
CN105214906A (en) * 2015-08-26 2016-01-06 武汉华星光电技术有限公司 A kind of method of apparatus for coating and the wrong report of removing foreign matter thereof
JP6792369B2 (en) * 2016-07-28 2020-11-25 株式会社サキコーポレーション Circuit board inspection method and inspection equipment
KR101959765B1 (en) * 2016-12-23 2019-07-04 (주) 피앤에프 Visual inspection method of lens module
CN106706660A (en) * 2017-02-04 2017-05-24 信利(惠州)智能显示有限公司 Detecting method and device for foreign matters on substrate and optical concentration
JP6688268B2 (en) * 2017-09-19 2020-04-28 株式会社スギノマシン Foreign substance inspection device and foreign substance inspection method
CN109656033B (en) * 2017-10-12 2021-07-02 凌云光技术股份有限公司 Method and device for distinguishing dust and defects of liquid crystal display screen
KR102017599B1 (en) * 2018-02-28 2019-09-03 재단법인 대구테크노파크 Power supply apparatus using capacitance
CN110456423B (en) * 2018-05-07 2024-03-19 特鲁普机械奥地利有限公司及两合公司 Cutting chip identification for bending units
CN109752392B (en) * 2018-12-24 2021-08-03 苏州江奥光电科技有限公司 PCB defect type detection system and method
TWI724370B (en) * 2019-02-01 2021-04-11 由田新技股份有限公司 An automatic optical inspection system, and method for measuring a hole structure
JP7375458B2 (en) * 2019-10-23 2023-11-08 オムロン株式会社 Appearance inspection equipment and defect inspection method
JP7418274B2 (en) * 2020-04-17 2024-01-19 東京エレクトロン株式会社 Foreign matter inspection system, foreign matter inspection method, program and semiconductor manufacturing equipment
JP7046150B1 (en) * 2020-12-03 2022-04-01 Ckd株式会社 Substrate foreign matter inspection device and substrate foreign matter inspection method
JP7110462B1 (en) * 2021-09-17 2022-08-01 Ckd株式会社 Substrate hair inspection device and substrate hair inspection method
CN113866180A (en) * 2021-12-06 2021-12-31 晶芯成(北京)科技有限公司 Foreign matter detection method, semiconductor wafer detection method and system

Family Cites Families (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2705458B2 (en) 1992-05-26 1998-01-28 松下電器産業株式会社 Mounting board appearance inspection device
US5489750A (en) * 1993-03-11 1996-02-06 Matsushita Electric Industrial Co., Ltd. Method of mounting an electronic part with bumps on a circuit board
JP3373327B2 (en) * 1995-04-24 2003-02-04 松下電器産業株式会社 Foreign matter inspection device
JP3468643B2 (en) 1996-10-15 2003-11-17 株式会社日立ユニシアオートモティブ Appearance inspection method for printed wiring boards
JP2903305B2 (en) 1996-11-19 1999-06-07 日本マランツ株式会社 Mounting component inspection method and mounting component inspection device
US6317513B2 (en) * 1996-12-19 2001-11-13 Cognex Corporation Method and apparatus for inspecting solder paste using geometric constraints
US6721461B1 (en) * 1997-11-24 2004-04-13 Cognex Technology And Investment Corporation Method and apparatus using image subtraction and dynamic thresholding
US6061476A (en) * 1997-11-24 2000-05-09 Cognex Corporation Method and apparatus using image subtraction and dynamic thresholding
JP2000124598A (en) * 1998-10-13 2000-04-28 Matsushita Electric Ind Co Ltd Generating method for inspection device data
US7039228B1 (en) * 1999-11-19 2006-05-02 Rudolph Technologies, Inc. System and method for three-dimensional surface inspection
US7181058B2 (en) * 1999-12-13 2007-02-20 Gsi Group, Inc. Method and system for inspecting electronic components mounted on printed circuit boards
JP2002042112A (en) 2000-07-26 2002-02-08 Eiritsu Denki Kk Part mounting inspecting method for circuit board and its system
US6630996B2 (en) 2000-11-15 2003-10-07 Real Time Metrology, Inc. Optical method and apparatus for inspecting large area planar objects
JP4667681B2 (en) 2001-09-28 2011-04-13 パナソニック株式会社 Mounting inspection system and mounting inspection method
JP2003141509A (en) 2001-10-31 2003-05-16 Nagoya Electric Works Co Ltd Method and apparatus for inspecting board
US7062080B2 (en) * 2001-11-26 2006-06-13 Omron Corporation Method of inspecting curved surface and device for inspecting printed circuit board
JP3551188B2 (en) * 2002-01-10 2004-08-04 オムロン株式会社 Surface condition inspection method and substrate inspection device
JP4190243B2 (en) 2002-09-30 2008-12-03 日本特殊陶業株式会社 Electronic circuit component visual inspection method, visual inspection apparatus, and electronic circuit component manufacturing method.
JP4573255B2 (en) * 2003-03-28 2010-11-04 株式会社サキコーポレーション Appearance inspection apparatus and appearance inspection method
JP4549662B2 (en) 2003-12-04 2010-09-22 パナソニック株式会社 Solder inspection apparatus and solder inspection method
JP3882840B2 (en) * 2004-03-01 2007-02-21 オムロン株式会社 Solder printing inspection method, solder printing inspection machine and solder printing inspection system using this method
JP3972941B2 (en) * 2004-06-30 2007-09-05 オムロン株式会社 Solder printing inspection method for component mounting board and inspection machine for solder printing inspection
JP4493421B2 (en) 2004-06-30 2010-06-30 株式会社リコー Printed circuit board inspection apparatus, printed circuit board assembly inspection line system, and program
JP4165538B2 (en) * 2004-07-21 2008-10-15 オムロン株式会社 Component mounting inspection method and component mounting inspection device
US20060033909A1 (en) * 2004-07-23 2006-02-16 Bowers Gerald M Reticle particle calibration standards
JP3741287B1 (en) 2005-03-22 2006-02-01 マランツエレクトロニクス株式会社 Mounting board inspection method and inspection apparatus
JP5191089B2 (en) 2005-06-30 2013-04-24 Ckd株式会社 Board inspection equipment
WO2007074770A1 (en) 2005-12-26 2007-07-05 Nikon Corporation Defect inspection device for inspecting defect by image analysis
JP2007304065A (en) * 2006-05-15 2007-11-22 Omron Corp Foreign substance detector, foreign substance detecting method, foreign substance detecting program, and recording medium with the program stored
JP4896655B2 (en) 2006-10-17 2012-03-14 ヤマハ発動機株式会社 Mounting fault cause identification method and mounting board manufacturing apparatus
JP4950738B2 (en) 2007-04-04 2012-06-13 富士機械製造株式会社 Printed circuit board assembly method and installation program creation program
JP4894628B2 (en) 2007-05-28 2012-03-14 パナソニック電工株式会社 Appearance inspection method and appearance inspection apparatus
JP5239314B2 (en) * 2007-11-28 2013-07-17 オムロン株式会社 Object recognition method and board visual inspection apparatus using this method
EP2270567A4 (en) * 2008-04-16 2014-08-06 Komatsulite Mfg Co Ltd Imaging lens unit
CN101592620A (en) * 2008-05-28 2009-12-02 华硕电脑股份有限公司 Circuit substrate pick-up unit and method
DE102010028894B4 (en) * 2009-05-13 2018-05-24 Koh Young Technology Inc. Method for measuring a measurement object
KR101215910B1 (en) 2010-04-14 2012-12-27 주식회사 고영테크놀러지 Method of measuring an area of a solder on a printed circuit board
DE102010029091B4 (en) * 2009-05-21 2015-08-20 Koh Young Technology Inc. Form measuring device and method
KR101133968B1 (en) * 2009-07-03 2012-04-05 주식회사 고영테크놀러지 Method for detecting a bridge connecting failure
US9091725B2 (en) * 2009-07-03 2015-07-28 Koh Young Technology Inc. Board inspection apparatus and method
US8369603B2 (en) * 2009-07-03 2013-02-05 Koh Young Technology Inc. Method for inspecting measurement object
KR101241175B1 (en) * 2010-02-01 2013-03-13 주식회사 고영테크놀러지 Mounting boards inspection apparatus and method thereof
JP2011053036A (en) * 2009-08-31 2011-03-17 Canon Inc Contamination inspection apparatus, exposure apparatus, and device manufacturing method
JP2011123019A (en) 2009-12-14 2011-06-23 Olympus Corp Image inspection apparatus
KR101078781B1 (en) * 2010-02-01 2011-11-01 주식회사 고영테크놀러지 Method of inspecting a three dimensional shape
CN101793843A (en) * 2010-03-12 2010-08-04 华东理工大学 Connection table based automatic optical detection algorithm of printed circuit board
US8855403B2 (en) 2010-04-16 2014-10-07 Koh Young Technology Inc. Method of discriminating between an object region and a ground region and method of measuring three dimensional shape by using the same
JP5678595B2 (en) 2010-11-15 2015-03-04 株式会社リコー INSPECTION DEVICE, INSPECTION METHOD, INSPECTION PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM
KR101205970B1 (en) 2010-11-18 2012-11-28 주식회사 고영테크놀러지 Method for detecting a bridge connecting failure
JP2012117920A (en) 2010-12-01 2012-06-21 Djtech Co Ltd Visual inspection apparatus and printed solder inspection apparatus
JP2012209085A (en) 2011-03-29 2012-10-25 Sinterland Inc Discharge plasma sintering device
CN102721695B (en) * 2012-05-18 2015-01-07 深圳大学 Method for detecting printed circuit board defect
US8831285B2 (en) * 2012-07-26 2014-09-09 Hewlett-Packard Development Company, L.P. Detecting objects with a depth sensor
KR101590831B1 (en) * 2013-04-02 2016-02-03 주식회사 고영테크놀러지 Method of inspecting foreign substance on a board
CN105829829B (en) * 2013-12-27 2019-08-23 索尼公司 Image processing apparatus and image processing method

Also Published As

Publication number Publication date
JP2016519768A (en) 2016-07-07
JP6425755B2 (en) 2018-11-21
EP3889590C0 (en) 2024-03-20
US10705028B2 (en) 2020-07-07
JP6574163B2 (en) 2019-09-11
JP2017125861A (en) 2017-07-20
US10060859B2 (en) 2018-08-28
WO2014163375A1 (en) 2014-10-09
CN104335030B (en) 2017-06-09
CN104335030A (en) 2015-02-04
CN107064163A (en) 2017-08-18
KR101590831B1 (en) 2016-02-03
EP3889590A2 (en) 2021-10-06
EP2982969A4 (en) 2017-03-01
US20160025649A1 (en) 2016-01-28
US20180328857A1 (en) 2018-11-15
KR20140120821A (en) 2014-10-14
EP2982969B1 (en) 2021-06-09
EP3889590A3 (en) 2021-11-10
EP2982969A1 (en) 2016-02-10

Similar Documents

Publication Publication Date Title
EP3889590B1 (en) Apparatus for inspecting a substrate for detecting a foreign substance
JP5562407B2 (en) Substrate inspection apparatus and inspection method
JP5256251B2 (en) Inspection method of measurement object
JP2015194500A (en) Measuring method and measuring apparatus of three-dimensional shape
JP5411913B2 (en) Pin tip position setting method
KR101241175B1 (en) Mounting boards inspection apparatus and method thereof
KR101659302B1 (en) Three-dimensional shape measurement apparatus
JP5948496B2 (en) Method for measuring the height of a three-dimensional shape measuring device
JP5621178B2 (en) Appearance inspection device and printed solder inspection device
CN107110789B (en) Method and apparatus for inspecting component-mounted substrate
JP2007155405A (en) Visual inspection method and visual inspection device
KR101684244B1 (en) Board inspection method
JP4131804B2 (en) Mounting component inspection method
KR101133972B1 (en) Method of inspecting terminal
KR101133641B1 (en) Method of inspecting three-dimensional shape
JP2009216475A (en) Surface inspection system and surface inspection method using the same
JPH036447A (en) Inspecting apparatus of packaged board
JP3038718B2 (en) Method and apparatus for inspecting solder bridge of lead component
JP2005189167A (en) Bridge inspection device of cap
KR101544763B1 (en) Method of inspecting terminal
JP2010078417A (en) Apparatus for inspecting color filter

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AC Divisional application: reference to earlier application

Ref document number: 2982969

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RIC1 Information provided on ipc code assigned before grant

Ipc: G01N 21/956 20060101ALI20211005BHEP

Ipc: G06T 7/00 20170101ALI20211005BHEP

Ipc: G01B 11/25 20060101ALI20211005BHEP

Ipc: G01N 21/94 20060101ALI20211005BHEP

Ipc: G01N 21/88 20060101AFI20211005BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20211210

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/30 20170101ALN20231123BHEP

Ipc: G06T 7/10 20170101ALN20231123BHEP

Ipc: G01N 21/956 20060101ALN20231123BHEP

Ipc: G06T 7/00 20170101ALI20231123BHEP

Ipc: G01N 21/94 20060101ALI20231123BHEP

Ipc: G01N 21/88 20060101AFI20231123BHEP

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/30 20170101ALN20231124BHEP

Ipc: G06T 7/10 20170101ALN20231124BHEP

Ipc: G01N 21/956 20060101ALN20231124BHEP

Ipc: G06T 7/00 20170101ALI20231124BHEP

Ipc: G01N 21/94 20060101ALI20231124BHEP

Ipc: G01N 21/88 20060101AFI20231124BHEP

INTG Intention to grant announced

Effective date: 20231212

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/30 20170101ALN20231201BHEP

Ipc: G06T 7/10 20170101ALN20231201BHEP

Ipc: G01N 21/956 20060101ALN20231201BHEP

Ipc: G06T 7/00 20170101ALI20231201BHEP

Ipc: G01N 21/94 20060101ALI20231201BHEP

Ipc: G01N 21/88 20060101AFI20231201BHEP

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AC Divisional application: reference to earlier application

Ref document number: 2982969

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602014089768

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

U01 Request for unitary effect filed

Effective date: 20240320

U07 Unitary effect registered

Designated state(s): AT BE BG DE DK EE FI FR IT LT LU LV MT NL PT SE SI

Effective date: 20240327

U20 Renewal fee paid [unitary effect]

Year of fee payment: 11

Effective date: 20240326