CN109804238B - Optical inspection device - Google Patents

Optical inspection device Download PDF

Info

Publication number
CN109804238B
CN109804238B CN201780062143.7A CN201780062143A CN109804238B CN 109804238 B CN109804238 B CN 109804238B CN 201780062143 A CN201780062143 A CN 201780062143A CN 109804238 B CN109804238 B CN 109804238B
Authority
CN
China
Prior art keywords
light
unit
region
imaging unit
light emitting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201780062143.7A
Other languages
Chinese (zh)
Other versions
CN109804238A (en
Inventor
米泽良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
V Technology Co Ltd
Original Assignee
V Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by V Technology Co Ltd filed Critical V Technology Co Ltd
Publication of CN109804238A publication Critical patent/CN109804238A/en
Application granted granted Critical
Publication of CN109804238B publication Critical patent/CN109804238B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/21Polarisation-affecting properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/86Investigating moving sheets
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/958Inspecting transparent materials or objects, e.g. windscreens
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention aims to provide an optical inspection device, which can inspect defects by one device and one inspection regardless of the position of the defects on the end face of an object to be inspected. When an object (G) to be inspected is imaged from above substantially vertically by a one-dimensional imaging unit (13), the object (G) to be inspected is irradiated with light from a plurality of light-emitting sections (30a) provided in a1 st region (31) of a substantially semicircular cylinder surface and a2 nd region (32) and a 3 rd region (33) of a substantially hemispherical surface or a substantially hemispherical spherical surface formed at both ends of the 1 st region (31), wherein a central axis (ax) of the 1 st region (31) is positioned on a central plane (S1) including a substantially vertical surface of the one-dimensional imaging unit (13). In the 1 st region (31), light emitting sections (30a) are arranged in a region other than the vicinity of the center plane (S1) in a direction substantially orthogonal to the conveying direction (F) of the object (G). In addition, in the 2 nd region (32) and the 3 rd region (33), light emitting parts (30a) are arranged at positions on the central plane (S1).

Description

Optical inspection device
Technical Field
The present invention relates to an optical inspection apparatus.
Background
Patent document 1 discloses an imaging optical inspection apparatus in which a plurality of illumination devices are arranged so that light irradiated to an object to be inspected and incident on a one-dimensional imaging unit is converted into regular reflection light, diffuse reflection light, and transmission light, and the lighting timings of the illumination devices are changed every time the one-dimensional imaging unit transfers the light, and image data transferred from the one-dimensional imaging unit is integrated for each image data obtained by lighting the same illumination device, thereby generating integrated image data.
Prior art documents
Patent document
Patent document 1: JP Kokai publication Hei 2012-42297
Disclosure of Invention
Problems to be solved by the invention
The invention described in patent document 1 is used for inspecting a transparent plate-like object to be inspected. As the transparent plate-like object to be inspected, for example, a cover glass used in a mobile terminal or the like is cited. The cover glass is substantially rectangular, and the end face is ground. In order to inspect a defect (for example, a notch in an end face) in the end face processing of such a cover glass or the like, it is necessary to irradiate the entire end face with light in the same manner and to cause the reflected light to enter the imaging unit.
In the invention described in patent document 1, the optical axis of the one-dimensional imaging unit is inclined by about 45 degrees with respect to the normal line of the object to be inspected. Therefore, in the invention described in patent document 1, although light reflected by a part of the end face (referred to as an end face I) enters the imaging unit, light reflected by an end face other than the end face I (referred to as an end face II) may not enter the imaging unit. Further, the end face II has a problem that the defect cannot be imaged, that is, the defect cannot be detected.
The present invention has been made in view of the above circumstances, and an object thereof is to provide an optical inspection apparatus capable of inspecting a defect by one inspection by one apparatus regardless of the position of the defect on the end face of an object to be inspected.
Means for solving the problems
In order to solve the above problem, an optical inspection apparatus according to the present invention includes, for example: a loading part for loading the object to be inspected in the horizontal direction; a conveying unit for moving the object to be inspected mounted on the mounting unit in a conveying direction; a one-dimensional imaging unit that images the object to be inspected from above substantially vertically and is arranged such that a long side direction thereof is substantially orthogonal to the conveyance direction; and a light irradiation unit having a plurality of light emitting units for irradiating the object with light, the light irradiation unit including: a1 st region of a substantially semi-cylindrical surface, a central axis of which is located on a central plane that is a substantially vertical plane including the one-dimensional imaging unit; and 2 nd and 3 rd regions of a substantially hemispherical surface or a substantially hemispherical spherical surface, which are formed at both ends of the 1 st region, wherein the 1 st region includes a plurality of band-shaped light-emitting portions in which the light-emitting portions are arranged in a direction substantially orthogonal to the transport direction, the band-shaped light-emitting portions are provided in the 1 st region in a region other than the vicinity of the center plane, and the light-emitting portions are arranged on the center plane in the 2 nd and 3 rd regions.
According to the optical inspection apparatus of the present invention, when the object is imaged from above substantially vertically by the one-dimensional imaging unit, the object is irradiated with light from the plurality of light irradiation units provided in the 1 st region of the substantially semicircular cylindrical surface, the 2 nd region and the 3 rd region of the substantially hemispherical surface or the substantially hemispherical spherical surface, wherein the central axis of the 1 st region is positioned on the central plane which is a substantially vertical plane including the one-dimensional imaging unit, and the 2 nd region and the 3 rd region are formed at both ends of the 1 st region. In the 1 st region, light emitting portions are arranged in a direction substantially orthogonal to the conveying direction of the test object in regions other than the vicinity of the central plane. This makes it possible to cause light reflected by the front end surface and the rear end surface of the object to be inspected to enter the one-dimensional imaging means. In the 2 nd region and the 3 rd region, the light emitting sections are arranged at positions on the central plane. This makes it possible to cause light reflected by the left and right end faces of the object to be inspected to enter the one-dimensional imaging means. Therefore, the defect can be inspected by one inspection by one apparatus regardless of the position of the defect on the end face of the inspection object.
Here, the belt-shaped light emitting unit may be provided such that an optical axis intersects an intersection line of the central surface and the upper surface of the mounting unit, and the belt-shaped light emitting unit may include a1 st belt-shaped light emitting unit having an angle formed by the optical axis and the central surface of substantially 8 degrees or substantially 17 degrees. Thus, an image in which color unevenness of the printed portion can be detected can be captured when a pearl pigment is used or when a pearl pigment is not used.
Here, the present invention may include: and a control unit that controls the conveying unit to convey the object at a constant speed, drives the one-dimensional imaging unit to capture images at constant intervals, and, in synchronization with the capturing by the one-dimensional imaging unit, causes the 1 st light-emitting region of a half region of the 1 st region divided by the center plane, the 2 nd light-emitting region other than the 1 st light-emitting region in the 1 st region, the 3 rd light-emitting region on the center plane in the 2 nd region, the 4 th light-emitting region on the center plane in the 3 rd region, and the 1 st strip-shaped light-emitting region to irradiate, respectively. Thus, during one transport of the object, images with bright defects on the front P-side, images with bright defects on the rear P-side, images with bright defects on the left and right P-sides, and images with dark defects in the printed portion compared to the other printed portions can be captured at each timing.
Here, the light irradiation section may include a cylindrical lens disposed between the strip-shaped light emitting section and an intersection of the central surface and the upper surface of the mounting section. This can condense the light emitted from the belt-shaped light emitting unit, thereby increasing the imaging frequency of the one-dimensional imaging unit (shortening the imaging time).
Here, the present invention may include: a1 st image pickup unit provided on an upper side or a lower side of the mounting portion; a2 nd imaging unit provided on the opposite side of the 1 st imaging unit with respect to the mounting portion so that an optical axis of the 2 nd imaging unit coincides with an optical axis of the 1 st imaging unit; a1 st coaxial illumination for irradiating the object with parallel light from a normal direction and for the 1 st imaging unit; and a2 nd coaxial illumination which is provided on the opposite side of the mounting portion with respect to the 1 st coaxial illumination and is coaxial illumination of the 2 nd imaging unit, wherein light irradiated from the 1 st coaxial illumination and regularly reflected by the object to be inspected enters the 1 st imaging unit, and light irradiated from the 2 nd coaxial illumination and regularly reflected by the object to be inspected and light irradiated from the 1 st coaxial illumination and transmitted through the object to be inspected enter the 2 nd imaging unit. This makes it possible to capture a transmission image in which a defect (e.g., a flaw in a printed portion, a notch in a printed edge, etc.) in an opaque portion of the test object can be detected, and a specular reflection image in which a flaw, a foreign substance, etc., on the front surface or the back surface of the test object can be detected.
Here, the present invention may include: a2 nd control unit for controlling the conveying unit to convey the object at a constant speed and driving the 1 st image pickup unit and the 2 nd image pickup unit, so as to illuminate the 1 st coaxial illumination or the 2 nd coaxial illumination in three illumination modes of the 1 st mode, the 2 nd mode and the 3 rd mode, and, the image is acquired by the 1 st image pickup means in accordance with the irradiation of the 1 st aspect, the image is acquired by the 2 nd image pickup means in accordance with the irradiation of the 2 nd aspect, and the image is acquired by the 2 nd image pickup means in accordance with the irradiation of the 3 rd aspect, wherein in the 1 st mode, the 1 st coaxial illumination is illuminated at a1 st intensity, in the 2 nd mode, the 2 nd coaxial illumination is irradiated with the 1 st intensity, and in the 3 rd mode, the 1 st coaxial illumination is irradiated with the 2 nd intensity. This makes it possible to capture a transmission image and a specular reflection image at respective timings while the test object is conveyed at one time.
Here, the present invention may include: a2 nd imaging unit provided on the opposite side of the one-dimensional imaging unit with respect to the mounting portion so that an optical axis of the one-dimensional imaging unit coincides with an optical axis of the one-dimensional imaging unit; a1 st coaxial illumination that irradiates parallel light from a normal direction to the object to be inspected and is coaxial illumination by the one-dimensional imaging unit; and a2 nd coaxial illumination that is provided on the opposite side of the mounting portion with respect to the 1 st coaxial illumination and is coaxial illumination of the 2 nd imaging unit, wherein the light irradiation portion is provided between the one-dimensional imaging unit and the transport portion, and is configured to allow light irradiated from the light irradiation portion or the 1 st coaxial illumination and regularly reflected by the object to be inspected to enter the one-dimensional imaging unit, and to allow light irradiated from the 2 nd coaxial illumination and regularly reflected by the object to be inspected and light irradiated from the 1 st coaxial illumination and transmitted through the object to be inspected to enter the 2 nd imaging unit. Thus, a transmission image in which a defect in an opaque portion of the object to be inspected can be detected, and a regular reflection image in which a flaw, a foreign matter, or the like on the front surface or the back surface of the object to be inspected can be detected can be captured with a smaller number of imaging units.
Here, the present invention may include: a 3 rd control unit for controlling the conveying unit to convey the object at a constant speed and driving the one-dimensional imaging unit and the 2 nd imaging unit, so as to illuminate the 1 st coaxial illumination or the 2 nd coaxial illumination in three illumination modes of the 1 st mode, the 2 nd mode and the 3 rd mode, and, the one-dimensional imaging means acquires an image in accordance with the irradiation of the 1 st aspect, the 2 nd imaging means acquires an image in accordance with the irradiation of the 2 nd aspect, and the 2 nd imaging means acquires an image in accordance with the irradiation of the 3 rd aspect, wherein in the 1 st mode, the 1 st coaxial illumination is irradiated at a1 st intensity, in the 2 nd mode, the 2 nd coaxial illumination is irradiated with the 1 st intensity, and in the 3 rd mode, the 1 st coaxial illumination is irradiated with the 2 nd intensity. This makes it possible to capture a transmission image and a specular reflection image at respective timings while the test object is conveyed at one time.
Here, the light irradiation section may have a light diffusion plate that is provided adjacent to the light emission section and diffuses light irradiated from the light emission section. Thus, the light emitted from the plurality of light-emitting sections can be made into an elongated surface light source by the light diffusion plate, and the problem that the point-like light of the light-emitting sections is reflected on the object to be inspected can be prevented.
Here, the present invention may include: a focal length adjusting optical element that adjusts a focal length of the one-dimensional imaging unit; and a reflecting mirror provided adjacent to the mounting portion, wherein the optical element for focal distance adjustment and the reflecting mirror are provided on the central surface, a mounting region, which is a region on the mounting portion on which the object to be inspected is mounted, is located below the one-dimensional imaging unit in the vertical direction in a plan view, the reflecting mirror is provided outside the mounting region and adjacent to the mounting region in a direction substantially orthogonal to the transport direction in the plan view, a reflecting surface of the reflecting mirror is substantially flat, the reflecting surface is provided so as to extend substantially in the transport direction, a line intersecting the reflecting surface and the central surface is inclined with respect to a horizontal plane, and the optical element for focal distance adjustment is arranged so as to overlap a line connecting the one-dimensional imaging unit and the reflecting mirror. Thus, even a cover glass having a partially cylindrical shape or an elliptical cylindrical shape on a side surface can be inspected for defects on the side surface by a single inspection. That is, the image of the side surface of the object to be inspected is reflected by the mirror and guided to the one-dimensional imaging unit, and the focal length of the one-dimensional imaging unit is extended by the focal length adjusting optical element, whereby the side surface below the plane of the object to be inspected can be brought into focus.
Here, the optical element for focal length adjustment may be a glass plate, and both end surfaces substantially orthogonal to the plate thickness direction may be horizontal. Accordingly, the light can be refracted during the process of entering the optical element for focal length adjustment and the process of emitting the light from the optical element for focal length adjustment, and the focal position of the one-dimensional imaging unit can be extended.
Here, the present invention may include: a moving unit that moves the one-dimensional imaging unit in an up-down direction; a height acquisition unit for acquiring the height of the object; and a movement control unit that controls the movement unit based on the information acquired by the height acquisition unit, and moves the one-dimensional imaging unit in the vertical direction in accordance with a change in height of the object passing under the one-dimensional imaging unit. Thus, even when the height of the object to be inspected changes, a sharp image in focus can be captured by the one-dimensional imaging means.
Here, the height acquiring unit may include: a surface light source that irradiates light in a direction substantially orthogonal to the conveyance direction; and a side imaging unit that receives light that has been irradiated from the surface light source and passed through the inspection object. Thus, the height of the object to be inspected can be accurately obtained by capturing an image such as a shadow image in which a portion of the object to be inspected which blocks light is dark and the other portion is bright by the side imaging means.
Here, the light irradiation section may include, in the 2 nd region and the 3 rd region: a light emitting block in which the light emitting sections are arranged in a row; and a2 nd cylindrical lens through which light emitted from the light emitting section passes, wherein an extending direction of the light emitting block is inclined with respect to a horizontal direction in the 2 nd region and the 3 rd region, and the extending direction of the 2 nd cylindrical lens is inclined with respect to the extending direction of the light emitting block in the 2 nd region and the 3 rd region. This enables focusing of light emitted from the 2 nd region and the 3 rd region on the object to be inspected. In addition, the 1 st region can be compensated by the 2 nd region and the 3 rd region, and the number of light emitting parts included in the band-shaped light emitting part in the 1 st region can be reduced.
Here, the light irradiation section may have a heat radiation member formed of a material having high thermal conductivity, and the light emitting section may be provided on the heat radiation member. This enables heat generated from the light emitting section to be dissipated with good thermal efficiency.
Here, the heat radiating member may include a plurality of plates on which the light emitting portions are provided, the plates may be provided to extend in a direction substantially orthogonal to the conveying direction, and the air blowing unit may blow air directed in the direction in which the plates extend. This can improve the cooling effect of the heat radiation member.
Here, the 2 nd imaging unit may be provided below the mounting portion, and a circular polarization filter may be provided above the 2 nd imaging unit, the circular polarization filter being provided such that a plane in a direction substantially orthogonal to the thickness direction is slightly inclined with respect to a direction substantially orthogonal to the optical axis of the 2 nd imaging unit. In this way, light reflected by the camera lens surface of the 2 nd imaging unit provided on the lower side of the mounting portion can be made incident on the camera provided on the upper side of the mounting portion, and bright lines can be prevented from being included in an image captured by the camera.
Effects of the invention
According to the present invention, a defect can be inspected by one apparatus and by one inspection regardless of the position of the defect on the end face of the object to be inspected.
Drawings
Fig. 1 is a front view showing an outline of an optical inspection apparatus 1 according to embodiment 1.
Fig. 2 is a diagram showing details of the stereoscopic illumination unit 30.
Fig. 3 is a schematic diagram showing details of the band-shaped light emitting unit 31 a.
Fig. 4 is a view schematically showing a cross section of the optical inspection apparatus 1 when the optical inspection apparatus 1 is cut so as to include the 1 st region 31 by a plane parallel to the xz plane.
Fig. 5 is a schematic plan view of the stereoscopic illumination portion 30.
Fig. 6 is a schematic diagram showing details of the band-shaped light emitting unit 32 a.
Fig. 7 is a block diagram showing an electrical configuration of the optical inspection apparatus 1.
Fig. 8 is a block diagram illustrating electrical connections between the output unit 73 and the respective components of the optical inspection apparatus 1.
Fig. 9 is a diagram for explaining signals output from the output unit 73 to the 1 st camera 11, the 2 nd camera 12, and the coaxial illumination unit 20.
Fig. 10 is a timing chart in the process shown in fig. 9.
Fig. 11 is a diagram for explaining signals output from the output unit 73 to the 3 rd camera 13 and the stereoscopic illumination unit 30.
Fig. 12 is a diagram showing a correspondence between the signal output shown in fig. 11 and a defect included in an image captured by the stereoscopic illumination unit 30.
Fig. 13 is a diagram showing the state of light at the end face of the cover glass G, and the path of the light is shown by a two-dot chain line.
Fig. 14 is a timing chart in the process shown in fig. 11.
Fig. 15 shows an example of a transmission image of the cover glass G.
Fig. 16 shows an example of a specular reflection image of the cover glass G.
Fig. 17 is an example of an image (partially enlarged view) in which a defect in the inner surface P of the front end of the cover glass G is highlighted.
Fig. 18 is an example of an image (partially enlarged view) in which a defect in a printed portion of the cover glass G is darker than other printed portions.
FIG. 19 is a view schematically showing a light-emitting block 30B-1 according to a modification example, in which (A) is a side view, and (B) is a view showing a state shown in (A) as viewed from below in the figure
Fig. 20 is a front view showing an outline of the optical inspection apparatus 2 according to embodiment 2.
Fig. 21 is a diagram showing the correspondence between the order of shooting processing and images shot by the 1 st camera 11 and the 2 nd camera 12.
Fig. 22 is a front view showing an outline of the optical inspection apparatus 3 according to embodiment 3.
Fig. 23 is a perspective view showing a part of the optical inspection apparatus 3 in an enlarged manner.
Fig. 24 is a diagram showing a schematic configuration in a state where the optical inspection apparatus 3 is cut at the center plane S1.
Fig. 25 is a diagram showing a relationship between the position of the cover glass G1 and an image captured by the 3 rd camera 13, (a) is an enlarged view of a side portion of the cover glass G1, and (B) shows a part of the image captured by the 3 rd camera 13.
Fig. 26 is a view schematically showing the measurement of the height of the cover glass G1, as viewed from a direction substantially orthogonal to the conveying direction F.
Fig. 27 is a view schematically showing a state where the height of the cover glass G1 is measured, and is a view seen in the conveying direction F.
Fig. 28 is a perspective view showing an outline of a stereoscopic illumination unit 30A provided in the optical inspection apparatus according to embodiment 3.
FIG. 29 is a schematic diagram showing details of the band-shaped light-emitting unit 31 a-1.
FIG. 30 is a schematic view showing the details of the band-shaped light-emitting part 32 a-1.
Fig. 31 is a diagram illustrating a path of light emitted from the stereoscopic illumination unit 30A.
Fig. 32 is a front view showing an outline of the optical inspection apparatus 5 according to embodiment 5.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the drawings, the same elements are denoted by the same reference numerals, and redundant portions are not described.
The invention provides an optical inspection device for inspecting cover glass G of a portable terminal as an object to be inspected. Defects such as scratches and uneven polishing on the end face, front face, and back face of the cover glass G, and defects such as uneven printing and chipping printed on the cover glass G can be inspected based on the image captured by the optical inspection device. In the present embodiment, an example is given in which the optical inspection device inspects the cover glass G of a mobile terminal or the like, but the object to be inspected by the optical inspection device is not limited to the cover glass.
The cover glass G is ground to a curved surface over the entire circumference. In addition, partial printing is performed on the back surface of the cover glass G. Hereinafter, the circular-arc-shaped polishing surface of the peripheral surface is referred to as a polishing surface (hereinafter, P-surface). In addition, monochrome printing in which a monochrome organic paint is applied, and pearl painting in which a monochrome pigment that becomes a ground color is applied and a transparent layer containing a pearl pigment in which translucent particles are covered with transparent titanium dioxide or the like are overlappingly applied are performed on a printed portion of the cover glass G.
< embodiment 1 >
Fig. 1 is a front view showing an outline of an optical inspection apparatus 1 according to embodiment 1. The optical inspection apparatus 1 mainly includes: the imaging unit 10, the coaxial illumination unit 20, the stereoscopic illumination unit 30, the mounting unit 40, and the transmission unit 50 (see fig. 8).
The mounting portion 40 has a plurality of rollers 40a, and the cover glass G is mounted on the upper side. The conveying unit 50 moves the cover glass G provided in the mounting unit 40 in a conveying direction F (here, the x direction), and includes, for example, an actuator (not shown) for rotating the roller 40 a. The mounting unit 40 and the transport unit 50 are already known, and therefore, the description thereof is omitted.
The imaging unit 10 mainly includes: the 1 st camera 11, the 2 nd camera 12, and the 3 rd camera 13 (in the present embodiment, corresponding to the 1 st imaging unit, the 2 nd imaging unit, and the one-dimensional imaging unit of the present invention, respectively). The 1 st camera 11, the 2 nd camera 12, and the 3 rd camera 13 mainly have: imaging lenses 11a, 12a, and 13a, and line sensors 11b, 12b, and 13b such as line CCDs and CMOSs. The line sensors 11b, 12b, 13b are arranged such that the longitudinal direction is substantially orthogonal to the conveyance direction F of the cover glass G (i.e., in the y direction). The imaging unit 10 is already known, and therefore, the description thereof is omitted.
The 1 st camera 11 is provided on the upper side (+ z side) of the mounting portion 40, and captures an image of the cover glass G from above (+ z direction) substantially vertically. The 2 nd camera 12 and the 1 st camera 11 are provided on opposite sides (lower side (-z side) of the mounting portion 40) with the mounting portion 40 interposed therebetween, and capture an image of the cover glass G from below (-z direction) substantially vertically. The 1 st camera 11 and the 2 nd camera 12 are arranged so that the optical axes oax coincide.
The 3 rd camera 13 is provided on the upper side (+ z side) of the mounting portion 40, and captures an image of the cover glass G from above substantially vertically. Further, the 3 rd camera 13 is disposed so as to be different from the 1 st camera 11 and the 2 nd camera 12 in position in the horizontal direction (position on a plane parallel to the xy plane).
The coaxial illumination unit 20 includes: upper side coaxial illumination 21 as coaxial illumination of the 1 st camera 11; and lower coaxial illumination 22 as coaxial illumination for the 2 nd camera 12. The upper coaxial illumination 21 and the lower coaxial illumination 22 are so-called kohler illuminations, and parallel light is irradiated from the normal direction (z direction) to the cover glass G. The upper coaxial illumination 21 and the lower coaxial illumination 22 are provided on opposite sides so that the optical axes oax are aligned with each other and the mounting portion 40 is sandwiched therebetween (the upper coaxial illumination 21 is on the + z side of the mounting portion 40, and the lower coaxial illumination 22 is on the-z side of the mounting portion 40).
The structure of the upper coaxial illumination 21 and the lower coaxial illumination 22 will be described. The upper coaxial illumination 21 and the lower coaxial illumination 22 each include: light sources 21a, 22 a; glass integrators 21b and 22b for uniformizing the illuminance distribution; condenser lenses 21c and 22 c; diaphragms 21d, 22d provided at positions substantially coincident with the focal points of the condenser lenses 21c, 22 c; collimator lenses 21e and 22e for converting light into parallel light; mirrors 21f, 22f for bending the optical path; fresnel lenses 21g and 22g having parallel linear grooves and condensing light on a straight line; and half mirrors 21h, 22h for bending the optical path.
The light sources 21a and 22a have a light emitting member surface-mounted on a metal plate such as aluminum. A heat sink (heat sink) is provided on the back surface of the metal plate. The light emitting member is a white (e.g., 5700K color temperature) LED, and its irradiation angle is about 120 degrees.
The light sources 21a, 22a and the integrators 21b, 22b are adjacently disposed. Since light is irradiated from the LED to a relatively wide area, a part of the light is leaked without being incident on the integrators 21b, 22b, but most of the light is incident on the integrators 21b, 22b and is irradiated on the cover glass G.
The half mirror 21h is disposed on the optical axis of the 1 st camera 11. Therefore, the cover glass G is vertically irradiated with the light reflected by the half mirror 21h, and the light regularly reflected by the cover glass G is incident on the 1 st camera 11. Further, the light reflected by the half mirror 21h is incident on the 2 nd camera 12 through the cover glass G.
The half mirror 22h is disposed on the optical axis of the 2 nd camera 12. Therefore, the light reflected by the half mirror 22h is vertically irradiated to the cover glass G, and the light regularly reflected by the cover glass G is incident on the 2 nd camera 12.
The cover glass G is irradiated with light from a plurality of directions by the stereoscopic illumination unit 30. The light irradiated from the stereo illumination section 30 and reflected by the cover glass G is incident on the 3 rd camera 13. Fig. 2 is a diagram showing details of the stereoscopic illumination unit 30. In fig. 2, the 3 rd camera 13 (the imaging lens 13a, the line sensor 13b) is shown by a dotted line, and the visual field position 13c of the 3 rd camera 13 and the path of the light of the 3 rd camera 13 are shown by a two-dot chain line.
The stereoscopic lighting unit 30 has a1 st region 31 of a substantially semicircular cylindrical surface, and 2 nd regions 32 and 3 rd regions 33 of a substantially hemispherical surface or a substantially hemispherical ellipsoidal surface. The 2 nd region 32 and the 3 rd region 33 have the same shape and are disposed at both ends of the 1 st region 31.
In the 1 st region 31, a light emitting portion 30a is provided on a substantially semi-cylindrical surface. The light emitting unit 30a includes a white LED and a heat radiating member, similarly to the light sources 21a and 22 a. The central axis ax of the 1 st region 31 is located on a central plane S1 which is a plane including the 3 rd camera 13 in the substantially vertical direction.
In the 2 nd region 32 and the 3 rd region 33, the light emitting portion 30a is provided on a substantially hemispherical surface or a substantially hemispherical spherical surface. The center points of the 2 nd area 32 and the 3 rd area 33 are located on the center plane S1. In the present embodiment, the 2 nd region 32 and the 3 rd region 33 are substantially hemispherical surfaces, but the shapes of the 2 nd region 32 and the 3 rd region 33 are not limited thereto.
In the 1 st region 31, the light emitting portion 30a is arranged in a direction (i.e., y direction) substantially orthogonal to the conveying direction F between the 3 rd camera 13 and the mounting portion 40, i.e., in a region other than the vicinity of the center plane S1. The light emitting parts 30a arranged in the y direction are band-shaped light emitting parts 31a, 31b, 31c, 32d, 31e, 31f, and 31g … (described later in detail). In the present embodiment, 10 band-shaped light emitting portions 31a to 31j (see fig. 4) are provided, but the number of band-shaped light emitting portions is not limited to this.
Fig. 3 is a schematic diagram showing details of the band-shaped light emitting unit 31 a. Since the band-shaped light emitting units 31a to 31j have the same configuration, only the band-shaped light emitting unit 31a will be described, and descriptions of the band-shaped light emitting units 31b to 31j will be omitted.
The entire length of the band-shaped light emitting section 31a is 3L, and three light emitting blocks 30b are provided in which the light emitting sections 30a are arranged in a row so that the length in the longitudinal direction is L.
The lateral width w of the light emitting portions 30a is about 3.4mm, the length h is about 3.4mm, and the distance between adjacent light emitting portions 30a is substantially 0.2 mm. In the present embodiment, since the length L of the light-emitting block 30b is approximately 50mm, about 13 light-emitting portions 30a are arranged in one light-emitting block 30 b. Thereby, the band-shaped (linear) light is irradiated from the band-shaped light emitting portion 31 a.
In the present embodiment, a white LED is used as the light emitting portion 30a, but the device used in the light emitting portion 30a is not limited to the white LED. For example, any of a device obtained by combining a blue or ultraviolet LED and a yellow phosphor, a device obtained by combining three-color chips of red, green and blue, and a device obtained by combining a blue LED and red and green phosphors may be used for the light-emitting section 30 a.
Next, the arrangement of the band-shaped light emitting units 31a to 31j in the stereoscopic illumination unit 30 will be described. Fig. 4 is a view schematically showing a cross section of the optical inspection apparatus 1 when the optical inspection apparatus 1 is cut from a plane parallel to the xz plane to include the 1 st region 31. In fig. 4, paths of light emitted from the band-shaped light emitting portions 31a to 31j are shown by two-dot chain lines.
The central axis ax of the 1 st region 31 is located near the upper surface of the cover glass G, and the band-shaped light-emitting portions 31a to 31j are provided on the circumference of a radius R (see the dotted line in fig. 4) around the central axis ax. Since the cover glass G has a very small thickness, the position of the upper surface of the cover glass G substantially coincides with the intersection between the center plane S1 and the upper surface of the mounting portion 40 (not shown in fig. 4). The radius R is substantially the same as the length 3L of the band-shaped light emitting parts 31a to 31 j.
The belt-shaped light emitting portions 31a to 31j are provided so that the optical axes (see chain lines in fig. 4) intersect the central axis ax, which is the intersection line between the central surface S1 and the mounting portion 40 (the intersection line between the central surface S1 and the cover glass G).
The strip-shaped light-emitting units 31a to 31e are positioned on the + x side of the center plane S1, and the strip-shaped light-emitting units 31f to 31j are positioned on the-x side of the center plane S1.
The strip-shaped light-emitting portions 31a and 31f are provided at positions closest to the central surface S1. The angle θ 1 formed by the optical axis of the strip-shaped light emitting portions 31a and 31f and the central plane S1 is substantially 8 degrees.
The band-shaped light emitting parts 31b and 31g are provided outside the band-shaped light emitting parts 31a and 31f, adjacent to the band-shaped light emitting parts 31a and 31 f. The angle θ 2 formed by the optical axis of the strip-shaped light emitting portions 31b and 31g and the central plane S1 is substantially 17 degrees.
The band-shaped light emitting parts 31c and 31h are provided outside the band-shaped light emitting parts 31b and 31g, adjacent to the band-shaped light emitting parts 31b and 31 g. The angle θ 3 formed by the optical axis of the strip-shaped light emitting portions 31c and 31h and the central plane S1 is substantially 26 degrees.
The band-shaped light emitting parts 31d and 31i are provided outside the band-shaped light emitting parts 31c and 31h, adjacent to the band-shaped light emitting parts 31c and 31 h. The angle θ 4 formed by the optical axis of the strip-shaped light emitting portions 31d and 31i and the central plane S1 is substantially 35 degrees.
The band-shaped light emitting parts 31e and 31j are provided outside the band-shaped light emitting parts 31d and 31i, adjacent to the band-shaped light emitting parts 31d and 31 i. The angle θ 5 formed by the optical axis of the strip-shaped light emitting portions 31e and 31j and the central plane S1 is substantially 44 degrees.
The band-shaped light emitting sections 31a to 31j have cylindrical lenses 30c provided on the optical axis. The cylindrical lens 30c is formed by cutting a rod material made of acrylic along a central axis and polishing a cut surface. The cylindrical lens 30c is provided between the light emitting section 30a and the central axis ax, and condenses the light emitted from the light emitting section 30a to the vicinity of the central axis ax.
Fig. 5 is a schematic plan view of the stereoscopic illumination portion 30. In fig. 5, the visual field position 13c of the 3 rd camera 13 is shown in dotted lines. In fig. 5, a light-emitting block 30d (described in detail later) is shown by a broken line.
The 2 nd region 32 and the 3 rd region 33 have band-shaped light emitting parts 32a to 32i, and 33a to 33i, which are formed by arranging the light emitting parts 30a in a row. In the present embodiment, each of the nine band-shaped light-emitting portions 32a to 32i and 33a to 33i is provided, but the number of the band-shaped light-emitting portions is not limited to this.
The strip-shaped light emitting portions 32a and 33a are provided on the center surface S1. The band-shaped light emitting parts 32b to 32e, and 33b to 33e are provided on extension lines of the band-shaped light emitting parts 31a to 31d, respectively. The band-shaped light emitting units 32f to 32i and 33f to 33i are provided on extension lines of the band-shaped light emitting units 31f to 31i, respectively.
Fig. 6 is a schematic diagram showing details of the band-shaped light emitting unit 32 a. Since the band-shaped light emitting units 32a to 32i and 33a to 33i have substantially the same configuration, only the band-shaped light emitting unit 32a will be described, and the description of the band-shaped light emitting units 32b to 32i and 33a to 33i will be omitted.
The strip-shaped light emitting section 32a has a plurality of light emitting blocks 30d in which the light emitting sections 30a are arranged in a row. In the present embodiment, since the length 1 of the light-emitting block 30d is approximately 18.5mm and the distance between the adjacent light-emitting portions 30a is approximately 0.2mm, about five light-emitting portions 30a are arranged in one light-emitting block 30 d. The light-emitting blocks 30d are arranged adjacent to each other such that the light-emitting portion 30a is positioned on a substantially cylindrical surface having a radius R (see dotted lines in fig. 6).
Returning to the description of fig. 5. The number of light-emitting blocks 30d in the band-shaped light-emitting portions 32a to 32c, 32f, 32g, 33a to 33c, 33f, 33g is five, the number of light-emitting blocks 30d in the band-shaped light-emitting portions 32d, 32h, 33d, 33h is four, and the number of light-emitting blocks 30d in the band-shaped light-emitting portions 32e, 32i, 33e, 33i is two, but the number of light-emitting blocks 30d in the band-shaped light-emitting portions 32a to 32i, 33a to 33i is not limited thereto.
The optical axes of the light-emitting blocks 30d in the band-shaped light-emitting portions 32a and 33a face the center points O2 and O3 of the 2 nd and 3 rd regions 32 and 33. The optical axis of each light-emitting block 30d in the band-shaped light-emitting portions 32b to 32i, and 33a to 33i is directed toward the center point O1 of the 1 st area 31. The orientation of the optical axis of each light-emitting block 30d in the band-shaped light-emitting units 32a to 32i and 33a to 33i is not limited to the embodiment shown in fig. 5.
Fig. 7 is a block diagram showing an electrical configuration of the optical inspection apparatus 1. The optical inspection apparatus 1 includes: an integrated circuit 71, an input section 72, an output section 73, a power supply section 74, and a communication interface (I/F) 75.
The integrated circuit 71 is, for example, an FPGA (Field-programmable gate array), and operates based on a program to control each part. The integrated circuit 71 has a function of a control unit (including the control unit of the present invention and the 2 nd control unit) that controls each unit of the optical inspection apparatus 1. Specifically, the integrated circuit 71 acquires signals from the input unit 72, the output unit 73, and the like, and generates a signal output from the output unit 73 based on the acquired signals. In the present embodiment, although a program for realizing each function is stored in the FPGA as the integrated circuit 71, the integrated circuit 71 is not limited to the FPGA, and the method of executing the program is not limited thereto.
Signals are input to the input unit 72 from various sensors such as the position detection sensors 81 and 82. The input unit 72 includes switches for setting the output mode of each channel of the output unit 73, setting the imaging frequency of the imaging unit 10, and the like.
The output unit 73 has a plurality of channels, and outputs the signals to the imaging unit 10, the coaxial illumination unit 20, the stereoscopic illumination unit 30, and the like from different channels. For example, the output unit 73 outputs a drive motor pulse to the transmission unit 50, and outputs a horizontal synchronization signal, a vertical synchronization signal, and the like to the imaging unit 10. The output unit 73 includes display elements such as an LED and a seven-segment display, which indicate error display such as a communication error and timeout, traveling of the cover glass G, waiting for capture by the imaging unit 10, and illumination of the coaxial illumination unit 20 and the stereoscopic illumination unit 30.
Fig. 8 is a block diagram illustrating electrical connections between the output unit 73 and the respective components of the optical inspection apparatus 1. The output unit 73 outputs signals to the imaging unit 10 (the 1 st camera 11, the 2 nd camera 12, and the 3 rd camera 13), the coaxial illumination unit 20 (the upper coaxial illumination 21, and the lower coaxial illumination 22), the stereoscopic illumination unit 30 (the band-shaped light emitting units 31a to 31j, 32a to 32i, and 33a to 33i), and the transmission unit 50.
The signal output from the output section 73 is generated by the integrated circuit 71 based on a horizontal synchronization signal or the like input from a Personal Computer (PC)100 via a communication I/F75.
The explanation returns to fig. 7. The power supply unit 74 receives a voltage of AC100V, for example, and includes therein a switching power supply that converts the voltage to a desired voltage. The power supply unit 74 supplies power to the coaxial illumination unit 20 and the stereoscopic illumination unit 30.
The communication I/F75 receives data from an external device and transmits it to the integrated circuit 71, and transmits data generated by the integrated circuit 71 to other devices. The communication I/F75 has a connector for programming and debugging the integrated circuit 71. The communication I/F75 acquires a horizontal synchronization signal, a vertical synchronization signal, a drive motor start signal, and the like from the PC100, and outputs them to the integrated circuit 71. Further, the communication I/F75 outputs the image data captured by the imaging unit 10 to the PCI00 and the like.
The position detection sensors 81 and 82 detect the position of the cover glass G. The position detection sensor 81 detects that the cover glass G is conveyed under the 1 st camera 11 and the 2 nd camera 12 and that the passage under the 1 st camera 11 and the 2 nd camera 12 is completed. The position detection sensor 82 detects that the cover glass G is conveyed under the 3 rd camera 13 and that the passage under the 3 rd camera 13 is completed.
The PC100 has: a cpu (central Processing unit)101, a ram (random Access memory)102, a rom (read Only memory)103, an input/output interface (I/F)104, a communication interface (I/F)105, and a media interface (I/F) 106.
The CPU101 operates based on programs stored in the RAM102 and the ROM103, and controls each unit. The signal output from the CPU101 is output to the optical inspection apparatus 1 via the communication I/F105.
The RAM102 is a volatile memory. The RAM102 stores programs executed by the CPU101, data used by the CPU101, and the like. The ROM103 is a nonvolatile memory in which various control programs and the like are stored. The CPU101 operates based on programs stored in the RAM102 and the ROM103, and controls each unit. The ROM103 stores a boot program executed by the CPU101 when the PC100 is started, a program dependent on hardware of the PC100, and the like.
The CPU101 controls an input device 111 such as a keyboard and a mouse, and an output device 112 such as a display device via the input/output I/F104. The CPU101 acquires data from the optical inspection apparatus 1 or other devices via a network or the like via the communication I/F105, and outputs the generated data to the optical inspection apparatus 1.
The CPU101 performs various settings such as output setting of channels of the output unit 73, output order of lighting signals in the integrated circuit 71, cycle setting of processing, and setting of a moving distance (idle distance) of the cover glass G from the edge of the cover glass G at which the start of imaging is detected, based on the input from the input device 111, and generates setting data of these settings. The communication I/F105 outputs the setting data generated by the CPU101 to the optical inspection apparatus 1. Further, the CPU101 acquires an image captured by the imaging unit 10 from the optical inspection apparatus 1, and generates an image for inspection. The details of the image generation processing will be described later.
The media I/F106 reads the program or data saved in the storage medium 113 and saves it in the RAM 102. The storage medium 113 is, for example, an IC card, an SD card, a DVD, or the like.
The program for realizing each function is read out from the storage medium 113, installed in the optical inspection apparatus 1 via the RAM102, and executed by the CPU 101.
The configurations of the optical inspection apparatus 1 and the PC100 shown in fig. 7 are the main configurations described in the description of the features of the present embodiment, and for example, the configurations provided in a general information processing apparatus are not excluded. The components of the optical inspection apparatus 1 may be further classified into a plurality of components according to the processing content, or the processing of a plurality of components may be executed by one component. In fig. 7, the optical inspection apparatus 1 and the PC100 are provided as separate apparatuses, but the components of the PC100 may be included in the optical inspection apparatus 1.
The processing performed by the optical inspection apparatus 1 configured as described above will be described. The following processing is mainly performed by the integrated circuit 71.
The integrated circuit 71 generates a drive motor pulse for driving the roller 40a of the mounting unit 40, and the output unit 73 outputs the drive motor pulse to the conveying unit 50. Thereby, the cover glass G moves above the mounting portion 40 at a constant speed in the conveying direction F.
When the cover glass G is detected to have passed under the 1 st camera 11 and the 2 nd camera 12 from the position detection sensor 81, a detection signal is input from the position detection sensor 81 to the integrated circuit 71 via the input unit 72. When the detection signal is input, the integrated circuit 71 starts a process of capturing the transmission image and the specular reflection image by the 1 st camera 11 and the 2 nd camera 12. The following describes processing for capturing a transmission image and a specular reflection image with reference to fig. 9 and 10.
< Process of capturing Transmission image and Positive reflection image >
Fig. 9 is a diagram for explaining signals output from the output unit 73 to the 1 st camera 11, the 2 nd camera 12, and the coaxial illumination unit 20. The channels (hereinafter referred to as "ch") are part of the channels of the output section 73, ch 1-3 are outputs of the upper coaxial lights 21, and ch 4-6 are outputs of the lower coaxial lights 22. ROOP is a signal indicating repetitive processing, and is output to the integrated circuit 71. The numerical values denoted by ch1 to ch 6 in fig. 9 indicate the time for irradiating the upper coaxial illumination 21 and the lower coaxial illumination 22, and the unit is μ sec (microseconds).
Fig. 10 is a timing chart in the process shown in fig. 9. The imaging signal is a signal for driving the 1 st camera 11 and the 2 nd camera 12, and is generated in the integrated circuit 71 based on a horizontal synchronization signal input at a constant cycle. In the present embodiment, the frequency of the imaging signal is 3kHz, and the interval T of the imaging signal is approximately 330 μ sec. The imaging signal is a signal having a High level during the imaging period T1 and a Low level during the blanking period T2, and the light amounts (brightness of the captured image) of the 1 st camera 11 and the 2 nd camera 12 are adjusted by adjusting the imaging period T1. A signal for coaxial illumination is generated in synchronization with the imaging signal. In the present embodiment, the shooting period T1 is set to 300 μ sec, but the shooting period T1 is not limited thereto.
The processing in the sequence 1 to 3 shown in fig. 9 and 10 will be described in detail below.
(sequence 1) the integrated circuit 71 generates a signal for irradiating the upper coaxial illumination 21 at 5 μ sec, and the output unit 73 outputs the signal to the upper coaxial illumination 21. At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 2 nd camera 12. Thus, the light transmitted through the cover glass G enters the 2 nd camera 12, and the 2 nd camera 12 captures a transmitted image. In the transmission image, defects of an opaque portion, for example, scratches of a printed portion, a notch of a printed edge, and the like can be detected.
In sequence 1, the signal for irradiating the upper coaxial illumination 21 becomes High at the same time as the imaging signal becomes High, and becomes Low after 5 μ sec has elapsed. The time of so-called 5 μ sec for irradiating the upper coaxial illumination 21 is a very short time in the case of comparing the imaging period with that of 300 μ sec. In glass, about 4% of light is reflected, and the remaining about 96% of light is transmitted. Therefore, by shortening the time for irradiating the upper coaxial illumination 21, it is possible to capture a transmission image with appropriate brightness.
In sequence 1, the upper coaxial illumination 21 is irradiated at 5 μ sec and the 2 nd camera 12 captures the transmission image, but the lower coaxial illumination 22 may be irradiated at 5 μ sec and the 1 st camera 11 captures the transmission image.
(sequence 2) the integrated circuit 71 generates a signal for irradiating the upper coaxial illumination 21 at 100 μ sec, and the output unit 73 outputs the signal to the upper coaxial illumination 21. At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 1 st camera 11. Thus, light regularly reflected by the surface of the cover glass G enters the 1 st camera 11, and the 1 st camera 11 captures a regularly reflected image.
The cover glass G is irradiated with light from the upper coaxial illumination 21 in the vertical direction. Therefore, the portion of the surface of the cover glass G where no flaw, foreign matter, or the like is present is brightly photographed in the photographed image because the light is regularly reflected and enters the 1 st camera 11. On the other hand, in a portion where a flaw, a foreign substance, or the like exists on the surface of the cover glass G, light is diffusely reflected and does not enter the 1 st camera 11, and thus the captured image is captured darkly. In this way, scratches, foreign matter, and the like on the surface of the cover glass G can be detected. Further, detection of scratches, foreign matter, and the like on the surface by using the regular reflection light is effective only on a surface having a high reflectance such as a mirror surface.
(sequence 3) the integrated circuit 71 generates a signal for irradiating the lower coaxial illumination 22 at 100 μ sec, and the output unit 73 outputs the signal to the lower coaxial illumination 22. At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 2 nd camera 12. Thus, light regularly reflected by the back surface of the cover glass G enters the 2 nd camera 12, and the 2 nd camera 12 captures a reflected image. In this way, flaws, foreign objects, and the like on the back surface of the cover glass G can be detected.
Further, the integrated circuit 71 generates a signal indicating the repetitive processing while performing the output of the sequence 3. The output unit 73 outputs a signal indicating the repetitive processing to the integrated circuit 71 while outputting the signal to the lower coaxial illumination 22. The integrated circuit 71 receives a signal indicating the repetitive processing, returns the processing to the first stage, and repeats the processing of sequentially outputting the signals shown in the sequences 1 to 3.
In the sequences 2 and 3, the signals irradiated with the upper coaxial illumination 21 and the lower coaxial illumination 22 are High at the same time as the imaging signal is High, and Low after 100 μ sec has elapsed. The time of 100 μ sec for irradiating the upper coaxial illumination 21 in the sequence 2 and the lower coaxial illumination 22 in the sequence 3 is about 1/3 of the imaging period T1(300 μ sec), and is much longer than the irradiation time (5 μ sec) in the sequence 1. In this way, when the front reflection image is captured, the reflection image with appropriate brightness can be captured by increasing the time for which the upper coaxial illumination 21 or the lower coaxial illumination 22 is irradiated.
The integrated circuit 71 outputs the drive motor pulse to the transmission unit 50 via the output unit 73 while outputting the signals of the irradiation modes in the above-described sequence 1 to 3. Accordingly, the cover glass G is conveyed at a constant speed to the mounting portion 40, and the positions of the cover glass G and the 1 st camera 11 and the 2 nd camera 12 are changed relative to each other, so that the 1 st camera 11 or the 2 nd camera 12 performs imaging.
This makes it possible to capture a transmission image and a specular reflection image at respective timings while the test object is conveyed at one time. The process of capturing the transmission image and the specular reflection image of the cover glass G by the 1 st camera 11 and the 2 nd camera 12 is continued until the cover glass G passes through the 1 st camera 11 and the 2 nd camera 12. When the cover glass G passes through the 1 st camera 11 and the 2 nd camera 12, a detection signal of the position detection sensor 81 is input to the integrated circuit 71. When the detection signal is input, the integrated circuit 71 terminates the signal output to the 1 st camera 11, the 2 nd camera 12, and the coaxial illumination unit 20.
The integrated circuit 71 continues to output the drive motor pulse to the transmission section 50 via the output section 73. When the position detection sensor 82 detects that the cover glass G is conveyed below the 3 rd camera 13, a detection signal is input from the position detection sensor 82 to the integrated circuit 71 via the input unit 72. When the detection signal is input, the integrated circuit 71 starts a process of capturing a reflected image by the 3 rd camera 13. The process of capturing the reflected image will be described below with reference to fig. 11 to 14.
< processing for capturing reflection image >
Fig. 11 is a diagram for explaining signals output from the output unit 73 to the 3 rd camera 13 and the stereoscopic illumination unit 30. ch 11-46 are part of the channels of the output unit 73. The numerical values in ch11 to 46 are the time for which the stereoscopic illumination unit 30 is irradiated, and the unit is μ sec. In fig. 11, a part of ch is not shown. Further, fig. 12 is a diagram showing a correspondence between the signal output shown in fig. 11 and a defect included in the image captured by the 3 rd camera 13. Fig. 13 is a diagram showing the state of light at the end face of the cover glass G, and the path of the light is shown by a two-dot chain line. Fig. 14 is a timing chart in the process shown in fig. 11.
In fig. 11, ch11 to ch 40 represent outputs to the strip-shaped light emitting parts 31a to 31 j. In the present embodiment, one channel is allocated to each light-emitting block 30 b. For example, ch11 to 13 are outputs to the band-shaped light emitting part 31a (three light emitting blocks 30d constituting it, the same applies hereinafter), ch14 to 16 are outputs to the band-shaped light emitting part 31b, ch17 to 19 are outputs to the band-shaped light emitting part 31c, ch20 to 22 are outputs to the band-shaped light emitting part 31d, ch23 to 25 are outputs to the band-shaped light emitting part 31e, ch26 to 28 are outputs to the band-shaped light emitting part 31f, ch35 29 to 31 are outputs to the band-shaped light emitting part 31g, ch45 to 34 are outputs to the band-shaped light emitting part 31h, ch35 to 37 are outputs to the band-shaped light emitting part 31i, and ch38 to 40 are outputs to the band-shaped light emitting part 31 j.
Further, ch41 is an output toward the band-shaped light-emitting part 32a, ch42 is an output toward the band-shaped light-emitting parts 32b to 32e, and ch43 is an output toward the band-shaped light-emitting parts 32f to 32 i. Further, ch44 is an output toward the band-shaped light-emitting part 33a, ch45 is an output toward the band-shaped light-emitting parts 33b to 33e, and ch46 is an output toward the band-shaped light-emitting parts 33f to 33 i.
The processing in the sequence 1 to 9 shown in fig. 11 and 13 will be described in detail below.
(order 1) the integrated circuit 71 generates a signal for irradiating the band-shaped light emitting unit 31f at 100 μ sec, the band-shaped light emitting unit 31g at 120 μ sec, the band-shaped light emitting unit 31h at 150 μ sec, the band-shaped light emitting unit 31i at 180 μ sec, and the band-shaped light emitting unit 31j at 210 μ sec, and the output unit 73 outputs the signal to the band-shaped light emitting units 31f to 31j (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thus, light irradiated from the-x direction and reflected by the inner surface of the P surface (hereinafter, referred to as the P inner surface) of the front end (+ x side end) of the cover glass G enters the 3 rd camera 13 (see fig. 13). The image captured by the 3 rd camera 13 in the sequence 1 is an image in which defects (printing unevenness, polishing unevenness, scratches, etc.) in the inner surface P of the front end of the cover glass G are shiny (see fig. 12).
(sequence 2) when the printed portion of the cover glass G is coated with pearls, the integrated circuit 71 generates a signal for irradiating the band-shaped light emitting portion 31b for 300 μ sec and the band-shaped light emitting portion 31G for 300 μ sec, and the output portion 73 outputs the signal to the band-shaped light emitting portions 31b and 31G (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from the + x direction of 17 ° and the-x direction of 17 ° and reflected by the printed portion of the cover glass G enters the 3 rd camera 13.
When printing other than pearl coating (such as monochrome printing) is performed on the printed portion of the cover glass G, the integrated circuit 71 generates a signal for irradiating the band-shaped light emitting portion 31a at 300 μ sec and irradiating the band-shaped light emitting portion 31f at 300 μ sec, and the output unit 73 outputs the signal to the band-shaped light emitting portions 31a and 31 f. At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from the + x direction 8 ° and the-x direction 8 ° and reflected by the printed portion of the cover glass G enters the 3 rd camera 13.
The image captured by the 3 rd camera 13 in the sequence 2 is an image of the printed portion of the cover glass G having a different contrast from the other printed portions (for example, an image in which the printed portion has a dark or bright defect than the other printed portions) (see fig. 12).
(sequence 3) the integrated circuit 71 generates a signal for irradiating the band-shaped light emitting portion 31a at 100 μ sec, the band-shaped light emitting portion 31b at 120 μ sec, the band-shaped light emitting portion 31c at 150 μ sec, the band-shaped light emitting portion 31d at 180 μ sec, and the band-shaped light emitting portion 31e at 210 μ sec, and the output portion 73 outputs the signal to the band-shaped light emitting portions 31a to 31e (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from the + x direction and reflected by the P inner surface of the rear end (-x side end) of the cover glass G enters the 3 rd camera 13 (see fig. 13). The image captured by the 3 rd camera 13 in the order 3 is an image in which a defect in the P inner surface of the rear end of the cover glass G is shiny (refer to fig. 12).
(step 4) the integrated circuit 71 generates a signal for irradiating the strip-shaped light emitting unit 32a at 200 μ sec, and the output unit 73 outputs the signal to the strip-shaped light emitting unit 32a (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from the-y direction and reflected by the P inner surface of the left end (+ y-side end) of the cover glass G is incident on the 3 rd camera 13. The image captured by the 3 rd camera 13 in the order 4 is an image in which a defect in the P inner surface of the left end of the cover glass G is shiny (refer to fig. 12).
(sequence 5) the integrated circuit 71 generates a signal for irradiating the band-shaped light-emitting portions 33a, 33f to 33i, and 31f to 31j for 3 μ sec, and the output portion 73 outputs the signal to the band-shaped light-emitting portions 33a, 33f to 33i, and 31f to 31j (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from the + y direction and from between the + y direction and the-x direction and reflected by the front side of the P-plane (hereinafter, referred to as the P-plane) on the left end and the left rear end side of the cover glass G enters the 3 rd camera 13 (see fig. 13). The image captured by the 3 rd camera 13 in the order 5 is an image in which the defect in the P surface on the left end side and the left rear end side of the cover glass G is reflected (refer to fig. 12). In this image, most of defects such as scratches and foreign matter on the P-surface are reflected to be dark.
(step 6) the integrated circuit 71 generates a signal for irradiating the band-shaped light-emitting parts 33b to 33e for 3 μ sec, and the output part 73 outputs the signal to the band-shaped light-emitting parts 33b to 33e (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from between the + y direction and the + x direction and reflected from the P surface of the front end side from the left end of the cover glass G is incident to the 3 rd camera 13. The image captured by the 3 rd camera 13 in the order 6 is an image in which a defect in the P surface on the front end side is reflected from the left end of the cover glass G (refer to fig. 12).
(step 7) the integrated circuit 71 generates a signal for irradiating the band-shaped light-emitting unit 33a at 200 μ sec, and the output unit 73 outputs the signal to the band-shaped light-emitting unit 33a (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from the + y direction and reflected by the P inner surface of the right end (end on the y side) of the cover glass G is incident on the 3 rd camera 13. The image captured by the 3 rd camera 13 in the sequence 7 is an image in which a defect in the P inner surface of the right end of the cover glass G is shiny (refer to fig. 12).
(step 8) the integrated circuit 71 generates a signal for irradiating the band-shaped light emitting units 32f to 32i at 3 μ sec, and the output unit 73 outputs the signal to the band-shaped light emitting units 32f to 32i (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from between the-y direction and the-x direction and reflected from the P surface on the rear end side from the right end of the cover glass G is incident on the 3 rd camera 13. The image captured by the 3 rd camera 13 in the order 8 is an image in which a defect in the P surface on the rear end side is reflected from the right end of the cover glass G (refer to fig. 12).
(step 9) the integrated circuit 71 generates a signal for irradiating the band-shaped light-emitting portions 32a to 32e and 31a to 31e for 3 μ sec, and the output portion 73 outputs the signal to the band-shaped light-emitting portions 32a to 32e and 31a to 31e (see fig. 11). At the same time, the integrated circuit 71 generates a shooting signal, and the output unit 73 outputs it to the 3 rd camera 13. Thereby, light irradiated from the-y direction and between the-y direction and the + x direction and reflected from the right end of the cover glass G by the P surface on the right front end side is incident to the 3 rd camera 13. The image captured by the 3 rd camera 13 in the order 9 is an image in which the defect in the P surface on the right end side and the right front end side of the cover glass G is reflected (refer to fig. 12).
Further, the integrated circuit 71 generates a signal indicating the repetitive processing while performing the output of the sequence 9, and the output unit 73 outputs the signal indicating the repetitive processing to the integrated circuit 71. The integrated circuit 71 receives a signal indicating the repetitive processing, returns the processing to the first, and repeats the processing of sequentially outputting the signals shown in the sequence 1 to 9.
Here, when the P inner surfaces at the left and right ends are inspected in the sequences 4 and 7, the band-shaped light emitting portions 32a and 33a are irradiated for a time much longer than the time when the P inner surfaces are inspected in the sequences 5, 6, 8, and 9. Therefore, in the sequences 4 and 7, the 3 rd camera 13 captures pure white and bright saturated images of the P-side of the cover glass G.
The sequences 1 to 9 shown in fig. 11, 12 and the like are examples, and the sequence of the examination part and the examination content can be arbitrarily set. Further, the inspection of the inner surface P of the front end of the cover glass G shown in the sequence 1 and the inspection of the surface P of the front end side from the left end of the cover glass G shown in the sequence 6 are not required in the central portion and the rear end portion of the cover glass G, and the inspection of the inner surface P of the rear end of the cover glass G shown in the sequence 3 and the inspection of the surface P of the rear end side from the right end of the cover glass G shown in the sequence 8 are not required in the front end portion and the central portion of the cover glass G. The integrated circuit 71 may calculate the number of drive motor pulses from the information indicating the length of the cover glass G, determine the position of the cover glass G based on the number of drive motor pulses output, and omit the steps 1, 3, 6, and 8 from the position of the cover glass G.
Fig. 14 is a timing chart in the process shown in fig. 11. In addition, regarding sequence 1, only a signal irradiated at 100 μ sec is displayed. The shooting signal is the same as in the case shown in fig. 10. The signal for irradiating the stereoscopic illumination unit 30 becomes High at the same time as the imaging signal becomes High, and becomes Low after the irradiation time elapses.
In the case where the light reflected by the P surface is made incident on the 3 rd camera 13 and the case where the light reflected by the printing portion is made incident on the 3 rd camera 13, the necessary light amount is about 1: 100. Therefore, the irradiation time of the stereoscopic illumination unit 30 is made different between the case where the light reflected by the P-plane is made incident on the 3 rd camera 13 and the case where the light reflected by the printing unit is made incident on the 3 rd camera 13.
The integrated circuit 71 outputs the drive motor pulse to the transmission unit 50 via the output unit 73, while outputting the signals of the above-described sequence 1 to 9. Thus, the 3 rd camera 13 performs imaging while conveying the cover glass G on the mounting portion 40 at a constant speed and relatively changing the positions of the cover glass G and the 3 rd camera 13.
Thus, while the cover glass G is conveyed once, an image in which the defect on the front end P surface is bright, an image in which the defect on the rear end P surface is bright, an image in which the defect on the left and right end P surfaces is bright, and an image in which the defect on the printed portion is darker than the other printed portions can be captured at each timing. The process of capturing the transmission image and the specular reflection image of the cover glass G by the 3 rd camera 13 is continued until the cover glass G passes through the 3 rd camera 13. When the cover glass G passes through the 3 rd camera 13, a detection signal of the position detection sensor 82 is input to the integrated circuit 71. When the detection signal is input, the integrated circuit 71 terminates the signal output to the 3 rd camera 13 and the stereoscopic illumination unit 30.
After that, the integrated circuit 71 outputs a driving motor pulse to the transfer unit 50 by a predetermined number of pulses, and moves the cover glass G to the processing end position. Then, the integrated circuit 71 ends a series of processes.
When the series of processes is completed, the images captured by the 1 st camera 11, the 2 nd camera 12, and the 3 rd camera 13 are output to the PC100 via the output unit 73. The CPU101 generates an image for inspection from the images captured by the 1 st camera 11, the 2 nd camera 12, and the 3 rd camera 13. In the present embodiment, the CPU101 extracts images captured in the same illumination mode from among the images captured by the 1 st camera 11, the 2 nd camera 12, and the 3 rd camera 13, and connects them to generate a planar image. The image generation process will be described below.
The CPU101 generates a transmission image and a regular reflection image from the images captured by the 1 st camera 11 and the 2 nd camera 12. In the process of capturing the transmission image and the specular reflection image, the irradiation pattern is repeatedly performed in the order 1 to 3 as shown in fig. 9 and 10. Therefore, the CPU101 extracts every third frame (1 st frame, 4 th frame …) from the images captured by the 2 nd camera 12 with reference to the 1 st frame, and generates a two-dimensional image by connecting these images. Thereby, a transmission image (plane image) of the cover glass G is generated. As shown in fig. 15, the transmission image is an image in which the opaque printed portion is reflected in dark, and the printed defect portion (the dotted circle mark portion in fig. 15) can be confirmed.
The CPU101 extracts every third frame (2 nd frame, 5 th frame …) from the images captured by the 1 st camera 11 with reference to the 2 nd frame, and generates a two-dimensional image by connecting these images. This generates a top regular reflection image, that is, a regular reflection image (plane image) on the surface of the cover glass G. As shown in fig. 16, the regular reflection image is an image in which a portion having no defect (in fig. 16, a flaw is illustrated) is reflected brightly and a flaw is reflected darkly. In addition, even if the scratch overlaps the printed portion, since the reflectance of the glass surface is higher than that of the printed portion, the scratch is darker than other portions.
Further, the CPU101 extracts every third frame (the 3 rd frame and the 6 th frame …) from the images captured by the 2 nd camera 12 with reference to the 3 rd frame, and generates a two-dimensional image by connecting these images. This generates a lower regular reflection image, that is, a regular reflection image (planar image) on the back surface of the cover glass G.
Further, the CPU101 generates a reflected image from the image captured by the 3 rd camera 13. In the process of capturing the reflected image, the irradiation in the irradiation patterns of the order 1 to 9 is repeated as shown in fig. 11 to 13. Therefore, the CPU101 extracts every ninth frame (1 st frame, 10 th frame …) from the images captured by the 3 rd camera 13 with reference to the 1 st frame, and generates a two-dimensional image by connecting these images. As a result, as shown in fig. 17, a plane image of the cover glass G in which the defect in the inner surface P of the front end of the cover glass G is bright is generated.
The CPU101 extracts every ninth frame (2 nd frame, 11 th frame …) from the images captured by the 3 rd camera 13 with reference to the 2 nd frame, and generates a two-dimensional image by connecting these images. As a result, as shown in fig. 18, a planar image of the cover glass G is generated in which the defects of the printed portion of the cover glass G are darker than those of the other printed portions.
For convenience of explanation, fig. 17 and 18 show a part of the plane image enlarged and black lines around the defect.
The CPU101 extracts every ninth frame (the 3 rd frame and the 12 th frame …) from the images captured by the 3 rd camera 13 with reference to the 3 rd frame, and generates a two-dimensional image by connecting these images. Thereby, a plane image of the cover glass G in which the defect in the inner surface P of the rear end of the cover glass G is shiny is generated.
The CPU101 extracts every ninth frame (4 th frame, 13 th mesh …) from the images captured by the 3 rd camera 13 with reference to the 4 th frame, and generates two-dimensional images by connecting these images. Thereby, a plane image of the cover glass G in which the defect in the inner surface P of the left end of the cover glass G is shiny is generated.
The CPU101 extracts every ninth frame (the 5 th frame and the 14 th frame …) from the images captured by the 3 rd camera 13 with reference to the 5 th frame, and generates a two-dimensional image by connecting these images. Thereby, a planar image of the cover glass G in which the defects in the P surface on the left end side and the rear end side of the cover glass G are reflected is generated.
The CPU101 extracts every ninth frame (6 th frame, 15 th frame …) from the images captured by the 3 rd camera 13 with reference to the 6 th frame, and generates a two-dimensional image by connecting these images. Thereby, a planar image of the cover glass G is generated in which the defect in the P surface on the tip side is reflected from the left end of the cover glass G.
The CPU101 extracts every ninth frame (7 th frame, 16 th frame …) from the images captured by the 3 rd camera 13 with reference to the 7 th frame, and generates a two-dimensional image by connecting these images. Thereby, a plane image of the cover glass G in which the defect in the inner surface P of the right end of the cover glass G is shiny is generated.
The CPU101 extracts every ninth frame (8 th frame, 17 th frame …) from the images captured by the 3 rd camera 13 with reference to the 8 th frame, and generates a two-dimensional image by connecting these images. Thereby, a planar image of the cover glass G is generated in which the defects in the surface P on the rear end side are reflected from the right end of the cover glass G.
The CPU101 extracts every ninth frame (9 th frame, 18 th frame …) from the images captured by the 3 rd camera 13 with reference to the 9 th frame, and generates a two-dimensional image by connecting these images. Thereby, a planar image of the cover glass G is generated in which defects are reflected in the surface P on the right end side and the front end side of the cover glass G.
According to the present embodiment, since the cover glass G is irradiated with light from various directions by using the stereoscopic illumination unit 30 to capture an image, it is possible to inspect a defect by one inspection using one optical inspection apparatus 1 regardless of the position of the defect on the end surface of the cover glass G.
In particular, in the 1 st region 31, the light emitting portions 30a are arranged in the y direction in regions other than the vicinity of the center plane S1. Therefore, the cover glass G can be irradiated with a band-shaped light, and the region near the central axis ax can be inspected in the same manner regardless of the position in the y direction.
Further, since the plurality of belt-shaped light emitting portions 31a to 31j are provided, light can be irradiated to a region near the central axis ax from various angles. In order to capture a whole image of a defect on the P-plane, it is important to irradiate light from a light source having an angle of about 8 degrees between the optical axis and the central plane S1 to a light source having an angle of about 44 degrees. Therefore, it is important to include a plurality of band-shaped light emitting units 31a to 31j in order to capture the entire image of the defect on the P-plane at the front end and the rear end. In the 2 nd area 32 and the 3 rd area 33, since the band-shaped light emitting parts 32a and 33a are provided at positions on the center plane S1, the entire image of the defect can be captured on the left and right P-planes.
Further, according to the present embodiment, by arranging the strip-shaped light emitting portions 31b and 31g such that the angle θ 2 formed by the optical axis and the center plane S1 is substantially 17 degrees, and by imaging the scattered light when the light is reflected by the pearl print by the 3 rd camera 13, it is possible to image an image having a defect contrast larger than the gloss of the pearl material.
For example, when the image is captured by the band-shaped light-emitting portions 31a and 31f having the angle θ 1 between the optical axis and the center plane S1 of substantially 8 degrees, the luster of the pearl material is captured strongly. For example, when the images are captured by the band-shaped light-emitting portions 31c and 31h having the angle θ 3 between the optical axis and the central plane S1 of substantially 26 degrees, and the images are captured by the band-shaped light-emitting portions 31d and 31i having the angle θ 4 between the optical axis and the central plane S1 of substantially 35 degrees, the contrast of the defective portion in which the pearl coating is applied is reduced.
In contrast, as in the present embodiment, by irradiating the cover glass G with light from an angle of 17 degrees with respect to the normal direction, it is possible to capture an image in which the contrast of the defective portion of pearl coating is lower or higher than that of the other portion, and the gloss of the pearl material hardly involves the threshold value of defect detection. Therefore, the defect of pearl coating can be easily detected based on the captured image.
Further, according to the present embodiment, by arranging the band-shaped light emitting portions 31a and 31f such that the angle θ 1 formed by the optical axis and the center plane S1 is substantially 8 degrees, and irradiating the cover glass G with light from an angle of 8 degrees with respect to the normal direction, it is possible to capture an image in which the contrast of a defective portion of monochrome printing is lower or higher than that of the other portion.
In the present embodiment, the light quantity can be varied according to the contents of the image taken by changing the irradiation time of the light from the coaxial illumination unit 20 and the stereoscopic illumination unit 30, instead of changing the imaging time of the imaging unit 10.
In the present embodiment, the stereoscopic illumination unit 30 is provided in correspondence with the 3 rd camera 13, and images captured in the same illumination mode are selected from among the images captured by the 3 rd camera 13, and planar images different for each illumination mode are generated, whereby images equivalent to those captured using a plurality of cameras can be realized with the minimum number of cameras.
In the present embodiment, since the optical inspection apparatus 1 includes both the band-shaped light emitting portions 31a and 31f and the band-shaped light emitting portions 31b and 31g, it is possible to capture an image in which color unevenness of the printed portion can be detected by the same apparatus in both the case of using the pearlescent pigment and the case of not using the pearlescent pigment.
In the present embodiment, the angle θ 1 formed by the optical axis of the band-shaped light emitting parts 31a and 31f and the central plane S1 is substantially 8 degrees, but the angle θ 1 may be substantially 8 to 10 degrees. The angles θ 2 to θ 5 are not limited to the illustrated angles.
In the present embodiment, the PC100 generates an image for defect detection, but the PC100 may detect a defect of the cover glass G based on the generated image. For example, with respect to the image shown in fig. 14, a defect can be detected by comparing the pixel value with the threshold value. For example, in the image shown in fig. 15, the defect can be detected by calculating an average value of a plurality of pixels and comparing the average value with a threshold value.
In the present embodiment, the cylindrical lens 30c is provided on the optical axis of the band-shaped light emitting portions 31a to 31j, but the cylindrical lens 30c is not essential. However, when the cylindrical lens 30c is provided, the light emitted from the light emitting section 30a can be condensed by the cylindrical lens 30c to the vicinity of the central axis ax, and the light amount directed to the 3 rd camera 13 can be increased. Therefore, the imaging frequency of the imaging unit 10 can be increased (the imaging time can be shortened).
In the present embodiment, as shown in fig. 11, the band-shaped light-emitting portions 32a, 32b to 32e, and 32f to 32i are connected to different channels, and the band-shaped light-emitting portions 33a, 33b to 33e, and 33f to 33i are connected to different channels, but the relationship between the band-shaped light-emitting portions 32a to 32i, and 33a to 33i and the channels is not limited to this. For example, the band-shaped light emitting units 32b to 32i and 33b to 33i may be connected to one channel, or the band-shaped light emitting units 32a to 32i and 33a to 33i may be connected to different channels.
In the present embodiment, the irradiation times in the respective irradiation modes shown in fig. 9, 10, 11, and 13 are shown, but the irradiation times are merely exemplary and the irradiation times are not limited to the values described above. In each of the irradiation patterns shown in fig. 9, 10, 11, and 13, the irradiation time is defined as μ sec, but the irradiation time may be defined at a ratio to the imaging time.
In the present embodiment, the light emitting unit 30a includes the light emitting blocks 30b and 30d arranged in a row, but the light emitting blocks 30b and 30d may include a light diffusing plate. Fig. 19 is a diagram schematically showing a light-emitting block 30B-1 according to a modification, fig. 19 (a) is a side view, and fig. 19 (B) is a diagram of a state shown in fig. 19 (a) as viewed from below in fig. 19 (a).
In the light-emitting block 30b-1, a lenticular lens (lens) 30e as a light diffusion plate is provided adjacent to the light-emitting portion 30 a. The lenticular lens 30e is provided to cover the plurality of light emitting portions 30 a. The lenticular lens 30e is formed by arranging a large number of elongated convex lenses having an arch-shaped cross section at a uniform pitch, and diffuses a light component in the same direction as the arrangement direction of the convex lenses (direction orthogonal to the longitudinal direction of the convex lenses). In the light-emitting block 30b-1, the arrangement direction of the convex lenses is the same as that of the light-emitting section 30 a. Therefore, the light emitted from the light emitting sections 30a is diffused by the lenticular lens 30e, and the light emitted from the plurality of light emitting sections 30a can be made into one elongated surface light source by the lenticular lens 30 e.
For example, in the case where the lenticular lens 30e is not provided, when the inspection is performed based on the light regularly reflected by the front end P surface and/or the rear end P surface of the cover glass G, the point-like light of the light emitting section 30a may be reflected on the front end P surface and/or the rear end P surface, but the occurrence of such a problem can be prevented by providing the lenticular lens 30e to cover the light emitting section 30 a.
In the inspection of the front-end P-surface and the rear-end P-surface, a method may be considered in which the band-shaped light emitting unit 31b and the band-shaped light emitting unit 31g are made to emit weak light and the light from the band-shaped light emitting unit 31b and the band-shaped light emitting unit 31g is imaged on the front-end P-surface and the rear-end P-surface. At this time, when the lenticular lens 30e is not present in the band-shaped light emitting portions 31b and 31g, point-like light is projected on the front end P surface and the rear end P surface, and the inspection cannot be performed smoothly. On the other hand, when the lenticular lens 30e is provided in the band-shaped light emitting portions 31b and 31g, the point-like light is projected on the front end P surface and the rear end P surface, and therefore, it is possible to check for defects related to polishing of the front end P surface and the rear end P surface by whether or not the light is curved.
In fig. 19, one piece of lenticular lens 30e is provided in one light-emitting block 30b-1, but the number of pieces of lenticular lens 30e is not limited to this. The plurality of light emitting portions 30a may be covered with a portion in which a plurality of lenticular lenses are arranged. Further, the light diffusion plate is not limited to the lenticular lens.
< embodiment 2 >
In embodiment 1 of the present invention, an image using coaxial illumination and an image using stereoscopic illumination are captured by different cameras, but an image using coaxial illumination and an image using stereoscopic illumination may be captured by the same camera.
Embodiment 2 of the present invention is a mode in which two cameras perform imaging. The optical inspection apparatus 2 according to embodiment 2 will be described below. The same portions as those of the optical inspection apparatus 1 are denoted by the same reference numerals, and description thereof is omitted.
Fig. 20 is a front view showing an outline of the optical inspection apparatus 2 according to embodiment 2. The optical inspection apparatus 2 mainly includes: the imaging unit 10A, the coaxial illumination unit 20, the stereoscopic illumination unit 30, the mounting unit 40, and the transmission unit 50 (not shown).
The imaging unit 10A includes: a1 st camera 11 (corresponding to a one-dimensional imaging means of the present invention in the present embodiment), and a2 nd camera 12 (corresponding to a2 nd imaging means of the present invention). The 1 st camera 11 and the 2 nd camera 12 are arranged in the same manner as the optical inspection apparatus 1.
The stereoscopic illumination unit 30 is provided between the 1 st camera 11 and the mounting unit 40. The stereoscopic illumination unit 30 is provided at a position where the central axis ax intersects the optical axis of the 1 st camera 11. Since the angle θ 1 (see fig. 4) formed by the optical axes of the belt-shaped light emitting parts 31a and 31f and the central plane S1 is substantially 8 degrees, the light from the upper coaxial illumination 21 is not blocked by the stereoscopic illumination part 30.
The processing performed by the optical inspection apparatus 2 configured as described above will be described. The integrated circuit 71 (including the 3 rd control unit of the present invention) generates a drive motor pulse for driving the roller 40a of the mounting unit 40, and the output unit 73 outputs the drive motor pulse to the conveying unit 50. Thereby, the cover glass G moves above the mount 40 at a constant speed in the conveying direction F.
When the position detection sensor 81 detects that the cover glass G has passed under the 1 st camera 11 and the 2 nd camera 12, a detection signal is input from the position detection sensor 81 to the integrated circuit 71 via the input unit 72. When the detection signal is input, the integrated circuit 71 starts the process of capturing the reflected image by the 1 st camera 11 and the process of capturing the transmitted image and the specular reflected image by the 1 st camera 11 and the 2 nd camera 12.
Fig. 21 is a diagram showing the correspondence between the order of shooting processing and images shot by the 1 st camera 11 and the 2 nd camera 12.
The processing for capturing the reflected image by the 1 st camera 11 (the processing in order 1 to 9 in fig. 21) is the same as the processing performed in the optical inspection apparatus 1 (fig. 11, 12, and 14) except that the captured signal generated by the integrated circuit 71 is output to the 1 st camera 11, and thus detailed description thereof is omitted.
After finishing the processes of the sequences 1 to 9, the integrated circuit 71 starts the process of capturing the transmission image and the specular reflection image shown in the sequences 10 to 12. The processing shown in the sequence 10 to 12 is the same as the processing shown in fig. 9 and 10 (the processing in the sequence 1 to 3), and therefore, the detailed description thereof will be omitted.
The integrated circuit 71 generates a signal indicating the repetitive processing while performing the output of the sequence 12, and outputs the signal to the integrated circuit 71 via the output unit 73. The integrated circuit 71 receives a signal indicating the repetitive processing, returns the processing to the first, and repeats the processing of outputting the signals shown in the sequences 1 to 12 in order.
According to the present embodiment, defects can be inspected by one inspection using one optical inspection apparatus 2. Further, the reflected image, the transmitted image, and the specular reflected image can be captured with the minimum number of cameras (two cameras).
In the present embodiment, since the half mirror 21h is provided on the optical axis of the 1 st camera 11, the amount of light irradiated from the stereoscopic illumination unit 30 and incident on the 1 st camera 11 is substantially half of the amount of light irradiated from the stereoscopic illumination unit 30 and incident on the 3 rd camera 13 in the optical inspection apparatus 1. Therefore, it is preferable to brighten the light emitted from the stereoscopic illumination unit 30. In the present embodiment, since the number of images captured by the 1 st camera 11 is larger than that of the 1 st embodiment, it is preferable to increase the imaging frequency by increasing the light emitted from the stereoscopic illumination unit 30 to be brighter.
< embodiment 3 >
In embodiment 1 of the present invention, the cover glass G having the circular-arc-shaped P-surface formed therearound is inspected, but the form of the cover glass is not limited thereto. Recently, a cover glass in which the peripheral curved surface of the cover glass is deep and the curved surface portion has a partial cylindrical shape or an elliptical cylindrical shape can be used.
Embodiment 3 of the present invention is a method for inspecting a cover glass having a partially cylindrical shape or an elliptic cylindrical shape around the cover glass. The optical inspection apparatus 3 according to embodiment 3 will be described below. The same portions as those of the optical inspection apparatus 1 are denoted by the same reference numerals, and description thereof is omitted.
Fig. 22 is a front view showing an outline of the optical inspection apparatus 3 according to embodiment 3. The optical inspection apparatus 3 mainly includes: an imaging unit 10, a coaxial illumination unit 20, a stereoscopic illumination unit 30, a mounting unit 40, a transmission unit 50 (not shown), a side surface inspection unit 60, and a height acquisition unit 90.
Fig. 23 is a perspective view showing a part of the optical inspection apparatus 3 in an enlarged manner. The side surface inspection unit 60 mainly includes optical elements 61 and 62 for focal length adjustment and mirrors 63 and 64. The optical elements 61 and 62 for focal length adjustment and the mirrors 63 and 64 are positioned on a center plane S1 in the substantially vertical direction including the 3 rd camera 13.
The optical elements 61 and 62 for focal length adjustment are optical elements for adjusting the focal length of the 3 rd camera 13. In the present embodiment, thick plate-shaped glass plates are used as the optical elements 61 and 62 for focal length adjustment. The optical elements 61 and 62 for focal length adjustment are disposed so that both end surfaces substantially orthogonal to the plate thickness direction are horizontal.
The focal length adjusting optical element 61 and the focal length adjusting optical element 62 are substantially at the same position in the x direction and the z direction, and are disposed so as to face each other with a line axl extending in the z direction passing through the center of the 3 rd camera 13 interposed therebetween. The optical elements 61 and 62 for focal length adjustment are provided above (on the + z side) the stereoscopic illumination unit 30.
The mirrors 63 and 64 are members that reflect the side surface image of the cover glass G1 and guide the side surface image to the 3 rd camera 13. The mirrors 63 and 64 are substantially plate-shaped and are provided adjacent to the mounting portion 40. In the present embodiment, the roller 40a is disposed between adjacent rollers 40 a.
The mirror 63 and the mirror 64 are substantially at the same position in the x direction and the z direction, and are provided so as to face each other with a line ax1 extending in the z direction passing through the center of the 3 rd camera 13 interposed therebetween. In addition, the reflecting mirror 63 and the reflecting mirror 64 are provided outside the mounting area and adjacent to the mounting area in a direction substantially orthogonal to the conveying direction F, respectively, in a plan view. Here, the mounting area is an area on the mounting portion 40 on which the cover glass G1 is mounted, and includes an area below the 3 rd camera 13 in the vertical direction. Fig. 23 illustrates a state where the cover glass G1 is mounted in the mounting area.
Fig. 24 is a diagram showing a schematic configuration in a state where the optical inspection apparatus 3 is cut off at the center plane S1. Fig. 24 shows a state viewed from the downstream side (+ x direction) of the conveying direction F. The two-dot chain line in fig. 24 schematically shows the path of light incident to the 3 rd camera 13.
The reflecting surfaces 63a and 64a of the reflecting mirrors 63 and 64 are substantially flat surfaces, and the reflecting surfaces 63a and 64a are provided so as to extend substantially in the conveying direction F such that a line intersecting the center plane S1 (in fig. 24, a line indicating the reflecting surfaces 63a and 64 a) is inclined with respect to a horizontal plane.
On the central plane S1, the mirror 63 and the mirror 64 are located on both sides of the cover glass G1, respectively. The cover glass G1 has a plane Ga parallel to the horizontal direction and a side surface Gb inclined with respect to the horizontal direction in a state of being mounted on the mounting portion 40. The side surface Gb is partially cylindrical or elliptical, and the inclination of the side surface Gb with respect to the horizontal direction is approximately 30 to 45 degrees. The ends Ge of the side surfaces Gb on both sides are mounted on the roller 40 a.
The image of the plane Ga is guided to the imaging lens 13a without passing through the optical elements 61 and 62 for focal length adjustment. In other words, the optical elements 61 and 62 for focal length adjustment are not present on a line connecting the imaging lens 13a and the plane Ga.
On the other hand, the image of the side surface Gb is reflected by the reflection surfaces 63a and 64a, passes through the optical elements 61 and 62 for focal length adjustment, and is guided to the imaging lens 13 a. In other words, the optical elements 61 and 62 for focal length adjustment are arranged so as to overlap a line connecting the imaging lens 13a and the reflection surfaces 63a and 64 a.
In the present embodiment, the optical elements 61 and 62 for focal length adjustment have inner portions thereof partially cut away so that the optical elements 61 and 62 for focal length adjustment do not lie on a line connecting the imaging lens 13a and the plane Ga.
The focal position F1 when the optical elements 61 and 62 for focal length adjustment are not passed is the position of the plane Ga of the cover glass G1. The cover glass G1 preferably has a plate thickness of approximately 0.5mm, and the focal point position F1 is located near the center in the plate thickness direction in the plane Ga.
When the light passes through the optical elements 61 and 62 for focal length adjustment, the light is refracted during the light enters the optical elements 61 and 62 for focal length adjustment and during the light exits the optical elements 61 and 62 for focal length adjustment, and therefore the focal position F2 is located farther than the focal position F1. When the plate pressure of the optical elements 61 and 62 for focal length adjustment is T and the refractive index of the glass is n, the extension of the focal position by the optical elements 61 and 62 for focal length adjustment (see black arrows in fig. 24) can be represented by T-T/n. For example, when T is 12mm and n is 1.5, the focal position extends by 4mm because the optical elements 61 and 62 for focal length adjustment pass through. That is, the focus position F2 is located at a position 4mm on the-z side than the focus position F1.
Fig. 25 is a diagram showing a relationship between the position of the cover glass G1 and an image captured by the 3 rd camera 13, (a) is an enlarged view of a side portion of the cover glass G1, and (B) shows a part of the image captured by the 3 rd camera 13. In fig. 25(a), the reflection surface 63a is shown by a dotted line, and the path of light reflected by the reflection surface 63a and incident on the 3 rd camera 13 is shown by a two-dot chain line.
The image of the region (region I) on the plane Ga of the cover glass G1 and the side surface Gb of the cover glass G1, which is a part of the plane Ga, is guided to the imaging lens 13a without passing through the optical elements 61 and 62 for focal length adjustment. The focal position F1 is at the position of the plane Ga, and therefore the image of the region I becomes a clear image. The image on the side surface Gb becomes a blurred image with a slight misalignment of the focal point (see the grid portion in fig. 25).
The image of the side surface Gb and a region (region II) of the plane Ga near the side surface Gb is reflected by the reflection surface 63a (the same applies to the reflection surface 64 a) and guided to the imaging lens 13 a. Therefore, the image of the region II is inverted left and right, the image of the end Ge of the side surface Gb is the inner side, and the image of the plane Ga side of the side surface Gb is the outer side.
The image of the region II is guided to the imaging lens 13a through the optical elements 61 and 62 for focal length adjustment. Since the focal position F2 is located lower than the focal position F1 by the optical elements 61 and 62 for focal length adjustment, most of the side surfaces Gb form a sharp in-focus image. In the present embodiment, by making the focal depth of the 3 rd camera 13 and the height of the side surface Gb substantially equal, a clear image in focus can be obtained for the entire side surface Gb. In addition, a blurred image with a slight misalignment of the focal point is formed at the boundary between the plane Ga and the side surface Gb (see a grid portion in fig. 25). The most central side of the image of the area II is a portion where the cover glass G1 is not present, and a black image is captured by the 3 rd camera 13.
The regions I and II are preferably set to partially overlap. Accordingly, two images, that is, an image passing through the optical elements 61 and 62 for focal length adjustment and an image not passing through the optical elements 61 and 62 for focal length adjustment, can be obtained at the boundary portion between the plane Ga and the side surface Gb, and therefore, even when one image is out of focus, a defect at the boundary portion between the plane Ga and the side surface Gb can be detected.
The explanation returns to fig. 23. The optical inspection apparatus 3 includes a height obtaining unit 90 for obtaining the height of the cover glass G1. The height obtaining unit 90 is provided upstream (on the (-x) side) in the conveyance direction F from the side surface inspection unit 60, and mainly includes: a surface light source 91, a camera 92, and a reflector 93.
Fig. 26 and 27 are diagrams schematically showing a state where the height of the cover glass G1 is measured, fig. 26 is a diagram viewed from a direction (here, the-y direction) substantially orthogonal to the conveying direction F, and fig. 27 is a diagram viewed along the conveying direction F (here, the + x direction). The two-dot chain line in fig. 27 shows the path of light irradiated from the surface light source 91.
The surface light source 91, the camera 92, and the reflecting mirror 93 are provided so as to sandwich the mounting portion 40. The surface light source 91 irradiates light in a direction (here, the + y direction) substantially orthogonal to the conveyance direction F. The camera 92 is incident light irradiated from the surface light source 91 and reflected by the reflecting mirror 93 after passing through the cover glass G1. Thus, the camera 92 can obtain an image of a shadow image in which the light is partially blocked by the cover glass G1 and the other part is bright. Therefore, the height of the cover glass G1 can be accurately obtained. The reflecting mirror 93 is not essential, and may be configured to allow light irradiated from the surface light source 91 and passing through the cover glass G1 to enter the camera 92.
The height of the front end Gc and the rear end Gd of the cover glass G1 is lower than that of the plane Ga. Therefore, the height changes of the front end Gc and the rear end Gd can be obtained by irradiating light from the surface light source 91 in the y direction while moving the cover glass G1 in the conveyance direction F by the mounting unit 40, and photographing the light having passed through the cover glass G1 by the camera 92.
The explanation returns to fig. 23. The 3 rd camera 13 is provided with a moving unit 95 for moving the 3 rd camera 13 in the vertical direction. The moving unit 95 includes: an actuator (not shown) as a driving source and a moving mechanism (not shown) for moving the 3 rd camera 13 in the vertical direction by transmitting the driving of the actuator. Various known techniques such as a feed screw can be used for the moving mechanism.
The CPU101 (see fig. 7) controls the moving unit 95 based on the image captured by the camera 92 to move the 3 rd camera 13 in the vertical direction in accordance with the changes in the heights of the front end Gc and the rear end Gd. The ROM103 (see fig. 7) stores the number of pulses for conveying the cover glass G1 in the conveying direction F until the cover glass G1 is positioned directly below the 3 rd camera 13 after being imaged by the camera 92. Further, in the ROM103, a relationship between a driving amount of the actuator and a moving amount of the 3 rd camera 13 is held. The CPU101 drives the actuator of the moving section 95 based on the information stored in the ROM103, thereby moving the 3 rd camera 13 in the up-down direction in accordance with the change in height of the cover glass G1 passing under the 3 rd camera 13.
According to the present embodiment, by extending the focal distance of the 3 rd camera 13 using the optical elements 61 and 62 for focal distance adjustment and reflecting the light incident on the 3 rd camera 13 using the mirrors 63 and 64, it is possible to inspect defects on the side surface Gb by one inspection even with the cover glass G1 having the side surface Gb.
Further, according to the present embodiment, the positions and inclinations of the mirrors 63 and 64 are changed according to the shape of the cover glass, thereby being capable of coping with various kinds of cover glass. For example, by setting the inclination of the reflecting surfaces 63a and 64a with respect to the horizontal direction according to the inclination of the side surface Gb with respect to the horizontal direction, it is possible to inspect the defect of the side surface Gb without depending on the inclination of the side surface Gb with respect to the horizontal direction. Further, for example, by changing the y-direction positions of the mirrors 63 and 64 in accordance with the width of the cover glass, it is possible to inspect the defects of the side surfaces Gb without depending on the width of the cover glass.
Further, according to the present embodiment, by moving the 3 rd camera 13 in the vertical direction in accordance with the change in the height of the cover glass G1 below the 3 rd camera 13, even when the heights of the front end Gc and the rear end Gd change, it is possible to capture a sharp image in focus with respect to the front end Gc and the rear end Gd by the 3 rd camera 13. Therefore, the defect can be reliably inspected also for the front end Gc and the rear end Gd.
In the present embodiment, thick plate-shaped glass plates are used as the optical elements 61 and 62 for focal length adjustment, but the form of the optical elements 61 and 62 for focal length adjustment is not limited to this. For example, concave lenses may be used as the focal length adjusting optical elements 61 and 62. Further, although the thickness of the glass plate and the shape of the concave lens are set according to the shape of the cover glass, when the concave lens is used, the focal distance is easily extended, and therefore, when the height of the side surface Gb is low, it is preferable to use the glass plate as the optical elements 61 and 62 for focal distance adjustment.
In the present embodiment, the CPU101 controls the moving unit 95 based on the image captured by the camera 92 so that the 3 rd camera 13 is moved in the up-down direction in accordance with the change in height of the cover glass G1 passing under the 3 rd camera 13, but the method of moving the 3 rd camera 13 in the up-down direction in accordance with the change in height of the cover glass G1 passing under the 3 rd camera 13 is not limited to this. For example, information on the height changes of the front end Gc and the rear end Gd may be stored in the ROM103 in advance, and the CPU101 may control the moving unit 95 based on the information. For example, the height of the cover glass G1 may be measured by a laser displacement meter, and the CPU101 may control the moving unit 95 based on the measurement result.
< embodiment 4 >
In embodiment 1 of the present invention, the stereoscopic illumination unit 30 includes the 1 st area 31, the 2 nd area 32, and the 3 rd area 33, and the light emitting unit 30a is disposed on the substantially cylindrical surface in the 2 nd area 32 and the 3 rd area 33, but the form of the stereoscopic illumination unit is not limited to this.
Embodiment 4 of the present invention is a form in which all the belt-shaped light emitting sections constituting the stereoscopic illumination section have light emitting blocks in which the light emitting sections 30a are linearly arranged. The optical inspection apparatus 4 according to embodiment 4 will be described below. Since the optical inspection apparatus 4 according to embodiment 4 differs from the optical inspection apparatus 1 only in the stereoscopic illumination unit, the stereoscopic illumination unit 30A provided in the optical inspection apparatus 4 according to embodiment 4 will be described, and descriptions of the same parts as those of the optical inspection apparatus 1 will be omitted.
Fig. 28 is a perspective view showing an outline of the stereoscopic illumination unit 30A provided in the optical inspection apparatus 4. The cover glass G is irradiated with light from a plurality of directions by the stereoscopic illumination unit 30A. The light irradiated from the stereoscopic illumination unit 30A and reflected by the cover glass G enters the 3 rd camera 13 (see fig. 1).
The stereoscopic lighting unit 30A includes a1 st region 31A of a substantially semicircular surface, and a2 nd region 32A and a 3 rd region 33A of a substantially hemispherical surface or a substantially hemispherical spherical surface. The 1 st area 31A, the 2 nd area 32A, and the 3 rd area 33A in the stereoscopic illumination unit 30A correspond to the 1 st area 31, the 2 nd area 32, and the 3 rd area 33 in the stereoscopic illumination unit 30, respectively.
The 1 st region 31A has band-like light-emitting parts 31A-1, 31b-1, 31c-1, 32d-1, 31e-1, 31f-1, 31g-1, 31h-1, 31i-1, 31 j-1. The band-shaped light emitting parts 31a-1 to 31j-1 in the three-dimensional illumination part 30A correspond to the band-shaped light emitting parts 31a to 31j in the three-dimensional illumination part 30.
The 2 nd region 32A has band-shaped light-emitting parts 32A-1, 32b-1, 32c-1, 32d-1, 32f-1, 32g-1, 32h-1, and the 3 rd region 33A has band-shaped light-emitting parts 33A-1, 33b-1, 33c-1, 33d-1, 33f-1, 33g-1, 33 h-1. The band-shaped light emitting parts 32a-1 to 32d-1, 32f-1 to 32h-1 in the three-dimensional illumination part 30A correspond to the band-shaped light emitting parts 32a to 32d, 32f to 32h in the three-dimensional illumination part 30. The band-shaped light emitting parts 33a-1 to 33d-1, 33f-1 to 33h-1 in the stereoscopic illuminating part 30A correspond to the band-shaped light emitting parts 33a to 33d, 33f to 33h in the stereoscopic illuminating part 30.
The stereoscopic illumination unit 30A has a frame 34 that integrates the 1 st area 31A, the 2 nd area 32A, and the 3 rd area 33A. The frame 34 is made of a metal having excellent thermal conductivity such as aluminum. The frame 34 has two plates 34a, and the 1 st region 31A is provided between the two plates 34 a. The 2 nd and 3 rd regions 32A and 33A are provided outside the plate 34 a.
FIG. 29 is a schematic diagram showing details of the band-shaped light-emitting unit 31 a-1. Since the band-shaped light emitting parts 31a-1 to 31j-1 have the same structure, only the band-shaped light emitting part 31a-1 will be described, and the description of the band-shaped light emitting parts 31b-1 to 31j-1 will be omitted.
The band-shaped light emitting section 31a-1 is a light emitting element having: a light-emitting block 30b in which the light-emitting sections 30a are arranged in a row so that the length in the longitudinal direction is L; a cylindrical lens 30 c-1; and a lenticular lens 30 e-1. The cylindrical lens 30c differs from the cylindrical lens 30c-1 only in the length in the long-side direction. The lenticular lens 30e differs from the lenticular lens 30e-1 only in the length in the longitudinal direction.
The band-shaped light emitting section 31a-1 has two light emitting blocks 30b arranged in a straight line. The two light-emitting blocks 30b are directly mounted on the board 34a via the mounting members 34 b. The cylindrical lens 30c-1 is directly mounted on the plate 34a via a mounting member 34 c.
Between the light-emitting block 30b and the cylindrical lens 30c-1, a lenticular lens 30e-1 is disposed. The lenticular lens 30e-1 is mounted on the light-emitting block 30b via a support member 30 f.
The light-emitting block 30b and the cylindrical lens 30c-1 are mounted on the board 34a such that the front end surface p1 of the light-emitting block 30b on which the light-emitting portion 30a is provided is substantially parallel to the upper surface p2 of the cylindrical lens 30 c-1. In other words, the extending direction of the light-emitting block 30b is substantially parallel to the extending direction of the cylindrical lens 30 c-1.
FIG. 30 is a schematic view showing the details of the band-shaped light-emitting part 32 a-1. Since the band-shaped light emitting parts 32a-1 to 32d-1, 32f-1 to 32h-1, 33a-1 to 33d-1 and 33f-1 to 33h-1 have the same structure, only the band-shaped light emitting part 32a-1 will be described, and the description of the band-shaped light emitting parts 32b-1 to 32d-1, 32f-1 to 32h-1, 33a-1 to 33d-1 and 33f-1 to 33h-1 will be omitted.
The band-shaped light emitting section 32a-1 is a light emitting element having: a light-emitting block 30b in which the light-emitting sections 30a are arranged in a row so that the length in the longitudinal direction is L; a cylindrical lens 30 c-2; and 30g of plate. The cylindrical lens 30c differs from the cylindrical lens 30c-2 only in the length in the long-side direction.
The light-emitting block 30b and the cylindrical lens 30c-2 are mounted on the board 30 g. The plate 30g is made of a metal having excellent thermal conductivity such as aluminum. A bent portion 30h for attaching the plate 30g to the plate 34a is formed in the plate 30 g.
The light-emitting block 30b and the cylindrical lens 30c-2 are disposed on the plate 30g such that the upper surface p4 of the cylindrical lens 30c-2 is inclined with respect to the front end surface p3 of the light-emitting block 30b on which the light-emitting portion 30a is disposed. In other words, the extending direction of the cylindrical lens 30c-1 is inclined with respect to the extending direction of the light-emitting block 30 b.
The band-shaped light emitting parts 31a-1 to 31j-1 have a lenticular lens 30e-1, but the band-shaped light emitting parts 32a-1 to 32d-1, 32f-1 to 32h-1, 33a-1 to 33d-1 and 33f-1 to 33h-1 do not have a lenticular lens. This is because the dot-like light emitted from the light emitting section 30a cannot be simultaneously captured on the P-faces formed at the left and right ends of the cover glass G.
The explanation returns to fig. 28. The strip-shaped light-emitting parts 31a-1 to 31e-1 are located on the + x side of the central plane S1, and the strip-shaped light-emitting parts 31f-1 to 31j-1 are located on the-x side of the central plane S1. The strip-shaped light emitting parts 31a-1 to 31j-1 are not arranged on the central surface S1.
The strip-shaped light emitting parts 31a-1 to 31j-1 are provided such that the optical axis intersects with a central axis ax (see fig. 31), which is an intersection line of the central surface S1 and the mounting part 40 (not shown in fig. 28) (described later).
The band-shaped light emitting parts 32a-1, 33a-1 are provided on the central surface S1. The strip-shaped light emitting parts 32b-1 to 32d-1, 33b-1 to 33d-1 are located on the + x side of the central plane S1, and the strip-shaped light emitting parts 32f-1 to 32h-1, 33f-1 to 33h-1 are located on the-x side of the central plane S1.
The central axes of the strip-shaped light emitting units 32a-1 and 33a-1 face the intersection (center point O1, see fig. 31) between the central axis ax1 of the stereoscopic illumination unit 30A and the mounting unit 40. The band-shaped light emitting parts 32b-1 to 32d-1, 32f-1 to 32h-1, 33b-1 to 33d-1 and 33f-1 to 33h-1 are provided substantially parallel to the band-shaped light emitting parts 32a-1 and 33 a-1.
Fig. 31 is a diagram illustrating a path of light emitted from the stereoscopic illumination unit 30A. Fig. 31 shows a schematic configuration when the stereoscopic illumination unit 30A is cut at the center plane S1, and the path of light is shown by a two-dot chain line.
The band-shaped light emitting section 31f-1 is provided so that the optical axis intersects the central axis ax. In the belt-shaped light emitting section 31f-1, the cylindrical lens 30c-1 is provided between the light emitting block 30b and the central axis ax, and condenses the light irradiated from the light emitting section 30a to the vicinity of the central axis ax. In the 1 st region 31A, the light-emitting portions 30a are arranged substantially horizontally, and the front end surfaces p1 of the light-emitting blocks 30b are substantially parallel to the upper surfaces p2 of the cylindrical lenses 30c-1, so that the light irradiated from the light-emitting blocks 30b is focused on the central axis ax by the cylindrical lenses 30 c-1.
The center axes of the band-shaped light emitting parts 32a-1 and 33a-1 face the center point O1. The cylindrical lens 30c-2 is provided between the light-emitting block 30b and the center point O1 (central axis ax), and condenses the light emitted from the light-emitting portion 30a to the vicinity of the central axis ax.
In the 2 nd and 3 rd regions 32A and 33A, the light emitting portions 30a are not arranged substantially horizontally, and the extending direction of the light emitting block 30b is inclined with respect to the horizontal direction. If the front end surface p3 of the light-emitting block 30b is assumed to be substantially parallel to the upper surface p4 of the cylindrical lens 30c-2, the light emitted from the light-emitting section 30a is focused on a line substantially parallel to the front end surface p3, not on the central axis ax. Therefore, in the 2 nd region 32A and the 3 rd region 33A, the upper surface p4 is inclined with respect to the front end surface p3 so that the light emitted from the light emitting section 30a is focused on the central axis ax.
Thus, in the 2 nd and 3 rd regions 32A and 33A, the light emitted from the light emitting section 30a passes through the cylindrical lens 30c-2 and then is irradiated to the P surfaces formed at the left and right ends of the cover glass G, and the focal point of the light is formed on the P surface. In addition, in the 2 nd region 32A and the 3 rd region 33A, light is irradiated to a region a2 outside a range a1 where the 1 st region 31A can irradiate light. Thus, the 2 nd and 3 rd regions 32A and 33A have a function of compensating for the 1 st region 31A.
According to the present embodiment, the cover glass G can be irradiated with light from various directions using the stereoscopic illumination unit 30A. In particular, in the 2 nd region 32A and the 3 rd region 33A, the upper surface P4 is inclined with respect to the front end surface P3, whereby the light emitted from the light emitting unit 30a can be focused on the P surfaces formed at the left and right ends of the cover glass G. Further, the 1 st region 31A can be compensated for by the 2 nd and 3 rd regions 32A, 33A by inclining the upper surface p4 with respect to the front end surface p 3. As a result, the number of light-emitting blocks 30b included in the strip-shaped light-emitting portions 31a-1 to 31j-1 can be reduced.
Further, according to the present embodiment, since the frame 34 and the plate 30g, which are formed of a material having high thermal conductivity, are integrated to constitute the heat radiation member, heat generated from the light emitting portion 30a and the like can be radiated with good thermal efficiency.
Further, as shown in fig. 28, it is preferable to provide an air blowing unit 35 in the vicinity of the stereoscopic illumination unit 30A. By blowing air from air blowing unit 35 to stereoscopic illumination unit 30A, the cooling effect of the heat radiating member (frame 34 and plate 30g) can be improved. Further, since the plate 30g is extended in the y direction, the air blowing unit 35 preferably sends air in the direction along the extending direction of the plate 30g (see thick arrow in fig. 28) from the side surface (+ y direction or-y direction) to the stereo illumination unit 30A. In fig. 28, although air blowing unit 35 is provided in the-y direction of stereoscopic illumination unit 30A, the position of air blowing unit 35 is not limited to this. In fig. 28, one air blowing unit 35 is provided, but the number of air blowing units 35 is not limited to this.
< embodiment 5 >
In embodiment 1 of the present invention, the coaxial illumination section 20 has the upper coaxial illumination 21 as the coaxial illumination of the 1 st camera 11 and the lower coaxial illumination 22 as the coaxial illumination of the 2 nd camera 12, but the form of the coaxial illumination section is not limited to this.
Embodiment 5 of the present invention is a mode in which a coaxial illumination unit has a C-PL filter. The optical inspection apparatus 5 according to embodiment 5 will be described below. Since the optical inspection apparatus 5 according to embodiment 5 differs from the optical inspection apparatus 1 only in the coaxial illumination unit, the coaxial illumination unit 20A provided in the optical inspection apparatus 5 according to embodiment 4 will be described, and descriptions of the same parts as those of the optical inspection apparatus 1 will be omitted.
Fig. 32 is a front view showing an outline of the optical inspection apparatus 5 according to embodiment 5. The coaxial illumination unit 20A includes: upper side coaxial illumination 21 as coaxial illumination of the 1 st camera 11; lower side coaxial illumination 22 as coaxial illumination for the 2 nd camera 12; and C- PL filters 23a, 23 b. The C-PL filter 23a is provided on the lower side of the 1 st camera 11, and the C-PL filter 23b is provided on the upper side of the 2 nd camera 12.
The C- PL filters 23a and 23b are circular polarization filters each having a polarizing plate and a 1/4 λ phase difference plate that gives a phase difference of 1/4 λ to transmitted light transmitted through the polarizing plate. 1/4 lambda phase difference plate converts linearly polarized light into circularly polarized light. The C- PL filters 23a and 23b are arranged such that the polarizing plates are positioned on the half mirrors 21h and 22h side and the 1/4 λ phase difference plate is positioned on the side away from the half mirrors 21h and 22 h. The C- PL filters 23a, 23b are already known, and therefore, detailed description thereof is omitted.
The C- PL filters 23a, 23b are disposed adjacent to the 1 st camera 11 and the 2 nd camera 12, respectively. The C- PL filters 23a and 23b are each provided such that a plane in a direction substantially orthogonal to the thickness direction is slightly inclined with respect to a direction substantially orthogonal to the optical axis oax.
Some of the light emitted from the light source 21a and reflected downward by the half mirror 21h may pass through the cover glass G and be reflected by the surface of the imaging lens 12 a. When the reflected light enters the 1 st camera 11, there is a possibility that a bright line is included in the center of the image captured by the 1 st camera 11. The C-PL filter 23b prevents light reflected by the surface of the imaging lens 12a after passing through the cover glass G from entering the 1 st camera 11. This makes it possible to prevent bright lines from being included in the center of the image captured by the 1 st camera 11.
Similarly, the C-PL filter 23a prevents light emitted from the light source 22a, reflected upward by the half mirror 22h, transmitted through the cover glass G, and reflected by the surface of the imaging lens 11a from entering the 2 nd camera 12. This makes it possible to prevent bright lines from being included in the center of the image captured by the 2 nd camera 12.
In the present embodiment, the C- PL filters 23a and 23b are provided, but the C-PL filter 23a is not essential.
While the embodiments of the present invention have been described above with reference to the drawings, the specific configuration is not limited to the embodiments, and design changes and the like without departing from the scope of the present invention are also included. Those skilled in the art can appropriately change, add, or convert the respective elements of the embodiment. For example, the side surface inspection unit 60 and the height acquisition unit 90 may be applied to the optical inspection apparatuses 4 and 5 according to embodiments 4 and 5. For example, the optical inspection apparatus 2 according to embodiment 2 and the optical inspection apparatus 5 according to embodiment 5 may be combined, and the optical inspection apparatus 4 according to embodiment 4 and the optical inspection apparatus 5 according to embodiment 5 may be combined.
In the above-described embodiments, the object to be inspected (inspection object) of the optical inspection apparatuses 1 to 5 is the cover glass G, G1, but the inspection object of the optical inspection apparatuses 1 to 5 is not limited to the cover glass. For example, the inspection target of the optical inspection apparatuses 1 to 5 may be glass of a touch panel used in a portable personal computer.
In the present invention, the term "substantially" does not include not only the exact same but also a concept including an error or a distortion to the extent that the identity is not lost. For example, the term "substantially horizontal" is not limited to a strictly horizontal state, but includes a concept of an error of about several degrees, for example. For example, when only parallel or orthogonal expression is given, the case of substantially parallel or substantially orthogonal expression is included, instead of the case of strictly parallel or orthogonal expression. In the present invention, the term "vicinity" means a region including a certain range (which can be arbitrarily set) in the vicinity of a reference position.
Description of the symbols
1,2: optical inspection device
10, 10A: image pickup unit
11: 1 st vidicon
12: 2 nd camera
13: 3 rd camera
11a, 12a, 13 a: camera lens
11b, 12b, 13 b: line sensor
13 c: 3 rd camera view position
20, 20A: coaxial lighting unit
21, 21A: upside coaxial lighting
22, 22A: lower side coaxial illumination
21a, 22 a: light source
21b, 22 b: integrator
21c, 22 c: condensing lens
21d, 22 d: aperture
21e, 22 e: collimating lens
21f, 22 f: mirror
21g, 22 g: fresnel lens
21h, 22 h: half-reflecting mirror
23a, 23 b: C-PL filter
30, 30A: three-dimensional lighting part
30 a: light emitting part
30 b: luminous block
30c, 30c-1, 30 c-2: cylindrical lens
30 d: luminous block
30e, 30 e-1: biconvex lens
30g of: board
30 h: a bent part
31, 31A: region 1
31a to 31j, 31a-1 to 31 j-1: band-shaped light emitting part
32, 32A: region 2
32 a-32 i, 32 a-1-32 d-1, 32 f-1-32 h-1: band-shaped light emitting part
33, 33A: region 3
33 a-33 i, 33 a-1-33 d-1, 33 f-1-33 h-1: band-shaped light emitting part
34: frame
34 a: board
34b, 34 c: mounting member
35: air supply part
40: mounting part
40 a: roller
50: conveying part
60: side inspection part
61, 62: optical element for adjusting focal length
63, 64: reflecting mirror
63a, 64 a: reflecting surface
71: integrated circuit with a plurality of transistors
72: input unit
73: output unit
74: power supply unit
75: communication I/F
81: position detection sensor
82: position detection sensor
90: height acquisition unit
91: area light source
92: video camera
93: reflecting mirror
95: moving part
100: personal computer
101:CPU
102:RAM
103:ROM
104: input/output interface
105: communication I/F
106: medium I/F
111: input device
112: output device
113: storage medium

Claims (17)

1. An optical inspection apparatus, comprising:
a loading part for loading the object to be inspected in the horizontal direction;
a transport unit configured to move the test object mounted on the mounting unit in a transport direction;
a one-dimensional imaging unit that images the object to be inspected from above substantially vertically and is arranged such that a long side direction thereof is substantially orthogonal to the conveyance direction; and
a light irradiation unit having a plurality of light emitting units for irradiating the object with light,
the light irradiation section includes: a1 st region of a substantially semi-cylindrical surface, a central axis of which is located on a central plane that is a substantially vertical plane including the one-dimensional imaging unit; and the 2 nd area and the 3 rd area of the roughly hemispherical surface or roughly semi-elliptic sphere, form at both ends of the 1 st area,
in the 1 st region, a plurality of belt-shaped light emitting parts in which the light emitting parts are arranged in a direction substantially orthogonal to the transport direction,
the belt-shaped light emitting part is provided in a region other than the vicinity of the central plane in the 1 st region,
in the 2 nd region and the 3 rd region, the light emitting portions are arranged on the central plane.
2. Optical inspection apparatus according to claim 1,
the strip-shaped light emitting part is arranged such that an optical axis intersects an intersection line of the central surface and the upper surface of the mounting part,
the band-shaped light-emitting unit has a1 st band-shaped light-emitting unit having an optical axis forming an angle of 8 degrees or 17 degrees with the central plane.
3. Optical inspection apparatus according to claim 2,
the optical inspection device includes: and a control unit that controls the conveying unit to convey the object at a constant speed, drives the one-dimensional imaging unit to capture images at constant intervals, and, in synchronization with the capturing by the one-dimensional imaging unit, causes the 1 st light-emitting region of a half region of the 1 st region divided by the center plane, the 2 nd light-emitting region other than the 1 st light-emitting region in the 1 st region, the 3 rd light-emitting region on the center plane in the 2 nd region, the 4 th light-emitting region on the center plane in the 3 rd region, and the 1 st strip-shaped light-emitting region to irradiate, respectively.
4. An optical inspection apparatus according to any one of claims 1 to 3,
the light irradiation section includes a cylindrical lens disposed between the strip-shaped light emitting section and an intersection of the central surface and the upper surface of the mounting section.
5. Optical inspection apparatus according to claim 1,
the optical inspection device includes:
a1 st image pickup unit provided on an upper side or a lower side of the mounting portion;
a2 nd imaging unit provided on the opposite side of the 1 st imaging unit with respect to the mounting portion so that an optical axis of the 2 nd imaging unit coincides with an optical axis of the 1 st imaging unit;
a1 st coaxial illumination for irradiating the object with parallel light from a normal direction and for the 1 st imaging unit; and
a2 nd coaxial illumination which is provided on the opposite side of the mounting portion with respect to the 1 st coaxial illumination and which is coaxial illumination of the 2 nd imaging unit,
light irradiated from the 1 st coaxial illumination and regularly reflected by the object to be inspected enters the 1 st imaging unit,
light irradiated from the 2 nd coaxial illumination and regularly reflected by the object to be inspected and light irradiated from the 1 st coaxial illumination and transmitted through the object to be inspected are incident on the 2 nd imaging unit.
6. Optical inspection apparatus according to claim 5,
the optical inspection device includes: a2 nd control unit for controlling the conveying unit to convey the object at a constant speed and driving the 1 st image pickup unit and the 2 nd image pickup unit, so as to illuminate the 1 st coaxial illumination or the 2 nd coaxial illumination in three illumination modes of the 1 st mode, the 2 nd mode and the 3 rd mode, and, the image is acquired by the 1 st image pickup means in accordance with the irradiation of the 1 st aspect, the image is acquired by the 2 nd image pickup means in accordance with the irradiation of the 2 nd aspect, and the image is acquired by the 2 nd image pickup means in accordance with the irradiation of the 3 rd aspect, wherein in the 1 st mode, the 1 st coaxial illumination is illuminated at a1 st intensity, in the 2 nd mode, the 2 nd coaxial illumination is irradiated with the 1 st intensity, and in the 3 rd mode, the 1 st coaxial illumination is irradiated with the 2 nd intensity.
7. Optical inspection apparatus according to claim 1,
the optical inspection device includes:
a2 nd imaging unit provided on the opposite side of the one-dimensional imaging unit with respect to the mounting portion so that an optical axis of the one-dimensional imaging unit coincides with an optical axis of the one-dimensional imaging unit;
a1 st coaxial illumination that irradiates parallel light from a normal direction to the object to be inspected and is coaxial illumination by the one-dimensional imaging unit; and
a2 nd coaxial illumination which is provided on the opposite side of the mounting portion with respect to the 1 st coaxial illumination and which is coaxial illumination of the 2 nd imaging unit,
the light irradiation section is provided between the one-dimensional imaging unit and the transmission section,
light irradiated from the light irradiation section or the 1 st coaxial illumination and regularly reflected by the object to be inspected is incident on the one-dimensional imaging unit,
light irradiated from the 2 nd coaxial illumination and regularly reflected by the object to be inspected and light irradiated from the 1 st coaxial illumination and transmitted through the object to be inspected are incident on the 2 nd imaging unit.
8. Optical inspection apparatus according to claim 7,
the optical inspection device includes: a 3 rd control unit for controlling the conveying unit to convey the object at a constant speed and driving the one-dimensional imaging unit and the 2 nd imaging unit, so as to illuminate the 1 st coaxial illumination or the 2 nd coaxial illumination in three illumination modes of the 1 st mode, the 2 nd mode and the 3 rd mode, and, the one-dimensional imaging means acquires an image in accordance with the irradiation of the 1 st aspect, the 2 nd imaging means acquires an image in accordance with the irradiation of the 2 nd aspect, and the 2 nd imaging means acquires an image in accordance with the irradiation of the 3 rd aspect, wherein in the 1 st mode, the 1 st coaxial illumination is irradiated at a1 st intensity, in the 2 nd mode, the 2 nd coaxial illumination is irradiated with the 1 st intensity, and in the 3 rd mode, the 1 st coaxial illumination is irradiated with the 2 nd intensity.
9. Optical inspection apparatus according to claim 1 or 2,
the light irradiation section has a light diffusion plate that is provided adjacent to the light emission section and diffuses light irradiated from the light emission section.
10. Optical inspection apparatus according to claim 1 or 2,
the optical inspection device includes:
a focal length adjusting optical element that adjusts a focal length of the one-dimensional imaging unit; and
a reflector provided adjacent to the mounting portion,
the optical element for focal length adjustment and the mirror are provided on the central plane,
a mounting region, which is a region on the mounting portion on which the object to be inspected is mounted, is located below the one-dimensional imaging unit in a vertical direction in a plan view,
the reflecting mirror is provided outside the mounting area and adjacent to the mounting area in a direction substantially orthogonal to the transport direction in a plan view,
the reflective surface of the mirror is substantially planar,
the reflecting surface is provided extending substantially in the conveying direction such that a line intersecting the central plane is inclined with respect to a horizontal plane,
the optical element for focal length adjustment is disposed so as to overlap a line connecting the one-dimensional imaging unit and the mirror.
11. Optical inspection apparatus according to claim 10,
the optical element for focal length adjustment is a glass plate, and is provided so that both end surfaces substantially orthogonal to the plate thickness direction are horizontal.
12. Optical inspection apparatus according to claim 1 or 2,
the optical inspection device includes:
a moving unit that moves the one-dimensional imaging unit in an up-down direction;
a height acquisition unit for acquiring the height of the object; and
and a movement control unit that controls the movement unit based on the information acquired by the height acquisition unit, and moves the one-dimensional imaging unit in the vertical direction in accordance with a change in height of the object passing under the one-dimensional imaging unit.
13. Optical inspection apparatus according to claim 12,
the height acquisition unit includes:
a surface light source that irradiates light in a direction substantially orthogonal to the conveyance direction; and
and a side imaging unit that receives light that has been irradiated from the surface light source and passed through the object.
14. Optical inspection apparatus according to claim 1 or 2,
the light irradiation section includes, in the 2 nd region and the 3 rd region: a light emitting block in which the light emitting sections are arranged in a row; and a2 nd cylindrical lens through which light irradiated from the light emitting section passes,
in the 2 nd region and the 3 rd region, an extending direction of the light emitting block is inclined with respect to a horizontal direction,
in the 2 nd region and the 3 rd region, an extending direction of the 2 nd cylindrical lens is inclined with respect to an extending direction of the light-emitting block.
15. Optical inspection apparatus according to claim 1 or 2,
the light irradiation section has a heat radiation member formed of a material having high thermal conductivity,
the light emitting portion is provided to the heat dissipating member.
16. Optical inspection apparatus according to claim 15,
the heat radiation member is provided with an air supply part for supplying air,
the heat dissipating member has a plurality of plates on which the light emitting portions are provided,
the plate is disposed to extend in a direction substantially orthogonal to the conveying direction,
the air supply unit supplies air in a direction along the extending direction of the plate.
17. An optical inspection apparatus according to any one of claims 5 to 8,
the 2 nd image pickup unit is arranged at the lower side of the carrying part,
a circular polarization filter is arranged on the upper side of the 2 nd camera unit,
the circular polarization filter is provided such that a plane in a direction substantially orthogonal to the thickness direction is slightly inclined with respect to a direction substantially orthogonal to the optical axis of the 2 nd imaging unit.
CN201780062143.7A 2016-11-09 2017-11-08 Optical inspection device Expired - Fee Related CN109804238B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016-218616 2016-11-09
JP2016218616 2016-11-09
PCT/JP2017/040231 WO2018088423A1 (en) 2016-11-09 2017-11-08 Optical inspection device

Publications (2)

Publication Number Publication Date
CN109804238A CN109804238A (en) 2019-05-24
CN109804238B true CN109804238B (en) 2021-12-28

Family

ID=62110631

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780062143.7A Expired - Fee Related CN109804238B (en) 2016-11-09 2017-11-08 Optical inspection device

Country Status (4)

Country Link
JP (1) JP6912824B2 (en)
KR (1) KR102339677B1 (en)
CN (1) CN109804238B (en)
WO (1) WO2018088423A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108535265A (en) * 2018-04-10 2018-09-14 深圳市纳研科技有限公司 A kind of multi-angle polisher and acquisition system
JP7167641B2 (en) * 2018-11-08 2022-11-09 日本電気硝子株式会社 Work manufacturing method and work inspection method
CN110208290A (en) * 2019-06-19 2019-09-06 海南梯易易智能科技有限公司 A kind of 3D bend glass defect detecting device based on line scan camera
US11340284B2 (en) * 2019-07-23 2022-05-24 Kla Corporation Combined transmitted and reflected light imaging of internal cracks in semiconductor devices
JPWO2021090827A1 (en) * 2019-11-05 2021-05-14
JP2021096112A (en) * 2019-12-16 2021-06-24 コニカミノルタ株式会社 Inspection device for transparent body
KR102216999B1 (en) * 2020-09-28 2021-02-18 주식회사 하이브비젼 Non-Lambertian Surface Inspecting System For Line Scan
KR102535869B1 (en) * 2021-03-02 2023-05-26 주식회사 디쌤 Visual inspection assembly
WO2022245195A1 (en) * 2021-05-18 2022-11-24 엘지전자 주식회사 Thickness measurement device
TWI782695B (en) * 2021-09-06 2022-11-01 致茂電子股份有限公司 Dual sided optical detection system with fluorescence detection function
KR102368707B1 (en) * 2021-09-07 2022-02-28 주식회사 하이브비젼 Non-Lambertian Surface Inspecting System For Line Scan
TWI781840B (en) * 2021-12-02 2022-10-21 友達光電股份有限公司 A light control device, testing system and method for dark field photography
CN218726727U (en) * 2022-12-12 2023-03-24 宁德时代新能源科技股份有限公司 Appearance detection device and battery cell manufacturing equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3639837B1 (en) * 2004-03-22 2005-04-20 株式会社メガトレード Lighting device
CN2867100Y (en) * 2005-07-13 2007-02-07 王锦峰 On-line light supply generation device
CN103293162A (en) * 2013-06-17 2013-09-11 浙江大学 Lighting system and method used for dark field detection of defect in spherical optical element surface
JP5538707B2 (en) * 2008-11-19 2014-07-02 株式会社メガトレード Lighting device
CN104040287A (en) * 2012-01-05 2014-09-10 合欧米成像公司 Arrangement for optical measurements and related method
CN104897691A (en) * 2014-03-06 2015-09-09 欧姆龙株式会社 Inspection apparatus
US9250197B2 (en) * 2008-09-12 2016-02-02 Gp Inspect Gmbh Lighting device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1075051A (en) * 1996-07-05 1998-03-17 Toyota Motor Corp Visual inspection device
JP3316501B2 (en) * 2000-03-28 2002-08-19 科学技術振興事業団 Sensor head, luminance distribution measuring device including the same, and display unevenness evaluation device
JP4709375B2 (en) * 2000-12-22 2011-06-22 東芝モバイルディスプレイ株式会社 Liquid crystal display element
US7113313B2 (en) * 2001-06-04 2006-09-26 Agilent Technologies, Inc. Dome-shaped apparatus for inspecting a component or a printed circuit board device
JP2003224353A (en) * 2002-01-30 2003-08-08 Hitachi Ltd Method for mounting substrate of electronic component
JP5806808B2 (en) 2010-08-18 2015-11-10 倉敷紡績株式会社 Imaging optical inspection device
US9885671B2 (en) * 2014-06-09 2018-02-06 Kla-Tencor Corporation Miniaturized imaging apparatus for wafer edge

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3639837B1 (en) * 2004-03-22 2005-04-20 株式会社メガトレード Lighting device
CN2867100Y (en) * 2005-07-13 2007-02-07 王锦峰 On-line light supply generation device
US9250197B2 (en) * 2008-09-12 2016-02-02 Gp Inspect Gmbh Lighting device
JP5538707B2 (en) * 2008-11-19 2014-07-02 株式会社メガトレード Lighting device
CN104040287A (en) * 2012-01-05 2014-09-10 合欧米成像公司 Arrangement for optical measurements and related method
CN103293162A (en) * 2013-06-17 2013-09-11 浙江大学 Lighting system and method used for dark field detection of defect in spherical optical element surface
CN104897691A (en) * 2014-03-06 2015-09-09 欧姆龙株式会社 Inspection apparatus

Also Published As

Publication number Publication date
KR102339677B1 (en) 2021-12-14
CN109804238A (en) 2019-05-24
JP6912824B2 (en) 2021-08-04
JPWO2018088423A1 (en) 2019-10-03
KR20190082198A (en) 2019-07-09
WO2018088423A1 (en) 2018-05-17

Similar Documents

Publication Publication Date Title
CN109804238B (en) Optical inspection device
JP6628753B2 (en) Lighting device and image sensor device
US7382457B2 (en) Illumination system for material inspection
TWI557434B (en) Lighting system
CN110166702B (en) Camera and method for capturing image data
CN110248056B (en) Image inspection apparatus
JP5755144B2 (en) Work inspection device
JP2011242379A (en) Image inspection device and image forming device
CN109540899B (en) Inspection apparatus and inspection method
JP5197712B2 (en) Imaging device
CN112888936A (en) Multi-modal multiplexed illumination for optical inspection systems
JP2008004284A (en) Lighting system and object surface examination device using same
JP2006275836A (en) Substrate inspection device
JP2005098926A (en) Illuminating device
JP6175819B2 (en) Image inspection apparatus and image inspection method
WO2008102339A1 (en) Led illumination for line scan camera
KR20120086333A (en) High speed optical inspection system with adaptive focusing
JP6341821B2 (en) Appearance inspection system
JP5541646B2 (en) Line lighting device
JP2010190820A (en) Device for inspecting quality of printed matter, and method for optical arrangement of the same
JP2013081075A (en) Reading apparatus
JP2024065760A (en) Line light irradiation device and inspection system
JP2014178205A (en) Image inspection device and image inspection method
JP2014053106A (en) Illumination device and image inspection device
JP2024084488A (en) Illumination device and imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20211228