CN111323369A - Optical detection device and correction method - Google Patents

Optical detection device and correction method Download PDF

Info

Publication number
CN111323369A
CN111323369A CN201811524912.3A CN201811524912A CN111323369A CN 111323369 A CN111323369 A CN 111323369A CN 201811524912 A CN201811524912 A CN 201811524912A CN 111323369 A CN111323369 A CN 111323369A
Authority
CN
China
Prior art keywords
image
light source
gray
target
scale value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811524912.3A
Other languages
Chinese (zh)
Other versions
CN111323369B (en
Inventor
刘育鑫
詹凱劭
薛名凱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
To Mao Electronics Suzhou Co ltd
Original Assignee
To Mao Electronics Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by To Mao Electronics Suzhou Co ltd filed Critical To Mao Electronics Suzhou Co ltd
Priority to CN201811524912.3A priority Critical patent/CN111323369B/en
Publication of CN111323369A publication Critical patent/CN111323369A/en
Application granted granted Critical
Publication of CN111323369B publication Critical patent/CN111323369B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Input (AREA)

Abstract

The present disclosure provides an optical inspection apparatus and a calibration method. The optical detection device comprises an image capturing device and a processor. The processor is coupled to the light source and the image capturing device. The processor is used for adjusting the light intensity of the light source irradiating the calibration object, enabling the gray scale value of at least one image block captured by the image capturing device for the calibration object to accord with the target correction value, and recording the target light intensity when the target correction value is accorded with; the processor is also used for controlling the light source to irradiate the object to be detected with the target light intensity and controlling the image capturing device to capture the image of the object to be detected; the processor is further configured to calculate a ratio of the target gray scale value to gray scale values of a plurality of pixels of the image of the object to be detected to obtain a mapping table, so that the effect of rapidly detecting whether the image is defective or not can be achieved.

Description

Optical detection device and correction method
Technical Field
The present disclosure relates to a detection apparatus and a method thereof, and more particularly, to an optical detection apparatus and a calibration method thereof.
Background
With the development of technology, image recognition technology is used to identify whether an article has a defect, which has gradually replaced manual judgment. The image sensor only needs to capture the image of the object, and the image processing technology and the algorithm can quickly give the result of whether the object has defects.
However, the problem of the automatic optical inspection technique is mostly related to the design of the algorithm or whether the image processing operation is considered, which affects the output result. Therefore, if the parameters of the image processing are controlled or carelessness occurs, the result of misjudging whether the article is defective or not may be caused.
Disclosure of Invention
This summary is provided to provide a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and is intended to neither identify key/critical elements of the embodiments nor delineate the scope of the embodiments.
According to an embodiment of the present disclosure, an optical inspection apparatus is disclosed. The optical detection device comprises an image capturing device and a processor. The processor is coupled to the light source and the image capturing device. The processor is used for adjusting the light intensity of the light source irradiating the calibration object, enabling the gray scale value of at least one block captured by the image capturing device for the calibration object to accord with the target calibration value, and recording the target light intensity when the target calibration value is accorded with; the processor is also used for controlling the light source to irradiate the object to be detected with the target light intensity and controlling the image capturing device to capture the image of the object to be detected; the processor is further configured to calculate a ratio of the target gray-scale value to gray-scale values of a plurality of pixels of the object image to obtain a mapping table.
In one embodiment, the image capturing device further includes capturing a smooth area of the object to be measured to obtain an area image, and the processor copies the area image and splices the copied area images into the object image, wherein the area image is smaller than the object image.
In an embodiment, the light source includes a red light emitting unit, a green light emitting unit, and a blue light emitting unit, wherein the processor controls the red light emitting unit, the green light emitting unit, and the blue light emitting unit to illuminate the object with the corresponding target light intensity, so that the image capturing device captures the corresponding object image for the object, wherein the object image includes a first light source image corresponding to the red light emitting unit, a second light source image corresponding to the red light emitting unit, and a third light source image corresponding to the red light emitting unit.
In one embodiment, the target gray scale values include a red target gray scale value, a green target gray scale value, and a blue target gray scale value, wherein the processor is further configured to: respectively calculating the ratio of the red light target gray-scale value to the gray-scale value of the first light source image at each pixel coordinate according to a plurality of pixel coordinates of the first light source image to obtain the corresponding mapping table; respectively calculating the ratio of the green light target gray-scale value to the gray-scale value of the second light source image at each pixel coordinate according to a plurality of pixel coordinates of the second light source image to obtain the corresponding mapping table; and respectively calculating the ratio of the blue light target gray-scale value to the gray-scale value of the third light source image at each pixel coordinate according to a plurality of pixel coordinates of the third light source image to obtain the corresponding mapping table.
In one embodiment, the processor is further configured to: controlling the light source to irradiate a detection object with the target light intensity and the image capturing device to capture the detection object to generate a detection object image; and adjusting the gray-scale values of a plurality of pixels of the detection object image according to the mapping table to make the gray-scale value of at least one image block of the detection object image accord with the target gray-scale value.
According to another embodiment, a calibration method is disclosed, comprising the steps of: adjusting the light intensity of the light source irradiating the calibration object to enable the gray scale value of at least one block captured by the image capturing device for the calibration object to accord with the target correction value, and recording the target light intensity when the target correction value is accorded with; controlling the light source to irradiate the object to be detected with the target light intensity, and controlling the image capturing device to capture an image of the object to be detected; and calculating the ratio of the target gray-scale value to the gray-scale values of the pixels of the image of the object to be measured so as to obtain the mapping table.
In one embodiment, the image capturing device captures a smooth area of the object to be measured to obtain an area image, and copies the area image and splices the copied area images to obtain the image of the object to be measured, wherein the area image is smaller than the image of the object to be measured.
In one embodiment, the light source further includes a red light emitting unit, a green light emitting unit, and a blue light emitting unit, wherein the calibration method further includes: and controlling the red light emitting unit, the green light emitting unit and the blue light emitting unit to respectively irradiate the object to be detected with the corresponding target light intensity, so that the image capturing device respectively captures corresponding images of the object to be detected, wherein the image of the object to be detected comprises a first light source image corresponding to the red light emitting unit, a second light source image corresponding to the red light emitting unit and a third light source image corresponding to the red light emitting unit.
In one embodiment, the target gray-scale values include a red target gray-scale value, a green target gray-scale value, and a blue target gray-scale value, wherein the calibration method further includes: respectively calculating the ratio of the red light target gray-scale value to the gray-scale value of the first light source image at each pixel coordinate according to a plurality of pixel coordinates of the first light source image to obtain the corresponding mapping table; respectively calculating the ratio of the green light target gray-scale value to the gray-scale value of the second light source image at each pixel coordinate according to a plurality of pixel coordinates of the second light source image to obtain the corresponding mapping table; and respectively calculating the ratio of the blue light target gray-scale value to the gray-scale value of the third light source image at each pixel coordinate according to a plurality of pixel coordinates of the third light source image to obtain the corresponding mapping table.
In one embodiment, the light source is controlled to irradiate a detection object with the target light intensity and the image capturing device captures the detection object to generate a detection object image; and adjusting the gray-scale values of a plurality of pixels of the detection object image according to the mapping table to make the gray-scale value of at least one image block of the detection object image accord with the target gray-scale value.
Drawings
FIG. 1 is a functional block diagram of an optical inspection device according to some embodiments of the present disclosure.
FIG. 2 is a flow chart illustrating steps of a calibration method according to some embodiments of the present disclosure.
Detailed Description
The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. Specific examples of components and arrangements are described below to simplify the present disclosure. Of course, these examples are merely illustrative and are not intended to be limiting. For example, forming a first feature over or on a second feature in the description that follows may include embodiments in which the first and second features are formed in direct contact, and may also include embodiments in which additional features may be formed between the first and second features such that the first and second features may not be in direct contact. Additionally, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
Further, spatially relative terms, such as "under," "below," "lower," "above," "higher," and the like, may be used herein for ease of description to describe one element or feature's relationship to another element (or elements) or feature (or features) as illustrated in the figures. Spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The apparatus may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
Generally, an optical inspection technology is used to inspect an article for defects, for example, an image is taken of the article, and then an image processing technology is used to determine pixels or related parameters on the image to determine whether there is an abnormality on the image. Therefore, whether the defective product is found can be known through the method of whether the image has abnormality.
Referring to FIG. 1, a functional block diagram of an optical inspection apparatus 100 according to some embodiments of the present disclosure is shown. The optical inspection apparatus 100 includes an image capturing device 110 and a processor 120. The image capturing device 110 is coupled to the processor 120. The image capturing device 110 is used for capturing an image of an object (not shown) to be tested to generate an image of the object to be tested. The processor 120 may control the light source 500. The light source 500 includes a red light emitting unit 510, a green light emitting unit 520, and a blue light emitting unit 530. The light source 500 is used to irradiate the object to be measured, so that the object to be measured reflects light for the image capturing device 110 to obtain an image of the object to be measured.
Referring to FIG. 2, a flowchart illustrating steps of a calibration method according to some embodiments of the present disclosure is shown. The steps of the calibration method of fig. 2 will be described below together with the elements of the optical inspection device 100 of fig. 1. As shown in fig. 2, in step S210, the processor 120 activates the light source 500 to irradiate the calibration object, and controls the image capturing device 110 to capture an image of the calibration object. The calibration object may be a gray card or a gray card with 18% gray surface for light measurement in general, and the card for obtaining accurate exposure value can be implemented in the present disclosure.
In one embodiment, the processor 120 activates the red light emitting unit 510 of the light source 500 to generate the calibration object image corresponding to the red light, the processor 120 activates the green light emitting unit 520 of the light source 500 to generate the calibration object image corresponding to the green light, and the processor 120 activates the blue light emitting unit 530 of the light source 500 to generate the calibration object image corresponding to the blue light. for simplicity of description, the calibration object image corresponding to the red light is used as an illustration, and the calibration object image corresponding to the green light and the calibration object image corresponding to the blue light are repeated, the calibration object image may be a gray scale image having a plurality of pixels. for example, if the image size of the calibration object image is 100 pixels × pixels, 10000 pixels are available, in these pixels, a gray scale value is recorded, therefore, in the image size of 100 pixels × pixels of the calibration object image, 10000 gray scale values are recorded, and for example, in the case that the image size of 100 pixels × pixels of the calibration object image is processed by the processor 120, the calibration object image size is processed by a complicated pixel processing method of processing 10, each pixel is processed by taking a pixel of a pixel in a pixel area of the processor 120 as a pixel processing unit of 3910, and each pixel of the calibration object image 120.
In one embodiment, the gray level value may vary with the light intensity of the light source 500. In step S220, the processor 120 adjusts the light intensity of the light source 500 when illuminating the calibration object, so that the gray scale value of at least one image block of the calibration object image matches the target calibration value. For example, the processor 120 controls a current value, which is related to the light intensity of the illumination light of the light source 500. The calibration object generates reflected light when being illuminated. The image capturing device 110 senses the reflected light and captures an image of the calibration object.
In one embodiment, the image capture device 110 determines whether the gray scale value of all or at least one image area (e.g., 10 pixels × 10 pixels) of the calibration object image matches the target gray scale value, if not, the current value passing through the light source 500 is continuously adjusted to control the light intensity of the light source 500 until the gray scale value of at least one image area of the image sensed by the image capture device 110 matches the target gray scale value.
In an embodiment, the processor 120 controls the light intensities of the red light emitting unit 510, the green light emitting unit 520, and the blue light emitting unit 530 of the light source 500 respectively, and determines the gray level value of the image or the at least one image block respectively until the image capturing device 110 determines that the gray level value matches the target gray level value. For simplicity, the red light emitting unit 510 is used as an illumination light source, and the green light emitting unit 520 and the blue light emitting unit 530 are repeated for brevity.
In step S230, when the gray-scale value of all or at least one image block of the calibration object image matches the target calibration value, the processor 120 records the light intensity (e.g., a current value or a Pulse Width Modulation (PWM) signal) used by the light source 500, and records the light intensity as the target light intensity. For example, the processor 120 records the light intensity used by the red light emitting unit 510 for subsequent use. In other words, the gray scale value of all or at least one image block of the calibration object image can be matched with the target calibration value by using the light intensity. In this way, each time the image capturing device 110 captures an image of the gray card, it is ensured that the image gray scale value of at least one image block is maintained at the target correction value.
In step S240, the processor 120 controls the light source 500 to irradiate an object (not shown) with a target light intensity. In one embodiment, the processor 120 controls the red light emitting unit 510, the green light emitting unit 520, and the blue light emitting unit 530 to irradiate the object with the corresponding target light intensity, so that the image capturing device 110 captures an image of the object corresponding to red light, an image of the object corresponding to green light, and an image of the object corresponding to blue light. For simplicity, the object image corresponding to red light is used as an illustration, and the object image corresponding to green light and the object image corresponding to blue light are repeated in the same manner. In one embodiment, the object to be measured may be a solar sheet (panel).
Then, the image capturing device 110 captures an image of the object. The image of the object to be measured is provided with a plurality of pixels, and each pixel has a corresponding gray-scale value. Since the pixels of the object image to be measured and the corresponding gray scale values are similar to the pixels of the corrected object image and the corresponding gray scale values, the description is omitted.
Then, in step S250, the processor 120 calculates a mapping table by calculating ratios of the target gray-scale value to gray-scale values of a plurality of pixels of the object image, as shown in the following table 1-1, taking the size of the object image as 3 × 3 pixels as an example, the gray-scale values of the plurality of pixels of the object image are T (e.g., the gray-scale value 140) according to the target correction values of a1, b1, c1, d1, e1, f1, g1, h1, i1. from left to right and from top to bottom, respectively, and then calculates the contents of the mapping table as T/a1, T/b1, T/c1, T/d1, T/e1, T/f1, T/g1, T/h1, T/i1 from left to right and from top to bottom, as shown in the following table 1-2, the disclosed values are table contents of a table representing a table of an actual gray-scale image, and the contents of the actual gray-scale image are expressed in a form, wherein the table is expressed as a table of an actual image.
TABLE 1-1 image Gray Scale values (3 × 3 pixels image)
a1 b1 c1
d1 e1 f1
g1 h1 i1
TABLE 1-2 mapping table (for 3 × 3 pixel image)
T/a1 T/b1 T/c1
T/d1 T/e1 T/f1
T/g1 T/h1 T/i1
In another embodiment, the dut image has 90 pixels × 90 pixels, and if an image block has a size of 30 pixels × 30 pixels, the image blocks are numbered B1-B9 from left to right and from top to bottom in sequence, as shown in table 2-1 below, the dut image has 9 image blocks (i.e., 3 × 3), each image block including 30 pixels × 30 pixels.
Table 2-1: numbering of image blocks
B1 B2 B3
B4 B5 B6
B7 B8 B9
Table 2-2 shows a mapping table for an image block. In this embodiment, the image block may be used to simplify the operation of the processor 120 when the gray scale values of the pixels in the image block are the same. The mapping table may be a Flat-Field Correction table (FFC table), or an FFC table.
TABLE 2-2 mapping table (for 90 × 90 pixel image)
T/B1 T/B2 T/B3
T/B4 T/B5 T/B6
T/B7 T/B8 T/B9
In some embodiments, the light source 500 illuminates the object with the red light emitting unit 510, the green light emitting unit 520, and the blue light emitting unit 530, respectively. The image capturing device 110 captures a first light source image corresponding to red light, a second light source image corresponding to green light, and a third light source image corresponding to blue light, respectively. The first light source image, the second light source image and the third light source image are gray scale images respectively. The processor 120 generates a first mapping table according to the gray scale value of the first light source image and the target gray scale value corresponding to the first light source, generates a second mapping table according to the gray scale value of the second light source image and the target gray scale value corresponding to the second light source, and generates a third mapping table according to the gray scale value of the third light source image and the target gray scale value corresponding to the third light source. The first mapping table, the second mapping table, and the third mapping table are calculated as described above, and tables 3 to 5 below are calculated results.
Table 3: the gray scale value of the first light source image, the target gray scale value and the first mapping table are as follows:
Figure BDA0001904185790000071
Figure BDA0001904185790000081
for example, the pixel coordinates of the first light source image are (1,1), (1,2), (1,3), (2,1), (2,2), … (3,3) from left to right and from top to bottom, (1,1), (1,2), (1,3), (2,2), … (3, 3). the gray level value of the pixel coordinate (1,1) in the first light source image is 10, the red target gray level value corresponding to the pixel coordinate (1,1) is 20, thus a value of 2 is obtained after 20 is divided by 10, this value is stored at the position of the first mapping table corresponding to the pixel coordinate (1,1), the gray level value of the pixel coordinate (2,2) in the first light source image is 20, the gray level value corresponding to the pixel coordinate (2,2) is stored at the position of the first mapping table corresponding to the pixel coordinate (1,1), this value is stored at the position of the pixel coordinate (2,2) in the first light source image is 20, the target gray level value of the pixel coordinate (2, thus a value of the pixel coordinate (2) is stored at the first mapping table corresponding to the red target gray level value of 20, this pixel coordinate (1, this value is stored at this mapping table, this target gray level of 20, this value is obtained after this value is stored at this mapping table, this target gray level of 20.
Table 4: the gray scale value of the second light source image, the target gray scale value and the second mapping table are as follows:
Figure BDA0001904185790000082
similar to the description of the foregoing table 3, from the above table 4, the gray-scale value of the second illuminant image at the pixel coordinate (1,1) is 20, and the target gray-scale value corresponding to the pixel coordinate (1,1) is 25, so that the value obtained after dividing 25 by 20 is 1.25, and the value is stored in the second mapping table at the position corresponding to the pixel coordinate (1, 1). The gray-scale value at pixel coordinate (2,2) in the second illuminant image is 25, and the target gray-scale value corresponding to pixel coordinate (2,2) is 25, so that the value obtained after dividing 25 by 25 is 1, and the value is stored at the position corresponding to pixel coordinate (2,2) in the second mapping table. And in the same way, calculating a second mapping table.
Table 5: the gray scale value of the third light source image, the target gray scale value and the third mapping table are as follows:
Figure BDA0001904185790000091
similar to the description of the foregoing table 3, from the above table 5, the gray-scale value at the pixel coordinate (1,1) in the third illuminant image is 25, and the target gray-scale value corresponding to the pixel coordinate (1,1) is 40, so that the value obtained after dividing 40 by 25 is 1.6, and the value is stored in the third mapping table at the position corresponding to the pixel coordinate (1, 1). The gray-scale value of the pixel coordinate (2,2) in the third illuminant image is 40, and the target gray-scale value corresponding to the pixel coordinate (2,2) is 40, so that the value obtained after dividing 40 by 40 is 1, and the value is stored in the third mapping table at the position corresponding to the pixel coordinate (2, 2).
In the pixel coordinates shown in tables 3 to 5, the red target gray scale value, the green target gray scale value and the blue target gray scale value may be different values. In fact, due to the characteristics of different objects to be measured, there are different degrees of absorption or reflection for different light waves. For example, the solar sheet is a dark blue rigid material, so blue light has a higher reflectivity when irradiated on the solar sheet than red light and green light, and thus the gray scale value (e.g. 40) of the third light source image is higher than the gray scale value (e.g. 20) of the first light source image and the gray scale value (e.g. 25) of the second light source image.
In some embodiments, the image capturing device 110 is used to capture a smooth region of the object to be measured to obtain a region image. For example, the surface of the solar cell may be slightly uneven, so that when an image of the object to be measured is generated, a smooth area can be extracted from the whole image, and then the image of the smooth area is copied to one or more areas. The processor 120 stitches the one or more area images to generate a stitched image as the object image. Therefore, the gray scale value of the generated image of the object to be measured is ensured to be uniform and close to the ideal flawless image as a whole. The size of the area image can be smaller than that of the object image to be detected.
Then, the light source 500 is activated to respectively make the red light emitting unit 510 irradiate light to the calibration object (e.g. gray card), and control the light intensity to make the gray value of the calibration object image be the first target calibration value (e.g. gray value 140), and record the target light intensity corresponding to the red light; causing the green light emitting unit 520 to irradiate the calibration object with light, and controlling the light intensity so that the gray-scale value of the calibration object image is the second target correction value, and recording the target light intensity corresponding to the green light; the blue light emitting unit 530 is made to irradiate the calibration object with light, and the light intensity is controlled such that the gray-scale value of the calibration object image is a third target correction value, and the target light intensity corresponding to the blue light is recorded, wherein the first target correction value, the second target correction value, and the third target correction value are the same value. In one embodiment, the image capturing device 110 respectively detects the image with the gray scale value of 140 when the red light emitting unit 510, the green light emitting unit 520, and the blue light emitting unit 530 illuminate, and then executes the white balance correction control procedure.
Then, in step S260, the processor 120 causes the light source 500 to illuminate a detected object (not shown) with the aforementioned target light intensity, the image capturing device 110 captures an image of the detected object, and obtains a gray level value of the image of the detected object, then, the processor 120 adjusts the gray level value of the image of the detected object with a mapping table, the light source 500 illuminates the detected object, and the image capturing device 110 captures the image of the detected object, the processor 120 multiplies the gray level value of the image of the detected object with the mapping table, for example, multiplies the image of the detected object generated by illuminating the detected object with the red light emitting unit 510 by the contents of table 3, multiplies the image of the detected object generated by illuminating the detected object with the green light emitting unit 520 by the contents of table 4, multiplies the image of the detected object generated by illuminating the detected object with the blue light emitting unit 530 by the contents of table 5, takes the image of the size of 3 × pixels as an example, if the gray level value of the image of the detected object at pixel coordinate (1,1) is a2, the gray level value a2 is multiplied by the coordinate (1,2) of the adjusted pixel coordinate of the detected object, and the gray level image of the detected object is obtained by the gray level adjustment of the center of the detected object (×).
Therefore, if it is desired to detect whether the same or similar type of article has defects (e.g., a solar panel or a circuit board of other specifications), the light intensity of the calibration object can be set first. And then the mapping table of the object to be measured is set according to the set light intensity of the object of the type. And then, adjusting the image of the same type of article by using the light intensity and the set mapping table, the image can be adjusted to the ideal gray level value 140, and the difference degree of the gray level values of the pixels of the whole image can be reduced.
In summary, compared to using only gray card to correct the exposure value, the optical inspection apparatus 100 and the correction method disclosed in the present disclosure can establish a mapping table for different characteristics of the object. Moreover, through the adjustment of the mapping table and the target light intensity of the light source 500, the gray scale values of the cross section of the whole surface or the center of the image of the object to be detected can be adjusted to present a linear distribution. Therefore, the adjusted image of the object to be detected can avoid the probability of erroneous determination of the optical detection from decreasing due to the large gray scale value difference, and improve the accuracy of detecting defects by the optical detection apparatus 100.
The foregoing outlines features of several embodiments so that those skilled in the art may better understand the aspects of the present disclosure. Those skilled in the art should appreciate that the present invention may be readily utilized as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present disclosure, and that they may make various changes, substitutions, and alterations herein without departing from the spirit and scope of the present disclosure.

Claims (10)

1. An optical inspection apparatus, comprising:
an image capturing device; and
a processor coupled to a light source and the image capturing device, wherein the processor is configured to:
adjusting the light intensity of a calibration object irradiated by the light source to make the gray scale value of at least one image block captured by the image capturing device for the calibration object accord with a target calibration value, and recording a target light intensity when the target calibration value is accorded with;
controlling the light source to irradiate an object to be detected with the target light intensity, and controlling the image capturing device to capture an image of the object to be detected; and
calculating a target gray-scale value and a ratio of gray-scale values of a plurality of pixels of the image of the object to be measured to obtain a mapping table.
2. The optical inspection apparatus of claim 1, wherein the image capturing device further comprises capturing a smooth region of the object to be inspected to obtain a region image, and the processor copies the region image and combines the copied region images into the object image, wherein the region image is smaller than the object image.
3. The optical inspection apparatus of claim 1, wherein the light source comprises a red light emitting unit, a green light emitting unit, and a blue light emitting unit, wherein the processor controls the red light emitting unit, the green light emitting unit, and the blue light emitting unit to illuminate the object with the corresponding target light intensity, so that the image capturing device captures corresponding images of the object with respect to the object, wherein the image of the object includes a first light source image corresponding to the red light emitting unit, a second light source image corresponding to the red light emitting unit, and a third light source image corresponding to the red light emitting unit.
4. The optical inspection device of claim 3, wherein the target gray scale values include a red target gray scale value, a green target gray scale value, and a blue target gray scale value, and wherein the processor is further configured to:
respectively calculating the ratio of the red light target gray-scale value to the gray-scale value of the first light source image at each pixel coordinate according to a plurality of pixel coordinates of the first light source image to obtain the corresponding mapping table;
respectively calculating the ratio of the green light target gray-scale value to the gray-scale value of the second light source image at each pixel coordinate according to a plurality of pixel coordinates of the second light source image to obtain the corresponding mapping table; and
and respectively calculating the ratio of the blue light target gray-scale value to the gray-scale value of the third light source image at each pixel coordinate according to a plurality of pixel coordinates of the third light source image to obtain the corresponding mapping table.
5. The optical inspection device of claim 1, wherein the processor is further configured to:
controlling the light source to irradiate a detection object with the target light intensity and the image capturing device to capture the detection object to generate a detection object image; and
and adjusting the gray-scale values of a plurality of pixels of the detection object image according to the mapping table to make the gray-scale value of at least one image block of the detection object image accord with the target gray-scale value.
6. A calibration method, comprising:
adjusting the light intensity of a light source irradiating a calibration object to enable the gray scale value of at least one image block captured by an image capturing device for the calibration object to accord with a target calibration value, and recording the target light intensity when the target calibration value is accorded with;
controlling the light source to irradiate an object to be detected with the target light intensity, and controlling the image capturing device to capture an image of the object to be detected; and
calculating a target gray-scale value and a ratio of gray-scale values of a plurality of pixels of the image of the object to be measured to obtain a mapping table.
7. The calibration method according to claim 6, further comprising:
the image capturing device captures a smooth area of the object to be detected to obtain an area image, copies the area image and splices the copied area images to obtain an image of the object to be detected, wherein the area image is smaller than the image of the object to be detected.
8. The calibration method of claim 6, wherein the light source further comprises a red light emitting unit, a green light emitting unit, and a blue light emitting unit, and wherein the calibration method further comprises:
and controlling the red light emitting unit, the green light emitting unit and the blue light emitting unit to respectively irradiate the object to be detected with the corresponding target light intensity, so that the image capturing device respectively captures corresponding images of the object to be detected, wherein the image of the object to be detected comprises a first light source image corresponding to the red light emitting unit, a second light source image corresponding to the red light emitting unit and a third light source image corresponding to the red light emitting unit.
9. The method of claim 8, wherein the target gray-scale values comprise a red target gray-scale value, a green target gray-scale value, and a blue target gray-scale value, and wherein the method further comprises:
respectively calculating the ratio of the red light target gray-scale value to the gray-scale value of the first light source image at each pixel coordinate according to a plurality of pixel coordinates of the first light source image to obtain the corresponding mapping table;
respectively calculating the ratio of the green light target gray-scale value to the gray-scale value of the second light source image at each pixel coordinate according to a plurality of pixel coordinates of the second light source image to obtain the corresponding mapping table; and
and respectively calculating the ratio of the blue light target gray-scale value to the gray-scale value of the third light source image at each pixel coordinate according to a plurality of pixel coordinates of the third light source image to obtain the corresponding mapping table.
10. The calibration method of claim 6, further comprising:
controlling the light source to irradiate a detection object with the target light intensity and the image capturing device to capture the detection object to generate a detection object image; and
and adjusting the gray-scale values of a plurality of pixels of the detection object image according to the mapping table to make the gray-scale value of at least one image block of the detection object image accord with the target gray-scale value.
CN201811524912.3A 2018-12-13 2018-12-13 Optical detection device and correction method Active CN111323369B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811524912.3A CN111323369B (en) 2018-12-13 2018-12-13 Optical detection device and correction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811524912.3A CN111323369B (en) 2018-12-13 2018-12-13 Optical detection device and correction method

Publications (2)

Publication Number Publication Date
CN111323369A true CN111323369A (en) 2020-06-23
CN111323369B CN111323369B (en) 2023-08-29

Family

ID=71170100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811524912.3A Active CN111323369B (en) 2018-12-13 2018-12-13 Optical detection device and correction method

Country Status (1)

Country Link
CN (1) CN111323369B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1805546A (en) * 2005-01-14 2006-07-19 Lg电子株式会社 Apparatus and method for compensating images in display device
CN101242476A (en) * 2008-03-13 2008-08-13 北京中星微电子有限公司 Automatic correction method of image color and digital camera system
CN101556381A (en) * 2008-04-10 2009-10-14 东捷科技股份有限公司 Detection device and image illumination level compensation method
CN101604509A (en) * 2008-06-13 2009-12-16 胜华科技股份有限公司 Image-displaying method
TW201621297A (en) * 2014-12-04 2016-06-16 致茂電子股份有限公司 Light source calibration detecting system and light source calibration method using the same
US20180146175A1 (en) * 2013-09-11 2018-05-24 Color Match, LLC Color measurement and calibration
WO2018159317A1 (en) * 2017-02-28 2018-09-07 日本精機株式会社 Display device, head-up display

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1805546A (en) * 2005-01-14 2006-07-19 Lg电子株式会社 Apparatus and method for compensating images in display device
CN101242476A (en) * 2008-03-13 2008-08-13 北京中星微电子有限公司 Automatic correction method of image color and digital camera system
CN101556381A (en) * 2008-04-10 2009-10-14 东捷科技股份有限公司 Detection device and image illumination level compensation method
CN101604509A (en) * 2008-06-13 2009-12-16 胜华科技股份有限公司 Image-displaying method
US20180146175A1 (en) * 2013-09-11 2018-05-24 Color Match, LLC Color measurement and calibration
TW201621297A (en) * 2014-12-04 2016-06-16 致茂電子股份有限公司 Light source calibration detecting system and light source calibration method using the same
WO2018159317A1 (en) * 2017-02-28 2018-09-07 日本精機株式会社 Display device, head-up display

Also Published As

Publication number Publication date
CN111323369B (en) 2023-08-29

Similar Documents

Publication Publication Date Title
EP3531114B1 (en) Visual inspection device and illumination condition setting method of visual inspection device
US20050207655A1 (en) Inspection system and method for providing feedback
KR20120068128A (en) Method of detecting defect in pattern and apparatus for performing the method
JP5174540B2 (en) Wood defect detection device
US20080175466A1 (en) Inspection apparatus and inspection method
TWI394428B (en) Image reading method and image reading device
JP5424659B2 (en) Inspection device for inspection object
TW502111B (en) Inspection method for foreign matters inside through hole
JP2015068668A (en) Appearance inspection device
TWI703509B (en) Optical detecting device and calibrating method
JP4675436B1 (en) Surface inspection illumination / imaging system and data structure
JP6822494B2 (en) Defect inspection equipment and defect inspection method for steel sheets
JP5424660B2 (en) Inspection device for inspection object
CN111323369A (en) Optical detection device and correction method
US20010048765A1 (en) Color characterization for inspection of a product having nonuniform color characteristics
JP2009002669A (en) Visual inspection device
JP2006266685A (en) Inspection method and inspection device for mounted substrate
CN115908328A (en) Method for detecting material placement condition in packaging box
JP2002310939A (en) Air-bubble inspection system
KR20220088154A (en) Examination apparatus and method of the same
KR20150024122A (en) Method for inspecting vehicle component
JPH0843048A (en) Method and apparatus for compensating inspection accuracy of inspecting system for appearance of article
KR102636920B1 (en) An apparatus for inspecting presence of liquid in a translucent container
JP2015155868A (en) Inspection device and inspection method
JP2010223914A (en) Defect inspection method and defect inspection device of surface of object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant