WO2022113867A1 - 解析装置、検査システム、および学習装置 - Google Patents
解析装置、検査システム、および学習装置 Download PDFInfo
- Publication number
- WO2022113867A1 WO2022113867A1 PCT/JP2021/042388 JP2021042388W WO2022113867A1 WO 2022113867 A1 WO2022113867 A1 WO 2022113867A1 JP 2021042388 W JP2021042388 W JP 2021042388W WO 2022113867 A1 WO2022113867 A1 WO 2022113867A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- inspected
- image
- area
- unit
- analysis
- Prior art date
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 48
- 238000000605 extraction Methods 0.000 claims abstract description 46
- 239000000284 extract Substances 0.000 claims abstract description 33
- 230000001678 irradiating effect Effects 0.000 claims abstract description 8
- 230000007547 defect Effects 0.000 claims description 33
- 230000002093 peripheral effect Effects 0.000 claims description 19
- 238000010801 machine learning Methods 0.000 claims description 9
- 238000013135 deep learning Methods 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 6
- 238000000034 method Methods 0.000 description 16
- 230000032258 transport Effects 0.000 description 15
- 238000004891 communication Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 238000012986 modification Methods 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 238000004381 surface treatment Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N2021/845—Objects on a conveyor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30156—Vehicle coating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/06—Recognition of objects for industrial automation
Definitions
- the present invention relates to an analysis device, an inspection system, and a learning device.
- Defects may occur on the painted surface of automobiles, etc. Development of an inspection system for detecting this defect is in progress (for example, Patent Document 1). In this inspection system, for example, defects are detected by sequentially imaging a part of an area such as an automobile while irradiating with light.
- the present invention has been made to solve such a problem. That is, it is an object of the present invention to provide an analysis device, an inspection system, and a learning apparatus capable of improving the accuracy of inspection.
- An acquisition unit that acquires image information of each of a plurality of images captured while irradiating the target with light, an irradiation region in which the target is irradiated with the light based on the image information, and the target. It includes an extraction unit that extracts the image having a predetermined relationship with the area to be inspected, and an analysis unit that analyzes the state of the area to be inspected based on the image information of the image extracted by the extraction unit. , Analytical device.
- the irradiation region has a central portion and a peripheral portion outside the central portion, and the extraction portion has the image in which the peripheral portion of the irradiation region overlaps at least a part of the inspection region.
- a plurality of the images based on the specific unit that specifies the area to be inspected in the image based on the image information acquired by the acquisition unit and the area to be inspected specified by the specific unit.
- the analysis according to (7) above further comprising a tracking unit for tracking the area to be inspected in the above-mentioned (7), wherein the extraction unit extracts the image based on the area to be inspected tracked by the tracking unit.
- a tracking unit for tracking the area to be inspected in the above-mentioned (7), wherein the extraction unit extracts the image based on the area to be inspected tracked by the tracking unit.
- the extraction unit extracts a plurality of the images in which the irradiation region and one of the inspected regions have a predetermined relationship, and the analysis unit is based on the image information of the plurality of images.
- the analysis device according to any one of (1) to (8) above, which analyzes the state of the area to be inspected.
- the trained model is pre-learned using training data of a combination of the inspected region of the image extracted by the extraction unit and the correct label of the state of the inspected region.
- the analyzer according to (10).
- a light source device that irradiates an object with light
- an image pickup device that captures an image of the object irradiated with light from the light source device
- an analysis device according to any one of (1) to (14) above. , Inspection system.
- An acquisition unit that acquires image information of each of a plurality of images captured while irradiating the target with light, an irradiation region in which the target is irradiated with the light based on the image information, and the target.
- An extraction unit that extracts the image having a predetermined relationship with the area to be inspected, and an analysis unit that analyzes the state of the area to be inspected by machine learning based on the image information of the image extracted by the extraction unit.
- a learning device including a learning unit for learning the analysis unit.
- an image in which the irradiation region and the target region to be inspected have a predetermined relationship is extracted from a plurality of images. This makes it possible to extract an image that makes it easier to analyze the state of the area to be inspected. Therefore, it is possible to improve the accuracy of the inspection.
- FIG. 3 is a side view showing an example of the positional relationship between the target area to be inspected, the light source, and the camera shown in FIG. 4 is a side view showing another example of the positional relationship between the target area to be inspected, the light source, and the camera shown in FIG. 4A.
- FIG. 4A It is a side view which shows the other example of the positional relationship of the target area to be inspected, a light source, and a camera shown in FIG. 4A.
- FIG. 4A It is a block diagram which shows an example of the schematic structure of the analysis apparatus shown in FIG.
- FIG. It is a block diagram which shows an example of the functional structure of the CPU shown in FIG.
- FIG. It is a flowchart which shows an example of the processing by the analysis apparatus shown in FIG.
- FIG. It is a block diagram which shows the functional structure of the analysis apparatus which concerns on modification 1.
- FIG. It is a block diagram which shows the functional structure of the analysis apparatus which concerns on modification 2.
- FIG. 1 is a diagram showing a schematic configuration of an inspection system 1 according to an embodiment of the present invention
- FIG. 2 shows an example of a predetermined object (object T) inspected by the inspection system 1.
- the inspection system 1 includes, for example, a transport device 100, a light source device 200, an image pickup device 300, and an analysis device 400 (FIG. 1).
- the transport device 100, the light source device 200, the image pickup device 300, and the analysis device 400 are connected to each other so as to be communicable with each other via a network such as a LAN (Local Area Network), a telephone network, or a data communication network by wire or wirelessly.
- LAN Local Area Network
- the shape of the surface of the target T is analyzed, and the surface defect is inspected.
- the target T is a vehicle body
- the inspection system 1 inspects a defect on the painted surface of the vehicle body.
- the surface of the vehicle body is subjected to surface treatment, metallic coating and clear coating, and has a multi-layer structure.
- the inspection system 1 detects a region that is a candidate for the surface defect (hereinafter referred to as an inspected region Rt) and analyzes the shape of the defect in the inspected region Rt (FIG. 2).
- FIG. 2 illustrates an inspected region Rt provided with a concave defect, but the inspected region Rt may be provided with a convex defect, or a defect of another shape is provided. May be.
- the target T may be something other than the vehicle body, and the inspection system 1 may inspect the surface other than the painted surface.
- the inspection system 1 may be used for inspecting a portion other than the surface of the target T, but the inspection system 1 can be suitably used for inspecting the surface of the target T.
- the transport device 100 transports the target T along a predetermined direction (for example, the transport direction C indicated by the arrow in FIG. 2) at a predetermined speed.
- the transport device 100 includes, for example, a mounting portion on which the target T is mounted and a driving unit for moving the mounting portion.
- the light source device 200 is for irradiating the target T with light, and has a light source 20 (FIG. 2).
- the light source 20 is, for example, a linear light source.
- a region (hereinafter referred to as an irradiation region Ri) in which the target T is irradiated with the light is formed.
- the light source 20 is fixed at a predetermined position, for example, and irradiates the target T moving from the predetermined position in the transport direction C with light. As a result, the position of the irradiation region Ri with respect to the inspected region Rt of the target T changes.
- the light source device 200 has, for example, a plurality of light sources 20, and the plurality of light sources 20 are arranged at predetermined intervals along the transport direction of the target T.
- a plurality of light sources 20 may be arranged at predetermined positions in the transport direction of the target T.
- FIG. 3 shows an image of the brightness of the irradiation region Ri formed by the light source 20.
- the irradiation region Ri has, for example, a substantially circular or substantially elliptical planar shape.
- the brightness of the irradiation region Ri is highest in the central portion (central portion Ric), and the brightness in the central portion Ric is substantially uniform.
- the brightness gradually decreases as the distance from the central Ric increases. That is, in the irradiation region Ri, the change in the brightness in the peripheral portion (peripheral portion Rie) outside the central portion Ric is larger than the change in the brightness in the central portion Ric.
- the change in illumination intensity is large in the vicinity of the periphery of the irradiation region Ri. Therefore, by imaging the peripheral portion Rie of the irradiation region Ri, preferably the region overlapping in the vicinity of the peripheral edge of the irradiation region Ri, it is possible to obtain an image similar to the captured image at various illumination intensities.
- the image pickup device 300 is for taking an image of the target T, and has a camera 30 (FIG. 2).
- the image pickup apparatus 300 has, for example, a plurality of cameras 30, and each camera 30 is fixed at a predetermined position.
- Each of the plurality of cameras 30 continuously or discontinuously images each part of the surface of the object T to be transported.
- the inspection system 1 the positions of the irradiation region Ri with respect to the target T change, and more specifically, the positions of the central portion Ric and the peripheral portion Rie with respect to the inspection region Rt of the target T change.
- the surface of the target T is imaged.
- FIGS. 4A to 4C show the positional relationship between the light source 20 (irradiation region Ri) and the camera 30 with the inspected region Rt of the target T transported along the transport direction C.
- the central portion Ric (FIG. 4A), the peripheral portion Rie (FIG. 4B), and the vicinity of the peripheral edge of the irradiation region Ri (FIG. 4C).
- the camera 30 fixed at a predetermined position captures an image of the inspected region Rt of the target T that overlaps each of the central portion Ric, the peripheral portion Rie, and the periphery of the irradiation region Ri.
- the image pickup apparatus 300 outputs a plurality of images thus captured by the camera 30.
- the analysis device 400 mainly sends and receives various information and instructions to and from the image pickup device 300.
- the analysis device 400 acquires image information of each of the plurality of images captured by the image pickup device 300, and analyzes the defects on the surface of the target T.
- the analysis device 400 is a computer such as a server and a PC.
- the analysis device 400 may be configured by a plurality of devices, or may be virtually configured as a cloud server by, for example, a large number of servers.
- FIG. 5 is a block diagram showing a schematic configuration of the analysis device 400.
- the analysis device 400 includes a CPU (Central Processing Unit) 410, a ROM (Read Only Memory) 420, a RAM (Random Access Memory) 430, a storage 440, a communication interface 450, and an operation display unit 460.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- storage 440 storage
- communication interface 450 communication interface
- operation display unit 460 operation display unit
- the CPU 410 controls each of the above configurations and performs various arithmetic processes according to the programs recorded in the ROM 420 and the storage 440.
- ROM 420 stores various programs and various data.
- RAM 430 temporarily stores programs and data as a work area.
- the storage 440 stores various programs including the operating system and various data. For example, an application for transmitting / receiving various information to / from another device and determining an analysis result to be output based on various information acquired from the other device is installed in the storage 440. Further, the storage 440 stores candidates for the analysis result to be output and information necessary for determining the analysis result to be output based on various information. When a machine learning model is used to determine the analysis result, a trained model or the like required for machine learning may be stored.
- the communication interface 450 is an interface for communicating with other devices.
- As the communication interface 450 a communication interface according to various wired or wireless standards is used.
- the operation display unit 460 is, for example, a touch panel type display, displays various information, and receives various inputs from the user.
- FIG. 6 is a block diagram showing the functional configuration of the analysis device 400.
- the CPU 410 reads the program stored in the storage 440 and executes the process, for example, as an acquisition unit 411, a specific unit 412, a tracking unit 413, an extraction unit 414, an analysis unit 415, and an output unit 416. Function.
- the acquisition unit 411 acquires the image information of each of the plurality of images captured by the image pickup device 300.
- the plurality of images captured by the image pickup apparatus 300 include a plurality of images captured while irradiating the inspected region Rt of the target T from the light source 20 with light (see FIGS. 4A to 4C).
- the identification unit 412 specifies the area to be inspected Rt in the image based on the image information acquired by the acquisition unit 411.
- the specific unit 412 may specify a plurality of inspected areas Rt.
- the identification unit 412 identifies the area to be inspected Rt in the image captured by the image pickup apparatus 300, for example, by using machine learning.
- the identification unit 412 identifies the area to be inspected Rt using, for example, a trained model. This trained model may include, for example, an image in which only a defect-free area is imaged and a non-defect label is attached, and an image in which a defect is imaged and an image with a defect label and a correct label for the position of the defect is attached. Has been learned in advance.
- the region Rt to be inspected may be specified by scanning the image using a trained model in which the method of classifying defects and non-defects has been learned.
- the specifying unit 412 may specify the area to be inspected Rt without using machine learning, or may specify the area to be inspected Rt by using, for example, a shape feature.
- the inspected region Rt is specified based on the image features derived from the luminance difference between the inspected region Rt and its surroundings.
- the tracking unit 413 tracks the inspected area Rt in a plurality of images based on the inspected area Rt specified by the specific unit 412.
- the tracking unit 413 may track each of the plurality of inspected areas Rt.
- the position of the inspected region Rt in another image is estimated based on the position of the inspected region Rt in a predetermined image and the imaging position of the image.
- Specific methods for tracking include the following methods. For example, tracking is performed by estimating the amount of movement of the area to be inspected Rt based on the speed information of the transport device 100. Alternatively, tracking may be performed on the premise that the amount of movement of the area to be inspected Rt between different images is limited.
- tracking of the inspected region Rt may be performed by performing the identification processing of the inspected region Rt again in another image with reference to the position of the inspected region Rt in the image specified by the specific unit 412. .. After estimating the amount of movement of the area to be inspected Rt based on the speed information of the transport device 100, the process of specifying the area to be inspected Rt may be further performed.
- the tracking unit 413 may cut out the area to be inspected Rt in the image.
- the extraction unit 414 extracts an image in which the irradiation region Ri and the inspected region Rt tracked by the tracking unit 413 have a predetermined relationship.
- the position of the irradiation region Ri in each image is determined, for example, based on the brightness.
- the image in which the inspected region Rt and the irradiation region Ri have a predetermined relationship is an image in which the state of the inspected region Rt can be easily analyzed in the positional relationship with the irradiation region Ri.
- the extraction unit 414 extracts, for example, an image having a predetermined non-uniformity in the brightness of the area Rt to be inspected from the image including the area Rt to be inspected. Specifically, the extraction unit 414 extracts an image in which the peripheral portion Rie of the irradiation region Ri overlaps at least a part of the inspected region Rt. Alternatively, the extraction unit 414 extracts an image in which the vicinity of the peripheral edge of the irradiation region Ri overlaps at least a part of the region Rt to be inspected.
- the relationship between the irradiated area Ri and the area to be inspected Rt is determined, for example, based on the difference between the maximum brightness and the minimum brightness of the area to be inspected Rt in the image and the average brightness.
- the extraction unit 414 extracts, for example, an image in which the difference between the maximum brightness and the minimum brightness of the area to be inspected Rt in the image is in a predetermined range, and the average brightness of the area to be inspected Rt in the image is in a predetermined range. ..
- the value in this predetermined range is determined by, for example, an experiment.
- the relationship between the irradiation region Ri and the inspected region Rt may be determined based on at least one of the variance and the histogram of the luminance in the vicinity of the inspected region Rt in the image.
- the extraction unit 414 extracts, for example, an image in which the degree of dispersion and the median of the histogram of the luminance in the vicinity of the area to be inspected Rt in the image are within a predetermined range. This predetermined range is determined by, for example, an experiment.
- the extraction unit 414 extracts a plurality of images for one inspected area Rt. By extracting a plurality of images for one inspected region Rt by the extraction unit 414, the accuracy of analysis by the analysis unit 415 can be improved.
- the analysis unit 415 analyzes the state of the area to be inspected Rt based on the image information of the image extracted by the extraction unit 414. Specifically, the analysis unit 415 analyzes the shape of the defect in the region Rt to be inspected. For example, the analysis unit 415 determines whether the defect of the inspected region Rt of the target T is a concave shape or a convex shape. The analysis unit 415 may determine the shape of the defect for each of the image information of the plurality of images extracted by the extraction unit 414, and then integrate these determination results to derive the analysis result.
- the analysis unit 415 analyzes the state of the inspected region Rt of the target T by using, for example, machine learning.
- This machine learning includes deep learning using neural networks.
- the analysis unit 415 analyzes the state of the inspected region Rt using, for example, a trained model.
- This trained model is pre-trained using, for example, training data of a combination of an image in which the defect is imaged and a correct label of the shape of the defect.
- the image used as the training data is, for example, an image extracted by the extraction unit 414.
- the output unit 416 outputs the analysis result of the state of the inspected area Rt of the target T analyzed by the analysis unit 415 by displaying it on the operation display unit 460 or the like.
- the output unit 416 may output the analysis result by transmitting it to an external device via the communication interface 450.
- FIG. 7 is a flowchart showing a procedure of processing executed by the analysis device 400.
- the process of the analysis device 400 shown in the flowchart of FIG. 7 is stored as a program in the storage 440 of the analysis device 400, and is executed by the CPU 410 controlling each unit.
- the analysis device 400 acquires image information of each of the plurality of images related to the target T captured by the image pickup device 300 (step S101). Next, the analysis device 400 identifies the inspected region Rt of the target T in the image based on the image information acquired in step S101 (step S102).
- the analysis device 400 tracks the inspected region Rt of a plurality of images based on the inspected region Rt specified in step S102 (step S103).
- the analysis device 400 extracts an image in which the area to be inspected Rt and the irradiation area Ri have a predetermined relationship based on the Rt of the area to be inspected tracked in step S103 (step S104).
- step S104 for example, an image in which the peripheral portion Rie of the irradiation region Ri overlaps at least a part of the inspection region Rt, or the vicinity of the peripheral edge of the irradiation region Ri overlaps at least a part of the inspection region Rt.
- the image is extracted.
- the analysis device 400 analyzes the state of the area to be inspected Rt based on the image information of the image extracted in step S104 (step S105). For example, the analysis device 400 analyzes the shape of the defect in the inspected region Rt and determines whether the defect in the inspected region Rt of the target T is a concave shape or a convex shape.
- the analysis device 400 analyzes the state of the inspected region Rt of the target T in the process of step S105, outputs this analysis result (step S106), and ends the process.
- the extraction unit 414 extracts an image in which the irradiation region Ri and the inspected region Rt of the target T have a predetermined relationship. This makes it possible to extract an image that makes it easier to analyze the state of the region Rt to be inspected. Hereinafter, this action and effect will be described.
- the analysis device 400 can extract only the image in which the state of the area to be inspected Rt can be easily analyzed in relation to the irradiation area Ri from all the images captured by the image pickup device 300.
- the analysis device 400 may use an image in which the brightness in the vicinity of the area to be inspected Rt has a predetermined non-uniformity, an image in which at least a part of the area to be inspected Rt is overlapped with a peripheral portion Rie of the irradiation area Ri, or an image.
- An image in which the vicinity of the periphery of the irradiation region Ri overlaps at least a part of the area to be inspected Rt is extracted.
- the analysis device 400 analyzes the state of the area to be inspected Rt based on an image that makes it easier to analyze the state of the area to be inspected Rt.
- an image in which it is difficult to analyze the state of the area Rt to be inspected is not used for analysis.
- the image in which it is difficult to analyze the state of the inspected region Rt is, for example, an image in which the entire inspected region Rt overlaps the central Ric of the irradiation region Ri, and an image in which the entire inspected region Rt is outside the irradiation region Ri. And so on.
- the analysis device 400 can improve the accuracy of the inspection as compared with the case of analyzing the state of the inspected region Rt in the state including the image in which the state of the inspected region Rt of the target T is difficult to analyze. Become.
- the inspection system 1 including such an analysis device 400 does not require complicated control of lighting conditions, various light sources, and the like, and can improve the accuracy of inspection while suppressing the cost.
- the accuracy of inspection can be effectively improved as described below.
- deep learning features are extracted from an image by a convolution operation. Therefore, if the analysis image contains an image in which it is difficult to analyze the shape of the defect, noise increases and the accuracy of the analysis tends to decrease. Therefore, by extracting an image in advance that makes it easy to analyze the state of the inspected region Rt of the target T and performing analysis by deep learning using this image, noise is reduced and the inspection accuracy is effectively improved. Can be made to.
- an image in which the irradiation region Ri and the inspected region Rt of the target T have a predetermined relationship is extracted. This makes it possible to extract an image that makes it easier to analyze the state of the region Rt to be inspected. Therefore, it is possible to improve the accuracy of the inspection of the inspection system 1.
- the analysis device 400 extracts an image based on the difference between the maximum brightness and the minimum brightness of the area Rt to be inspected and the average brightness. As a result, an image that makes it easier to analyze the shape of the region Rt to be inspected can be extracted more reliably, and the accuracy of the inspection can be further improved.
- the analysis device 400 extracts a plurality of images for one inspected region Rt and analyzes the state of one inspected region Rt based on the image information of each of the plurality of images. This makes it possible to further improve the accuracy of the analysis as compared with the case of analyzing the state of one inspected region Rt based on the image information of a single image.
- the analysis device 400 has a specific unit 412 and a tracking unit 413.
- the extraction unit 414 can extract an image based on the tracked region Rt to be inspected. Therefore, the image can be easily extracted and the accuracy of the analysis can be improved as compared with the extraction from all the images captured by the image pickup apparatus 300.
- the accuracy of the analysis can be effectively improved for the following reasons.
- the tracking unit 413 tracks each of the plurality of inspected regions Rt. Therefore, confusion between a plurality of inspected regions Rt is less likely to occur, and the accuracy of analysis can be improved.
- FIG. 8 shows an example of the functional configuration of the analysis device 400 according to the modified example 1.
- the analysis device 400 has a learning unit 417 in addition to an acquisition unit 411, a specific unit 412, a tracking unit 413, an extraction unit 414, an analysis unit 415, and an output unit 416. Except for this point, the analysis device 400 according to the first modification has the same configuration as the analysis device 400 according to the above embodiment, and has the same function and effect.
- the learning unit 417 trains the analysis unit 415. Specifically, the learning unit 417 causes the analysis unit 415 to learn the combination of the inspected region Rt in the image extracted by the extraction unit 414 and the state of the inspected region Rt. Here, the learning unit 417 learns only the image extracted by the extraction unit 414, that is, the image in which the irradiation region Ri and the inspected region Rt have a predetermined relationship. As a result, an image that makes it easier to analyze the state of the area to be inspected Rt is learned, so that noise is reduced and the learning accuracy can be improved. That is, it is possible to improve the accuracy of the analysis unit 415 learned by the learning unit 417 as compared with the case where the learning unit 417 learns by including an image that is difficult to analyze.
- the analysis device 400 may have a function as a learning device.
- a learning device may be provided separately from the analysis device 400.
- FIG. 9 shows an example of the functional configuration of the analysis device 400 according to the modified example 2.
- the analysis device 400 has an acquisition unit 411, an extraction unit 414, an analysis unit 415, and an output unit 416. That is, the analysis device 400 is not provided with a specific unit and a tracking unit (specific unit 412 and tracking unit 413 in FIG. 6). Except for this point, the analysis device 400 according to the modified example 2 has the same configuration as the analysis device 400 according to the above embodiment, and has the same function and effect.
- the extraction unit 414 extracts an image in which the inspected region Rt of the target T and the irradiation region Ri have a predetermined relationship based on the image information acquired by the acquisition unit 411. In this way, the analysis device 400 may extract an image based on the image information acquired by the acquisition unit 411.
- the configuration of the inspection system 1 described above is not limited to the above configuration, but is variously modified within the scope of the claims, as the main configuration has been described in explaining the features of the above-described embodiment and modification. be able to. Moreover, it does not exclude the configuration provided in a general inspection system.
- the target T is conveyed in a predetermined direction and the light source 20 and the camera 30 are fixed at a predetermined position.
- the target T is fixed at a predetermined position and the light source 20 or the light source 20 or The camera 30 may be moved. That is, in the inspection system 1, the target T may be imaged while any of the target T, the irradiation area Ri, and the imaging position is moving.
- the transport device 100, the light source device 200, the image pickup device 300, and the analysis device 400 may each be configured by a plurality of devices, or these may be configured as a single device.
- each configuration may be realized by other configurations.
- the light source device 200 and the image pickup device 300 may be integrated into the analysis device 400, and a part or all of the functions of the light source device 200 and the image pickup device 300 may be realized by the analysis device 400.
- the inspection system 1 analyzes the state of the inspected region Rt of the target T by machine learning
- the inspection system 1 is used for statistical processing and the like.
- the state of the inspected region Rt of the target T may be analyzed by another method.
- the means and methods for performing various processes in the above-mentioned inspection system 1 can be realized by either a dedicated hardware circuit or a programmed computer.
- the program may be provided by a computer-readable recording medium such as a USB memory or a DVD (Digital Versaille Disc) -ROM, or may be provided online via a network such as the Internet.
- the program recorded on the computer-readable recording medium is usually transferred to and stored in a storage unit such as a hard disk.
- the above program may be provided as a single application software, or may be incorporated into the software of a device such as a detection unit as a function.
- Inspection system 100 transport equipment, 200 light source device, 20 light source, 300 imager, 30 cameras, 400 analyzer, 410 CPU, 420 ROM, 430 RAM, 440 storage, 450 communication interface, 460 operation display unit, T target, Rt area to be inspected, Ri irradiation area.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Biochemistry (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Quality & Reliability (AREA)
- Signal Processing (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
[検査システムの構成]
図1は、本発明の一実施形態に係る検査システム1の概略構成を示す図であり、図2は、検査システム1により検査される所定の対象(対象T)の一例を表している。検査システム1は、たとえば、搬送装置100、光源装置200、撮像装置300および解析装置400を有している(図1)。搬送装置100、光源装置200、撮像装置300および解析装置400は、有線や無線によって、LAN(Local Area Network)、電話網またはデータ通信網等のネットワークを介して、相互に通信可能に接続されている。
図7は、解析装置400において実行される処理の手順を示すフローチャートである。図7のフローチャートに示される解析装置400の処理は、解析装置400のストレージ440にプログラムとして記憶されており、CPU410が各部を制御することによって実行される。
以上のように、本実施形態の解析装置400によれば、抽出部414により、照射領域Riと対象Tの被検査領域Rtとが所定の関係にある画像が抽出される。これにより、被検査領域Rtの状態をより解析しやすい画像を抽出することができる。以下、この作用効果について説明する。
図8は、変形例1に係る解析装置400の機能構成の一例を表している。この解析装置400は、取得部411、特定部412、追尾部413、抽出部414、解析部415および出力部416に加えて、学習部417を有している。この点を除き、変形例1に係る解析装置400は、上記実施形態に係る解析装置400と同様の構成を有しており、同様の作用効果を奏する。
図9は、変形例2に係る解析装置400の機能構成の一例を表している。この解析装置400は、取得部411、抽出部414、解析部415および出力部416を有している。即ち、この解析装置400には、特定部および追尾部(図6の特定部412および追尾部413)が設けられていない。この点を除き、変形例2に係る解析装置400は、上記実施形態に係る解析装置400と同様の構成を有しており、同様の作用効果を奏する。
100 搬送装置、
200 光源装置、
20 光源、
300 撮像装置、
30 カメラ、
400 解析装置、
410 CPU、
420 ROM、
430 RAM、
440 ストレージ、
450 通信インターフェース、
460 操作表示部、
T 対象、
Rt 被検査領域、
Ri 照射領域。
Claims (16)
- 対象に光を照射しながら撮像された複数の画像各々の画像情報を取得する取得部と、
前記画像情報に基づいて、前記対象に前記光が照射された照射領域と、前記対象の被検査領域とが所定の関係にある前記画像を抽出する抽出部と、
前記抽出部で抽出された前記画像の前記画像情報に基づいて、前記被検査領域の状態を解析する解析部と
を備える、解析装置。 - 前記照射領域は、中央部および前記中央部の外側の周辺部を有し、
前記抽出部は、前記被検査領域の少なくとも一部に前記照射領域の前記周辺部が重なっている前記画像を抽出する、請求項1に記載の解析装置。 - 前記抽出部は、前記被検査領域に前記照射領域の周縁が重なっている前記画像を抽出する、請求項1または2に記載の解析装置。
- 前記抽出部は、前記被検査領域の輝度が所定の不均一性を有する前記画像を抽出する、請求項1~3のいずれかに記載の解析装置。
- 前記抽出部は、前記被検査領域の最大輝度と最小輝度との差および平均輝度に基づいて、前記画像を抽出する、請求項1~4のいずれかに記載の解析装置。
- 前記抽出部は、前記被検査領域の輝度の分散およびヒストグラムの少なくとも一方に基づいて、前記画像を抽出する、請求項1~5のいずれかに記載の解析装置。
- 前記取得部は、前記対象、前記照射領域および撮像位置のいずれかが移動しながら撮像された複数の前記画像各々の前記画像情報を取得する、請求項1~6のいずれかに記載の解析装置。
- 前記取得部で取得された前記画像情報に基づいて、前記画像における前記被検査領域を特定する特定部と、
前記特定部により特定された前記被検査領域に基づいて、複数の前記画像における前記被検査領域を追尾する追尾部とをさらに有し、
前記抽出部は、前記追尾部により追尾された前記被検査領域に基づいて、前記画像を抽出する、請求項7に記載の解析装置。 - 前記抽出部は、前記照射領域と、一の前記被検査領域とが所定の関係にある複数の前記画像を抽出し、
前記解析部は、複数の前記画像の前記画像情報に基づいて、一の前記被検査領域の状態を解析する、請求項1~8のいずれかに記載の解析装置。 - 前記解析部は、学習済みモデルを用いて、前記被検査領域の状態を解析する、請求項1~9のいずれかに記載の解析装置。
- 前記学習済みモデルは、前記抽出部で抽出された前記画像の前記被検査領域と、当該被検査領域の状態の正解ラベルとの組み合わせの訓練データを用いて予め学習されている、請求項10に記載の解析装置。
- 前記解析部は、ディープラーニングを用いて、前記被検査領域の状態を解析する、請求項1~11のいずれかに記載の解析装置。
- 前記被検査領域は、前記対象における欠陥の候補領域であり、
前記解析部は、前記欠陥の形状を解析する、請求項1~12のいずれかに記載の解析装置。 - 前記形状は、凹形状および凸形状である、請求項13に記載の解析装置。
- 対象に光を照射する光源装置と、
前記光源装置から光が照射された前記対象を撮像する撮像装置と、
請求項1~14のいずれかに記載の解析装置と
を備える、検査システム。 - 対象に光を照射しながら撮像された複数の画像各々の画像情報を取得する取得部と、
前記画像情報に基づいて、前記対象に前記光が照射された照射領域と、前記対象の被検査領域とが所定の関係にある前記画像を抽出する抽出部と、
前記抽出部で抽出された前記画像の前記画像情報に基づいて、前記被検査領域の状態を機械学習により解析する解析部と、
前記解析部を学習させる学習部と
を備える、学習装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022562670A JP7248201B2 (ja) | 2020-11-30 | 2021-11-18 | 解析装置、検査システム、および学習装置 |
CN202180079661.6A CN116507907A (zh) | 2020-11-30 | 2021-11-18 | 分析装置、检查***及学习装置 |
US18/252,599 US20240005473A1 (en) | 2020-11-30 | 2021-11-18 | Analysis apparatus, inspection system, and learning apparatus |
EP21897841.9A EP4253943A4 (en) | 2020-11-30 | 2021-11-18 | ANALYSIS DEVICE, INSPECTION SYSTEM AND LEARNING DEVICE |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020198500 | 2020-11-30 | ||
JP2020-198500 | 2020-11-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022113867A1 true WO2022113867A1 (ja) | 2022-06-02 |
Family
ID=81754292
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/042388 WO2022113867A1 (ja) | 2020-11-30 | 2021-11-18 | 解析装置、検査システム、および学習装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240005473A1 (ja) |
EP (1) | EP4253943A4 (ja) |
JP (1) | JP7248201B2 (ja) |
CN (1) | CN116507907A (ja) |
WO (1) | WO2022113867A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000172845A (ja) | 1998-12-04 | 2000-06-23 | Suzuki Motor Corp | 表面欠陥検査装置、表面欠陥検査方法及び表面欠陥検査用プログラムを記録した記録媒体 |
US20140079311A1 (en) * | 2012-09-20 | 2014-03-20 | Applied Materials Israel Ltd. | System, method and computer program product for classification |
JP2018204063A (ja) * | 2017-05-31 | 2018-12-27 | 日立造船株式会社 | 監視装置および監視方法 |
JP2020070494A (ja) * | 2018-11-02 | 2020-05-07 | 株式会社アイヴィワークス | 薄膜蒸着工程を制御するための装置、方法及び命令を記録した記録媒体 |
JP2020198500A (ja) | 2019-05-31 | 2020-12-10 | 株式会社Pfu | 画像読取装置、制御方法及び制御プログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4726983B2 (ja) | 2009-10-30 | 2011-07-20 | 住友化学株式会社 | 欠陥検査システム、並びに、それに用いる、欠陥検査用撮影装置、欠陥検査用画像処理装置、欠陥検査用画像処理プログラム、記録媒体、および欠陥検査用画像処理方法 |
WO2016121878A1 (ja) | 2015-01-29 | 2016-08-04 | 株式会社デクシス | 光学式外観検査装置、及びこれを用いた光学式外観検査システム |
US20190096057A1 (en) * | 2017-05-11 | 2019-03-28 | Jacob Nathaniel Allen | Object inspection system and method for inspecting an object |
JP7021667B2 (ja) * | 2017-05-29 | 2022-02-17 | コニカミノルタ株式会社 | 表面欠陥検査装置および該方法 |
JP2020187657A (ja) * | 2019-05-16 | 2020-11-19 | 株式会社キーエンス | 画像検査装置 |
-
2021
- 2021-11-18 US US18/252,599 patent/US20240005473A1/en active Pending
- 2021-11-18 CN CN202180079661.6A patent/CN116507907A/zh active Pending
- 2021-11-18 JP JP2022562670A patent/JP7248201B2/ja active Active
- 2021-11-18 WO PCT/JP2021/042388 patent/WO2022113867A1/ja active Application Filing
- 2021-11-18 EP EP21897841.9A patent/EP4253943A4/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000172845A (ja) | 1998-12-04 | 2000-06-23 | Suzuki Motor Corp | 表面欠陥検査装置、表面欠陥検査方法及び表面欠陥検査用プログラムを記録した記録媒体 |
US20140079311A1 (en) * | 2012-09-20 | 2014-03-20 | Applied Materials Israel Ltd. | System, method and computer program product for classification |
JP2018204063A (ja) * | 2017-05-31 | 2018-12-27 | 日立造船株式会社 | 監視装置および監視方法 |
JP2020070494A (ja) * | 2018-11-02 | 2020-05-07 | 株式会社アイヴィワークス | 薄膜蒸着工程を制御するための装置、方法及び命令を記録した記録媒体 |
JP2020198500A (ja) | 2019-05-31 | 2020-12-10 | 株式会社Pfu | 画像読取装置、制御方法及び制御プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP4253943A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4253943A1 (en) | 2023-10-04 |
JP7248201B2 (ja) | 2023-03-29 |
US20240005473A1 (en) | 2024-01-04 |
EP4253943A4 (en) | 2024-05-01 |
CN116507907A (zh) | 2023-07-28 |
JPWO2022113867A1 (ja) | 2022-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220084183A1 (en) | Defect detection device, defect detection method, and program | |
EP3176751B1 (en) | Information processing device, information processing method, computer-readable recording medium, and inspection system | |
JP5920994B2 (ja) | マイクロプレートのウェル壁境界を識別するための方法及びシステム | |
JP2013526717A5 (ja) | ||
KR100742003B1 (ko) | 표면 결함 검사 방법 및 장치 | |
US20230053085A1 (en) | Part inspection system having generative training model | |
Tang et al. | Anomaly detection of core failures in die casting X-ray inspection images using a convolutional autoencoder | |
JP4322230B2 (ja) | 表面欠陥検査装置及び表面欠陥検査方法 | |
KR20180115645A (ko) | 2d 영상 기반의 용접 비드 인식 장치 및 그것을 이용한 그을음 제거 방법 | |
JP6812118B2 (ja) | 欠陥検出装置、欠陥検出方法およびプログラム | |
JP4318579B2 (ja) | 表面欠陥検査装置 | |
WO2022113867A1 (ja) | 解析装置、検査システム、および学習装置 | |
Castejón-Limas et al. | Texture descriptors for automatic estimation of workpiece quality in milling | |
CN115375610A (zh) | 检测方法及装置、检测设备和存储介质 | |
CN111833350A (zh) | 机器视觉检测方法与*** | |
JP2004296592A (ja) | 欠陥分類装置、欠陥分類方法およびプログラム | |
US10241000B2 (en) | Method for checking the position of characteristic points in light distributions | |
JP6811540B2 (ja) | 欠陥検出装置、欠陥検出方法およびプログラム | |
JP7469740B2 (ja) | ベルト検査システムおよびベルト検査プログラム | |
KR20190119801A (ko) | 차량 헤드라이트 얼라인먼트 보정 및 분류 방법 및 이를 이용한 차량 헤드라이트 불량검사 방법 | |
Dias et al. | Identification of marks on tires using artificial vision for quality control | |
JP4017585B2 (ja) | 塗装面の検査装置 | |
JPH109835A (ja) | 表面欠陥検査装置 | |
CN111610205A (zh) | 一种金属零部件x光图像缺陷检测装置 | |
JP2005291844A (ja) | 表面欠陥検査装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21897841 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022562670 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18252599 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180079661.6 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021897841 Country of ref document: EP Effective date: 20230630 |