WO2023162142A1 - Image confirmation device and image confirmation method - Google Patents

Image confirmation device and image confirmation method Download PDF

Info

Publication number
WO2023162142A1
WO2023162142A1 PCT/JP2022/007896 JP2022007896W WO2023162142A1 WO 2023162142 A1 WO2023162142 A1 WO 2023162142A1 JP 2022007896 W JP2022007896 W JP 2022007896W WO 2023162142 A1 WO2023162142 A1 WO 2023162142A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
board
teacher
feature amount
component
Prior art date
Application number
PCT/JP2022/007896
Other languages
French (fr)
Japanese (ja)
Inventor
恵市 小野
勇太 横井
賢志郎 西田
智也 藤本
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2022/007896 priority Critical patent/WO2023162142A1/en
Publication of WO2023162142A1 publication Critical patent/WO2023162142A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • This specification discloses a technique related to an image confirmation device and an image confirmation method.
  • the pass/fail determination device described in Patent Document 1 includes a distribution acquisition section and a pass/fail determination section.
  • the distribution acquisition unit extracts a reference feature amount that is a feature amount of the entire image for each image of the reference image, and acquires a feature amount distribution that is a distribution of the plurality of extracted reference feature amounts.
  • the pass/fail judgment unit extracts a target feature quantity that is a feature quantity of the entire image for each image of the target image, and acquires the target image based on the degree of deviation of the target feature quantity from the feature region defined by the feature quantity distribution. Judging the quality of the work with the board when
  • a teacher image of a board product is prepared by machine learning before production, and the board product is inspected using the teacher image during production. can.
  • teacher images are suitable images. If an inappropriate training image is included, there is a possibility that the inspection accuracy of circuit board products will be degraded.
  • this specification discloses an image confirmation device and an image confirmation method that can guide an inappropriate teacher image.
  • an image confirmation device that includes an acquisition unit and a guide unit.
  • the acquisition unit acquires a plurality of teacher images, which are images used for machine learning, in which an object provided on the board is captured by a work machine that performs a predetermined work with the board.
  • An image feature quantity which is a feature quantity of the entire image, is extracted for each, and a feature quantity distribution, which is a distribution of the extracted image feature quantity, is obtained.
  • the guide unit guides the teacher images in descending order of deviation of the image feature amount in the feature amount distribution acquired by the acquisition unit.
  • the acquisition step includes obtaining teacher images, which are images used for machine learning, which are a plurality of images of an object provided on the board captured by a work machine that performs a predetermined work with the board.
  • An image feature quantity which is a feature quantity of the entire image, is extracted for each, and a feature quantity distribution, which is a distribution of the extracted image feature quantity, is obtained.
  • the guiding step guides the teacher image in descending order of deviation of the image feature quantity in the feature quantity distribution acquired by the acquiring step.
  • the image confirmation device it is equipped with an acquisition unit and a guide unit.
  • the image checking apparatus can guide the teacher images in descending order of the degree of deviation of the image feature amount in the feature amount distribution. What has been described above with respect to the image confirmation apparatus can be similarly applied to the image confirmation method.
  • FIG. 4 is a schematic diagram showing an example of a teacher image; 3 is a block diagram showing an example of control blocks of the image confirmation device; FIG. 4 is a flow chart showing an example of a control procedure by the image confirmation device; It is a schematic diagram which shows an example of feature-value distribution.
  • FIG. 11 is a schematic diagram showing an example of guidance of a teacher image by an image confirmation device;
  • Embodiment 1-1 Configuration Example of Production Line WL0
  • FIG. 1 shows an example of a production line WL0 to which the image confirmation device 50 is applied.
  • the board-related work machine WM0 performs a predetermined board-related work on the board 90.
  • the type and number of board-oriented work machines WM0 are not limited.
  • the production line WL0 of the embodiment includes a plurality (five) of board-to-board work machines WM0 including a printer WM1, a print inspection machine WM2, a component mounting machine WM3, a reflow furnace WM4, and an appearance inspection machine WM5.
  • the substrate 90 is transported in the above order by the substrate transport device.
  • the printing machine WM1 prints solder on the mounting position of the component 91 on the board 90 .
  • the print inspection machine WM2 inspects the printed state of the solder printed by the printer WM1.
  • the component mounting machine WM3 mounts a plurality of components 91 on the board 90 on which solder has been printed by the printing machine WM1.
  • the number of component mounting machines WM3 may be one or plural. When a plurality of component mounters WM3 are provided, a plurality of components 91 can be mounted by the plurality of component mounters WM3.
  • the reflow furnace WM4 heats the board 90 on which the component 91 is mounted by the component mounter WM3, melts the solder, and performs soldering.
  • the appearance inspection machine WM5 inspects the mounting state of the component 91 mounted by the component mounting machine WM3. In this manner, the production line WL0 can use a plurality of (five) board-to-board work machines WM0 to transport the boards 90 in sequence and perform production processes including inspection processes to produce board products 900. .
  • the production line WL0 may be provided with substrate-to-board work machines WM0 such as, for example, a function inspection machine, a buffer device, a substrate supply device, a substrate reversing device, a shield mounting device, an adhesive coating device, and an ultraviolet irradiation device, as necessary.
  • substrate-to-board work machines WM0 such as, for example, a function inspection machine, a buffer device, a substrate supply device, a substrate reversing device, a shield mounting device, an adhesive coating device, and an ultraviolet irradiation device, as necessary.
  • a plurality (five) of work machines WM0 for the board and the management device HC0 are communicably connected by a wired or wireless communication unit.
  • a local area network LAN
  • a local area network is configured by a plurality of (five) work machines for board WM0 and management device HC0.
  • the plurality (five) of the work machines for board WM0 can communicate with each other via the communication section.
  • the plurality (five) of the board-oriented work machines WM0 can communicate with the management device HC0 via the communication unit.
  • the management device HC0 controls a plurality (five) of work machines for boards WM0 that make up the production line WL0, and monitors the operation status of the production line WL0.
  • the management device HC0 stores various control data for controlling a plurality of (five) work machines for board WM0.
  • the management device HC0 transmits control data to each of the plurality (five) of the board-oriented work machines WM0. Further, each of the plurality (five) of the board-oriented work machines WM0 transmits the operation status and production status to the management device HC0.
  • a data server 70 is provided in the management device HC0.
  • the data server 70 can store, for example, acquired data relating to board-related work acquired by the board-related work machine WM0.
  • image data of an image captured by the board-oriented work machine WM0 is included in the acquired data.
  • a teacher image 40 which will be described later, is included in the acquired data.
  • a record (log data) of the operation status acquired by the board-oriented work machine WM0 is included in the acquired data.
  • the data server 70 can also store various production information related to the production of the board product 900.
  • part data such as information on the shape of each type of part 91, information on electrical characteristics of the part 91, and information on how to handle the part 91 are included in the production information.
  • inspection results obtained by inspection machines such as the print inspection machine WM2 and the appearance inspection machine WM5 are included in the production information.
  • the component mounting machine WM3 mounts a component 91 on a board 90 .
  • the component mounting machine WM3 of the embodiment includes a substrate conveying device 11, a component supply device 12, a component transfer device 13, a component camera 14, a substrate camera 15 and a control device 16.
  • the substrate transport device 11 is configured by, for example, a belt conveyor, etc., and transports the substrate 90 in the transport direction (X-axis direction).
  • the substrate 90 is a circuit board on which electronic circuits, electric circuits, magnetic circuits, and the like are formed.
  • the board transfer device 11 carries the board 90 into the component mounting machine WM3 and positions the board 90 at a predetermined position inside the machine. After the component mounting machine WM3 completes the mounting process of the component 91, the board transfer device 11 carries the board 90 out of the component mounting machine WM3.
  • the component supply device 12 supplies components 91 to be mounted on the board 90 .
  • the component supply device 12 includes a feeder 12a provided along the transport direction (X-axis direction) of the substrate 90 .
  • the feeder 12a pitch-feeds a carrier tape containing a plurality of components 91, and supplies the components 91 so as to be picked up at a supply position located on the leading end side of the feeder 12a.
  • the component supply device 12 can also supply relatively large electronic components (for example, lead components) compared to chip components in a state of being arranged on a tray.
  • the component transfer device 13 includes a head driving device 13a and a moving table 13b.
  • the head driving device 13a is configured such that a moving table 13b can be moved in the X-axis direction and the Y-axis direction (a direction perpendicular to the X-axis direction in the horizontal plane) by a linear motion mechanism.
  • a mounting head 20 is detachably (exchangeably) provided on the moving table 13b by a clamp member.
  • the mounting head 20 uses at least one holding member 30 to pick up and hold the component 91 supplied by the component supply device 12 , and mounts the component 91 on the substrate 90 positioned by the substrate transfer device 11 .
  • a suction nozzle, a chuck, or the like can be used as the holding member 30 .
  • the component camera 14 is fixed to the base of the component mounting machine WM3 so that the optical axis faces upward in the Z-axis direction (vertical direction orthogonal to the X-axis direction and the Y-axis direction).
  • the component camera 14 can image the component 91 held by the holding member 30 from below.
  • the substrate camera 15 is provided on the moving table 13b of the component transfer device 13 so that the optical axis faces downward in the Z-axis direction.
  • the substrate camera 15 can image the substrate 90 from above.
  • the component camera 14 and the substrate camera 15 can use known imaging devices, and perform imaging based on control signals sent from the control device 16 . Image data of images captured by the component camera 14 and the board camera 15 are transmitted to the control device 16 .
  • the control device 16 includes a known arithmetic device and storage device, and constitutes a control circuit. Information, image data, and the like output from various sensors provided in the component mounting machine WM3 are input to the control device 16 .
  • the control device 16 sends a control signal to each device based on the control program and predetermined wearing conditions set in advance.
  • control device 16 causes the substrate camera 15 to image the substrate 90 positioned by the substrate transport device 11 .
  • the control device 16 processes the image captured by the board camera 15 and recognizes the positioning state of the board 90 .
  • the control device 16 causes the holding member 30 to collect and hold the component 91 supplied by the component supply device 12 , and causes the component camera 14 to image the component 91 held by the holding member 30 .
  • the control device 16 performs image processing on the image captured by the component camera 14 to recognize the presence/absence of the component 91, the suitability of the component 91, the holding posture of the component 91, and the like.
  • the control device 16 moves the holding member 30 upward from the intended mounting position preset by the control program or the like. Further, the control device 16 corrects the planned mounting position based on the positioning state of the substrate 90, the holding attitude of the component 91, and the like, and sets the mounting position where the component 91 is actually mounted.
  • the planned mounting position and mounting position include the position (X-axis coordinate and Y-axis coordinate) as well as the rotation angle.
  • the control device 16 corrects the target position (X-axis coordinate and Y-axis coordinate) and rotation angle of the holding member 30 according to the mounting position.
  • the controller 16 lowers the holding member 30 at the corrected rotation angle at the corrected target position to mount the component 91 on the substrate 90 .
  • the control device 16 repeats the pick-and-place cycle described above to perform a mounting process of mounting a plurality of components 91 on the board 90 .
  • FIG. 3 shows an example of the teacher image 40. As shown in FIG. In the teacher image 40 shown in the figure, one component 91 (an object 91t provided on the board 90) among a plurality of components 91 mounted on the board 90 by the component mounter WM3 is imaged.
  • the teacher image 40 shown in the figure can be captured from above the board 90 by an imaging device 80 such as a camera provided outside the board camera 15, the visual inspection machine WM5, and the work machine for board WM0.
  • an imaging device 80 such as a camera provided outside the board camera 15, the visual inspection machine WM5, and the work machine for board WM0.
  • the operator determines whether the image captured by the imaging device 80 is appropriate as the teacher image 40 .
  • the operator may mistakenly register an inappropriate image as the teacher image 40 (for example, an image in which the appropriate component 91 is not properly mounted on the predetermined area 90t of the substrate 90). be.
  • an image confirmation device 50 is provided.
  • the image confirmation device 50 guides inappropriate teacher images.
  • the image confirmation device 50 includes an acquisition unit 51 and a guide unit 52 as control blocks.
  • Acquisition unit 51 and guide unit 52 can be provided in various arithmetic devices, control devices, and the like.
  • at least one of the acquiring unit 51 and the guiding unit 52 can be provided in the management device HC0.
  • At least one of the acquisition unit 51 and the guide unit 52 can also be formed on the cloud.
  • the image confirmation device 50 of the embodiment is provided in the management device HC0. Further, the image confirmation device 50 of the embodiment executes control according to the flowchart shown in FIG. Acquisition unit 51 performs the determination and processing shown in steps S11 and S12. The guidance unit 52 performs the processing shown in step S13.
  • the acquisition unit 51 extracts an image feature quantity BF1 that is a feature quantity of the entire image for each of the teacher images 40, and acquires a feature quantity distribution FD1 that is the distribution of the extracted image feature quantity BF1.
  • the teacher images 40 are a plurality of images in which an object 91t provided on the board 90 is captured by the board-working machine WM0 that performs a predetermined work on the board 90, and are used for machine learning. say.
  • the image shown in FIG. 3 is included in the teacher image 40.
  • the board-oriented work machine WM0 is a component mounting machine WM3 that mounts the component 91 on the board 90, and the component 91 mounted on the board 90 is included in the object 91t.
  • the teacher image 40 can be used for various known machine learning.
  • the teacher image 40 can be used for various machine learning such as support vector machine and regression analysis. Note that, in the embodiment, the teacher image 40 is stored in the data server 70 shown in FIG.
  • a component 91 mounted on a substrate 90 is imaged in the teacher image 40 shown in FIG.
  • the component 91 is a chip component such as a chip resistor or a chip capacitor
  • the component 91 includes an electrode region AR11 and an electrode region AR12, which are electrode regions, and a body region AR13, which is a body region.
  • the electrode area AR11 and the electrode area AR12 are silver (metallic). Further, the main body area AR13 on the front side of the component 91 (the side visible when the component 91 is properly mounted on the substrate 90) is black, and the main body area AR13 on the back (bottom) side of the component 91 is black. , is assumed to be white. When the component 91 is properly attached to the substrate 90, the imaging device 80 images the electrode area AR11 and the electrode area AR12 (silver) and the main body area AR13 (black) on the surface side of the component 91.
  • the image pickup device 80 is configured to hold the electrodes.
  • An area AR11, an electrode area AR12 (silver), and a body area AR13 (white) on the back side of the component 91 are imaged.
  • the difference in brightness between the electrode regions AR11 and AR12 and the main body region AR13 of the component 91 is It is smaller than when the component 91 is properly attached to the board 90 .
  • an image inappropriate as the teacher image 40 (for example, an image in which the appropriate component 91 is not properly mounted on the predetermined region 90t of the substrate 90) is an image in which the component 91 is properly mounted on the substrate 90.
  • the acquiring unit 51 extracts the image feature quantity BF1 that is the feature quantity of the entire image for each of the teacher images 40, and acquires the feature quantity distribution FD1 that is the distribution of the extracted image feature quantity BF1.
  • the acquisition unit 51 can take various forms as long as it can acquire the feature quantity distribution FD1.
  • the acquiring unit 51 can acquire the feature quantity distribution FD1 by, for example, a method known in multivariate analysis (for example, principal component analysis).
  • FIG. 6 shows an example of the feature quantity distribution FD1.
  • the figure shows an example of the feature quantity distribution FD1 when the brightness of a plurality of pixels forming the teacher image 40 shown in FIG. 3 is the image feature quantity BF1.
  • a plurality of points shown in the feature region FR1 of FIG. 6 schematically show the image feature amount BF1 of the teacher image 40.
  • the feature quantity distribution FD1 can be represented by a two-dimensional feature region FR1.
  • the feature quantity distribution FD1 can also be represented by a three-dimensional or higher feature region FR1.
  • the feature region FR1 indicates the outer edges of the plurality of image feature amounts BF1, and is also called a unit space.
  • the predetermined area 90t corresponds to the target area on the substrate 90 where the component 91 is to be mounted.
  • the target area 91s corresponds to a mounting area on the substrate 90 where the component 91 is actually mounted.
  • the teacher image 40 in which the substrate 90 is provided with an object 91t different from the object 91t that should be provided on the substrate 90 is inappropriate.
  • the object 91t to be provided on the board 90 corresponds to the component 91 to be mounted on the board 90.
  • the area of the target region 91 s may increase or decrease compared to the case where the target object 91 t to be provided is provided on the substrate 90 .
  • the index (for example, roundness) indicating the shape of the target object 91t may increase or decrease compared to the case where the target object 91t to be provided is provided on the substrate 90.
  • the square component 91 has a higher degree of circularity than the rectangular component 91, and is one of the indicators for determining whether or not it is the object 91t to be provided on the substrate 90.
  • the image feature amount BF1 indicates the luminance of a plurality of pixels forming an image, the center of gravity of the target area 91s on which the target object 91t is provided on the substrate 90, the area of the target area 91s, and the shape of the target object 91t.
  • it is at least one of the indicators.
  • the acquisition unit 51 can extract the at least one image feature amount BF1 to acquire the feature amount distribution FD1.
  • the acquisition unit 51 can acquire the feature quantity distribution FD1 including the image feature quantity BF1 for checking the degree of deviation.
  • the acquiring unit 51 can also acquire the feature quantity distribution FD1 by excluding the image feature quantity BF1 for checking the degree of deviation.
  • the acquiring unit 51 can also acquire the feature quantity distribution FD1 including the image feature quantity BF1 for checking the degree of deviation.
  • the acquisition unit 51 can also acquire the feature distribution FD1 by excluding the image feature BF1 for checking the degree of deviation.
  • the main body area AR13 shown in FIG. 3 is black (luminance value is 0).
  • the main body area AR13 is white (for example, one pixel is represented by 8-bit information). In this case, the luminance value is 255).
  • a small number of teacher images 40 with a white main body area AR13 (with a luminance value of 255) are included.
  • a small number of teacher images 40 in which the main body area AR13 is white (with a luminance value of 255) have a conspicuous degree of deviation in the image feature amount BF1.
  • the degree of deviation of the image feature quantity BF1 can be represented by the Mahalanobis distance MD1.
  • the Mahalanobis distance MD1 is the distance from the center of the feature region FR1 to the image feature quantity BF1.
  • the image feature quantities BF1 of a small number of teacher images 40 in which the main body region AR13 is white (with a luminance value of 255) are likely to be positioned at the outer edges of the plurality of image feature quantities BF1 in the feature quantity distribution FD1, and the Mahalanobis distance MD1 is notable. growing. What has been described above regarding luminance can be similarly applied to other image feature amounts BF1.
  • the guiding unit 52 guides the teacher image 40 in descending order of the deviation degree of the image feature quantity BF1 in the feature quantity distribution FD1 acquired by the acquiring unit 51.
  • the guiding unit 52 may take various forms as long as it can guide the teacher images 40 in descending order of the degree of deviation of the image feature quantity BF1.
  • the guiding unit 52 can guide the teacher images 40 in descending order of the Mahalanobis distance MD1.
  • the guidance unit 52 can guide the teacher image 40 by various methods such as displaying the teacher image 40 and voice guidance of information (for example, file name of image data) that can specify the teacher image 40 .
  • the guidance unit 52 causes the display device 60 to display the teacher images 40 in descending order of the degree of deviation of the image feature amount BF1.
  • the guide unit 52 causes the display device 60 to display the teacher images 40 in descending order of the Mahalanobis distance MD1.
  • the display device 60 only needs to be able to display the teacher image 40 as described above, and various known display devices can be used.
  • FIG. 7 shows an example of guidance for the teacher image 40 by the image confirmation device 50.
  • the guide unit 52 displays the teacher images 40 so that the teacher images 40 having the largest deviation of the image feature amount BF1 (the Mahalanobis distance MD1 is large) are displayed from the upper position to the lower position on the display screen. . Thereby, the operator can check the teacher images 40 on the display device 60 in order from the teacher images 40 that are most likely to be inappropriate.
  • the image feature amount BF1 the Mahalanobis distance MD1 is large
  • the guidance unit 52 can allow the operator to select a teacher image 40 that is not used for machine learning on the display screen of the display device 60.
  • the display screen shown in the figure is configured by a touch panel. The operator selects the teacher image 40 by touching the teacher image 40 not used for machine learning on the display screen. Then, the operator can touch the operation part (operation part BA1 to operation part BA4 in the figure) corresponding to the selected teacher image 40 to exclude the teacher image 40 that is not used for machine learning.
  • the operator touches the first teacher image 40 to 40 is selected and the operation part BA1 is touched.
  • the first teacher image 40 is no longer used for machine learning.
  • a teacher image 40 that is not used for machine learning can be deleted from the data server 70 and reflected in the board-oriented work machine WM0 that uses the teacher image 40 .
  • the operator can select all the teacher images 40 displayed on the display screen by touching the operation part BB1. Further, the operator can cancel the selection of the teacher image 40 by touching the operation part BB2. Furthermore, the operator can display the next display screen (the teacher image 40 whose degree of deviation of the image feature quantity BF1 is smaller than the currently displayed teacher image 40) by touching the operation part BC1. By touching the operation part BC2, the operator can display the previous display screen (the teacher image 40 whose degree of deviation of the image feature quantity BF1 is greater than the currently displayed teacher image 40).
  • the guide unit 52 determines the degree of deviation of the image feature quantity BF1, the inspection result of the board product 900 produced by the board-oriented work machine WM0, and at least the degree of deviation of the image feature quantity BF1 among the image information related to the teacher image 40. can be guided together with the teacher image 40 .
  • the inspection result of the board product 900 can be obtained from an inspection machine such as the print inspection machine WM2 and the visual inspection machine WM5.
  • the inspection results of the board product 900 can also be obtained from the board-to-board working machines WM0 such as the printing machine WM1 and the component mounting machine WM3, for example.
  • the image information regarding the teacher image 40 is not limited.
  • the image information about the teacher image 40 includes information about the date and time when the teacher image 40 was acquired, information about the imaging device 80, imaging conditions (for example, exposure time, aperture value, and light source when the imaging device 80 captured the teacher image 40). It may contain various information such as information on type, irradiation direction of light).
  • the degree of deviation (Mahalanobis distance MD1) of the image feature amount BF1 of the first teacher image 40 is indicated by the distance L1.
  • the inspection result of the circuit board product 900 for which the first teaching image 40 has been picked up is indicated by an OK result indicating that it is a non-defective product.
  • the date and time when the first teacher image 40 was acquired is indicated by date and time T1. What has been described above for the first teacher image 40 can be similarly applied to the second and subsequent teacher images 40 .
  • the image confirmation device 50 can acquire the feature quantity distribution FD1 by the acquisition unit 51 and guide the teacher image 40 by the guide unit 52 in various situations.
  • the image confirmation device 50 acquires the feature quantity distribution FD1 through the acquiring unit 51 and guides the teacher image 40 through the guiding unit 52 in at least one situation of before, during, and after the production of the substrate product 900. can do.
  • the teacher image 40 used to inspect the board product 900 may be inappropriate. Therefore, when the board product 900 produced by the board-to-board work machine WM0 is non-defective and the inspection result of the board product 900 by machine learning is defective, the acquisition unit 51 obtains the teacher image used for the inspection of the board product 900. 40, the feature quantity distribution FD1 can be obtained. Further, the guiding unit 52 can guide the teacher image 40 (steps S11 to S13 shown in FIG. 5).
  • the acquiring unit 51 determines whether the substrate product 900 is a non-defective product and whether the inspection result of the substrate product 900 by machine learning is defective (step S11). For example, when the inspection result of the board product 900 by machine learning is defective, the operator can determine whether the board product 900 is a non-defective product. It is possible to input the determination result in the device that performed the inspection, the device that performed the inspection, or the like. Also, an inspection machine that does not use machine learning can determine whether or not the product board 900 is a non-defective product when the inspection result of the product board 900 by machine learning is defective.
  • the acquiring unit 51 acquires the feature quantity distribution FD1 for the teacher image 40 used in the inspection of the circuit board product 900 (step S12). Then, the guide unit 52 guides the teacher images 40 in descending order of the degree of deviation of the image feature quantity BF1 in the feature quantity distribution FD1 acquired by the acquisition unit 51 (step S13).
  • the operator confirms the teacher images 40 guided by the guide unit 52 and selects the teacher images 40 that are not used for machine learning.
  • the operator refers to the degree of deviation of the image feature quantity BF1, the inspection result of the circuit board product 900, and the image information related to the teacher image 40, and selects the teacher image 40 that is not used for machine learning. can be selected.
  • the control by the image confirmation device 50 is once terminated. If the above condition is not satisfied (No in step S11), the control by the image confirmation device 50 is temporarily terminated without executing the processing shown in steps S12 and S13.
  • the board-facing work machine WM0 is a component mounting machine WM3 that mounts the component 91, which is the target object 91t, on the board 90.
  • the inspection of the substrate product 900 is a component presence/absence inspection for inspecting whether or not the component 91 is mounted on the predetermined area 90t of the substrate 90 by the component mounting machine WM3. Therefore, the guide section 52 can guide the teacher image 40 in which the appropriate component 91 is not properly attached to the predetermined area 90 t of the substrate 90 .
  • the teacher image 40 is shown for one component 91 among the plurality of components 91 mounted on the board 90 by the component mounting machine WM3.
  • the guiding unit 52 can similarly guide the teacher image 40 for other parts 91 as well.
  • the degree of deviation of the image feature quantity BF1 is represented by the Mahalanobis distance MD1.
  • the degree of deviation of the image feature amount BF1 can also be represented by a method other than the Mahalanobis distance MD1.
  • the outlier degree of the image feature quantity BF1 can also be represented by outlier information such as the Smirnov-Grubbs test.
  • the board-facing work machine WM0 is not limited to the component mounting machine WM3.
  • the board-to-board working machine WM0 may be a printing machine WM1 that prints solder on the board 90 .
  • the solder printed on the board 90 is included in the object 91t.
  • a predetermined area 90t on the substrate 90 where the object 91t is to be provided corresponds to a target area on the substrate 90 on which solder is to be printed.
  • a target region 91s on which the target object 91t is provided on the substrate 90 corresponds to a printing region on the substrate 90 where solder is actually printed.
  • the inspection of the board product 900 is not limited to the component presence/absence inspection.
  • the inspection of the board product 900 may be a solder presence/absence inspection.
  • the inspection of the substrate product 900 is performed by inspecting the deviation between the center of gravity of the predetermined area 90t on the substrate 90 where the target object 91t is to be provided and the center of gravity of the target area 91s on the substrate 90 where the target object 91t is provided. It may be an inspection.
  • the image confirmation method includes an acquisition step and a guidance step.
  • the obtaining step corresponds to control performed by the obtaining unit 51 .
  • the guiding process corresponds to control performed by the guiding section 52 .
  • the image checking device 50 the acquiring section 51 and the guiding section 52 are provided.
  • the image confirmation device 50 can guide the teacher images 40 in descending order of the degree of deviation of the image feature quantity BF1 in the feature quantity distribution FD1. What has been described above with respect to the image confirmation device 50 also applies to the image confirmation method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

This image confirmation device comprises an acquiring unit and a guiding unit. The acquiring unit extracts image feature amounts that are feature amounts of an entire image for each of teaching images that are a plurality of images which are used for machine learning and in which is imaged an object provided on a substrate by a substrate working machine that performs predetermined substrate working on a substrate, and acquires a feature amount distribution that is a distribution of the extracted feature amounts. The guiding unit guides the teaching images in order of the largest degree of deviation of the image feature amounts in the feature amount distribution acquired by the acquiring unit.

Description

画像確認装置および画像確認方法Image confirmation device and image confirmation method
 本明細書は、画像確認装置および画像確認方法に関する技術を開示する。 This specification discloses a technique related to an image confirmation device and an image confirmation method.
 特許文献1に記載の良否判定装置は、分布取得部と、良否判断部とを備えている。分布取得部は、基準画像の各画像について画像全体の特徴量である基準特徴量を抽出して、抽出された複数の基準特徴量の分布である特徴量分布を取得する。良否判断部は、対象画像の各画像について画像全体の特徴量である対象特徴量を抽出して、特徴量分布によって規定される特徴領域に対する対象特徴量の外れ度合いに基づいて、対象画像を取得したときの対基板作業の良否を判断する。 The pass/fail determination device described in Patent Document 1 includes a distribution acquisition section and a pass/fail determination section. The distribution acquisition unit extracts a reference feature amount that is a feature amount of the entire image for each image of the reference image, and acquires a feature amount distribution that is a distribution of the plurality of extracted reference feature amounts. The pass/fail judgment unit extracts a target feature quantity that is a feature quantity of the entire image for each image of the target image, and acquires the target image based on the degree of deviation of the target feature quantity from the feature region defined by the feature quantity distribution. Judging the quality of the work with the board when
国際公開第2020/183734号WO2020/183734
 特許文献1に記載の良否判定装置のように、対基板作業機では、生産前に機械学習によって基板製品の教師画像を用意しておき、生産時に教師画像を用いて基板製品を検査することができる。しかしながら、全ての教師画像が適切な画像であるとは限らない。不適切な教師画像が含まれていると、基板製品の検査精度が低下する可能性がある。また、教師画像は、大量に存在し、作業者が一枚ずつ教師画像を確認して、不適切な教師画像を発見することは困難である。 Like the pass/fail judgment device described in Patent Document 1, in a board-to-board work machine, a teacher image of a board product is prepared by machine learning before production, and the board product is inspected using the teacher image during production. can. However, not all teacher images are suitable images. If an inappropriate training image is included, there is a possibility that the inspection accuracy of circuit board products will be degraded. Moreover, there are a large number of training images, and it is difficult for the operator to check the training images one by one and find an inappropriate training image.
 このような事情に鑑みて、本明細書は、不適切な教師画像を案内可能な画像確認装置および画像確認方法を開示する。 In view of such circumstances, this specification discloses an image confirmation device and an image confirmation method that can guide an inappropriate teacher image.
 本明細書は、取得部と、案内部とを備える画像確認装置を開示する。前記取得部は、基板に所定の対基板作業を行う対基板作業機によって前記基板に設けられた対象物が撮像されている複数の画像であって機械学習に使用される画像である教師画像の各々について画像全体の特徴量である画像特徴量を抽出して、抽出された前記画像特徴量の分布である特徴量分布を取得する。前記案内部は、前記取得部によって取得された前記特徴量分布において前記画像特徴量の外れ度が大きい順に前記教師画像を案内する。 This specification discloses an image confirmation device that includes an acquisition unit and a guide unit. The acquisition unit acquires a plurality of teacher images, which are images used for machine learning, in which an object provided on the board is captured by a work machine that performs a predetermined work with the board. An image feature quantity, which is a feature quantity of the entire image, is extracted for each, and a feature quantity distribution, which is a distribution of the extracted image feature quantity, is obtained. The guide unit guides the teacher images in descending order of deviation of the image feature amount in the feature amount distribution acquired by the acquisition unit.
 また、本明細書は、取得工程と、案内工程とを備える画像確認方法を開示する。前記取得工程は、基板に所定の対基板作業を行う対基板作業機によって前記基板に設けられた対象物が撮像されている複数の画像であって機械学習に使用される画像である教師画像の各々について画像全体の特徴量である画像特徴量を抽出して、抽出された前記画像特徴量の分布である特徴量分布を取得する。前記案内工程は、前記取得工程によって取得された前記特徴量分布において前記画像特徴量の外れ度が大きい順に前記教師画像を案内する。 This specification also discloses an image confirmation method including an acquisition process and a guidance process. The acquisition step includes obtaining teacher images, which are images used for machine learning, which are a plurality of images of an object provided on the board captured by a work machine that performs a predetermined work with the board. An image feature quantity, which is a feature quantity of the entire image, is extracted for each, and a feature quantity distribution, which is a distribution of the extracted image feature quantity, is obtained. The guiding step guides the teacher image in descending order of deviation of the image feature quantity in the feature quantity distribution acquired by the acquiring step.
 上記の画像確認装置によれば、取得部および案内部を備えている。これにより、画像確認装置は、特徴量分布において画像特徴量の外れ度が大きい順に教師画像を案内することができる。画像確認装置について上述されていることは、画像確認方法についても同様に言える。  According to the above image confirmation device, it is equipped with an acquisition unit and a guide unit. As a result, the image checking apparatus can guide the teacher images in descending order of the degree of deviation of the image feature amount in the feature amount distribution. What has been described above with respect to the image confirmation apparatus can be similarly applied to the image confirmation method.
生産ラインの一例を示す構成図である。It is a lineblock diagram showing an example of a production line. 部品装着機の一例を示す平面図である。It is a top view which shows an example of a component mounting machine. 教師画像の一例を示す模式図である。FIG. 4 is a schematic diagram showing an example of a teacher image; 画像確認装置の制御ブロックの一例を示すブロック図である。3 is a block diagram showing an example of control blocks of the image confirmation device; FIG. 画像確認装置による制御手順の一例を示すフローチャートである。4 is a flow chart showing an example of a control procedure by the image confirmation device; 特徴量分布の一例を示す模式図である。It is a schematic diagram which shows an example of feature-value distribution. 画像確認装置による教師画像の案内例を示す模式図である。FIG. 11 is a schematic diagram showing an example of guidance of a teacher image by an image confirmation device;
 1.実施形態
 1-1.生産ラインWL0の構成例
 図1は、画像確認装置50が適用される生産ラインWL0の一例を示している。生産ラインWL0では、対基板作業機WM0が基板90に所定の対基板作業を行う。対基板作業機WM0の種類および数は、限定されない。図1に示すように、実施形態の生産ラインWL0は、印刷機WM1、印刷検査機WM2、部品装着機WM3、リフロー炉WM4および外観検査機WM5の複数(5つ)の対基板作業機WM0を備えており、基板90は、基板搬送装置によって、上記の順に搬送される。
1. Embodiment 1-1. Configuration Example of Production Line WL0 FIG. 1 shows an example of a production line WL0 to which the image confirmation device 50 is applied. In the production line WL0, the board-related work machine WM0 performs a predetermined board-related work on the board 90. As shown in FIG. The type and number of board-oriented work machines WM0 are not limited. As shown in FIG. 1, the production line WL0 of the embodiment includes a plurality (five) of board-to-board work machines WM0 including a printer WM1, a print inspection machine WM2, a component mounting machine WM3, a reflow furnace WM4, and an appearance inspection machine WM5. The substrate 90 is transported in the above order by the substrate transport device.
 印刷機WM1は、基板90の部品91の装着位置に、はんだを印刷する。印刷検査機WM2は、印刷機WM1によって印刷されたはんだの印刷状態を検査する。部品装着機WM3は、印刷機WM1によってはんだが印刷された基板90に複数の部品91を装着する。部品装着機WM3は、一つであっても良く、複数であっても良い。部品装着機WM3が複数設けられる場合は、複数の部品装着機WM3が分担して、複数の部品91を装着することができる。 The printing machine WM1 prints solder on the mounting position of the component 91 on the board 90 . The print inspection machine WM2 inspects the printed state of the solder printed by the printer WM1. The component mounting machine WM3 mounts a plurality of components 91 on the board 90 on which solder has been printed by the printing machine WM1. The number of component mounting machines WM3 may be one or plural. When a plurality of component mounters WM3 are provided, a plurality of components 91 can be mounted by the plurality of component mounters WM3.
 リフロー炉WM4は、部品装着機WM3によって部品91が装着された基板90を加熱し、はんだを溶融させて、はんだ付けを行う。外観検査機WM5は、部品装着機WM3によって装着された部品91の装着状態などを検査する。このように、生産ラインWL0は、複数(5つ)の対基板作業機WM0を用いて、基板90を順に搬送し、検査処理を含む生産処理を実行して基板製品900を生産することができる。なお、生産ラインWL0は、例えば、機能検査機、バッファ装置、基板供給装置、基板反転装置、シールド装着装置、接着剤塗布装置、紫外線照射装置などの対基板作業機WM0を必要に応じて備えることもできる。 The reflow furnace WM4 heats the board 90 on which the component 91 is mounted by the component mounter WM3, melts the solder, and performs soldering. The appearance inspection machine WM5 inspects the mounting state of the component 91 mounted by the component mounting machine WM3. In this manner, the production line WL0 can use a plurality of (five) board-to-board work machines WM0 to transport the boards 90 in sequence and perform production processes including inspection processes to produce board products 900. . Note that the production line WL0 may be provided with substrate-to-board work machines WM0 such as, for example, a function inspection machine, a buffer device, a substrate supply device, a substrate reversing device, a shield mounting device, an adhesive coating device, and an ultraviolet irradiation device, as necessary. can also
 複数(5つ)の対基板作業機WM0および管理装置HC0は、有線または無線の通信部によって通信可能に接続されている。実施形態では、複数(5つ)の対基板作業機WM0および管理装置HC0によって、構内情報通信網(LAN:Local Area Network)が構成されている。これにより、複数(5つ)の対基板作業機WM0は、通信部を介して、互いに通信することができる。また、複数(5つ)の対基板作業機WM0は、通信部を介して、管理装置HC0と通信することができる。 A plurality (five) of work machines WM0 for the board and the management device HC0 are communicably connected by a wired or wireless communication unit. In the embodiment, a local area network (LAN) is configured by a plurality of (five) work machines for board WM0 and management device HC0. Thereby, the plurality (five) of the work machines for board WM0 can communicate with each other via the communication section. Also, the plurality (five) of the board-oriented work machines WM0 can communicate with the management device HC0 via the communication unit.
 管理装置HC0は、生産ラインWL0を構成する複数(5つ)の対基板作業機WM0の制御を行い、生産ラインWL0の動作状況を監視する。管理装置HC0には、複数(5つ)の対基板作業機WM0を制御する種々の制御データが記憶されている。管理装置HC0は、複数(5つ)の対基板作業機WM0の各々に制御データを送信する。また、複数(5つ)の対基板作業機WM0の各々は、管理装置HC0に動作状況および生産状況を送信する。 The management device HC0 controls a plurality (five) of work machines for boards WM0 that make up the production line WL0, and monitors the operation status of the production line WL0. The management device HC0 stores various control data for controlling a plurality of (five) work machines for board WM0. The management device HC0 transmits control data to each of the plurality (five) of the board-oriented work machines WM0. Further, each of the plurality (five) of the board-oriented work machines WM0 transmits the operation status and production status to the management device HC0.
 管理装置HC0には、データサーバ70が設けられている。データサーバ70は、例えば、対基板作業機WM0が対基板作業に関して取得した取得データを保存することができる。例えば、対基板作業機WM0によって撮像された画像の画像データは、取得データに含まれる。後述されている教師画像40は、取得データに含まれる。対基板作業機WM0によって取得された稼働状況の記録(ログデータ)などは、取得データに含まれる。 A data server 70 is provided in the management device HC0. The data server 70 can store, for example, acquired data relating to board-related work acquired by the board-related work machine WM0. For example, image data of an image captured by the board-oriented work machine WM0 is included in the acquired data. A teacher image 40, which will be described later, is included in the acquired data. A record (log data) of the operation status acquired by the board-oriented work machine WM0 is included in the acquired data.
 また、データサーバ70は、基板製品900の生産に関する種々の生産情報を保存することもできる。例えば、部品91の種類ごとの形状に関する情報、部品91の電気的特性に関する情報、部品91の取り扱い方法に関する情報などの部品データは、生産情報に含まれる。また、印刷検査機WM2、外観検査機WM5などの検査機による検査結果は、生産情報に含まれる。 The data server 70 can also store various production information related to the production of the board product 900. For example, part data such as information on the shape of each type of part 91, information on electrical characteristics of the part 91, and information on how to handle the part 91 are included in the production information. Also, the inspection results obtained by inspection machines such as the print inspection machine WM2 and the appearance inspection machine WM5 are included in the production information.
 1-2.部品装着機WM3の構成例
 部品装着機WM3は、基板90に部品91を装着する。図2に示すように、実施形態の部品装着機WM3は、基板搬送装置11、部品供給装置12、部品移載装置13、部品カメラ14、基板カメラ15および制御装置16を備えている。
1-2. Configuration Example of Component Mounting Machine WM3 The component mounting machine WM3 mounts a component 91 on a board 90 . As shown in FIG. 2, the component mounting machine WM3 of the embodiment includes a substrate conveying device 11, a component supply device 12, a component transfer device 13, a component camera 14, a substrate camera 15 and a control device 16.
 基板搬送装置11は、例えば、ベルトコンベアなどによって構成され、基板90を搬送方向(X軸方向)に搬送する。基板90は、回路基板であり、電子回路、電気回路、磁気回路などが形成される。基板搬送装置11は、部品装着機WM3の機内に基板90を搬入し、機内の所定位置に基板90を位置決めする。基板搬送装置11は、部品装着機WM3による部品91の装着処理が終了した後に、基板90を部品装着機WM3の機外に搬出する。 The substrate transport device 11 is configured by, for example, a belt conveyor, etc., and transports the substrate 90 in the transport direction (X-axis direction). The substrate 90 is a circuit board on which electronic circuits, electric circuits, magnetic circuits, and the like are formed. The board transfer device 11 carries the board 90 into the component mounting machine WM3 and positions the board 90 at a predetermined position inside the machine. After the component mounting machine WM3 completes the mounting process of the component 91, the board transfer device 11 carries the board 90 out of the component mounting machine WM3.
 部品供給装置12は、基板90に装着される部品91を供給する。部品供給装置12は、基板90の搬送方向(X軸方向)に沿って設けられるフィーダ12aを備えている。フィーダ12aは、複数の部品91が収納されているキャリアテープをピッチ送りさせて、フィーダ12aの先端側に位置する供給位置において部品91を採取可能に供給する。また、部品供給装置12は、チップ部品などと比べて比較的大型の電子部品(例えば、リード部品など)を、トレイ上に配置した状態で供給することもできる。 The component supply device 12 supplies components 91 to be mounted on the board 90 . The component supply device 12 includes a feeder 12a provided along the transport direction (X-axis direction) of the substrate 90 . The feeder 12a pitch-feeds a carrier tape containing a plurality of components 91, and supplies the components 91 so as to be picked up at a supply position located on the leading end side of the feeder 12a. In addition, the component supply device 12 can also supply relatively large electronic components (for example, lead components) compared to chip components in a state of being arranged on a tray.
 部品移載装置13は、ヘッド駆動装置13aおよび移動台13bを備えている。ヘッド駆動装置13aは、直動機構によって移動台13bを、X軸方向およびY軸方向(水平面においてX軸方向と直交する方向)に移動可能に構成されている。移動台13bには、クランプ部材によって装着ヘッド20が着脱可能(交換可能)に設けられている。装着ヘッド20は、少なくとも一つの保持部材30を用いて、部品供給装置12によって供給された部品91を採取し保持して、基板搬送装置11によって位置決めされた基板90に部品91を装着する。保持部材30は、例えば、吸着ノズル、チャックなどを用いることができる。 The component transfer device 13 includes a head driving device 13a and a moving table 13b. The head driving device 13a is configured such that a moving table 13b can be moved in the X-axis direction and the Y-axis direction (a direction perpendicular to the X-axis direction in the horizontal plane) by a linear motion mechanism. A mounting head 20 is detachably (exchangeably) provided on the moving table 13b by a clamp member. The mounting head 20 uses at least one holding member 30 to pick up and hold the component 91 supplied by the component supply device 12 , and mounts the component 91 on the substrate 90 positioned by the substrate transfer device 11 . For example, a suction nozzle, a chuck, or the like can be used as the holding member 30 .
 部品カメラ14は、光軸がZ軸方向(X軸方向およびY軸方向と直交する鉛直方向)の上向きになるように、部品装着機WM3の基台に固定されている。部品カメラ14は、保持部材30に保持されている部品91を下方から撮像することができる。基板カメラ15は、光軸がZ軸方向の下向きになるように、部品移載装置13の移動台13bに設けられている。基板カメラ15は、基板90を上方から撮像することができる。部品カメラ14および基板カメラ15は、公知の撮像装置を用いることができ、制御装置16から送出される制御信号に基づいて撮像を行う。部品カメラ14および基板カメラ15によって撮像された画像の画像データは、制御装置16に送信される。 The component camera 14 is fixed to the base of the component mounting machine WM3 so that the optical axis faces upward in the Z-axis direction (vertical direction orthogonal to the X-axis direction and the Y-axis direction). The component camera 14 can image the component 91 held by the holding member 30 from below. The substrate camera 15 is provided on the moving table 13b of the component transfer device 13 so that the optical axis faces downward in the Z-axis direction. The substrate camera 15 can image the substrate 90 from above. The component camera 14 and the substrate camera 15 can use known imaging devices, and perform imaging based on control signals sent from the control device 16 . Image data of images captured by the component camera 14 and the board camera 15 are transmitted to the control device 16 .
 制御装置16は、公知の演算装置および記憶装置を備えており、制御回路が構成されている。制御装置16には、部品装着機WM3に設けられる各種センサから出力される情報、画像データなどが入力される。制御装置16は、制御プログラムおよび予め設定されている所定の装着条件などに基づいて、各装置に対して制御信号を送出する。 The control device 16 includes a known arithmetic device and storage device, and constitutes a control circuit. Information, image data, and the like output from various sensors provided in the component mounting machine WM3 are input to the control device 16 . The control device 16 sends a control signal to each device based on the control program and predetermined wearing conditions set in advance.
 例えば、制御装置16は、基板搬送装置11によって位置決めされた基板90を基板カメラ15に撮像させる。制御装置16は、基板カメラ15によって撮像された画像を画像処理して、基板90の位置決め状態を認識する。また、制御装置16は、部品供給装置12によって供給された部品91を保持部材30に採取させ保持させて、保持部材30に保持されている部品91を部品カメラ14に撮像させる。制御装置16は、部品カメラ14によって撮像された画像を画像処理して、部品91の有無、部品91の適否、部品91の保持姿勢などを認識する。 For example, the control device 16 causes the substrate camera 15 to image the substrate 90 positioned by the substrate transport device 11 . The control device 16 processes the image captured by the board camera 15 and recognizes the positioning state of the board 90 . Further, the control device 16 causes the holding member 30 to collect and hold the component 91 supplied by the component supply device 12 , and causes the component camera 14 to image the component 91 held by the holding member 30 . The control device 16 performs image processing on the image captured by the component camera 14 to recognize the presence/absence of the component 91, the suitability of the component 91, the holding posture of the component 91, and the like.
 制御装置16は、制御プログラムなどによって予め設定される装着予定位置の上方に向かって、保持部材30を移動させる。また、制御装置16は、基板90の位置決め状態、部品91の保持姿勢などに基づいて、装着予定位置を補正して、実際に部品91を装着する装着位置を設定する。装着予定位置および装着位置は、位置(X軸座標およびY軸座標)の他に回転角度を含む。 The control device 16 moves the holding member 30 upward from the intended mounting position preset by the control program or the like. Further, the control device 16 corrects the planned mounting position based on the positioning state of the substrate 90, the holding attitude of the component 91, and the like, and sets the mounting position where the component 91 is actually mounted. The planned mounting position and mounting position include the position (X-axis coordinate and Y-axis coordinate) as well as the rotation angle.
 制御装置16は、装着位置に合わせて、保持部材30の目標位置(X軸座標およびY軸座標)および回転角度を補正する。制御装置16は、補正された目標位置において補正された回転角度で保持部材30を下降させて、基板90に部品91を装着する。制御装置16は、上記のピックアンドプレースサイクルを繰り返すことによって、基板90に複数の部品91を装着する装着処理を実行する。 The control device 16 corrects the target position (X-axis coordinate and Y-axis coordinate) and rotation angle of the holding member 30 according to the mounting position. The controller 16 lowers the holding member 30 at the corrected rotation angle at the corrected target position to mount the component 91 on the substrate 90 . The control device 16 repeats the pick-and-place cycle described above to perform a mounting process of mounting a plurality of components 91 on the board 90 .
 1-3.画像確認装置50の構成例
 対基板作業機WM0では、生産前に機械学習によって基板製品900の教師画像40を用意しておき、生産時に教師画像40を用いて基板製品900を検査することができる。しかしながら、全ての教師画像40が適切な画像であるとは限らない。図3は、教師画像40の一例を示している。同図に示す教師画像40には、部品装着機WM3によって基板90に装着された複数の部品91のうちの一つの部品91(基板90に設けられた対象物91t)が撮像されている。
1-3. Configuration Example of Image Checking Device 50 In the board-to-board work machine WM0, the teacher image 40 of the board product 900 is prepared by machine learning before production, and the board product 900 can be inspected using the teacher image 40 during production. . However, not all teacher images 40 are suitable images. FIG. 3 shows an example of the teacher image 40. As shown in FIG. In the teacher image 40 shown in the figure, one component 91 (an object 91t provided on the board 90) among a plurality of components 91 mounted on the board 90 by the component mounter WM3 is imaged.
 同図に示す教師画像40は、基板カメラ15、外観検査機WM5、対基板作業機WM0の外部に設けられるカメラなどの撮像装置80によって、基板90の上方から撮像することができる。例えば、撮像装置80によって撮像された画像が教師画像40として適切であるか否かを作業者が判断する場合を想定する。この場合、教師画像40として不適切な画像(例えば、適切な部品91が基板90の所定領域90tに適正に装着されていない画像など)を作業者が誤って教師画像40に登録する可能性がある。 The teacher image 40 shown in the figure can be captured from above the board 90 by an imaging device 80 such as a camera provided outside the board camera 15, the visual inspection machine WM5, and the work machine for board WM0. For example, it is assumed that the operator determines whether the image captured by the imaging device 80 is appropriate as the teacher image 40 . In this case, the operator may mistakenly register an inappropriate image as the teacher image 40 (for example, an image in which the appropriate component 91 is not properly mounted on the predetermined area 90t of the substrate 90). be.
 このように、不適切な教師画像40が含まれていると、基板製品900の検査精度が低下する可能性がある。また、教師画像40は、大量に存在し、作業者が一枚ずつ教師画像40を確認して、不適切な教師画像40を発見することは困難である。そこで、実施形態では、画像確認装置50が設けられている。画像確認装置50は、不適切な教師画像を案内する。 In this way, if an inappropriate teacher image 40 is included, there is a possibility that the inspection accuracy of the circuit board product 900 will be degraded. Moreover, there are a large number of teacher images 40, and it is difficult for the operator to check the teacher images 40 one by one and find an inappropriate teacher image 40. FIG. Therefore, in the embodiment, an image confirmation device 50 is provided. The image confirmation device 50 guides inappropriate teacher images.
 画像確認装置50は、制御ブロックとして捉えると、取得部51と、案内部52とを備えている。取得部51および案内部52は、種々の演算装置、制御装置などに設けることができる。例えば、取得部51および案内部52のうちの少なくとも一つは、管理装置HC0に設けることができる。取得部51および案内部52のうちの少なくとも一つは、クラウド上に形成することもできる。 The image confirmation device 50 includes an acquisition unit 51 and a guide unit 52 as control blocks. Acquisition unit 51 and guide unit 52 can be provided in various arithmetic devices, control devices, and the like. For example, at least one of the acquiring unit 51 and the guiding unit 52 can be provided in the management device HC0. At least one of the acquisition unit 51 and the guide unit 52 can also be formed on the cloud.
 図4に示すように、実施形態の画像確認装置50は、管理装置HC0に設けられている。また、実施形態の画像確認装置50は、図5に示すフローチャートに従って、制御を実行する。取得部51は、ステップS11およびステップS12に示す判断および処理を行う。案内部52は、ステップS13に示す処理を行う。 As shown in FIG. 4, the image confirmation device 50 of the embodiment is provided in the management device HC0. Further, the image confirmation device 50 of the embodiment executes control according to the flowchart shown in FIG. Acquisition unit 51 performs the determination and processing shown in steps S11 and S12. The guidance unit 52 performs the processing shown in step S13.
 1-3-1.取得部51
 取得部51は、教師画像40の各々について画像全体の特徴量である画像特徴量BF1を抽出して、抽出された画像特徴量BF1の分布である特徴量分布FD1を取得する。教師画像40は、基板90に所定の対基板作業を行う対基板作業機WM0によって基板90に設けられた対象物91tが撮像されている複数の画像であって、機械学習に使用される画像をいう。
1-3-1. Acquisition unit 51
The acquisition unit 51 extracts an image feature quantity BF1 that is a feature quantity of the entire image for each of the teacher images 40, and acquires a feature quantity distribution FD1 that is the distribution of the extracted image feature quantity BF1. The teacher images 40 are a plurality of images in which an object 91t provided on the board 90 is captured by the board-working machine WM0 that performs a predetermined work on the board 90, and are used for machine learning. say.
 図3に示す画像は、教師画像40に含まれる。この場合、対基板作業機WM0は、基板90に部品91を装着する部品装着機WM3であり、基板90に装着された部品91は、対象物91tに含まれる。また、教師画像40は、公知の種々の機械学習に使用することができる。例えば、教師画像40は、サポートベクターマシン、回帰分析などの種々の機械学習に使用することができる。なお、実施形態では、教師画像40は、図1に示すデータサーバ70に保存されている。 The image shown in FIG. 3 is included in the teacher image 40. In this case, the board-oriented work machine WM0 is a component mounting machine WM3 that mounts the component 91 on the board 90, and the component 91 mounted on the board 90 is included in the object 91t. Also, the teacher image 40 can be used for various known machine learning. For example, the teacher image 40 can be used for various machine learning such as support vector machine and regression analysis. Note that, in the embodiment, the teacher image 40 is stored in the data server 70 shown in FIG.
 図3に示す教師画像40には、基板90に装着された部品91が撮像されている。例えば、部品91がチップ抵抗器、チップコンデンサなどのチップ部品の場合、部品91は、電極部の領域である電極領域AR11および電極領域AR12と、本体部の領域である本体領域AR13とを備えている。 A component 91 mounted on a substrate 90 is imaged in the teacher image 40 shown in FIG. For example, when the component 91 is a chip component such as a chip resistor or a chip capacitor, the component 91 includes an electrode region AR11 and an electrode region AR12, which are electrode regions, and a body region AR13, which is a body region. there is
 電極領域AR11および電極領域AR12は、銀色(金属色)である。また、部品91の表面側(部品91が適正に基板90に装着されている場合に視認可能な面)の本体領域AR13は、黒色であり、部品91の裏面(底面)側の本体領域AR13は、白色であると仮定する。部品91が適正に基板90に装着されている場合、撮像装置80は、電極領域AR11および電極領域AR12(銀色)と、部品91の表面側の本体領域AR13(黒色)とを撮像する。 The electrode area AR11 and the electrode area AR12 are silver (metallic). Further, the main body area AR13 on the front side of the component 91 (the side visible when the component 91 is properly mounted on the substrate 90) is black, and the main body area AR13 on the back (bottom) side of the component 91 is black. , is assumed to be white. When the component 91 is properly attached to the substrate 90, the imaging device 80 images the electrode area AR11 and the electrode area AR12 (silver) and the main body area AR13 (black) on the surface side of the component 91.
 例えば、保持部材30が部品91の裏面側を誤って吸着して、部品91の表面側と裏面側とが反転した状態で部品91が基板90に装着されている場合、撮像装置80は、電極領域AR11および電極領域AR12(銀色)と、部品91の裏面側の本体領域AR13(白色)とを撮像する。部品91の表面側と裏面側とが反転した状態で部品91が基板90に装着されている場合、例えば、電極領域AR11および電極領域AR12と、部品91の本体領域AR13との輝度の差は、部品91が適正に基板90に装着されている場合と比べて、小さくなる。 For example, when the holding member 30 erroneously picks up the back side of the component 91 and the component 91 is mounted on the substrate 90 in a state where the front side and the back side of the component 91 are reversed, the image pickup device 80 is configured to hold the electrodes. An area AR11, an electrode area AR12 (silver), and a body area AR13 (white) on the back side of the component 91 are imaged. When the component 91 is attached to the substrate 90 with the front side and the back side of the component 91 reversed, for example, the difference in brightness between the electrode regions AR11 and AR12 and the main body region AR13 of the component 91 is It is smaller than when the component 91 is properly attached to the board 90 .
 このように、教師画像40として不適切な画像(例えば、適切な部品91が基板90の所定領域90tに適正に装着されていない画像)は、部品91が適正に基板90に装着されている画像と比べて、画像全体の特徴量が変化する。そこで、取得部51は、教師画像40の各々について画像全体の特徴量である画像特徴量BF1を抽出して、抽出された画像特徴量BF1の分布である特徴量分布FD1を取得する。 In this way, an image inappropriate as the teacher image 40 (for example, an image in which the appropriate component 91 is not properly mounted on the predetermined region 90t of the substrate 90) is an image in which the component 91 is properly mounted on the substrate 90. Compared to , the feature amount of the entire image changes. Therefore, the acquiring unit 51 extracts the image feature quantity BF1 that is the feature quantity of the entire image for each of the teacher images 40, and acquires the feature quantity distribution FD1 that is the distribution of the extracted image feature quantity BF1.
 取得部51は、特徴量分布FD1を取得することができれば良く、種々の形態をとり得る。取得部51は、例えば、多変量解析において公知の方法(例えば、主成分分析など)によって、特徴量分布FD1を取得することができる。図6は、特徴量分布FD1の一例を示している。同図は、図3に示す教師画像40を構成する複数の画素の輝度を画像特徴量BF1とした場合の特徴量分布FD1の一例を示している。 The acquisition unit 51 can take various forms as long as it can acquire the feature quantity distribution FD1. The acquiring unit 51 can acquire the feature quantity distribution FD1 by, for example, a method known in multivariate analysis (for example, principal component analysis). FIG. 6 shows an example of the feature quantity distribution FD1. The figure shows an example of the feature quantity distribution FD1 when the brightness of a plurality of pixels forming the teacher image 40 shown in FIG. 3 is the image feature quantity BF1.
 図6の特徴領域FR1に図示されている複数の点は、教師画像40の画像特徴量BF1を模式的に示している。同図に示すように、特徴量分布FD1は、二次元の特徴領域FR1によって表すことができる。特徴量分布FD1は、三次元以上の特徴領域FR1によって表すこともできる。特徴領域FR1は、複数の画像特徴量BF1の外縁を示しており、単位空間ともいう。 A plurality of points shown in the feature region FR1 of FIG. 6 schematically show the image feature amount BF1 of the teacher image 40. As shown in the figure, the feature quantity distribution FD1 can be represented by a two-dimensional feature region FR1. The feature quantity distribution FD1 can also be represented by a three-dimensional or higher feature region FR1. The feature region FR1 indicates the outer edges of the plurality of image feature amounts BF1, and is also called a unit space.
 なお、基板90において対象物91tが設けられるべき所定領域90tの重心と、基板90において対象物91tが設けられている対象領域91sの重心との偏差が大きくなるほど、教師画像40は、不適切である。既述されている例では、所定領域90tは、基板90において部品91が装着されるべき目標領域に相当する。対象領域91sは、基板90において部品91が実際に装着されている装着領域に相当する。 Note that the larger the deviation between the center of gravity of the predetermined area 90t on the substrate 90 where the target object 91t is to be provided and the center of gravity of the target area 91s on which the target object 91t is provided on the substrate 90, the more inappropriate the teacher image 40 is. be. In the example already described, the predetermined area 90t corresponds to the target area on the substrate 90 where the component 91 is to be mounted. The target area 91s corresponds to a mounting area on the substrate 90 where the component 91 is actually mounted.
 また、基板90に設けられるべき対象物91tと異なる対象物91tが基板90に設けられている教師画像40は、不適切である。既述されている例では、基板90に設けられるべき対象物91tは、基板90に装着されるべき部品91に相当する。この場合、対象領域91sの面積は、設けられるべき対象物91tが基板90に設けられている場合と比べて、増減する可能性がある。 Also, the teacher image 40 in which the substrate 90 is provided with an object 91t different from the object 91t that should be provided on the substrate 90 is inappropriate. In the example already described, the object 91t to be provided on the board 90 corresponds to the component 91 to be mounted on the board 90. FIG. In this case, the area of the target region 91 s may increase or decrease compared to the case where the target object 91 t to be provided is provided on the substrate 90 .
 また、対象物91tの形状を示す指標(例えば、真円度)は、設けられるべき対象物91tが基板90に設けられている場合と比べて、増減する可能性がある。例えば、正方形状の部品91は、矩形状の部品91と比べて真円度が高く、基板90に設けられるべき対象物91tであるか否かを判断する際の指標の一つになる。 In addition, the index (for example, roundness) indicating the shape of the target object 91t may increase or decrease compared to the case where the target object 91t to be provided is provided on the substrate 90. For example, the square component 91 has a higher degree of circularity than the rectangular component 91, and is one of the indicators for determining whether or not it is the object 91t to be provided on the substrate 90. FIG.
 そこで、画像特徴量BF1は、画像を構成する複数の画素の輝度、基板90において対象物91tが設けられている対象領域91sの重心、対象領域91sの面積、および、対象物91tの形状を示す指標のうちの少なくとも一つであると好適である。この場合、取得部51は、上記の少なくとも一つの画像特徴量BF1を抽出して、特徴量分布FD1を取得することができる。 Therefore, the image feature amount BF1 indicates the luminance of a plurality of pixels forming an image, the center of gravity of the target area 91s on which the target object 91t is provided on the substrate 90, the area of the target area 91s, and the shape of the target object 91t. Preferably, it is at least one of the indicators. In this case, the acquisition unit 51 can extract the at least one image feature amount BF1 to acquire the feature amount distribution FD1.
 また、取得部51は、外れ度を確認する画像特徴量BF1を含めて、特徴量分布FD1を取得することができる。取得部51は、外れ度を確認する画像特徴量BF1を除外して、特徴量分布FD1を取得することもできる。さらに、取得部51は、教師画像40の数が所定数より多い場合に、外れ度を確認する画像特徴量BF1を含めて、特徴量分布FD1を取得することもできる。取得部51は、教師画像40の数が所定数より少ない場合に、外れ度を確認する画像特徴量BF1を除外して、特徴量分布FD1を取得することもできる。 In addition, the acquisition unit 51 can acquire the feature quantity distribution FD1 including the image feature quantity BF1 for checking the degree of deviation. The acquiring unit 51 can also acquire the feature quantity distribution FD1 by excluding the image feature quantity BF1 for checking the degree of deviation. Furthermore, when the number of teacher images 40 is greater than a predetermined number, the acquiring unit 51 can also acquire the feature quantity distribution FD1 including the image feature quantity BF1 for checking the degree of deviation. When the number of teacher images 40 is less than a predetermined number, the acquisition unit 51 can also acquire the feature distribution FD1 by excluding the image feature BF1 for checking the degree of deviation.
 1-3-2.案内部52
 既述したように、例えば、部品91が適正に基板90に装着されている教師画像40では、図3に示す本体領域AR13は、黒色(輝度値は0)である。部品91の表面側と裏面側とが反転した状態で部品91が基板90に装着されている教師画像40では、本体領域AR13は、白色(例えば、一つの画素が8ビットの情報で表される場合、輝度値は255)になる。
1-3-2. Guide part 52
As described above, for example, in the teacher image 40 in which the component 91 is properly mounted on the board 90, the main body area AR13 shown in FIG. 3 is black (luminance value is 0). In the teacher image 40 in which the component 91 is mounted on the substrate 90 with the front side and the back side of the component 91 reversed, the main body area AR13 is white (for example, one pixel is represented by 8-bit information). In this case, the luminance value is 255).
 大多数の教師画像40において本体領域AR13が黒色(輝度値は0)の場合に、本体領域AR13が白色(輝度値は255)の少数の教師画像40が含まれている場合を想定する。この場合、本体領域AR13が白色(輝度値は255)の少数の教師画像40は、画像特徴量BF1の外れ度が顕著になる。例えば、画像特徴量BF1の外れ度は、マハラノビス距離MD1によって表すことができる。 Assume that when the majority of teacher images 40 have a black main body area AR13 (with a luminance value of 0), a small number of teacher images 40 with a white main body area AR13 (with a luminance value of 255) are included. In this case, a small number of teacher images 40 in which the main body area AR13 is white (with a luminance value of 255) have a conspicuous degree of deviation in the image feature amount BF1. For example, the degree of deviation of the image feature quantity BF1 can be represented by the Mahalanobis distance MD1.
 図6に示すように、マハラノビス距離MD1は、特徴領域FR1の中心から画像特徴量BF1までの距離をいう。本体領域AR13が白色(輝度値は255)の少数の教師画像40の画像特徴量BF1は、特徴量分布FD1において、複数の画像特徴量BF1の外縁部に位置し易く、マハラノビス距離MD1が顕著に大きくなる。輝度について上述されていることは、他の画像特徴量BF1についても同様に言える。 As shown in FIG. 6, the Mahalanobis distance MD1 is the distance from the center of the feature region FR1 to the image feature quantity BF1. The image feature quantities BF1 of a small number of teacher images 40 in which the main body region AR13 is white (with a luminance value of 255) are likely to be positioned at the outer edges of the plurality of image feature quantities BF1 in the feature quantity distribution FD1, and the Mahalanobis distance MD1 is notable. growing. What has been described above regarding luminance can be similarly applied to other image feature amounts BF1.
 案内部52は、取得部51によって取得された特徴量分布FD1において画像特徴量BF1の外れ度が大きい順に、教師画像40を案内する。案内部52は、画像特徴量BF1の外れ度が大きい順に教師画像40を案内することができれば良く、種々の形態をとり得る。例えば、案内部52は、マハラノビス距離MD1が大きい順に、教師画像40を案内することができる。 The guiding unit 52 guides the teacher image 40 in descending order of the deviation degree of the image feature quantity BF1 in the feature quantity distribution FD1 acquired by the acquiring unit 51. The guiding unit 52 may take various forms as long as it can guide the teacher images 40 in descending order of the degree of deviation of the image feature quantity BF1. For example, the guiding unit 52 can guide the teacher images 40 in descending order of the Mahalanobis distance MD1.
 また、案内部52は、教師画像40の表示、教師画像40を特定可能な情報(例えば、画像データのファイル名)の音声案内など種々の方法によって、教師画像40を案内することができる。実施形態では、案内部52は、画像特徴量BF1の外れ度が大きい順に、表示装置60に教師画像40を表示させる。また、案内部52は、マハラノビス距離MD1が大きい順に、表示装置60に教師画像40を表示させる。表示装置60は、上記のように教師画像40を表示することができれば良く、公知の種々の表示装置を用いることができる。 In addition, the guidance unit 52 can guide the teacher image 40 by various methods such as displaying the teacher image 40 and voice guidance of information (for example, file name of image data) that can specify the teacher image 40 . In the embodiment, the guidance unit 52 causes the display device 60 to display the teacher images 40 in descending order of the degree of deviation of the image feature amount BF1. Further, the guide unit 52 causes the display device 60 to display the teacher images 40 in descending order of the Mahalanobis distance MD1. The display device 60 only needs to be able to display the teacher image 40 as described above, and various known display devices can be used.
 図7は、画像確認装置50による教師画像40の案内例を示している。具体的には、同図は、表示装置60の表示画面の一例を示している。例えば、案内部52は、画像特徴量BF1の外れ度が大きい(マハラノビス距離MD1が大きい)教師画像40から順に、表示画面の上位位置から下位位置に表示されるように、教師画像40を表示させる。これにより、作業者は、表示装置60において、不適切である可能性が高い教師画像40から順に、教師画像40を確認することができる。 FIG. 7 shows an example of guidance for the teacher image 40 by the image confirmation device 50. FIG. Specifically, the figure shows an example of the display screen of the display device 60 . For example, the guide unit 52 displays the teacher images 40 so that the teacher images 40 having the largest deviation of the image feature amount BF1 (the Mahalanobis distance MD1 is large) are displayed from the upper position to the lower position on the display screen. . Thereby, the operator can check the teacher images 40 on the display device 60 in order from the teacher images 40 that are most likely to be inappropriate.
 また、案内部52は、表示装置60の表示画面において、機械学習に使用しない教師画像40を作業者に選択させることができる。例えば、同図に示す表示画面は、タッチパネルにより構成されている。作業者は、表示画面において、機械学習に使用しない教師画像40をタッチして、教師画像40を選択する。そして、作業者は、選択した教師画像40に対応する操作部(同図では、操作部BA1~操作部BA4)をタッチして、機械学習に使用しない教師画像40を排除することができる。 In addition, the guidance unit 52 can allow the operator to select a teacher image 40 that is not used for machine learning on the display screen of the display device 60. For example, the display screen shown in the figure is configured by a touch panel. The operator selects the teacher image 40 by touching the teacher image 40 not used for machine learning on the display screen. Then, the operator can touch the operation part (operation part BA1 to operation part BA4 in the figure) corresponding to the selected teacher image 40 to exclude the teacher image 40 that is not used for machine learning.
 例えば、画像特徴量BF1の外れ度が最も大きい(マハラノビス距離MD1が最も大きい)1番の教師画像40を機械学習に使用しない場合、作業者は、1番の教師画像40をタッチして教師画像40を選択し、操作部BA1をタッチする。これにより、1番の教師画像40は、機械学習に使用されなくなる。1番の教師画像40について上述されていることは、2番以降の教師画像40についても同様に言える。機械学習に使用しない教師画像40は、データサーバ70から削除することができ、当該教師画像40を使用している対基板作業機WM0において反映される。 For example, when the first teacher image 40 having the largest outlier of the image feature quantity BF1 (the largest Mahalanobis distance MD1) is not used for machine learning, the operator touches the first teacher image 40 to 40 is selected and the operation part BA1 is touched. As a result, the first teacher image 40 is no longer used for machine learning. What has been described above for the first teacher image 40 can be similarly applied to the second and subsequent teacher images 40 . A teacher image 40 that is not used for machine learning can be deleted from the data server 70 and reflected in the board-oriented work machine WM0 that uses the teacher image 40 .
 なお、作業者は、操作部BB1をタッチすることにより、表示画面に表示されている教師画像40を全て選択することができる。また、作業者は、操作部BB2をタッチすることにより、教師画像40の選択を解除することができる。さらに、作業者は、操作部BC1をタッチすることにより、次の表示画面(画像特徴量BF1の外れ度が現在表示されている教師画像40より小さい教師画像40)を表示させることができる。作業者は、操作部BC2をタッチすることにより、前の表示画面(画像特徴量BF1の外れ度が現在表示されている教師画像40より大きい教師画像40)を表示させることができる。 Note that the operator can select all the teacher images 40 displayed on the display screen by touching the operation part BB1. Further, the operator can cancel the selection of the teacher image 40 by touching the operation part BB2. Furthermore, the operator can display the next display screen (the teacher image 40 whose degree of deviation of the image feature quantity BF1 is smaller than the currently displayed teacher image 40) by touching the operation part BC1. By touching the operation part BC2, the operator can display the previous display screen (the teacher image 40 whose degree of deviation of the image feature quantity BF1 is greater than the currently displayed teacher image 40).
 また、案内部52は、画像特徴量BF1の外れ度、対基板作業機WM0によって生産された基板製品900の検査結果、および、教師画像40に関する画像情報のうちの少なくとも画像特徴量BF1の外れ度を、教師画像40と共に案内することができる。基板製品900の検査結果は、例えば、印刷検査機WM2、外観検査機WM5などの検査機から取得することができる。 Further, the guide unit 52 determines the degree of deviation of the image feature quantity BF1, the inspection result of the board product 900 produced by the board-oriented work machine WM0, and at least the degree of deviation of the image feature quantity BF1 among the image information related to the teacher image 40. can be guided together with the teacher image 40 . The inspection result of the board product 900 can be obtained from an inspection machine such as the print inspection machine WM2 and the visual inspection machine WM5.
 基板製品900の検査結果は、例えば、印刷機WM1、部品装着機WM3などの対基板作業機WM0から取得することもできる。また、教師画像40に関する画像情報は、限定されない。教師画像40に関する画像情報は、教師画像40が取得された日時に関する情報、撮像装置80に関する情報、撮像条件(例えば、撮像装置80が教師画像40を撮像したときの露光時間、絞り値、光源の種類、光の照射方向)に関する情報などの種々の情報を含み得る。 The inspection results of the board product 900 can also be obtained from the board-to-board working machines WM0 such as the printing machine WM1 and the component mounting machine WM3, for example. Further, the image information regarding the teacher image 40 is not limited. The image information about the teacher image 40 includes information about the date and time when the teacher image 40 was acquired, information about the imaging device 80, imaging conditions (for example, exposure time, aperture value, and light source when the imaging device 80 captured the teacher image 40). It may contain various information such as information on type, irradiation direction of light).
 図7に示す表示画面では、上記の全ての情報が教師画像40と共に案内されている。例えば、1番の教師画像40の画像特徴量BF1の外れ度(マハラノビス距離MD1)は、距離L1で示されている。また、1番の教師画像40が撮像された基板製品900の検査結果は、良品であることを示す結果OKで示されている。さらに、1番の教師画像40が取得された日時は、日時T1で示されている。1番の教師画像40について上述されていることは、2番以降の教師画像40についても同様に言える。 On the display screen shown in FIG. 7, all of the above information is displayed together with the teacher image 40. For example, the degree of deviation (Mahalanobis distance MD1) of the image feature amount BF1 of the first teacher image 40 is indicated by the distance L1. Also, the inspection result of the circuit board product 900 for which the first teaching image 40 has been picked up is indicated by an OK result indicating that it is a non-defective product. Furthermore, the date and time when the first teacher image 40 was acquired is indicated by date and time T1. What has been described above for the first teacher image 40 can be similarly applied to the second and subsequent teacher images 40 .
 画像確認装置50は、種々の状況において、取得部51が特徴量分布FD1を取得し、案内部52が教師画像40を案内することができる。例えば、画像確認装置50は、基板製品900の生産前、生産中および生産後のうちの少なくとも一つの状況において、取得部51が特徴量分布FD1を取得し、案内部52が教師画像40を案内することができる。 The image confirmation device 50 can acquire the feature quantity distribution FD1 by the acquisition unit 51 and guide the teacher image 40 by the guide unit 52 in various situations. For example, the image confirmation device 50 acquires the feature quantity distribution FD1 through the acquiring unit 51 and guides the teacher image 40 through the guiding unit 52 in at least one situation of before, during, and after the production of the substrate product 900. can do.
 特に、基板製品900が良品であり且つ機械学習による基板製品900の検査結果が不良の場合、基板製品900の検査に使用された教師画像40が不適切な可能性がある。そこで、取得部51は、対基板作業機WM0によって生産された基板製品900が良品であり且つ機械学習による基板製品900の検査結果が不良の場合に、基板製品900の検査に使用された教師画像40について特徴量分布FD1を取得することができる。また、案内部52は、教師画像40を案内することができる(図5に示すステップS11~ステップS13)。 In particular, if the board product 900 is good and the inspection result of the board product 900 by machine learning is bad, the teacher image 40 used to inspect the board product 900 may be inappropriate. Therefore, when the board product 900 produced by the board-to-board work machine WM0 is non-defective and the inspection result of the board product 900 by machine learning is defective, the acquisition unit 51 obtains the teacher image used for the inspection of the board product 900. 40, the feature quantity distribution FD1 can be obtained. Further, the guiding unit 52 can guide the teacher image 40 (steps S11 to S13 shown in FIG. 5).
 具体的には、取得部51は、基板製品900が良品であり且つ機械学習による基板製品900の検査結果が不良であるか否かを判断する(ステップS11)。例えば、作業者は、機械学習による基板製品900の検査結果が不良である場合に、基板製品900が良品であるか否かを判断し、例えば、画像確認装置50、画像確認装置50と通信可能な装置、当該検査を行った装置などにおいて、判断結果を入力することができる。また、機械学習を使用しない検査機は、機械学習による基板製品900の検査結果が不良である場合に、基板製品900が良品であるか否かを判断することもできる。上記の条件を満たす場合(ステップS11でYesの場合)、取得部51は、基板製品900の検査に使用された教師画像40について特徴量分布FD1を取得する(ステップS12)。そして、案内部52は、取得部51によって取得された特徴量分布FD1において画像特徴量BF1の外れ度が大きい順に教師画像40を案内する(ステップS13)。 Specifically, the acquiring unit 51 determines whether the substrate product 900 is a non-defective product and whether the inspection result of the substrate product 900 by machine learning is defective (step S11). For example, when the inspection result of the board product 900 by machine learning is defective, the operator can determine whether the board product 900 is a non-defective product. It is possible to input the determination result in the device that performed the inspection, the device that performed the inspection, or the like. Also, an inspection machine that does not use machine learning can determine whether or not the product board 900 is a non-defective product when the inspection result of the product board 900 by machine learning is defective. If the above condition is satisfied (Yes in step S11), the acquiring unit 51 acquires the feature quantity distribution FD1 for the teacher image 40 used in the inspection of the circuit board product 900 (step S12). Then, the guide unit 52 guides the teacher images 40 in descending order of the degree of deviation of the image feature quantity BF1 in the feature quantity distribution FD1 acquired by the acquisition unit 51 (step S13).
 作業者は、案内部52によって案内された教師画像40を確認して、機械学習に使用しない教師画像40を選択する。作業者は、教師画像40の選択の際に、画像特徴量BF1の外れ度、基板製品900の検査結果、および、教師画像40に関する画像情報を参照して、機械学習に使用しない教師画像40を選択することができる。そして、画像確認装置50による制御は、一旦、終了する。なお、上記の条件を満たさない場合(ステップS11でNoの場合)、ステップS12およびステップS13に示す処理を実行しないで、画像確認装置50による制御は、一旦、終了する。 The operator confirms the teacher images 40 guided by the guide unit 52 and selects the teacher images 40 that are not used for machine learning. When selecting the teacher image 40, the operator refers to the degree of deviation of the image feature quantity BF1, the inspection result of the circuit board product 900, and the image information related to the teacher image 40, and selects the teacher image 40 that is not used for machine learning. can be selected. Then, the control by the image confirmation device 50 is once terminated. If the above condition is not satisfied (No in step S11), the control by the image confirmation device 50 is temporarily terminated without executing the processing shown in steps S12 and S13.
 実施形態では、対基板作業機WM0は、対象物91tである部品91を基板90に装着する部品装着機WM3である。また、基板製品900の検査は、部品装着機WM3によって基板90の所定領域90tに部品91が装着されているか否かを検査する部品有無検査である。よって、案内部52は、適切な部品91が基板90の所定領域90tに適正に装着されていない教師画像40を案内することができる。なお、図7では、部品装着機WM3によって基板90に装着された複数の部品91のうちの一つの部品91について、教師画像40が案内されている。案内部52は、同様にして、他の部品91についても教師画像40を案内することができる。 In the embodiment, the board-facing work machine WM0 is a component mounting machine WM3 that mounts the component 91, which is the target object 91t, on the board 90. Further, the inspection of the substrate product 900 is a component presence/absence inspection for inspecting whether or not the component 91 is mounted on the predetermined area 90t of the substrate 90 by the component mounting machine WM3. Therefore, the guide section 52 can guide the teacher image 40 in which the appropriate component 91 is not properly attached to the predetermined area 90 t of the substrate 90 . In FIG. 7, the teacher image 40 is shown for one component 91 among the plurality of components 91 mounted on the board 90 by the component mounting machine WM3. The guiding unit 52 can similarly guide the teacher image 40 for other parts 91 as well.
 2.その他の形態
 実施形態では、画像特徴量BF1の外れ度は、マハラノビス距離MD1によって表されている。しかしながら、画像特徴量BF1の外れ度は、マハラノビス距離MD1以外の方法によって表すこともできる。例えば、画像特徴量BF1の外れ度は、スミルノフ・グラブス検定などの外れ値情報によって表すこともできる。
2. Other Embodiments In the embodiment, the degree of deviation of the image feature quantity BF1 is represented by the Mahalanobis distance MD1. However, the degree of deviation of the image feature amount BF1 can also be represented by a method other than the Mahalanobis distance MD1. For example, the outlier degree of the image feature quantity BF1 can also be represented by outlier information such as the Smirnov-Grubbs test.
 また、対基板作業機WM0は、部品装着機WM3に限定されない。例えば、対基板作業機WM0は、基板90にはんだを印刷する印刷機WM1であっても良い。この場合、基板90に印刷されたはんだは、対象物91tに含まれる。基板90において対象物91tが設けられるべき所定領域90tは、基板90において、はんだが印刷されるべき目標領域に相当する。基板90において対象物91tが設けられている対象領域91sは、基板90において、はんだが実際に印刷されている印刷領域に相当する。 Also, the board-facing work machine WM0 is not limited to the component mounting machine WM3. For example, the board-to-board working machine WM0 may be a printing machine WM1 that prints solder on the board 90 . In this case, the solder printed on the board 90 is included in the object 91t. A predetermined area 90t on the substrate 90 where the object 91t is to be provided corresponds to a target area on the substrate 90 on which solder is to be printed. A target region 91s on which the target object 91t is provided on the substrate 90 corresponds to a printing region on the substrate 90 where solder is actually printed.
 さらに、基板製品900の検査は、部品有無検査に限定されない。例えば、基板製品900の検査は、はんだ有無検査であっても良い。また、基板製品900の検査は、基板90において対象物91tが設けられるべき所定領域90tの重心と、基板90において対象物91tが設けられている対象領域91sの重心との偏差を検査する位置ずれ検査であっても良い。 Furthermore, the inspection of the board product 900 is not limited to the component presence/absence inspection. For example, the inspection of the board product 900 may be a solder presence/absence inspection. Further, the inspection of the substrate product 900 is performed by inspecting the deviation between the center of gravity of the predetermined area 90t on the substrate 90 where the target object 91t is to be provided and the center of gravity of the target area 91s on the substrate 90 where the target object 91t is provided. It may be an inspection.
 3.画像確認方法
 画像確認装置50について既述されていることは、画像確認方法についても同様に言える。具体的には、画像確認方法は、取得工程と、案内工程とを備える。取得工程は、取得部51が行う制御に相当する。案内工程は、案内部52が行う制御に相当する。
3. Image Confirmation Method What has already been described with respect to the image confirmation device 50 is also applicable to the image confirmation method. Specifically, the image confirmation method includes an acquisition step and a guidance step. The obtaining step corresponds to control performed by the obtaining unit 51 . The guiding process corresponds to control performed by the guiding section 52 .
 4.実施形態の効果の一例
 画像確認装置50によれば、取得部51および案内部52を備えている。これにより、画像確認装置50は、特徴量分布FD1において画像特徴量BF1の外れ度が大きい順に教師画像40を案内することができる。画像確認装置50について上述されていることは、画像確認方法についても同様に言える。
4. Example of Effect of Embodiment According to the image checking device 50 , the acquiring section 51 and the guiding section 52 are provided. Thus, the image confirmation device 50 can guide the teacher images 40 in descending order of the degree of deviation of the image feature quantity BF1 in the feature quantity distribution FD1. What has been described above with respect to the image confirmation device 50 also applies to the image confirmation method.
40:教師画像、50:画像確認装置、51:取得部、52:案内部、
60:表示装置、90:基板、90t:所定領域、91:部品、
91s:対象領域、91t:対象物、900:基板製品、
BF1:画像特徴量、FD1:特徴量分布、MD1:マハラノビス距離、
WM0:対基板作業機、WM3:部品装着機。
40: Teacher image, 50: Image confirmation device, 51: Acquisition unit, 52: Guidance unit,
60: display device, 90: substrate, 90t: predetermined area, 91: parts,
91s: target area, 91t: target object, 900: substrate product,
BF1: image feature amount, FD1: feature amount distribution, MD1: Mahalanobis distance,
WM0: work machine for board, WM3: component mounting machine.

Claims (9)

  1.  基板に所定の対基板作業を行う対基板作業機によって前記基板に設けられた対象物が撮像されている複数の画像であって機械学習に使用される画像である教師画像の各々について画像全体の特徴量である画像特徴量を抽出して、抽出された前記画像特徴量の分布である特徴量分布を取得する取得部と、
     前記取得部によって取得された前記特徴量分布において前記画像特徴量の外れ度が大きい順に前記教師画像を案内する案内部と、
    を備える画像確認装置。
    For each teacher image, which is an image used for machine learning, which is a plurality of images in which an object provided on the board is imaged by a work machine for performing a predetermined work with the board, an acquisition unit that extracts an image feature amount that is a feature amount and acquires a feature amount distribution that is a distribution of the extracted image feature amount;
    a guiding unit that guides the teacher image in descending order of deviation of the image feature quantity in the feature quantity distribution acquired by the acquiring unit;
    An image confirmation device comprising:
  2.  前記案内部は、前記画像特徴量の前記外れ度が大きい順に表示装置に前記教師画像を表示させる請求項1に記載の画像確認装置。 The image confirmation device according to claim 1, wherein the guidance unit causes the display device to display the teacher images in descending order of the degree of deviation of the image feature amount.
  3.  前記案内部は、前記表示装置の表示画面において、前記機械学習に使用しない前記教師画像を作業者に選択させる請求項2に記載の画像確認装置。 3. The image checking device according to claim 2, wherein the guide section allows the operator to select the teacher image not to be used for the machine learning on the display screen of the display device.
  4.  前記案内部は、前記画像特徴量の前記外れ度、前記対基板作業機によって生産された基板製品の検査結果、および、前記教師画像に関する画像情報のうちの少なくとも前記画像特徴量の前記外れ度を、前記教師画像と共に案内する請求項1~請求項3のいずれか一項に記載の画像確認装置。 The guide section is configured to display at least the degree of deviation of the image feature amount from among the degree of deviation of the image feature amount, the inspection result of the board product produced by the work machine for board, and the image information related to the teacher image. , the image confirmation device according to any one of claims 1 to 3, wherein guidance is provided together with the teacher image.
  5.  前記画像特徴量の前記外れ度は、マハラノビス距離によって表され、
     前記案内部は、前記マハラノビス距離が大きい順に前記教師画像を案内する請求項1~請求項4のいずれか一項に記載の画像確認装置。
    The degree of deviation of the image feature amount is represented by a Mahalanobis distance,
    5. The image confirmation device according to claim 1, wherein the guide section guides the teacher images in order of decreasing Mahalanobis distance.
  6.  前記取得部は、前記対基板作業機によって生産された基板製品が良品であり且つ前記機械学習による前記基板製品の検査結果が不良の場合に、前記基板製品の検査に使用された前記教師画像について前記特徴量分布を取得し、
     前記案内部は、前記教師画像を案内する請求項1~請求項5のいずれか一項に記載の画像確認装置。
    When the board product produced by the board-oriented working machine is a non-defective product and the inspection result of the board product by the machine learning is unsatisfactory, the acquisition unit acquires Obtaining the feature quantity distribution,
    6. The image confirmation device according to any one of claims 1 to 5, wherein the guide section guides the teacher image.
  7.  前記対基板作業機は、前記対象物である部品を前記基板に装着する部品装着機であり、
     前記基板製品の検査は、前記部品装着機によって前記基板の所定領域に前記部品が装着されているか否かを検査する部品有無検査である請求項6に記載の画像確認装置。
    The board-facing working machine is a component mounting machine that mounts a component, which is the target object, on the board,
    7. The image confirming apparatus according to claim 6, wherein the inspection of the substrate product is a component presence/absence inspection for inspecting whether or not the component is mounted in a predetermined area of the substrate by the component mounting machine.
  8.  前記画像特徴量は、画像を構成する複数の画素の輝度、前記基板において前記対象物が設けられている対象領域の重心、前記対象領域の面積、および、前記対象物の形状を示す指標のうちの少なくとも一つである請求項1~請求項7のいずれか一項に記載の画像確認装置。 The image feature amount is selected from among the luminance of a plurality of pixels forming an image, the center of gravity of a target area in which the target object is provided on the substrate, the area of the target area, and an index indicating the shape of the target object. The image confirmation device according to any one of claims 1 to 7, wherein at least one of
  9.  基板に所定の対基板作業を行う対基板作業機によって前記基板に設けられた対象物が撮像されている複数の画像であって機械学習に使用される画像である教師画像の各々について画像全体の特徴量である画像特徴量を抽出して、抽出された前記画像特徴量の分布である特徴量分布を取得する取得工程と、
     前記取得工程によって取得された前記特徴量分布において前記画像特徴量の外れ度が大きい順に前記教師画像を案内する案内工程と、
    を備える画像確認方法。
    For each teacher image, which is an image used for machine learning, which is a plurality of images in which an object provided on the board is imaged by a work machine for performing a predetermined work with the board, an acquisition step of extracting an image feature amount, which is a feature amount, and acquiring a feature amount distribution, which is a distribution of the extracted image feature amount;
    a guiding step of guiding the teacher image in the descending order of deviation of the image feature quantity in the feature quantity distribution acquired by the acquiring step;
    An image verification method comprising:
PCT/JP2022/007896 2022-02-25 2022-02-25 Image confirmation device and image confirmation method WO2023162142A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/007896 WO2023162142A1 (en) 2022-02-25 2022-02-25 Image confirmation device and image confirmation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/007896 WO2023162142A1 (en) 2022-02-25 2022-02-25 Image confirmation device and image confirmation method

Publications (1)

Publication Number Publication Date
WO2023162142A1 true WO2023162142A1 (en) 2023-08-31

Family

ID=87765112

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/007896 WO2023162142A1 (en) 2022-02-25 2022-02-25 Image confirmation device and image confirmation method

Country Status (1)

Country Link
WO (1) WO2023162142A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07220058A (en) * 1991-07-12 1995-08-18 Omron Corp Method and device for supporting illumination condition setting
JP2004361145A (en) * 2003-06-02 2004-12-24 Omron Corp Display method, quality control apparatus, and quality control system
JP2013247228A (en) * 2012-05-25 2013-12-09 Fuji Mach Mfg Co Ltd Substrate inspection method, inspection program, and inspection device
JP2020052554A (en) * 2018-09-25 2020-04-02 キヤノン株式会社 Information processing apparatus and information processing method
JP2020187071A (en) * 2019-05-16 2020-11-19 株式会社キーエンス Image inspection device and method for setting image inspection device
JP2021131831A (en) * 2020-02-21 2021-09-09 オムロン株式会社 Information processing device, information processing method and program
JP2021152489A (en) * 2020-03-24 2021-09-30 株式会社 システムスクエア Teacher data generation device, inspection device, and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07220058A (en) * 1991-07-12 1995-08-18 Omron Corp Method and device for supporting illumination condition setting
JP2004361145A (en) * 2003-06-02 2004-12-24 Omron Corp Display method, quality control apparatus, and quality control system
JP2013247228A (en) * 2012-05-25 2013-12-09 Fuji Mach Mfg Co Ltd Substrate inspection method, inspection program, and inspection device
JP2020052554A (en) * 2018-09-25 2020-04-02 キヤノン株式会社 Information processing apparatus and information processing method
JP2020187071A (en) * 2019-05-16 2020-11-19 株式会社キーエンス Image inspection device and method for setting image inspection device
JP2021131831A (en) * 2020-02-21 2021-09-09 オムロン株式会社 Information processing device, information processing method and program
JP2021152489A (en) * 2020-03-24 2021-09-30 株式会社 システムスクエア Teacher data generation device, inspection device, and program

Similar Documents

Publication Publication Date Title
JP3965288B2 (en) Substrate work result inspection device
US8849442B2 (en) Component mounting line and component mounting method
JP4767995B2 (en) Component mounting method, component mounting machine, mounting condition determining method, mounting condition determining apparatus, and program
JP4629584B2 (en) Mounting system and electronic component mounting method
US7543259B2 (en) Method and device for deciding support portion position in a backup device
JP2013221766A (en) Visual inspection device and visual inspection method
JP2013222740A (en) Visual inspection device and visual inspection method
JP2012160627A (en) Substrate processing apparatus
JP7149723B2 (en) Image management method and image management device
CN113498634B (en) Correction amount calculation device and correction amount calculation method
JP2007189029A (en) Mounting system, mounting machine, mounting method of printer and electronic component
WO2023162142A1 (en) Image confirmation device and image confirmation method
KR20110023330A (en) Method of adjusting work position automatically by reference value and automatic apparatus for the same
JP5830652B2 (en) Calibration jig and calibration method for visual inspection
JP4388423B2 (en) Electronic component mounting device
JP6904978B2 (en) Parts mounting machine
WO2023175831A1 (en) Image confirmation device and image confirmation method
US20220404816A1 (en) Operation state display device and operation state display method
WO2024062635A1 (en) Testing device and testing method
JP7473735B2 (en) Foreign object detection device and foreign object detection method
JP6064168B2 (en) Mark imaging method and component mounting line
WO2024033961A1 (en) Foreign matter detection device and foreign matter detection method
JP7261935B2 (en) Data management device and data management method
JP2004301620A (en) Inspection method for automatic part mounting device
JP7466746B2 (en) Countermeasure information presentation device and countermeasure information presentation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22928669

Country of ref document: EP

Kind code of ref document: A1