CN109792860B - Imaging device and mounting device - Google Patents

Imaging device and mounting device Download PDF

Info

Publication number
CN109792860B
CN109792860B CN201680089241.5A CN201680089241A CN109792860B CN 109792860 B CN109792860 B CN 109792860B CN 201680089241 A CN201680089241 A CN 201680089241A CN 109792860 B CN109792860 B CN 109792860B
Authority
CN
China
Prior art keywords
light
light source
epi
amount
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201680089241.5A
Other languages
Chinese (zh)
Other versions
CN109792860A (en
Inventor
大木秀晃
大崎聪士
深谷芳行
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuji Corp
Original Assignee
Fuji Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Corp filed Critical Fuji Corp
Publication of CN109792860A publication Critical patent/CN109792860A/en
Application granted granted Critical
Publication of CN109792860B publication Critical patent/CN109792860B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/081Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
    • H05K13/0813Controlling of single components prior to mounting, e.g. orientation, component geometry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Operations Research (AREA)
  • Manufacturing & Machinery (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Supply And Installment Of Electrical Components (AREA)
  • Image Input (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

An imaging control unit (52) of the component camera (40) causes the epi-light source (44) and the side-light source (47) to emit light when imaging an element (90) having wiring patterns (95, 96) and flat-surface terminals (93, 94) so that the amount of epi-light is smaller than the amount of side-light when imaging the element (90). The amount of the epi-illumination light is the amount of light emitted from the epi-illumination light source (44), reflected by the element (90), and received by the imaging unit (51). Similarly, the side-emission light receiving amount is the amount of light emitted from the side-emission light source (47), reflected by the element (90), and received by the imaging unit (51).

Description

Imaging device and mounting device
Technical Field
The invention relates to an imaging device and a mounting device.
Background
Conventionally, as an imaging device, a device that performs imaging by irradiating an electronic component as an imaging target with light from an illumination unit is known. For example, patent document 1 describes an illumination unit having three-stage light emitting diode groups that obliquely irradiate an element with light. The three-stage light emitting diode groups are installed such that the irradiation directions are formed at different angles. It is described that an image corresponding to the type and characteristics of the electronic component can be obtained by controlling the light amount independently for each light emitting diode group.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2005-260268
Disclosure of Invention
Problems to be solved by the invention
However, in an element having a wiring pattern and a terminal with a flat surface, there is a case where an image of an outer shape portion of a display element and the terminal is to be captured. However, when such an element is imaged by irradiating light to the element, there is a problem that, for example, detection accuracy of a terminal based on an imaged image is lowered due to display of an unnecessary wiring pattern. Patent document 1 does not describe control of the light amount suitable for imaging such an element.
The present invention has been made to solve the above problems, and a main object of the present invention is to capture an image in which a wiring pattern is hardly displayed and an outer shape portion of an element and a terminal are displayed.
Means for solving the problems
The present invention adopts the following means to achieve the above main object.
An imaging device according to the present invention is an imaging device for imaging an element having a wiring pattern and a flat-surface terminal, the imaging device including: an imaging unit that images the component based on the received light; an epi-illumination light source that irradiates the element with light in a direction along an optical axis of the imaging unit; a side-emitting light source that irradiates the element with light in a direction inclined from an optical axis of the imaging unit; and a light emission control unit configured to cause the epi-light source and the side-light source to emit light when the device is imaged such that an epi-light amount, which is a light amount emitted from the epi-light source and received by the imaging unit after being reflected by the device, is smaller than a side-light amount, which is a light amount emitted from the side-light source and received by the imaging unit after being reflected by the device.
In this imaging device, the light emission control unit causes the epi-light source and the side-light source to emit light so that the amount of the epi-light is smaller than the amount of the side-light at the time of imaging the element, and the imaging unit performs imaging of the element based on the received light. Here, the light from the epi-illumination light source is irradiated in a direction along the optical axis of the imaging unit. Therefore, the light irradiated from the epi-illumination light source and reflected by the flat surface terminal can reach the imaging unit more easily than the light irradiated from the epi-illumination light source and reflected by the outer shape portion of the element. That is, by using the light source for epi-illumination, it is easy to capture an image showing a terminal with a flat surface. On the other hand, light irradiated from the epi-illumination light source and reflected by the wiring pattern also reaches the imaging unit relatively easily. However, the surface of the wiring pattern is covered with an insulating film, for example, and thus the reflectance tends to be lower than that of the terminal. Therefore, by making the amount of the epi-light smaller than the amount of the side-light, i.e., by preventing the amount of light from the epi-light source from being excessively large, it is possible to capture an image in which the wiring pattern is difficult to display and the terminals are displayed. Further, light from the side-emitting light source is emitted in a direction inclined from the optical axis of the imaging unit. Therefore, the light reflected by the external shape portion of the element irradiated from the side-emitting light source can reach the imaging unit relatively easily compared with the light irradiated from the side-emitting light source and reflected by the wiring pattern or the terminal. That is, by using the side-emitting light source, an image showing the outer shape portion of the element can be easily captured. Therefore, by making the amount of side-incident light larger than the amount of epi-light, i.e., making the amount of light from the side-incident light source larger, it is possible to capture an image in which the wiring pattern is difficult to display and the outer shape portion of the element is displayed. On the other hand, for example, when the amount of incident light is equal to and large as the amount of incident light in the image capturing, the external shape portion of the element and the terminal can be displayed, but the wiring pattern can be easily displayed. For example, when the amount of incident light is equal to and smaller than the amount of incident light in the side-view image, the wiring pattern is difficult to display, but the external portions of the elements and the terminals are also easy to be difficult to display. As described above, the imaging device of the present invention can image an image in which the wiring pattern is difficult to display by making the amount of epi-light smaller than the amount of side-light, the terminals are displayed mainly by light from the epi-light source, and the outer shape portion of the element is displayed mainly by light from the side-light source. Further, "the wiring pattern is difficult to display" means that the luminance value of the pixel corresponding to the wiring pattern in the image becomes small.
In the imaging device according to the present invention, the light emission control unit may control light emission times of the epi-light source and the side-light source at the time of imaging the element so that the amount of the epi-light is smaller than the amount of the side-light. In this way, for example, the amount of the epi-light and the amount of the side-light can be controlled relatively easily as compared with the case where the power supplied to the epi-light source and the side-light source is controlled.
In the imaging device according to the present invention, the light emission control unit may cause the epi-light source and the side-light source to emit light so that the amount of the epi-light is 0.9 times or less the amount of the side-light when the element is imaged. In this way, the wiring pattern can be displayed more reliably. The amount of the epi-light is not limited to 0, but may be 0.1 times or more the amount of the side-light.
The mounting device of the present invention comprises: the imaging device according to any one of the above aspects of the invention; an element holding portion capable of holding the element; a moving section that moves the element holding section; and a mounting control unit that performs processing based on the image obtained by the imaging and processing for mounting the component on the substrate by controlling the component holding unit and the moving unit.
The mounting device of the present invention includes the imaging device according to any one of the above aspects. Therefore, the mounting device can obtain the same effect as the above-described imaging device of the present invention, for example, can image an image in which the wiring pattern is made difficult to display and the outer shape portion of the element and the terminal are displayed.
Drawings
Fig. 1 is a perspective view of a mounting device 10.
Fig. 2 is a schematic explanatory view of the structure of the part camera 40.
Fig. 3 is a block diagram showing a configuration related to control of the mounting apparatus 10.
Fig. 4 is a flowchart of a component mounting processing routine.
Fig. 5 is an explanatory diagram of the binary image P1 in the case where the image pickup device 90 is set to the epi-light amount < the side-light amount.
Fig. 6 is an explanatory diagram of the binary image P2 in the case where the image pickup device 90 is configured such that the amount of epi-light reception is equal to and larger than the amount of side-light reception.
Fig. 7 is an explanatory diagram of the binary image P3 in the case where the image pickup device 90 is configured such that the amount of epi-light reception is equal to and smaller than the amount of side-light reception.
Detailed Description
Embodiments of the present invention will be described below with reference to the drawings. Fig. 1 is a perspective view of the mounting apparatus 10, fig. 2 is a schematic explanatory view of a configuration of a component camera 40 provided in the mounting apparatus 10, and fig. 3 is a block diagram showing a configuration related to control of the mounting apparatus 10. In the present embodiment, the left-right direction (X axis), the front-rear direction (Y axis), and the up-down direction (Z axis) are as shown in fig. 1.
The mounting device 10 includes a base 12, a mounting device body 14 provided on the base 12, and a reel unit 70 as a component supply device attached to the mounting device body 14.
The mounting device body 14 is provided to be replaceable with respect to the base 12. The mounting apparatus main body 14 includes a substrate transfer apparatus 18 that transfers or holds a substrate 16, a head 24 that is movable in an XY plane, a machine chuck 37 that is attached to the head 24 and is movable along a Z axis, a component camera 40 that photographs components held by the machine chuck 37, and a controller 60 that executes various controls.
The substrate transport device 18 includes support plates 20 and 20 provided at a distance from each other in the front-rear direction of fig. 1 and extending in the left-right direction, and conveyors 22 and 22 (only one is shown in fig. 1) provided on the surfaces of the two support plates 20 and 20 facing each other. The conveyor belts 22, 22 are looped around drive wheels and driven wheels provided on the left and right of the support plates 20, 20. The substrate 16 is placed on the upper surfaces of the pair of conveyors 22, 22 and conveyed from left to right. The substrate 16 is supported from the back side by a plurality of support pins 23 provided upright.
The head 24 is mounted to the front surface of the X-axis slide 26. The X-axis slider 26 is attached slidably in the left-right direction to the front surface of a Y-axis slider 30 slidable in the front-rear direction. The Y-axis slider 30 is slidably attached to a pair of right and left guide rails 32, 32 extending in the front-rear direction. A pair of upper and lower guide rails 28, 28 extending in the left-right direction are provided on the front surface of the Y-axis slider 30, and the X-axis slider 26 is attached to the guide rails 28, 28 so as to be slidable in the left-right direction. The head 24 moves in the left-right direction as the X-axis slider 26 moves in the left-right direction, and the head 24 moves in the front-back direction as the Y-axis slider 30 moves in the front-back direction. The sliders 26 and 30 are driven by drive motors 26a and 30a (see fig. 3), respectively. The head 24 incorporates a Z-axis motor 34, and the height of a mechanical chuck 37 attached to a ball screw 35 extending along the Z-axis is adjusted by the Z-axis motor 34. Further, the head 24 incorporates a Q-axis motor 36 (see fig. 3) for rotating a mechanical chuck 37.
The mechanical chuck 37 is a mechanism capable of holding an element by gripping the element. The mechanical chuck 37 includes a pair of front and rear gripping portions 38 projecting downward from a bottom surface of a main body of the mechanical chuck 37, a slider (not shown) that can slide the pair of gripping portions 38 in directions to approach and separate from each other, and a drive motor 39 (see fig. 3) that drives the slider. The holding portions 38 are moved closer to each other by driving the motor 39, so that the holding portions 38 hold the element. Further, the mechanical chuck 37 is moved up and down in the Z-axis direction by the Z-axis motor 34, thereby adjusting the height of the element gripped by the gripping portion 38. The orientation of the element held by the holding portion 38 is adjusted by rotating the mechanical chuck 37 by the Q-axis motor 36.
The component camera 40 is disposed in front of the support plate 20 on the front side of the substrate transfer device 18. The part camera 40 takes an image of the component held by the mechanical chuck 37 from below with the upper side of the part camera 40 as an imaging range, and generates an image. As shown in fig. 2, the part camera 40 includes an illumination unit 41 that irradiates light to an element to be imaged, an imaging unit 51 that captures an image of the element based on the received light, and an imaging control unit 52 that controls the whole part camera 40.
The illumination unit 41 includes a housing 42, a coupling unit 43, a light source 44 for epi-illumination, a half mirror 46, and a light source 47 for side illumination. The housing 42 is a bowl-shaped member having an octagonal opening on the upper surface and the lower surface (bottom surface). The housing 42 is formed in a shape in which the opening of the upper surface is larger than the opening of the lower surface and the internal space tends to become larger from the lower surface toward the upper surface. The coupling portion 43 is a tubular member that couples the housing 42 and the imaging portion 51. The light emitted from the epi-illumination light source 44 and the light received by the imaging unit 51 pass through the internal space of the coupling unit 43. The epi-illumination light source 44 is a light source for irradiating light to the element held by the mechanical chuck 37 in a direction along the optical axis 51a of the imaging unit 51. The epi-light source 44 includes a plurality of LEDs 45 facing the half mirror 46 and emitting light in a direction perpendicular to the optical axis 51 a. The LEDs 45 are attached to the inner peripheral surface of the coupling portion 43. In the present embodiment, the optical axis 51a is oriented in the vertical direction, and the light from the LED45 is irradiated in the horizontal direction (for example, the left-right direction). The half mirror 46 is disposed inside the coupling portion 43 so as to be inclined from the optical axis 51a (for example, at an inclination angle of 45 °). The half mirror 46 reflects the light in the horizontal direction from the incident light source 44 upward. Therefore, the light from the LED45 of the light source 44 for falling light is reflected by the half mirror 46 and then irradiated in a direction along the optical axis 51a of the imaging unit 51 (here, upward). The half mirror 46 transmits light from above toward the imaging unit 51. The side-emitting light source 47 is a light source for irradiating light to the element held by the mechanical chuck 37 in a direction inclined from the optical axis 51 a. The side-emitting light source 47 includes an upper stage light source 47a having a plurality of LEDs 48a, a middle stage light source 47b disposed below the upper stage light source 47a and having a plurality of LEDs 48b, and a lower stage light source 47c disposed below the middle stage light source 47b and having a plurality of LEDs 48 c. The LEDs 48a to 48c are attached to the inner peripheral surface of the housing 42. Any one of the LEDs 48a to 48c irradiates light in a direction inclined from the optical axis 51a (an inclination angle with respect to the optical axis 51a is more than 0 ° and less than 90 °). Among the inclination angles of the irradiation directions of the LEDs 48a to 48c with respect to the optical axis 51a, the LED48a is largest, and the LED48a irradiates light in a nearly horizontal direction. In addition, in the inclination angle, the LED48c is formed to be smallest.
The imaging unit 51 includes an imaging element and an optical system such as a lens, not shown. When light emitted from the epi-light source 44 and the side-light source 47 and reflected by an element of the subject passes through the half mirror 46 and reaches the imaging unit 51, the imaging unit 51 receives the light. The imaging unit 51 performs photoelectric conversion on the received light to generate electric charges corresponding to the respective pixels in the image, and generates digital data including information of the respective pixels, that is, image data, based on the generated electric charges.
The imaging control unit 52 outputs a control signal to the illumination unit 41 to control the illumination of light from the illumination unit 41, outputs a control signal to the imaging unit 51 to capture an image, or outputs image data generated by the imaging unit 51 to the controller 60. The imaging control unit 52 controls the value and the energization time of the current to be energized to each of the LEDs 45, 48a to 48c of the illumination unit 41, and can independently control the light emission amounts and the light emission times per unit time of the epi-light source 44 and the side-light source 47. The imaging control unit 52 can independently control the light emission amounts and the light emission times per unit time of the upper stage light source 47a, the middle stage light source 47b, and the lower stage light source 47c of the side-emission light source 47.
Here, an example of an element imaged by the part camera 40 will be described with reference to fig. 2. Of the views of the element 90 shown in fig. 2, the upper-stage view is a bottom view of the element 90, and the lower-stage view is a front view of the element 90. The element 90 includes a main body 91 and a terminal plate 92 for connection to the substrate when the main body 91 is disposed on the substrate. The terminal plate 92 is a plate-like member having a substantially planar lower surface. The terminal plate 92 is formed in a quadrangular shape in which two corner portions are cut off in a bottom view. The portion of the cut-out portion located at the lower left in the upper stage of fig. 2 is referred to as a cut-out portion 92a, and the portion located at the lower right is referred to as a cut-out portion 92 b. Terminals 93 and 94, a wiring pattern 95 connected to terminal 93, and a wiring pattern 96 connected to terminal 94 are disposed on the lower surface of terminal plate 92. The terminals 93 and 94 are electrodes connected to other elements or wiring patterns on a substrate, for example, and have flat surfaces (lower surfaces in this case) and exposed conductors such as metals. The terminals 93 and 94 have polarities, and the terminals 93 and 94 can be recognized by positional relationships with the notches 92a and 92 b. The wiring patterns 95 and 96 are wirings for electrically connecting the terminals 93 and 94 and the inside of the main body 91, and have flat lower surfaces and are mainly composed of a conductor such as a metal. In order to prevent short-circuiting, the surfaces (lower surfaces in this case) of the wiring patterns 95 and 96 are covered with an insulating film. Since the wiring patterns 95 and 96 are covered with the insulating film, the reflectance is lower than that of the terminals 93 and 94. Terminal plate 92 is made of a material having a lower reflectance than terminals 93 and 94 and wiring patterns 95 and 96, such as black resin. The component camera 40 irradiates the lower surface of the terminal plate 92 with light from the illumination unit 41, and the imaging unit 51 receives the reflected light of the light to image the lower surface of the terminal plate 92. As will be described in detail later, the part camera 40 performs imaging by changing the amount of light emitted from the illumination unit 41 between the case where the imaging target is the element 90 and the case where the imaging target is other than the element.
As shown in fig. 3, the controller 60 is configured as a microprocessor including a CPU61 as a center, and includes a ROM62 storing processing programs, an HDD63 storing various data, a RAM64 used as a work area, an input/output interface 65 for exchanging electrical signals with external devices, and the like, which are connected via a bus 66. The controller 60 outputs drive signals to the substrate transfer device 18, the drive motor 26a of the X-axis slider 26, the drive motor 30a of the Y-axis slider 30, the Z-axis motor 34, the Q-axis motor 36, and the drive motor 39 of the machine chuck 37. The controller 60 outputs information on imaging conditions including the control amounts of the epi-light source 44 and the side-light source 47 during imaging to the part camera 40 or inputs image data from the part camera 40. Further, although not shown, position sensors, not shown, are provided on the sliders 26 and 30, and the controller 60 controls the drive motors 26a and 30a of the sliders 26 and 30 while inputting position information from the position sensors.
The reel unit 70 includes a plurality of reels 72 and is detachably attached to the front side of the mounting device body 14. A tape is wound around each tape reel 72, and elements are held on the surface of the tape in the longitudinal direction of the tape. These components are protected by a film covering the surface of the tape. Such a tape is unwound backward from the tape reel, and is peeled off by the feeder unit 74 to expose the components. The exposed component is held by the holding portion 38 of the mechanical chuck 37, and the component is held by the mechanical chuck 37 and can move together with the head 24.
The management computer 80 is a computer that manages the production work of the mounting apparatus 10, and is connected to the controller 60 of the mounting apparatus 10 so as to be able to communicate with it. The production operation is information for determining what kind of components are mounted on what kind of substrate 16 in what order in the mounting apparatus 10 and how many substrates 16 are mounted with components. The management computer 80 stores the production job and outputs information included in the production job to the mounting apparatus 10 as necessary.
Next, the operation of the mounting apparatus 10 according to the present embodiment, in particular, the process of mounting the component on the substrate 16 in association with the imaging of the component using the component camera 40 will be described. Upon receiving the instruction from the management computer 80, the CPU61 of the controller 60 of the mounting apparatus 10 first acquires information on the component to be mounted from the management computer 80. Next, the CPU61 performs component mounting processing including processing for picking up an image of a component to be mounted and processing for arranging the component on a substrate. Then, the CPU61 repeats the process of acquiring information on the component to be mounted next and the component mounting process until all components are mounted on the board 16. Hereinafter, the component mounting process in the case where the component 90 shown in fig. 2 is a component to be mounted will be described in particular. Fig. 4 is a flowchart showing an example of the component mounting processing routine. The component mounting processing routine of fig. 4 is stored in the HDD 63. When the component type of the component of the next mounting object acquired from the management computer 80 is the component 90, the controller 60 starts the component mounting processing routine of fig. 4.
When the component mounting processing routine of fig. 4 is started, the CPU61 sets the control amounts of the epi-light source 44 and the side-light source 47 at the time of photographing so that the amount of epi-light is less than the amount of side-light (step S100). Here, the amount of the epi-light is the amount of light emitted from the epi-light source 44, reflected by the element 90, and received by the imaging unit 51. Similarly, the side-emission light receiving amount is the amount of light emitted from the side-emission light source 47, reflected by the element 90, and received by the imaging unit 51. The amount of the falling light and the amount of the side light are integrated values of the amount of light received during imaging (for example, the product of the amount of light received per unit time and the light receiving time). The case where the amount of epi-light reception < the amount of side-light reception, in other words, the case where only the epi-light source 44 is caused to emit light and the image is captured, is the case where the average value of the luminance values (for example, the luminance values of 256 gradations) of each pixel of the image is smaller than the average value of the luminance values of each pixel of the image in the case where only the side-light source 47 is caused to emit light and the image is captured. In the present embodiment, only the upper stage light source 47a of the side-emitting light sources 47 is caused to emit light when the imaging element 90 is used. In the present embodiment, the values of the currents supplied to the LED45 and the LED48a by the imaging unit 51 are determined in advance through experiments so that the amount of the epi-light is equal to the amount of the side-light when the light emission time (referred to as the epi-light emission time t1) of the epi-light source 44 at the time of imaging is equal to the light emission time (referred to as the side-light emission time t2) of the side-light source 47 (here, only the upper stage light source 47 a). Therefore, in step S100, the CPU61 sets the epi-emission time t1 and the side-emission time t2 so that t1 < t2 as the control amounts of the epi-light source 44 and the side-emission light source 47. The values of the epi-emission time t1 and the side-emission time t2 may be stored in the HDD63 in advance in association with the element 90, may be included in the production job, or may be derived by the CPU61 based on information included in the production job, for example.
When the epi-emission light emission time t1 and the side-emission light emission time t2 are set in step S100, the CPU61 moves the head 24 to cause the holding portion 38 of the mechanical chuck 37 to hold the component to be mounted supplied from the tape reel unit 70 (step S110). Next, the CPU61 moves the head 24 to move the component held by the machine chuck 37 upward of the part camera 40 (step S120). Then, the CPU61 outputs a control signal to the imaging control unit 52 so that the illumination unit 41 is controlled by the control amounts (here, the epi-emission time t1 and the side-emission time t2) set in step S100 to perform imaging (step S130). The imaging control unit 52 causes the epi-light source 44 and the side-light source 47 to emit light based on the input epi-light emission time t1 and side-light emission time t2, generates an image based on the light received by the imaging unit 51, and images the component held by the mechanical chuck 37. Thereby, the image of the element 90 is captured under the conditions of the amount of incident light < the amount of side incident light. In the present embodiment, the imaging unit 51 generates image data of a gradation representing the luminance value of each pixel in 256 gradations, for example, based on the received light. The imaging control unit 52 outputs the image data obtained by imaging to the controller 60.
When image data obtained by shooting is input from the shooting control section 52, the CPU61 performs predetermined processing based on the image (steps S140 to S180). Specifically, first, the CPU61 binarizes the luminance value of each pixel of the obtained image data to obtain a binary image (step S140). The binarization can be performed by a known method such as a discriminant analysis method (profuse binarization).
Here, an example of a binary image obtained from image data obtained by the imaging device 90 will be described. Fig. 5 is an explanatory diagram of a binary image P1 obtained by the image pickup device 90 assuming that the amount of incident light < the amount of side-incident light. Fig. 6 is an explanatory diagram of a binary image P2 in the case where the image pickup device 90 is configured to have the same amount of epi-light reception light and the amount of side-light reception light and to have both the amounts of epi-light reception light and the amount of side-light reception light larger, as a comparative example, unlike the present embodiment. Fig. 7 is an explanatory diagram of a binary image P3 in the case where the image pickup device 90 is configured to have the same and smaller epi-light amount and side-light amount as a comparative example, unlike the present embodiment. In fig. 5 to 7, the pixel having a smaller luminance value after binarization is represented by black, and the pixel having a larger luminance value after binarization is represented by white.
As is clear from fig. 5, in the binary image P1, pixels corresponding to the outline portion of the element 90 (here, the edge portion of the terminal plate 92 when viewed from the bottom) and the terminals 93 and 94 are formed in white. On the other hand, in the binary image P1, pixels corresponding to portions other than the edge of the terminal plate 92, the wiring patterns 95 and 96, and portions other than the element 90 (background portions) are formed in black, and the wiring patterns 95 and 96 are not displayed. By imaging the element 90 with the epi-light amount < the side-light amount in this way, a binary image can be obtained in which the external shape of the element 90 and the terminals 93 and 94 are displayed without displaying the wiring patterns 95 and 96. The reason for this will be described.
First, the light from the epi-illumination light source 44 is irradiated in a direction along the optical axis 51a of the imaging unit 51 (here, upward). Thus, light emitted upward from the epi-light source 44 and reflected by a planar portion perpendicular to the optical axis 51a in the element 90 advances in a direction along the optical axis 51a (downward in this case) as it is. Therefore, the light irradiated from the epi-light source 44 and reflected by the flat- surface terminals 93 and 94 can reach the imaging unit 51 relatively easily compared with the light irradiated from the epi-light source 44 and reflected by the outer shape portion of the element 90. That is, by using the epi-light source 44, it is easy to capture an image showing the terminals 93 and 94 having flat surfaces. On the other hand, since the surfaces of the wiring patterns 95 and 96 are flat, the light irradiated from the epi-light source 44 and reflected by the wiring patterns 95 and 96 can relatively easily reach the image pickup unit 51. However, as described above, the surfaces of the wiring patterns 95 and 96 are covered with the insulating film, and the reflectance is lower than that of the terminals 93 and 94. Therefore, by making the amount of the epi-light smaller than the amount of the side-light, i.e., by preventing the amount of light from the epi-light source 44 from becoming excessively large, it is possible to capture an image in which the wiring patterns 95 and 96 are difficult to display and the terminals 93 and 94 are displayed. In the present embodiment, in the image captured by the imaging unit 51 (here, a 256-tone gray scale image), the amount of the incident light is set to a small value to the extent that the luminance value of the pixels corresponding to the terminals 93 and 94 is larger than the threshold value at the time of binarizing the image and the luminance value of the pixels corresponding to the wiring patterns 95 and 96 is smaller. Thus, based on the image captured by the imaging unit 51, an image in which the terminals 93 and 94 are displayed and the wiring patterns 95 and 96 are not displayed can be obtained as in the binary image P1 of fig. 5. The value of the amount of epi-light reception and the control amount of the epi-light source 44 for realizing the value (here, the epi-light emission time t1) can be determined in advance by experiments, for example.
Light from the side-emission light source 47 (here, the upper stage light source 47a) is emitted in a direction inclined from the optical axis 51a of the imaging unit 51. Therefore, the light irradiated from the side-view light source 47 and reflected by the outer shape portion of the element 90 (here, the edge portion of the terminal plate 92 in a bottom view) reaches the image pickup unit 51 relatively easily compared to the light irradiated from the side-view light source 47 and reflected by the wiring patterns 95 and 96 or the terminals 93 and 94. That is, by using the side-emitting light source 47, it is easy to capture an image showing the outer shape portion of the element 90. Therefore, by making the amount of side-emitting light larger than the amount of epi-light, i.e., making the amount of light from the side-emitting light source 47 larger, it is possible to capture an image showing the outer shape portion of the element 90 while maintaining the state in which the wiring patterns 95 and 96 are difficult to display. In the present embodiment, in the image captured by the imaging unit 51 (here, a 256-tone gray scale image), the amount of side-incident light is set to a large value to the extent that the luminance value of the pixel corresponding to the outer shape portion of the element 90 is larger than the threshold value at the time of binarizing the image. Thus, an image showing the outline of the element 90 as in the binary image P1 of fig. 5 can be obtained based on the image captured by the imaging unit 51. The value of the amount of side-emission light and the control amount of the side-emission light source 47 for realizing the value (here, the side-emission light emission time t2) can be determined in advance by experiments, for example.
On the other hand, for example, when the amount of incident light is equal to and large as the amount of side-incident light during imaging, the outer shape portions of the elements 90 and the terminals 93 and 94 can be displayed, but the wiring patterns 95 and 96 can be easily displayed. That is, the luminance values of the pixels corresponding to the wiring patterns 95 and 96 in the image captured by the image capturing unit 51 tend to increase. Therefore, as in the binary image P2 of fig. 6, the pixels corresponding to the wiring patterns 95 and 96 after binarization are easily formed to be white. For example, when the amount of incident light is equal to and smaller than the amount of side light, the wiring patterns 95 and 96 can be made difficult to display, but the outer shape of the element 90 and the terminals 93 and 94 can be made difficult to display. That is, the luminance value of the pixel corresponding to the external portion of the element 90 and the terminals 93 and 94 in the image captured by the imaging unit 51 is likely to be small. Therefore, as in the binary image P3 of fig. 7, the pixel corresponding to at least one of the outer shape portion of the element 90 and the terminals 93 and 94 after binarization (in fig. 7, the pixel corresponding to the outer shape portion of the element 90) is easily formed in black. Note that, although not shown, for example, in the case where imaging is performed with the epi-light receiving amount > side-light receiving amount contrary to the present embodiment, it is difficult to display the outer shape portion of the element 90 and the wiring patterns 95 and 96 are easily displayed. Therefore, for example, when binarizing the captured image, it is easy to form an image in which the terminals 93 and 94 and the wiring patterns 95 and 96 are displayed without displaying the outer shape portion of the element 90.
As described above, the component camera 40 according to the present embodiment can capture an image in which the wiring patterns 95 and 96 are difficult to display, the terminals 93 and 94 are displayed mainly by light from the epi-light source 44, and the outer shape portion of the element 90 is displayed mainly by light from the side-light source 47 by making the epi-light amount smaller than the side-light amount. By binarizing the image, a binary image in which the outer shape portions of the elements 90 and the terminals 93 and 94 are displayed without displaying the wiring patterns 95 and 96 can be obtained as in the binary image P1 of fig. 5.
When the binary image of the element 90 is obtained in step S140, the CPU61 acquires information on the element 90 based on the binary image (step S150). In the present embodiment, the CPU61 detects the outer shape of the element 90, the terminals 93 and 94, the center of the element 90, and the orientation of the element 90. Specifically, first, the CPU61 detects the outer shape of the element 90 and the terminals 93 and 94 by pattern matching or the like based on the binary image. Next, the CPU61 detects the center position of the element 90 based on the detected outer shape. The CPU61 detects the notches 92a and 92b based on the detected outer shape, and detects the orientation of the element 90 based on the detected outer shape. In the present embodiment, as shown in fig. 5, the wiring patterns 95 and 96 are not displayed in the binary image. Therefore, for example, in step S150, the CPU61 can be prevented from erroneously detecting the wiring patterns 95 and 96 as the external portions of the element 90 and the terminals 93 and 94, as compared with the case where the wiring patterns 95 and 96 are displayed as shown in fig. 6.
Next, the CPU61 determines whether or not there is an abnormality in the component 90 gripped by the gripping portion 38 based on the information acquired in step S150 (step S160), and if there is an abnormality, discards the component 90 gripped by the gripping portion 38 (step S170), and performs the processing after step S110. The CPU61 determines whether or not there is an abnormality in the element 90 based on, for example, the shape of the outer shape of the element 90 and the shapes of the terminals 93 and 94. When there is no abnormality in step S160, the CPU61 derives the correction amount of the mounting position and orientation of the component 90 based on the information acquired in step S150 (step S180). For example, the CPU61 derives a correction amount of the mounting position of the component 90 to the substrate 16 based on the detected center position (coordinates) of the component 90. The CPU61 derives, as the correction amount of the orientation, the driving amount of the Q-axis motor 36 (the amount of rotation of the element 90) required when mounting the element 90 on the substrate 16, based on the detected orientation of the element 90. Then, the CPU61 arranges the component 90 on the substrate 16 with the correction amount of the derived mounting position and orientation added (step S190), and ends the component mounting processing routine. As described above, in step S150, the CPU61 can be prevented from erroneously detecting the wiring patterns 95 and 96 as the external portions of the element 90 and the terminals 93 and 94, and therefore the CPU61 can also accurately perform the processing in steps S160 and S180.
The CPU61 performs component mounting processing for components of component types other than the component 90, as in the component mounting processing routine of fig. 4. However, the control amounts of the epi-light source and the side-light source in step S100 are predetermined for each component type, and the CPU61 sets the control amount according to the component type of the mounting target. For example, the CPU61 may set t1 to t2 and set the amount of incident light to the amount of side-incident light in step S100 for a different element type having no wiring pattern and only terminals from the element 90. The contents of the processing in steps S150 to S180 are also predetermined for each component type, and the CPU61 performs processing according to the component type of the mounting target. These pieces of information predetermined in units of component types may be stored in advance in HDD63, for example, or may be included in the production work.
Here, the correspondence relationship between the components of the present embodiment and the components of the present invention is clarified. The component camera 40 of the present embodiment corresponds to the imaging device of the present invention, the imaging unit 51 corresponds to the imaging unit, the epi-light source 44 corresponds to the epi-light source, the side-light source 47 corresponds to the side-light source, and the imaging control unit 52 corresponds to the light emission control unit. The mechanical chuck 37 corresponds to an element holding portion, the X-axis slider 26 and the Y-axis slider 30 correspond to a moving portion, and the controller 60 corresponds to an attachment control portion.
According to the component camera 40 of the mounting apparatus 10 of the present embodiment described in detail above, in the process of imaging the element 90 having the wiring patterns 95 and 96 and the terminals 93 and 94 having flat surfaces, the imaging control unit 52 causes the epi-light source 44 and the side-light source 47 to emit light so that the amount of the epi-light is smaller than the amount of the side-light at the time of imaging the element 90. This makes it difficult to display the wiring patterns 95 and 96, and allows an image to be captured in which the outer portions of the element 90 and the terminals 93 and 94 are displayed.
The imaging control unit 52 controls the light emission time (t1, t2) of the epi-light source 44 and the side-light source 47 at the time of the imaging element 90 so that the amount of epi-light is smaller than the amount of side-light. Therefore, for example, the amount of the epi-light and the amount of the side-light can be controlled relatively easily as compared with the case where the power supplied to the epi-light source 44 and the side-light source 47 is controlled.
The imaging control unit 52 causes the epi-light source 44 and the side-light source 47 to emit light so that predetermined amounts of epi-light and side-light are formed, and when the luminance value of the pixel is binarized, an image is obtained in which the pixels corresponding to the wiring patterns 95 and 96 are included on the side having a smaller luminance, and the pixels corresponding to the terminals 93 and 94 and the pixels corresponding to the outer shape portion of the element 90 are included on the side having a larger luminance (for example, a binary image P1 in fig. 5). Therefore, by performing binarization, an image in which the wiring patterns 95 and 96 can be displayed can be captured.
The present invention is not limited to the above-described embodiments, and can be implemented in various forms as long as the technical scope of the present invention is covered.
For example, in the above-described embodiment, the amount of epi-light reception and the amount of side-light reception are predetermined so that, when the luminance value of the pixel is binarized, an image in which the pixels corresponding to the wiring patterns 95 and 96 are included on the side having a smaller luminance and the pixels corresponding to the terminals 93 and 94 and the pixels corresponding to the outer shape portion of the element 90 are included on the side having a larger luminance is obtained. For example, the imaging control unit 52 may cause the epi-light source 44 and the side-light source 47 to emit light so that the amount of the epi-light is 0.9 times or less the amount of the side-light at the time of the imaging element 90. In this way, the wiring patterns 95 and 96 can be more reliably made difficult to display. The amount of the epi-light may be 0.8 times or less, or 0.7 times or less, the amount of the side-light. The amount of the epi-light is not limited to 0, but may be 0.1 times or more the amount of the side-light. The amount of the epi-light may be 0.3 times or more, or 0.5 times or more, the amount of the side-light.
Alternatively, the imaging control unit 52 may cause the epi-light source 44 and the side-light source 47 to emit light so that predetermined amounts of epi-light and side-light are formed, so that an image is obtained in a state in which the maximum value of the luminance values of the pixels corresponding to the wiring patterns 95 and 96 is smaller than the minimum value of the luminance values of the pixels corresponding to the outer shape portion of the element 90. In this way, images that make it difficult to display the wiring patterns 95, 96 can be captured to the extent that the external shape portions of the element 90 and the wiring patterns 95, 96 can be easily distinguished based on the luminance values. In this case, the imaging control unit 52 may cause the epi-light source 44 and the side-light source 47 to emit light so that predetermined epi-light receiving amounts and side-light receiving amounts are formed, so that an image in which the luminance values of the pixels corresponding to the wiring patterns 95 and 96 are the minimum among all the pixels is obtained. In this way, an image that further makes the wiring patterns 95 and 96 difficult to display can be captured.
Alternatively, the imaging control unit 52 may cause the epi-light source 44 and the side-light source 47 to emit light so that predetermined amounts of epi-light and side-light are formed, so that an image in which the outer shape portions of the elements 90 and the terminals 93 and 94 are displayed without displaying the wiring patterns 95 and 96 is obtained. In this way, an image in which the wiring patterns 95 and 96 that do not require imaging are not displayed can be obtained. Note that, as for the image not displaying the wiring patterns 95 and 96, for example, as in the above-described embodiment, a binary image in which the pixels corresponding to the wiring patterns 95 and 96 are included on the side of the smaller luminance side may be used, an image in which the luminance values of the pixels corresponding to the wiring patterns 95 and 96 are regarded as the same as the luminance values of the pixels around the pixels (that is, an image in which the pixels of the wiring patterns 95 and 96 cannot be specified even if image processing is performed) may be used, a pixel in which the luminance values of the pixels corresponding to the wiring patterns 95 and 96 are the smallest among all the pixels may be used, or an image in which the luminance values of the pixels corresponding to the wiring patterns 95 and 96 are 0 (black) may be used.
In the above embodiment, the imaging controller 52 controls the epi-emission time t1 and the side-emission time t2 to form the epi-received light amount < the side-received light amount, but is not limited thereto. For example, instead of controlling the epi-emission time t1 and the side-emission time t2, that is, the continuous emission time, the imaging control unit 52 may control the duty ratio between the emission time and the non-emission time of the epi-light source 44 and the side-light source 47 to form the epi-light amount < the side-light amount. That is, the light emission and the non-light emission are repeated by the epi-light source 44 and the side-light source 47 in one shot, and the imaging control unit 52 may control the ratio of the light emission time in one cycle of the light emission and the non-light emission (i.e., the duty ratio) to form the epi-light receiving amount < the side-light receiving amount. Alternatively, the imaging control unit 52 may control the currents supplied to the epi-light source 44 and the side-light source 47 to form the amount of epi-light reception < the amount of side-light reception.
In the above embodiment, the image captured by the image capturing unit 51 is a grayscale image, and the CPU61 binarizes the image in step S140 and acquires information on the element 90 in step S150, but the present invention is not limited to this. For example, the CPU61 may acquire information related to the element 90 (for example, the shape of the outer shape of the element 90 and the shapes of the terminals 93 and 94) by performing pattern matching, edge detection, or the like on a grayscale image. In this case, setting the epi-light receiving amount < the side-light receiving amount makes it difficult to display the wiring patterns 95 and 96 in the grayscale image (the luminance values of the pixels corresponding to the wiring patterns 95 and 96 are reduced). Therefore, as in the above-described embodiment, for example, erroneous detection when the CPU61 acquires information on the element 90 can be suppressed. Further, the image captured by the capturing section 51 may be a color image or a binary image.
In the above embodiment, the imaging control unit 52 causes only the upper stage light source 47a of the side-emission light sources 47 to emit light in the imaging device 90, but is not particularly limited thereto, and may cause at least one of the upper stage light source 47a, the middle stage light source 47b, and the lower stage light source 47c to emit light. However, there is a tendency that: the more the light source whose inclination angle with respect to the optical axis 51a is close to 90 ° is used, the more the image can be captured in which the outer shape portion of the element 90 is displayed while maintaining the state in which the wiring patterns 95 and 96 are difficult to display. Therefore, it is preferable that the light source (upper stage light source 47a in the above embodiment) having the inclination angle closest to 90 ° emit light.
In the above-described embodiment, the element 90 is exemplified as the object to be imaged so that the amount of epi-light reception is smaller than the amount of side-light reception, but the object to be imaged is not limited to this. As long as the element has a wiring pattern and a flat-surface terminal, and the outer shape portion of the element and the terminal need not be detected by the wiring pattern, the same effect can be obtained by setting the epi-light amount < the side-light amount in the same manner as in the above embodiment. For example, the element to be photographed may be an element having no polarity at a plurality of terminals, which is different from the element 90. The outer shape of the element 90 (the edge of the terminal plate 92 when viewed from the bottom) is formed as a right-angled corner, but the present invention is not limited to this, and the outer shape of the element to be imaged may be chamfered or the like, and the outer shape may be formed as a slope inclined from the optical axis 51 a.
In the above embodiment, the mounting device 10 includes the mechanical chuck 37 for gripping the component, but is not limited thereto as long as the component can be held. For example, the mounting device 10 may include a suction nozzle for sucking and holding a component instead of the mechanical chuck 37.
In the above embodiment, the controller 60 sets the control amounts of the epi-light source 44 and the side-light source 47, but the control amounts are not limited to this, and may be determined by the imaging control unit 52, for example.
Industrial applicability
The present invention can be applied to a mounting apparatus for mounting a component on a substrate.
Description of the reference numerals
10 mounting device, 12 base, 14 mounting device body, 16 substrate, 18 substrate conveying device, 20 support plate, 22 conveyor belt, 23 support pin, 24 head, 26X axis slider, 26a drive motor, 28 guide rail, 30Y axis slider, 30a drive motor, 32 guide rail, 34Z axis motor, 35 ball screw, 36Q axis motor, 37 mechanical chuck, 38 holding part, 39 drive motor, 40 part camera, 41 illumination part, 42 housing, 43 connection part, 44 light source for down-projection, 45 LED, 46 half mirror, 47 side light source, 47a upper stage light source, 47b middle stage light source, 47c lower stage light source, 48a to 48c LED, 51 imaging part, 51a optical axis, 52 imaging control part, 60 controller, 61 CPU, 62 ROM, 63 HDD, 64 RAM, 65 input/output interface, 66 bus, 70 tape reel unit, 72, 74 tape reel part, and tape reel, 80 management computer, 90 elements, 91 main body, 92 terminal board, 92a, 92b notch, 93, 94 terminal, 95, 96 wiring pattern, P1-P3 binary image.

Claims (4)

1. An imaging device for imaging an element having a wiring pattern and a flat-surface terminal,
the imaging device is provided with:
an imaging unit that images the component based on the received light;
an epi-illumination light source for irradiating the element with light in a direction along an optical axis of the imaging unit;
a side-emitting light source for irradiating the element with light in a direction inclined from an optical axis of the imaging unit; and
the light emission control unit controls light emission of the epi-light source and the side-light source so that the amount of epi-light is smaller than the amount of side-light, and an image in which the terminal is displayed and the wiring pattern is difficult to display is captured by the amount of light from the epi-light source, and an image in which the external shape of the element is displayed and the wiring pattern is difficult to display is captured by the amount of light from the side-light source, when the element is imaged.
2. The camera according to claim 1, wherein,
the light emission control unit controls light emission time of the epi-light source and the side-light source when the device is imaged, so that the epi-light receiving amount is smaller than the side-light receiving amount.
3. The photographing apparatus according to claim 1 or 2, wherein,
the light emission control unit causes the epi-light source and the side-light source to emit light so that the amount of the epi-light is 0.9 times or less the amount of the side-light when the device is imaged.
4. A mounting device is provided with:
the imaging device according to any one of claims 1 to 3;
an element holding portion capable of holding the element;
a moving section that moves the element holding section; and
and a mounting control unit that performs processing based on the image obtained by the imaging and processing for mounting the component on the substrate by controlling the component holding unit and the moving unit.
CN201680089241.5A 2016-09-20 2016-09-20 Imaging device and mounting device Active CN109792860B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/077647 WO2018055663A1 (en) 2016-09-20 2016-09-20 Imaging device and mounting device

Publications (2)

Publication Number Publication Date
CN109792860A CN109792860A (en) 2019-05-21
CN109792860B true CN109792860B (en) 2020-08-25

Family

ID=61690217

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680089241.5A Active CN109792860B (en) 2016-09-20 2016-09-20 Imaging device and mounting device

Country Status (3)

Country Link
JP (1) JP6865227B2 (en)
CN (1) CN109792860B (en)
WO (1) WO2018055663A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3863391A4 (en) * 2018-10-04 2021-10-06 Fuji Corporation Camera for capturing component images, and component mounting machine

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009135266A (en) * 2007-11-30 2009-06-18 Hitachi High-Tech Instruments Co Ltd Electronic component mounting apparatus
JP2012174888A (en) * 2011-02-22 2012-09-10 Fuji Mach Mfg Co Ltd Electronic component illumination device
CN102781212A (en) * 2011-05-12 2012-11-14 雅马哈发动机株式会社 Attraction state inspection device, surface mounting apparatus, and part test device
CN103098580A (en) * 2011-08-29 2013-05-08 松下电器产业株式会社 Component-mounting device, head, and component-orientation recognition method
JP2015106603A (en) * 2013-11-29 2015-06-08 スタンレー電気株式会社 Component attachment light irradiation imaging device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000124683A (en) * 1998-10-12 2000-04-28 Tenryu Technics:Kk Image pickup of electronic component and electronic component mounting equipment
JP4363721B2 (en) * 1999-11-12 2009-11-11 オリンパス株式会社 Electronic device manufacturing method and electronic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009135266A (en) * 2007-11-30 2009-06-18 Hitachi High-Tech Instruments Co Ltd Electronic component mounting apparatus
JP2012174888A (en) * 2011-02-22 2012-09-10 Fuji Mach Mfg Co Ltd Electronic component illumination device
CN102781212A (en) * 2011-05-12 2012-11-14 雅马哈发动机株式会社 Attraction state inspection device, surface mounting apparatus, and part test device
CN103098580A (en) * 2011-08-29 2013-05-08 松下电器产业株式会社 Component-mounting device, head, and component-orientation recognition method
JP2015106603A (en) * 2013-11-29 2015-06-08 スタンレー電気株式会社 Component attachment light irradiation imaging device

Also Published As

Publication number Publication date
CN109792860A (en) 2019-05-21
JP6865227B2 (en) 2021-04-28
WO2018055663A1 (en) 2018-03-29
JPWO2018055663A1 (en) 2019-07-04

Similar Documents

Publication Publication Date Title
CN114128417B (en) Substrate alignment operation system
CN109792860B (en) Imaging device and mounting device
CN108702867B (en) Mounting device and shooting processing method
CN114788436B (en) Component mounting machine
CN111434103B (en) Imaging unit and component mounting machine
JP6475165B2 (en) Mounting device
CN110710343B (en) Component mounting apparatus
CN109983859B (en) Surface mounting machine, component recognition device, and component recognition method
WO2018055757A1 (en) Illumination condition specifying device and illumination condition specifying method
CN110651538B (en) Working machine and calculation method
JP7116524B2 (en) Image recognition device
CN110024507B (en) Working machine
CN114128418B (en) Inspection apparatus
CN112840752B (en) Camera for shooting component and component mounting machine
CN115088402B (en) Component assembling machine
JP6728501B2 (en) Image processing system and component mounter
CN111543126B (en) Working machine and polarity determination method
CN113966650B (en) Lighting unit
CN112262621A (en) Working machine
EP3767938B1 (en) Image-capturing unit and component-mounting device
JP7271738B2 (en) Imaging unit
US20240149363A1 (en) Quality determination device and quality determination method
WO2024038527A1 (en) Image processing apparatus and image processing system
WO2023276059A1 (en) Component mounting machine
WO2021181592A1 (en) Identification device and parts holding system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant