CN115939904A - Component mounting device and component mounting method - Google Patents

Component mounting device and component mounting method Download PDF

Info

Publication number
CN115939904A
CN115939904A CN202211206617.XA CN202211206617A CN115939904A CN 115939904 A CN115939904 A CN 115939904A CN 202211206617 A CN202211206617 A CN 202211206617A CN 115939904 A CN115939904 A CN 115939904A
Authority
CN
China
Prior art keywords
lead
point group
dimensional
data
group data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211206617.XA
Other languages
Chinese (zh)
Inventor
佐野孝浩
山田友美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juki Corp
Original Assignee
Juki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki Corp filed Critical Juki Corp
Publication of CN115939904A publication Critical patent/CN115939904A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Supply And Installment Of Electrical Components (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a component mounting device and a component mounting method, which can inhibit the reduction of the productivity of the component mounting device. The component mounting device is provided with: a robot manipulator; a manipulator which is arranged at the front end part of the robot manipulator and is used for holding the main body of the lead component; a projection device for irradiating stripe pattern light to the lead member in a state that the main body is held by the manipulator; an imaging device that images the lead member irradiated with the stripe pattern light from a predetermined viewpoint; an arithmetic device for performing image processing on the image data of the lead member captured by the imaging device; and a control device for controlling the robot manipulator based on the image processing result of the arithmetic device so that the lead of the lead component is inserted into the hole of the substrate.

Description

Component mounting device and component mounting method
Technical Field
The present invention relates to a component mounting apparatus and a component mounting method.
Background
In the production of electronic devices, component mounting apparatuses that mount components on substrates are used. Patent document 1 discloses a component mounting apparatus for mounting a lead component on a substrate.
The lead member is mounted on the substrate by inserting the lead of the lead member into the hole of the substrate. Depending on the state of the lead, it is difficult to insert the lead into the hole of the substrate, and as a result, the productivity of the component mounting apparatus may be reduced. For example, in the case where the lead is bent, it is difficult to insert the lead into the hole of the substrate, and as a result, there is a possibility that the mounting of the lead member has to be abandoned. In order to suppress a decrease in productivity of the component mounting apparatus, a technique is desired in which the state of the lead is recognized and the lead is inserted into the hole of the substrate in accordance with the state of the lead.
Patent document 1: japanese patent laid-open publication No. 2021-093560
Disclosure of Invention
The present specification discloses a component mounting apparatus. The present invention provides a component mounting device, comprising: a robot manipulator; a manipulator provided at a distal end portion of the robot manipulator and holding a main body of a lead member; a projection device that irradiates a stripe pattern light to the lead member in a state where the main body is held by the robot; an imaging device that images the lead member irradiated with the stripe pattern light from a predetermined viewpoint; an arithmetic device that performs image processing on the image data of the lead member captured by the imaging device; and a control device that controls the robot manipulator based on an image processing result of the arithmetic device so that the lead of the lead member is inserted into the hole of the substrate, the arithmetic device including: a three-dimensional image generation unit that generates three-dimensional image data by performing arithmetic processing on the captured data of the lead member based on a phase shift method; a three-dimensional point group conversion unit that converts the three-dimensional image data into three-dimensional point group data; a dividing unit that divides the three-dimensional point group data and extracts, from the three-dimensional point group data, lead point group data indicating a three-dimensional shape of a surface of the lead; and a lead state calculation unit that performs principal component analysis on the lead point group data to calculate three-dimensional data of the lead.
According to the component mounting device disclosed in the present specification, a decrease in productivity can be suppressed.
Drawings
Fig. 1 is a perspective view showing a component mounting apparatus according to an embodiment.
Fig. 2 is a side view showing a component mounting apparatus of the embodiment.
Fig. 3 is a perspective view showing a robot hand according to the embodiment.
Fig. 4 is a side view showing a lead member held by a robot of the embodiment.
Fig. 5 is a view of the lead member of the embodiment as viewed from below.
Fig. 6 is a perspective view showing a three-dimensional measuring apparatus according to an embodiment.
Fig. 7 is a diagram for explaining the operation of the robot manipulator according to the embodiment.
Fig. 8 is a block diagram showing a component mounting apparatus according to an embodiment.
Fig. 9 is a flowchart showing an image processing method of the arithmetic device according to the embodiment.
Fig. 10 is a diagram schematically showing an example of three-dimensional point cloud data according to the embodiment.
Fig. 11 is a diagram schematically showing an example of the integrated point group data according to the embodiment.
Fig. 12 is a diagram schematically showing an example of the lead point group data according to the embodiment.
Fig. 13 is a diagram for explaining the division point group according to the embodiment.
Fig. 14 is a diagram for explaining an operation of inserting the lead of the lead member of the embodiment into the hole of the substrate.
Fig. 15 is a diagram for explaining an operation of inserting the lead of the lead member of the embodiment into the hole of the substrate.
Fig. 16 is a block diagram showing a computer system according to an embodiment.
Description of the reference numerals
1: a component mounting device; 2: a base station; 3: a component supply member; 4: a substrate supporting member; 5: a manipulator; 5A: a connecting member; 5B: a rotating member; 5C: a moving member; 5D: a clamping part; 6: a robot manipulator; 6A: a base member; 6B: a revolving member; 6C: a first arm; 6D: a second arm; 6E: a third arm; 7: a three-dimensional measuring device; 7A: a projection device; 7B: a photographing device; 7C: an arithmetic device; 7D: a housing; 7E: a transparent member; 8: a force sensor; 9: a control device; 11: a three-dimensional image generation unit; 12: a three-dimensional point group conversion unit; 13: a three-dimensional point group integrating unit; 14: a dividing section; 15: a candidate point group generating unit; 16: a lead state calculating section; 17: an output section; 100: a lead member; 101: a main body; 110: a lead wire; 111: a first lead; 112: a second lead; 200: a substrate; 210: a hole; 211: a first hole; 212: a second hole; 1000: a computer system; 1001: a processor; 1002: a main memory; 1003: a memory; 1004: an interface; aa: a smallest face; ab: a smallest face; AX1: a first rotating shaft; AX2: a second rotating shaft; AX3: a third rotating shaft; ba: an axis-aligned bounding box; bb: an axis alignment bounding box; da: three-dimensional image data; and Db: three-dimensional point group data; db1: three-dimensional point group data; db2: three-dimensional point group data; db3: three-dimensional point group data; dc: integrating the point group data; and Dd: lead point group data; de: dividing the point group; de1: a first division point group; de2: a second division point group; de3: a third division point group; de4: a fourth division point group; df: candidate point group data; RX: a rotating shaft; TX: and (4) rotating the shaft.
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the drawings, but the present invention is not limited to the embodiments. The constituent elements of the embodiments described below may be combined as appropriate. In addition, some of the components may not be used.
In the embodiment, a local coordinate system is set in the component mounting device 1, and the positional relationship of each part will be described with reference to the local coordinate system. As the local coordinate system, an XYZ rectangular coordinate system is set. In the predetermined plane, a direction parallel to the X axis is defined as an X axis direction. In the predetermined plane, a direction parallel to a Y axis orthogonal to the X axis is defined as a Y axis direction. The Z-axis direction is a direction parallel to a Z-axis orthogonal to the X-axis and the Y-axis. The rotation direction or the tilt direction about the X axis is defined as the θ X direction. The rotation direction or the tilt direction about the Y axis is defined as the θ Y direction. The rotation direction or the tilt direction about the Z axis is defined as θ Z direction. The prescribed plane is an XY plane. The Z axis is orthogonal to the predetermined plane. In an embodiment, the predetermined plane is parallel to a horizontal plane. The Z-axis direction is a vertical direction. The predetermined surface may be inclined with respect to the horizontal plane.
(component mounting device)
Fig. 1 is a perspective view showing a component mounting apparatus 1 of the embodiment. Fig. 2 is a side view showing the component mounting apparatus 1 of the embodiment. As shown in fig. 1 and 2, the component mounting apparatus 1 includes a base 2, a component supply member 3, a substrate support member 4, a robot hand (robot hand) 5, a robot manipulator 6, and a three-dimensional measuring device 7.
The base 2 supports a component supply member 3, a substrate support member 4, a robot manipulator 6, and a three-dimensional measuring device 7, respectively.
The component supply member 3 supplies the lead components 100. In the embodiment, the component supply member 3 includes a tray on which the lead components 100 are arranged. The plurality of lead components 100 are disposed on the component supply member 3. The plurality of lead members 100 may be the same type or different types.
The substrate support member 4 supports the substrate 200 on which the lead component 100 is mounted. The substrate support member 4 supports the substrate 200 such that the upper surface of the substrate 200 is parallel to the XY plane.
The robot 5 holds the lead member 100. The robot hand 5 is provided at the front end of the robot manipulator 6.
The robot manipulator 6 moves the manipulator 5. The robot manipulator 6 includes a multi-joint robot. In an embodiment, the robot manipulator 6 is a vertical multi-joint robot. In addition, the robot manipulator 6 may be a horizontal articulated robot. The robot manipulator 6 includes: a base member 6A fixed to the base 2; a rotary member 6B supported by the base member 6A; a first arm 6C connected to the rotating member 6B; a second arm 6D connected to the first arm 6C; and a third arm 6E connected to the second arm 6D.
The rotary member 6B is supported by the base member 6A so as to be rotatable about a rotation axis TX. The spin axis TX is parallel to the Z axis. The first arm 6C is coupled to the rotary member 6B so as to be rotatable about the first rotation axis AX 1. The first rotation axis AX1 is orthogonal to the Z axis. The second arm 6D is coupled to the first arm 6C so as to be rotatable about the second rotation axis AX 2. The second rotation axis AX2 is parallel to the first rotation axis AX 1. The third arm 6E is coupled to the second arm 6D so as to be rotatable about the third rotation axis AX 3. The third rotation axis AX3 is parallel to the second rotation axis AX 2. The robot 5 is attached to the third arm 6E.
The robot manipulator 6 includes: a swiveling actuator that swivels the swiveling member 6B; a first rotary actuator that rotates the first arm 6C; a second rotation actuator that rotates the second arm 6D; and a third rotation actuator that rotates the third arm 6E.
The three-dimensional measuring device 7 measures the lead member 100 held by the robot 5. The three-dimensional measurement device 7 detects the position of the lead member 100 in the local coordinate system based on the phase shift method.
(mechanical arm)
Fig. 3 is a perspective view showing the robot 5 according to the embodiment. The robot 5 includes: a connecting member 5A attached to the third arm 6E; a rotary member 5B supported by the coupling member 5A; and a pair of moving members 5C supported by the rotating member 5B.
The rotating member 5B is supported by the coupling member 5A so as to be rotatable about the rotation axis RX. The rotation axis RX is orthogonal to the third rotation axis AX 3. The pair of moving members 5C move in a direction approaching each other and in a direction separating from each other. A clamping portion 5D is provided at the lower end of the moving member 5C. The pair of gripping portions 5D approach or separate from each other.
The robot 5 includes a rotation actuator for rotating the rotation member 5B and a gripping actuator for moving the pair of moving members 5C toward and away from each other.
In a state where the lead member 100 is disposed between the pair of pinching portions 5D, the pair of pinching portions 5D approach each other, and thereby the lead member 100 is held by the pinching portions 5D. The pair of pinching portions 5D are separated from each other, and the lead member 100 is released from the pinching portions 5D.
A force sensor 8 is disposed on one of the moving members 5C. The force sensor 8 can detect the load applied to the grip portion 5D.
(lead member)
Fig. 4 is a side view showing the lead member 100 held by the robot 5 according to the embodiment. Fig. 5 is a view of the lead member 100 of the embodiment as viewed from below.
The lead member 100 has a body 101 and a plurality of leads 110 protruding from the body 101.
The main body 101 includes a synthetic resin case. An element such as a coil is disposed in the internal space of the main body 101. The lead 110 is a metal protrusion. The lead 110 is connected to, for example, an element disposed in the internal space of the main body 101.
The lead 110 protrudes downward from the lower surface of the body 101. In a state where the lead member 100 is mounted on the substrate 200, the lower surface of the body 101 faces the upper surface of the substrate 200.
The robot 5 holds the main body 101 of the lead member 100. The pair of clamping portions 5D hold the lead member 100 by clamping the body 101.
(three-dimensional measuring device)
Fig. 6 is a perspective view showing a three-dimensional measuring apparatus 7 according to an embodiment. As shown in fig. 6, the three-dimensional measuring device 7 measures the three-dimensional shape of the lead member 100 in a state where the main body 101 is held by the robot 5.
The three-dimensional measuring device 7 includes a projecting device 7A, a photographing device 7B, and an arithmetic device 7C. In the embodiment, the projection device 7A and the imaging device 7B are housed in the case 7D, respectively. The projection device 7A and the imaging device 7B are fixed to the housing 7D, respectively. A transparent member 7E is disposed in an opening at an upper end of the housing 7D. As the transparent member 7E, a glass plate is exemplified.
The projector 7A irradiates the lead member 100 with a stripe pattern light in a state where the main body 101 is held by the robot 5. The projection device 7A includes: a light source; a light modulation element that generates stripe pattern light by performing light modulation on light emitted from a light source; and an emission optical system that emits the stripe pattern light generated by the light modulation element. As the light modulation element, a Digital Micromirror Device (DMD), a transmissive liquid crystal panel, or a reflective liquid crystal panel is exemplified.
The imaging device 7B images the lead member 100 irradiated with the stripe pattern light from a predetermined viewpoint. The viewpoint of the imaging device 7B refers to the relative imaging position and imaging angle of the imaging device 7B with respect to the lead member 100. The imaging device 7B includes an imaging optical system for imaging the stripe pattern light reflected by the lead member 100, and an imaging element for acquiring image data of the lead member 100 via the imaging optical system. As the Image Sensor, a CMOS Image Sensor (Complementary Metal Oxide Semiconductor Image Sensor) or a CCD Image Sensor (Charge Coupled Device Image Sensor) is exemplified.
The arithmetic device 7C performs image processing on the image data of the lead member 100 captured by the imaging device 7B. The arithmetic device 7C includes a computer system. The arithmetic device 7C includes a processor such as a CPU (Central Processing Unit), a Memory such as a ROM (Read Only Memory) or a RAM (Random Access Memory), and an input/output interface including an input/output circuit capable of inputting/outputting signals and data.
The three-dimensional measuring device 7 measures the three-dimensional shape of the lead member 100 held by the robot 5 based on the phase shift method.
The projector 7A irradiates the lead member 100 with a phase shift of a stripe pattern light having a sinusoidal luminance distribution, for example. The lead member 100 held by the robot 5 is disposed above the transparent member 7E. The stripe pattern light emitted from the projector 7A is irradiated to the lead member 100 via the transparent member 7E.
The imaging device 7B images the lead member 100 irradiated with the stripe pattern light. The imaging device 7B images the lead member 100 via the transparent member 7E. The image pickup device 7B picks up an image of the lead member 100 from a position below the lead member 100. The viewpoint of the imaging device 7B is defined below the lead member 100. The robot manipulator 6 operates to change the position and posture of the lead member 100 held by the manipulator 5, thereby changing the relative position and relative angle between the lead member 100 and the imaging device 7B. The relative viewpoint of the imaging device 7B with respect to the lead member 100 changes by the relative position and relative angle change of the lead member 100 and the imaging device 7B.
The arithmetic device 7C performs image processing on the image data of the lead member 100 captured by the imaging device 7B based on the phase shift method, and calculates three-dimensional data of the lead 110. The three-dimensional data of the lead 110 includes the amount of bending of the lead 110 in a three-dimensional space defined by the local coordinate system and the coordinates of the tip portion of the lead 110. When the angle of the lead 110 with respect to the design value of the body 101 is set to an ideal angle and the actual angle of the lead 110 with respect to the body 101 is set to an actual angle, the amount of bending of the lead 110 is the difference between the ideal angle and the actual angle.
(operation of robot manipulator)
Fig. 7 is a diagram for explaining the operation of the robot manipulator 6 according to the embodiment. The component mounting apparatus 1 includes a control device 9 for controlling the robot manipulator 6. The control means 9 comprise a computer system. As shown in fig. 7, the substrate 200 is provided with a hole 210 into which the lead 110 of the lead member 100 is inserted. The control device 9 controls the robot manipulator 6 to insert the lead 110 of the lead member 100 into the hole 210 of the substrate 200 based on the image processing result of the arithmetic device 7C of the three-dimensional measurement device 7.
(arithmetic device)
Fig. 8 is a block diagram showing the component mounting apparatus 1 of the embodiment. As shown in fig. 8, the component mounting apparatus 1 includes a computing device 7C and a control device 9.
The arithmetic device 7C includes a three-dimensional image generation unit 11, a three-dimensional point group conversion unit 12, a three-dimensional point group integration unit 13, a division unit 14, a candidate point group generation unit 15, a lead state calculation unit 16, and an output unit 17.
The three-dimensional image generation unit 11 acquires the image data of the lead member 100 from the imaging device 7B. The three-dimensional image generating unit 11 performs an arithmetic process on the captured data of the lead member 100 based on the phase shift method, and generates three-dimensional image data Da of the lead member 100.
In the embodiment, the control device 9 controls the robot manipulator 6 so that the imaging device 7B images the lead members 100 from a plurality of viewpoints, respectively. The three-dimensional image generating unit 11 generates a plurality of three-dimensional image data Da of the lead member 100 viewed from a plurality of viewpoints, based on a plurality of pieces of captured data captured from a plurality of viewpoints, respectively.
The three-dimensional point cloud conversion unit 12 converts the three-dimensional image data Da generated by the three-dimensional image generation unit 11 into three-dimensional point cloud data Db. The three-dimensional point group data Db represents the three-dimensional shape of the surface of the lead member 100. The three-dimensional point group data Db is an aggregate of a plurality of measurement points measured by the three-dimensional measurement device 7 on the surface of the lead member 100. The positions of the plurality of measurement points are specified by X, Y, and Z coordinates, respectively.
As described above, in the embodiment, the three-dimensional image generation unit 11 generates the plurality of three-dimensional image data Da based on the plurality of pieces of captured data captured from the plurality of viewpoints, respectively. The three-dimensional point group conversion unit 12 generates a plurality of three-dimensional point group data Db of the lead member 100 viewed from a plurality of viewpoints, respectively, based on the plurality of three-dimensional image data Da.
The three-dimensional point cloud integrating unit 13 aligns the plurality of three-dimensional point cloud data Db generated by the three-dimensional point cloud converting unit 12 based on a predetermined algorithm, and generates integrated point cloud data Dc.
As a predetermined algorithm, an ICP (Iterative Closest Point) matching algorithm is exemplified. The three-dimensional point group integrating unit 13 aligns the plurality of three-dimensional point group data Db in the three-dimensional space based on an existing algorithm such as an ICP matching algorithm, and generates integrated point group data Dc.
The dividing unit 14 divides the three-dimensional point group data Db and extracts the lead point group data Dd from the three-dimensional point group data Db. The lead point group data Dd represents the three-dimensional shape of the surface of the lead 110. The surface of lead 110 includes the outer peripheral surface and the lower end surface of lead 110. By the division, in the three-dimensional point cloud data Db, the measurement point corresponding to the lead 110 is separated from the measurement point corresponding to the background or the noise. That is, measurement points corresponding to the lead 110 are extracted by division, and measurement points corresponding to the background or noise are removed.
The division is a process of separating a point group of interest from a point group constituting the three-dimensional point group data Db according to a predetermined rule. In the embodiment, the divider 14 captures feature points of the three-dimensional point cloud data Db, and classifies the measurement points constituting the three-dimensional point cloud data Db into a plurality of groups. The dividing unit 14 divides the three-dimensional point group data Db by a predetermined division method.
As the division method, an LCCP (localization Connected Patches) method or a Region Growing (Region Growing) method is exemplified. The divider 14 extracts the lead point group data Dd from the three-dimensional point group data Db based on an existing dividing method such as the LCCP method or the area expansion method.
In the embodiment, dividing the three-dimensional point group data Db includes dividing the integrated point group data Dc generated by the three-dimensional point group integrating unit 13. The dividing unit 14 divides the integrated point group data Dc generated by the three-dimensional point group integrating unit 13, and extracts the lead point group data Dd from the integrated point group data Dc.
The candidate point group generating unit 15 connects the plurality of divided point groups De based on the design value of the lead member 100, the viewpoint of the imaging device 7B at the time of imaging the lead member 100, and a predetermined connection condition, and generates candidate point group data Df as candidates of the lead 110. The lead point group data Dd is extracted from the integrated point group data Dc by division, but the lead point group data Dd may be separated into a plurality of divided point groups De. When the lead point group data Dd is separated into a plurality of divided point groups De by division, it may be unclear which divided point group De corresponds to the lead 110 and which divided point group De does not correspond to the lead 110. Therefore, the candidate point group generating unit 15 extracts a plurality of divided point groups De predicted to correspond to the lead 110 based on the design value of the lead member 100, the viewpoint of the imaging device 7B at the time of imaging the lead member 100, and the predetermined connection condition, and connects the plurality of divided point groups De to generate candidate point group data Df as candidates for the lead 110.
The design values of the lead member 100 are known data. The design values of the lead member 100 are stored in the candidate point group generating unit 15 in advance. As design values of the lead member 100, the length of the lead 110, the relative position of the body 101 and the lead 110, the angle (ideal angle) of the lead 110 with respect to the body 101, and the intervals of the plurality of leads 110 are exemplified.
The candidate point group generating unit 15 can calculate the viewpoint of the imaging device 7B when the lead member 100 is imaged based on a control command output from the control device 9 to the robot manipulator 6 when the lead member 100 is imaged by the imaging device 7B. In addition, when the robot manipulator 6 is provided with a sensor capable of detecting the position and orientation of the robot manipulator 6, the candidate point group generating unit 15 can calculate the viewpoint of the imaging device 7B when imaging the lead member 100, based on the detection value of the sensor.
Predetermined connection conditions are predetermined. The predetermined connection condition is stored in advance in the candidate point group generating unit 15. In an embodiment, the predetermined connection condition includes a first connection condition and a second connection condition. The first connection condition includes that the relative distance between the two division point groups De to be connected is equal to or less than a predetermined distance. The second connection condition includes a condition that [ α < β × γ ] is satisfied when the area of the minimum surface of the Axis alignment Bounding Box when the connected division point group De is surrounded by the Axis alignment Bounding Box (AABB: axis-Aligned Bounding Box) is α, the diameter of the lead 110 is β, and the coefficient is γ. The axis alignment bounding box is a rectangular parallelepiped bounding figure in which each side is parallel to the X axis, Y axis, and Z axis of a three-dimensional space defined by a local coordinate system.
The lead state calculating unit 16 performs principal component analysis on the lead point group data Dd to calculate three-dimensional data of the lead 110. The lead state calculating unit 16 calculates the amount of bending of the lead 110 and the coordinates of the tip portion of the lead 110 in a three-dimensional space defined by the local coordinate system as three-dimensional data of the lead 110.
Principal component analysis is an analysis method for determining a principal component by selecting a first principal component so as to maximize the variance of the first principal component and a second principal component so as to maximize the variance of the second principal component and subsequent principal components under the condition of being orthogonal to the first principal component.
In the embodiment, performing the principal component analysis on the lead point group data Dd includes performing the principal component analysis on the candidate point group data Df generated by the candidate point group generating unit 15. The lead state calculation unit 16 performs principal component analysis on the candidate point group data Df generated by the candidate point group generation unit 15 to calculate three-dimensional data of the lead 110.
In the embodiment, lead state calculating unit 16 extracts, as lead 110, a main component in which a main axis vector of a main component of candidate point group data Df and coordinates of a tip portion of lead 110 approximate to a main axis vector of lead 110 and coordinates of a tip portion of lead 110 predicted based on a design value of lead 110 and a viewpoint of imaging device 7B at the time of imaging lead member 100.
The output unit 17 outputs the three-dimensional data of the lead 110 calculated by the lead state calculating unit 16 to the control device 9. The image processing result of the arithmetic device 7C includes the three-dimensional data of the lead 110 calculated by the lead state calculating unit 16. The control device 9 controls the robot manipulator 6 based on the three-dimensional data of the lead 110 so that the lead 110 is inserted into the hole 210 of the substrate 200.
(image processing method)
Fig. 9 is a flowchart showing an image processing method of the arithmetic device 7C according to the embodiment. In order to mount the lead component 100 on the substrate 200, the control device 9 controls the robot manipulator 6 so that the manipulator 5 approaches the component feeding member 3. The robot 5 moved to the component supply unit 3 holds the main body 101 of the lead component 100 disposed in the component supply unit 3. After the main body 101 of the lead member 100 is held by the robot hand 5, the control device 9 controls the robot manipulator 6 so that the three-dimensional measuring device 7 measures the lead member 100 held by the robot hand 5. That is, as described with reference to fig. 6, the controller 9 controls the robot manipulator 6 so that the lead member 100 held by the robot hand 5 is disposed above the three-dimensional measuring device 7.
In the embodiment, the control device 9 controls the robot manipulator 6 so that the image pickup device 7B picks up images of the lead members 100 from a plurality of viewpoints, respectively. That is, the controller 9 controls the robot manipulator 6 so that the position and angle of the lead member 100 held by the robot hand 5 above the three-dimensional measuring device 7 are changed. The image pickup device 7B picks up images of the lead members 100 from a plurality of different viewpoints.
The three-dimensional image generation unit 11 acquires the image data of the lead member 100 from the imaging device 7B (step S1).
The three-dimensional image generating unit 11 performs an arithmetic process on the captured data of the lead member 100 based on the phase shift method, and generates three-dimensional image data Da of the lead member 100. The three-dimensional image generating unit 11 generates a plurality of three-dimensional image data Da of the lead member 100 viewed from a plurality of viewpoints, based on a plurality of pieces of captured data captured from a plurality of viewpoints, respectively (step S2).
The three-dimensional point cloud conversion unit 12 converts the three-dimensional image data Da into three-dimensional point cloud data Db. The three-dimensional point group converting unit 12 generates a plurality of three-dimensional point group data Db of the lead member 100 viewed from a plurality of viewpoints, respectively, based on the plurality of three-dimensional image data Da generated in step S2 (step S3).
Fig. 10 is a diagram schematically showing an example of the three-dimensional point cloud data Db according to the embodiment. When the lead member 100 is photographed from a plurality of viewpoints, the three-dimensional point group conversion unit 12 generates a plurality of three-dimensional point group data Db of the lead member 100 viewed from the plurality of viewpoints, respectively. For example, when the lead member 100 is photographed from 3 viewpoints, as shown in fig. 10, the three-dimensional point group conversion unit 12 generates three-dimensional point group data Db1 of the lead member 100 viewed from a first viewpoint, three-dimensional point group data Db2 of the lead member 100 viewed from a second viewpoint, and three-dimensional point group data Db3 of the lead member 100 viewed from a third viewpoint.
The three-dimensional point cloud integrating unit 13 aligns the plurality of three-dimensional point cloud data Db generated in step S3 based on a predetermined algorithm to generate integrated point cloud data Dc (step S4).
Fig. 11 is a diagram schematically showing an example of the integrated point group data Dc according to the embodiment. When generating the 3 three-dimensional point cloud data Db (Db 1, db2, db 3) as shown in fig. 10, the three-dimensional point cloud integrating unit 13 aligns the 3 three-dimensional point cloud data Db generated in step S3 in a three-dimensional space based on an existing algorithm such as an ICP matching algorithm, and generates integrated point cloud data Dc.
The dividing unit 14 divides the integrated point group data Dc generated in step S4, and extracts the lead point group data Dd from the integrated point group data Dc (step S5).
Fig. 12 is a diagram schematically showing an example of the lead point group data Dd according to the embodiment. The divider 14 divides the aggregate point group data Dc based on an existing division method such as the LCCP method or the region expansion method, and extracts the outlier group data Dd from the aggregate point group data Dc. As shown in fig. 12, the lead point group data Dd is extracted by the division. Further, measurement points corresponding to the background or noise are removed by segmentation.
The lead point group data Dd is extracted from the integrated point group data Dc by the division in step S5, but the lead point group data Dd may be separated into a plurality of divided point groups De. The candidate point group generating unit 15 connects the plurality of divided point groups De based on the design value of the lead member 100, the viewpoint of the imaging device 7B at the time of imaging the lead member 100, and a predetermined connection condition, and generates candidate point group data Df as candidates of the lead 110 (step S6).
Fig. 13 is a diagram for explaining the division point group De according to the embodiment. As shown in fig. 13, the lead point group data Dd is extracted from the integrated point group data Dc by division, but the lead point group data Dd may be separated into a plurality of divided point groups De. In the example shown in fig. 13, the division point group De includes a first division point group De1, a second division point group De2, a third division point group De3, and a fourth division point group De4. When the lead point group data Dd is divided into a plurality of divided point groups De by division, it may be unclear which divided point group De corresponds to the lead 110 and which divided point group De does not correspond to the lead 110. Therefore, the candidate point group generating unit 15 extracts a plurality of divided point groups De predicted to correspond to the lead 110 based on the design value of the lead member 100, the viewpoint of the imaging device 7B at the time of imaging the lead member 100, and a predetermined connection condition, and connects the plurality of divided point groups De to generate candidate point group data Df as candidates for the lead 110.
As described above, in the embodiment, the predetermined connection condition includes the first connection condition and the second connection condition. The first connection condition includes that the relative distance between the two division point groups De to be connected is equal to or less than a predetermined distance. The second connection condition includes a condition that [ α < β × γ ] is satisfied when an area of a minimum plane when the division point group De connected is surrounded by the axis-aligned bounding box is α, a diameter of the lead 110 is β, and a coefficient is γ.
The two division point groups De to be connected are determined based on the design value of the lead member 100 and the viewpoint of the imaging device 7B at the time of imaging the lead member 100. The candidate point group generating unit 15 predicts the actual extending direction of the lead 110 based on the design value of the lead member 100 and the viewpoint of the imaging device 7B at the time of imaging the lead member 100. The candidate point group generating unit 15 determines a plurality of divided point groups De arranged along the extending direction of the actual lead 110 as the connection target.
In the example shown in fig. 13, the first division point group De1, the second division point group De2, the third division point group De3, and the fourth division point group De4 are arranged such that the predicted actual lead 110 is along the extending direction. Therefore, the first division point group De1, the second division point group De2, the third division point group De3, and the fourth division point group De4 are determined as connection targets. More specifically, a first division point group De1 and a second division point group De2 adjacent to the first division point group De1 are determined as connection targets. The second division point group De2 and a third division point group De3 adjacent to the second division point group De2 are determined as connection targets. The third division point group De3 and the fourth division point group De4 adjacent to the third division point group De3 are determined as connection targets.
In the example shown in fig. 13, when the first connection condition is satisfied, the first division point group De1 and the second division point group De2 are close to each other, and the relative distance between the first division point group De1 and the second division point group De2 is equal to or less than a predetermined distance. The second division point group De2 and the third division point group De3 are close to each other, and the relative distance between the second division point group De2 and the third division point group De3 is equal to or less than a predetermined distance. On the other hand, the third division point group De3 is separated from the fourth division point group De4, and the relative distance between the third division point group De3 and the fourth division point group De4 is longer than the predetermined distance. Therefore, in the example shown in fig. 13, it is determined that the fourth dividing point group De4 does not satisfy the first connection condition.
In the example shown in fig. 13, when the first division point group De1, the second division point group De2, and the third division point group De3 are connected while focusing on the second connection condition, the area α of the bottom surface, which is the minimum surface Aa of the axially aligned bounding box Ba surrounding the connected 3 division point groups De (De 1, de2, de 3), is lower than [ β × γ ]. On the other hand, when the first division point group De1, the second division point group De2, the third division point group De3, and the fourth division point group De4 are connected, the area α of the bottom surface, which is the minimum surface Ab of the axis alignment boundary frame Bb surrounding the connected 4 division point groups De (De 1, de2, de3, de 4), is higher than [ β × γ ]. Therefore, in the example shown in fig. 13, it is determined that the fourth dividing point group De4 also does not satisfy the second connection condition.
The extending direction of the connected 3 division point groups De (De 1, de2, de 3) coincides with the predicted extending direction of the real lead 110. On the other hand, the extending direction of the connected 4 division point groups De (De 1, de2, de3, de 4) does not coincide with the predicted extending direction of the real lead 110.
In this way, in the case of the example shown in fig. 13, the candidate point group generating unit 15 predicts that the first division point group De1, the second division point group De2, and the third division point group De3 correspond to the lead 110, and predicts that the fourth division point group De4 does not correspond to the lead 110.
The candidate point group generating unit 15 extracts a plurality of divided point groups De (De 1, de2, de 3) predicted to correspond to the lead 110, connects the plurality of divided point groups De (De 1, de2, de 3), and generates candidate point group data Df as candidates for the lead 110 (step S6).
In the example shown in fig. 13, the candidate point group data Df is composed of a first division point group De1, a second division point group De2, and a third division point group De 3.
The lead state calculating unit 16 performs principal component analysis on the candidate point group data Df generated in step S6 to calculate three-dimensional data of the lead 110 (step S7).
In the embodiment, lead state calculating unit 16 extracts, as lead 110, a main component in which a main axis vector of a main component of candidate point group data Df and coordinates of a tip portion of lead 110 approximate to a main axis vector of lead 110 and coordinates of a tip portion of lead 110 predicted based on a design value of lead 110 and a viewpoint of imaging device 7B at the time of imaging lead member 100. As shown in fig. 13, the predicted principal axis vector of lead 110 includes the predicted actual extending direction of lead 110.
The lead state calculating unit 16 calculates the amount of bending of the lead 110 and the coordinates of the tip portion of the lead 110 in a three-dimensional space defined by the local coordinate system as three-dimensional data of the lead 110.
The output unit 17 outputs the three-dimensional data of the lead 110 calculated in step S7 to the control device 9 (step S8).
The control device 9 controls the robot manipulator 6 based on the three-dimensional data of the lead 110 so that the lead 110 is inserted into the hole 210 of the substrate 200.
(insertion of lead wire)
Fig. 14 and 15 are views for explaining an operation of inserting the lead 110 of the lead member 100 according to the embodiment into the hole 210 of the substrate 200. In the example shown in fig. 14 and 15, the lead member 100 has two leads 110. The lead 110 includes a first lead 111 and a second lead 112. The holes 210 of the substrate 200 include a first hole 211 into which the first lead 111 is inserted, and a second hole 212 into which the second lead 112 is inserted.
In the embodiment, the robot manipulator 6 is a multi-joint robot. Therefore, the robot manipulator 6 can tilt the lead member 100 with respect to the upper surface of the substrate 200. The robot manipulator 6 can arbitrarily adjust the angle formed by the upper surface of the substrate 200 and the lower surface of the main body 101 held by the robot hand 5.
As shown in fig. 14, for example, when the second lead wire 112 is bent, the control device 9 controls the robot manipulator 6 to insert the second lead wire 112 into the second hole 212 before the first lead wire 111 is inserted into the first hole 211 based on the image processing result of the arithmetic device 7C. The robot manipulator 6 inserts the second lead 112 into the second hole 212 while tilting the lead member 100.
After the second lead 112 is inserted into the second hole 212, the controller 9 moves the lead member 100 in the-X direction until the tip end of the first lead 111 faces the first hole 211 of the substrate 200. Thereby, the second lead 112 is straightened. After aligning the second lead 112 until the first lead 111 faces the first hole 211 of the substrate 200, the controller 9 controls the robot manipulator 6 to insert the first lead 111 into the first hole 211 in a state where the second lead 112 is disposed in the second hole 212, as shown in fig. 15.
(computer System)
Fig. 16 is a block diagram showing a computer system 1000 according to an embodiment. The arithmetic unit 7C and the control unit 9 each include a computer system 1000. The computer system 1000 has: a processor 1001 such as a CPU (Central Processing Unit); a main Memory 1002 including a nonvolatile Memory such as a ROM (Read Only Memory) and a volatile Memory such as a RAM (Random Access Memory); a memory 1003; an interface 1004 including input and output circuits. The functions of the arithmetic device 7C and the control device 9 are stored in the memory 1003 as computer programs, respectively. The processor 1001 reads out a computer program from the memory 1003, expands the computer program in the main memory 1002, and executes the above-described processing in accordance with the computer program. In addition, the computer program may also be distributed to the computer system 1000 via a network.
The computer program enables the computer system 1000 to execute the following processing according to the above-described embodiment: irradiating the lead member 100 with a stripe pattern light in a state where the body 101 of the lead member 100 is held by the robot hand 5 provided at the distal end portion of the robot manipulator 6; a lead member 100 that photographs the light irradiated with the stripe pattern from a predetermined viewpoint; image processing is performed on the captured data of the lead member 100; the robot manipulator 6 is controlled based on the image processing result to insert the lead 110 of the lead member 100 into the hole 210 of the substrate 200. In the image processing, the computer program can cause the computer system 1000 to execute the following processing according to the above-described embodiment: performing arithmetic processing on the imaging data of the lead member 100 based on a phase shift method to generate three-dimensional image data Da; converting the three-dimensional image data Da into three-dimensional point group data Db; dividing the three-dimensional point group data Db, and extracting lead point group data Dd indicating a three-dimensional shape of the surface of the lead 110 from the three-dimensional point group data Db; principal component analysis is performed on the lead point group data Dd, and three-dimensional data of the lead 110 is calculated.
(Effect)
As described above, according to the embodiment, the component mounting device 1 includes the robot manipulator 6, the manipulator 5, and the three-dimensional measuring device 7. The three-dimensional measurement device 7 can recognize the amount of bending of the lead 110 and the coordinates of the tip portion of the lead 110 in the three-dimensional space defined by the local coordinate system as the state of the lead 110. The robot manipulator 6 and the robot hand 5 can insert the lead 110 into the hole 210 of the substrate 200 in accordance with the state of the lead 110. Therefore, a decrease in productivity of the component mounting apparatus 1 can be suppressed.
The arithmetic device 7C of the three-dimensional measurement device 7 includes: a three-dimensional image generation unit 11 that generates three-dimensional image data Da by performing arithmetic processing on the image data of the lead member 100 based on a phase shift method; a three-dimensional point group conversion unit 12 for converting the three-dimensional image data Da into three-dimensional point group data Db; a dividing unit 14 that divides the three-dimensional point group data Db and extracts lead point group data Dd indicating the three-dimensional shape of the surface of the lead 110 from the three-dimensional point group data Db; and a lead state calculation unit 16 for performing principal component analysis on the lead point group data Dd and calculating three-dimensional data of the lead 110. The division by the dividing unit 14 removes the measurement points corresponding to the background or noise from the three-dimensional point cloud data Db. The principal component analysis by the lead state calculation unit 16 calculates a principal axis vector indicating the state of the lead 110. Therefore, the three-dimensional measurement device 7 can appropriately recognize the state of the lead wire 110.
When the lead point group data Dd is divided into a plurality of divided point groups De by division, the candidate point group generating unit 15 connects the plurality of divided point groups De based on the design value of the lead member 100, the viewpoint of the imaging device 7B at the time of imaging the lead member 100, and a predetermined connection condition, and generates candidate point group data Df as candidates for the lead 110. The three-dimensional measurement device 7 can appropriately recognize the state of the lead 110 based on the candidate point group data Df.
The three-dimensional image generating unit 11 generates a plurality of three-dimensional image data Da based on a plurality of pieces of captured data captured from a plurality of viewpoints, respectively. The three-dimensional point cloud converting unit 12 generates a plurality of three-dimensional point cloud data Db based on the plurality of three-dimensional image data Da. The three-dimensional point group integrating unit 13 aligns the plurality of three-dimensional point group data based on a predetermined algorithm to generate integrated point group data Dc. Thus, even if the positioning accuracy of the robot manipulator 6 is insufficient in the imaging of the lead member 100 by the imaging device 7B, the error of the three-dimensional point group data Db due to the positioning accuracy of the robot manipulator 6 can be absorbed by generating the integrated point group data Dc.

Claims (5)

1. A component mounting device is characterized by comprising:
a robot manipulator;
a manipulator provided at a distal end portion of the robot manipulator and holding a main body of a lead member;
a projection device that irradiates a stripe pattern light to the lead member in a state where the main body is held by the robot;
an imaging device that images the lead member irradiated with the stripe pattern light from a predetermined viewpoint;
an arithmetic device that performs image processing on the image data of the lead member captured by the imaging device; and
a control device for controlling the robot manipulator based on the image processing result of the arithmetic device so that the lead of the lead component is inserted into the hole of the substrate,
the arithmetic device includes:
a three-dimensional image generation unit that generates three-dimensional image data by performing arithmetic processing on the captured data of the lead member based on a phase shift method;
a three-dimensional point group conversion unit that converts the three-dimensional image data into three-dimensional point group data;
a dividing unit that divides the three-dimensional point group data and extracts, from the three-dimensional point group data, lead point group data indicating a three-dimensional shape of a surface of the lead; and
and a lead state calculation unit for performing principal component analysis on the lead point group data to calculate three-dimensional data of the lead.
2. Component mounting apparatus according to claim 1,
by the division, the lead point group data is separated into a plurality of divided point groups,
the computing device includes a candidate point group generating unit that generates candidate point group data as candidates for the lead by connecting a plurality of the divided point groups based on a design value of the lead member, a viewpoint of the imaging device at the time of imaging the lead member, and a predetermined connection condition,
performing principal component analysis on the lead point group data includes performing principal component analysis on the candidate point group data.
3. The component mounting apparatus according to claim 1 or 2,
the control device controls the robot manipulator so that the photographing device photographs the lead members from a plurality of viewpoints, respectively,
the three-dimensional image generation unit generates a plurality of three-dimensional image data based on a plurality of pieces of captured data captured from a plurality of viewpoints,
the three-dimensional point group conversion unit generates a plurality of three-dimensional point group data based on a plurality of three-dimensional image data,
the arithmetic device includes a three-dimensional point group integrating unit that aligns a plurality of three-dimensional point group data based on a predetermined algorithm to generate integrated point group data,
segmenting the three-dimensional point cloud data includes segmenting the aggregate point cloud data.
4. Component mounting apparatus according to any one of claims 1 to 3,
the lead state calculation unit calculates a bending amount of the lead and coordinates of a tip portion of the lead in a three-dimensional space as three-dimensional data of the lead.
5. A component mounting method, comprising:
irradiating a stripe pattern light to a lead member in a state where a body of the lead member is held by a manipulator provided at a distal end portion of a robot manipulator;
shooting the lead member irradiated with the stripe pattern light from a predetermined viewpoint;
performing image processing on the shot data of the lead component; and
controlling the robot manipulator to insert the lead of the lead member into the hole of the substrate based on the image processing result,
the image processing includes:
performing arithmetic processing on the shooting data of the lead component based on a phase shift method to generate three-dimensional image data;
converting the three-dimensional image data into three-dimensional point group data;
segmenting the three-dimensional point cloud data, and extracting lead point cloud data representing a three-dimensional shape of the surface of the lead from the three-dimensional point cloud data; and
and performing principal component analysis on the lead point group data, and calculating three-dimensional data of the lead.
CN202211206617.XA 2021-10-05 2022-09-30 Component mounting device and component mounting method Pending CN115939904A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-164065 2021-10-05
JP2021164065A JP2023054992A (en) 2021-10-05 2021-10-05 Component mounting device and component mounting method

Publications (1)

Publication Number Publication Date
CN115939904A true CN115939904A (en) 2023-04-07

Family

ID=85830945

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211206617.XA Pending CN115939904A (en) 2021-10-05 2022-09-30 Component mounting device and component mounting method

Country Status (2)

Country Link
JP (1) JP2023054992A (en)
CN (1) CN115939904A (en)

Also Published As

Publication number Publication date
JP2023054992A (en) 2023-04-17

Similar Documents

Publication Publication Date Title
CN111482959B (en) Automatic hand-eye calibration system and method of robot motion vision system
EP1413850B1 (en) Optical sensor for measuring position and orientation of an object in three dimensions
CN108965690B (en) Image processing system, image processing apparatus, and computer-readable storage medium
US20030110610A1 (en) Pick and place machine with component placement inspection
WO2001067831A2 (en) One camera system for component to substrate registration
WO2007149050A1 (en) Method and apparatus for 3-dimensional vision and inspection of ball and like protrusions of electronic components
JP2009156877A (en) Three-dimensional inspection system
JP6984633B2 (en) Devices, methods and programs that detect the position and orientation of an object
US20230179732A1 (en) Image capturing apparatus, image processing apparatus, image processing method, image capturing apparatus calibration method, robot apparatus, method for manufacturing article using robot apparatus, and recording medium
CN107030689B (en) Method for positioning a component in a desired orientation and position on a board, pick and place machine and sensor
KR20140071895A (en) Apparatus and method of recognizing an object, and apparatus and method of mounting a semiconductor chip
CN111971523B (en) Vision sensor system, control method, and storage medium
JP2011065399A (en) Simulation device, simulation method, and simulation program
CN115939904A (en) Component mounting device and component mounting method
CN116718109A (en) Target capturing method based on binocular camera
EP4238718A1 (en) Soldering device, soldering system, and processing device
EP3988895A1 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
US20080008381A1 (en) Coordinate acquisition apparatus for test of printed board, and coordinate acquisition method and program for test thereof
CN113766083A (en) Parameter configuration method of tracking scanning system, electronic device and storage medium
JP7071207B2 (en) Position detectors, position detection methods, manufacturing systems, manufacturing methods, control programs, and recording media
CN117308824A (en) 3-dimensional measuring device, component mounting device, and 3-dimensional measuring method
JPH0843044A (en) Measuring apparatus for three dimensional coordinate
JP2023546204A (en) Apparatus and method for imaging an object in at least two views
JPH0934552A (en) Mobile object controller, position detection device, mobile object device and their control method
KR100439713B1 (en) Precise position working device and calibration method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination