CN114945450A - Robot system - Google Patents

Robot system Download PDF

Info

Publication number
CN114945450A
CN114945450A CN202180008788.9A CN202180008788A CN114945450A CN 114945450 A CN114945450 A CN 114945450A CN 202180008788 A CN202180008788 A CN 202180008788A CN 114945450 A CN114945450 A CN 114945450A
Authority
CN
China
Prior art keywords
robot
image
coordinate system
unit
vision sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180008788.9A
Other languages
Chinese (zh)
Inventor
并木勇太
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Publication of CN114945450A publication Critical patent/CN114945450A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39045Camera on end effector detects reference pattern
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39057Hand eye calibration, eye, camera on hand, end effector

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Provided is a robot system capable of appropriately correcting the operation of a robot. The robot system includes: a vision sensor that captures a first image of an object at a predetermined position of a robot and captures a second image of the object at a position reached by moving the robot from the predetermined position by a predetermined distance; a calibration data storage unit that stores calibration data in which a robot coordinate system of the robot and an image coordinate system of the vision sensor are associated with each other; a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the first image and the calibration data; a second acquisition unit that acquires a second position of the object in a robot coordinate system based on the first image and the second image; and a determination unit that determines whether or not a difference between the first position and the second position is within a predetermined range.

Description

Robot system
Technical Field
The present invention relates to a robot system.
Background
Conventionally, a robot system has a vision sensor such as an imaging device, and performs a work such as processing or machining of an object by recognizing the position of the object by the vision sensor. The vision sensor captures an image of an object by an imaging device provided near the hand of the robot and an imaging device provided around the robot.
Such a robot system detects an object from a captured image, and controls the operation of the robot so that the robot performs a task with respect to the position of the detected object.
The robot system uses calibration data of the imaging device to convert the position of the detected object (the position in the image coordinate system or the position in the sensor coordinate system viewed from the vision sensor) into the position of the workpiece viewed from the robot (the position in the robot coordinate system) (see, for example, patent document 1). Thus, the robot system can correct the operation of the robot with respect to the detected position of the object to cause the robot to perform a work.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 4-35885
Disclosure of Invention
Problems to be solved by the invention
In such a robot system, the operation correction of the robot using the vision sensor may not be performed correctly due to an error in various settings. In such a case, the robot performs an unexpected operation, and therefore it is desirable to appropriately correct the operation of the robot.
Means for solving the problems
The robot system according to the present disclosure includes: a vision sensor that captures a first image of an object at a predetermined position of a robot and captures a second image of the object at a position reached by moving the robot from the predetermined position by a predetermined distance; a calibration data storage unit that stores calibration data in which a robot coordinate system of the robot and an image coordinate system of the vision sensor are associated with each other; a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the first image and the calibration data; a second acquisition unit that acquires a second position of the object in a robot coordinate system based on the first image and the second image; and a determination unit that determines whether or not a difference between the first position and the second position is within a predetermined range.
The robot system according to the present disclosure includes: a vision sensor that captures a plurality of images of the object at a plurality of positions reached by moving the robot by a predetermined distance; a calibration data storage unit that stores calibration data in which a robot coordinate system serving as a reference for operation control of the robot and an image coordinate system serving as a reference for measurement processing by the vision sensor are associated with each other; a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the plurality of images and the calibration data; a second acquisition unit that acquires a second position of the object in the robot coordinate system based on the plurality of images; and a determination unit that determines whether or not a difference between the first position and the second position is within a predetermined range.
The robot control method according to the present disclosure includes the steps of: setting the action correction of the robot; capturing a first image of an object at a first position of the robot; capturing a second image of the object at a second position reached by moving the robot a predetermined distance from the first position; acquiring a first position of the object in a robot coordinate system based on the first image and the calibration data; acquiring a second position of the object in a robot coordinate system based on the first image and the second image; determining whether a difference between the first position and the second position is within a predetermined range; estimating a cause of an abnormality in the motion correction of the robot based on the first position, the second position, the position of the robot corresponding to the first position, and the position of the robot corresponding to the second position when it is determined that the difference is outside the predetermined range; and changing the setting of the motion correction of the robot based on the estimated cause of the abnormality of the motion correction of the robot.
The robot system according to the present disclosure includes: a vision sensor that captures a first image of an object at a predetermined position of a robot and captures a second image of the object at a position reached by moving the robot from the predetermined position by a predetermined distance; a calibration data storage unit that stores calibration data in which a robot coordinate system of the robot and an image coordinate system of the vision sensor are associated with each other; a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the first image and the calibration data; a second acquisition unit that acquires a second position of the object in a robot coordinate system based on the first image and the second image; and a determination unit that determines whether or not a relationship between the first position and the second position is within a predetermined range based on the first position and the second position.
The robot system according to the present disclosure includes: a vision sensor that captures a first image of an object at a predetermined position of a robot and captures a plurality of images of the object at a plurality of positions reached by moving the robot from the predetermined position; a calibration data storage unit that stores calibration data in which a robot coordinate system of the robot and an image coordinate system of the vision sensor are associated with each other; a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the calibration data and a first image of the object; a second acquisition unit that acquires a second position of the object in a robot coordinate system based on the first image and the plurality of images; and a determination unit that determines whether or not a relationship between the first position and the plurality of positions is within a predetermined range, based on the first position and the plurality of positions.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, the operation of the robot can be appropriately corrected.
Drawings
Fig. 1 is a diagram showing a configuration of a robot system.
Fig. 2 is a diagram showing the configurations of the vision sensor control device and the robot control device.
Fig. 3 is a diagram illustrating movement of a vision sensor.
Fig. 4 is a diagram showing the second position Ps.
Fig. 5 is a diagram showing the relationship between the first position Pw and the second position Ps.
Fig. 6 is a diagram illustrating movement of the vision sensor.
Fig. 7 is a diagram showing a calibration jig for calibration.
Fig. 8 is a diagram showing calibration performed using the calibration jig.
Fig. 9 is a flowchart showing a process of the vision sensor control apparatus.
Detailed Description
Next, an example of an embodiment of the present invention will be described.
Fig. 1 is a diagram showing a configuration of a robot system 1.
As shown in fig. 1, the robot system 1 includes a robot 2, an arm 3, a vision sensor 4, a vision sensor control device 5, and a robot control device 6.
The robot system 1 recognizes the position of the object W based on the image of the object W captured by the vision sensor 4, for example, and performs a task such as processing or machining of the object W.
A hand or a tool is attached to a tip end of the arm 3 of the robot 2. The robot 2 performs operations such as processing and machining of the object W under the control of the robot controller 6. Further, a vision sensor 4 is attached to a distal end portion of the arm 3 of the robot 2.
The vision sensor 4 images the object W under the control of the vision sensor control device 5. The vision sensor 4 may be a two-dimensional camera having an imaging element constituted by a CCD (Charge Coupled Device) image sensor and an optical system including a lens, or may be a stereo camera capable of three-dimensional measurement.
The robot controller 6 executes an operation program of the robot 2 to control the operation of the robot 2. At this time, the robot controller 6 corrects the operation of the robot 2 with respect to the position of the object W detected by the vision sensor controller 5 so that the robot 2 performs a predetermined operation.
Fig. 2 is a diagram showing the configurations of the vision sensor control device 5 and the robot control device 6. The vision sensor control device 5 includes a storage unit 51 and a control unit 52.
The storage unit 51 is a storage device such as a ROM (Read Only Memory) for storing an OS (Operating System), an application program, and the like, a RAM (Random Access Memory), a hard disk Drive for storing other various information, and an SSD (Solid State Drive).
The storage unit 51 includes a model pattern storage unit 511 and a calibration data storage unit 512.
The model pattern storage unit 511 stores a model pattern obtained by modeling the image of the object W, for example, a model pattern representing the feature of the image of the object W.
The calibration data storage unit 512 stores calibration data in which a robot coordinate system serving as a reference for operation control of the robot 2 and an image coordinate system serving as a reference for measurement processing by the vision sensor 4 are associated with each other. Various methods have been proposed for the form of the calibration data and the method of obtaining the form of the calibration data, and any of these methods can be used.
The control Unit 52 is a processor such as a CPU (Central Processing Unit), and functions as a first acquisition Unit 521, a second acquisition Unit 522, a judgment Unit 523, an estimation Unit 524, a calibration Unit 525, and a notification Unit 526 by executing programs stored in the storage Unit 51.
The first acquisition unit 521 acquires a first position Pw of the object W in the robot coordinate system based on the calibration data and the first image of the object W captured by the vision sensor 4.
The second acquisition unit 522 acquires the second position Ps of the object W in the robot coordinate system based on the first image and the second image of the object W.
The determination unit 523 determines whether or not the difference between the first position Pw and the second position Ps is within a predetermined range.
When the determination unit 523 determines that the difference is outside the predetermined range, the estimation unit 524 estimates the cause of the abnormality in the motion correction of the robot 2 based on the first position Pw, the second position Ps, the position of the robot 2 corresponding to the first position Pw, and the position of the robot 2 corresponding to the second position Ps.
When the determination unit 523 determines that the difference is outside the predetermined range, the vision sensor 4 captures a plurality of images of the object W at a plurality of positions. Then, the estimation unit 524 estimates the cause of the abnormality in the motion correction of the robot 2 based on the plurality of images and the plurality of positions.
The calibration section 525 performs calibration of the vision sensor 4. The details of the calibration of the visual sensor 4 will be described later.
When the determination unit 523 determines that the difference is outside the predetermined range, the notification unit 526 notifies the robot 2 that there is an abnormality in the operation correction.
The notification unit 526 notifies the abnormality cause estimated by the estimation unit 524. The notification unit 526 notifies, for example, that an abnormal message is displayed on a display unit (not shown).
The robot controller 6 includes an operation control unit 61. The operation control unit 61 executes an operation program of the robot 2 to control the operation of the robot 2.
Next, the detailed operation of the visual sensor control device 5 will be described with reference to fig. 3 to 8.
First, the calibration unit 525 calibrates the visual sensor 4 as follows, for example.
Fig. 7 is a diagram showing the calibration jig 7 for calibration. Fig. 8 is a diagram illustrating calibration performed using the calibration jig 7.
The calibration jig 7 has a dot pattern as shown in fig. 7 as a pattern that can be recognized by the vision sensor 4. The dot pattern is composed of large dots and small dots arranged in a lattice shape. The dots having a large size are arranged in an L shape, and indicate the coordinate system of the calibration jig 7.
The calibration unit 525 sets in advance the position of the calibration jig 7 as viewed from the robot coordinate system.
In addition, the intervals of the points of the calibration jig 7 can be known from the design paper of the calibration jig and the like. Therefore, the calibration unit 525 specifies a known value for the dot interval, and stores the dot interval as a part of the calibration data in the calibration data storage unit 512.
The vision sensor 4 photographs the calibration jig 7. The calibration unit 525 acquires the position of the robot when the calibration jig 7 is imaged.
The operation control unit 61 moves the robot 2 in the vertical direction with respect to the calibration jig 7, and the vision sensor 4 photographs the calibration jig 7. In addition, the calibration unit acquires the position of the robot when the calibration jig 7 is photographed.
The calibration unit 525 performs calibration of the vision sensor 4 using information of the positions of the plurality of points in each image of the calibration jig 7 in the image coordinate system and the positions in the robot coordinate system.
By performing calibration of the visual sensor 4, external parameters and internal parameters of the visual sensor 4 are obtained.
Here, the external parameters are information on the position and orientation of the vision sensor, and the internal parameters are information on the conditions of the optical system of the vision sensor, such as the focal length of the lens, lens distortion, and the size of the light receiving element.
Fig. 3 is a diagram illustrating the movement of the vision sensor 4. First, the operation control unit 61 of the robot controller 6 moves the robot 2 so that the object W enters the imaging range, and the vision sensor 4 captures a first image of the object W. At this time, the first acquisition unit 521 stores the position of the robot 2 at the time of photographing.
The first acquisition unit 521 detects the object W from a predetermined range of the captured first image by using the model pattern stored in the model pattern storage unit 511. Thereby, the first acquisition unit 521 acquires the position and orientation of the object W in the image coordinate system.
The first acquisition unit 521 converts the position of the object W in the image coordinate system into a three-dimensional position Pw in the robot coordinate system (world coordinate system) based on the position of the robot 2 at the time of imaging and the calibration data stored in the calibration data storage unit 512. In this way, the first acquisition unit 521 acquires the first position Pw of the object W in the robot coordinate system.
Here, when a two-dimensional camera is used to detect the object W, the first acquisition unit 521 acquires the first position Pw, which is a three-dimensional position, assuming that the detection region of the object W is located on a certain plane (correction plane). The correction plane is typically set by a user using an operation panel (not shown) or the like connected to the visual sensor control device 5.
Generally, the operation of the robot 2 is corrected using the first position Pw of the object W. At this time, the manner of motion correction uses absolute position correction and relative position correction.
In the absolute position correction, the motion control unit 61 moves a TCP (Tool Center Point) 8 (see fig. 8) of the robot 2 to the first position Pw.
In the relative position correction, the control unit 52 determines the position of the object to be the reference (reference position) in advance, and calculates the difference between the reference position and the first position Pw as the correction amount. Then, the robot controller 6 adds the obtained correction amount to the operation of the robot 2 set in advance to correct the operation of the robot 2.
When the setting for the operation correction is performed as described above, the setting may not be performed correctly due to an error in various settings.
Therefore, in order to determine an error in the operation correction of the robot 2, the robot system 1 according to the present embodiment executes the following processing.
As shown in fig. 3, the operation control unit 61 of the robot controller 6 moves the robot 2 by a predetermined distance D from the reference position in the direction orthogonal to the optical axis of the vision sensor 4, and the vision sensor 4 captures a second image of the object W at the position where the robot 2 has moved by the predetermined distance D.
Here, the predetermined distance D indicates a distance from the reference position to a position where the object W can be sufficiently moved and detected by the vision sensor 4.
Fig. 4 is a diagram showing the second position Ps. Fig. 5 is a diagram showing the relationship between the first position Pw and the second position Ps.
The second acquisition unit 522 detects the object in each of a first image obtained by imaging the object W at a reference position and a second image obtained by imaging the object W at a position reached by moving the robot 2 by a predetermined distance D. The position of the object in the image coordinate system at the two viewpoints A and B is thus determined.
Then, the second acquisition unit 522 calculates two lines of sight from the position of the object W in the image coordinate system. The second acquisition unit 522 acquires the intersection of the two lines of sight or the point at which the distance between the two lines of sight is the closest as the second position Ps of the object W.
In the example of fig. 4, since the two lines of sight do not intersect each other, the second acquisition unit 522 acquires the point closest to the distance between the two lines of sight as the second position Ps of the object W.
The image pickup unit may pick up a plurality of images as the second image. In this case, the imaging unit repeatedly performs the following processing: the object W is imaged at a position where the robot 2 has moved a predetermined distance D, and the object W is imaged at a position where the robot 2 has moved the predetermined distance D from the imaged position.
When the imaging unit captures a plurality of images (for example, three or more images) as the second image, the second acquisition unit 522 calculates three or more viewpoints from the position of the object W in the image coordinate system. In this case, the second acquisition unit 522 acquires the intersection of three or more viewpoints as the second position Ps of the object W, or acquires the position where the distance between three or more viewpoints is the shortest using the least square method as the second position Ps of the object W.
Here, since the object W does not move, although the position of the object W in the image varies, the positional deviation (difference) between the first position Pw and the second position Ps should fall within the error range.
Therefore, the determination unit 523 determines whether or not the difference between the first position Pw and the second position Ps is within a predetermined range.
When the determination unit 523 determines that the difference is within the predetermined range, the estimation unit 524 determines that the operation correction of the robot 2 is normal.
When the determination unit 523 determines that the difference is outside the predetermined range, the estimation unit 524 determines that the motion of the robot 2 is corrected to be abnormal, and estimates the cause of the abnormality in the motion correction of the robot 2 based on the first position Pw, the second position Ps, the position of the robot 2 corresponding to the first position Pw, and the position of the robot 2 corresponding to the second position Ps.
Specifically, when the determination unit 523 determines that the difference is outside the predetermined range, the estimation unit 524 estimates that the calibration data stored in the calibration data storage unit 512 is erroneous when the two lines of sight do not intersect (for example, as in the case of fig. 4).
When the determination unit 523 determines that the difference is outside the predetermined range, the estimation unit 524 estimates that the correction plane is not correct as shown in fig. 5 when the two lines of sight intersect.
In the case where the calibration data is estimated to be erroneous, it is possible to estimate where the calibration data is erroneous by performing calibration using another calibration method. Fig. 6 is a diagram illustrating the movement of the vision sensor 4. As shown in fig. 6, the operation control unit 61 moves the robot 2 in a grid-like manner from the reference position in a direction orthogonal to the optical axis of the vision sensor 4. The vision sensor 4 captures images of the object W at a plurality of positions where the robot 2 has moved. Thereby, the vision sensor 4 captures a plurality of images of the object W.
The calibration unit 525 detects the object W from the plurality of captured images, and acquires the position Pci of the object W in the image coordinate system.
The calibration unit 525 obtains the position Psi of the object W viewed from the position of the flange of the robot 2 when the image is captured.
In this manner, the operation control unit 61 moves the robot 2 n times (i ═ 1, 2, 3, ·, n), and the vision sensor 4 captures an image of the object W n times. The calibration unit 525 performs calibration of the vision sensor 4 using the n sets (Pci, Psi) thus obtained.
The estimation unit 524 can determine abnormality of the external parameter and the internal parameter by comparing the external parameter and the internal parameter of the vision sensor 4 obtained in this way with the external parameter and the internal parameter (see fig. 7 and 8) previously obtained by the calibration unit 525 as described above.
For example, when the external parameter is incorrect, the estimation unit 524 estimates that the internal parameter obtained by the calibration is incorrect or the position of the calibration jig 7 is incorrect.
When the internal parameter is incorrect, the estimation unit 524 estimates that the interval of the designated points is incorrect or that inappropriate points are used for calibration.
Fig. 9 is a flowchart showing the processing of the vision sensor control apparatus 5.
In step S1, the operation control unit 61 moves the robot 2 so that the object W enters the imaging range, and the vision sensor 4 captures a first image of the object W. At this time, the first acquisition unit 521 stores the position of the robot 2 at the time of photographing.
In step S2, the first acquisition unit 521 detects the object W from a predetermined range of the captured first image using the model pattern stored in the model pattern storage unit 511.
The first acquisition unit 521 converts the position of the object W in the image coordinate system into a first position Pw in the robot coordinate system based on the calibration data stored in the calibration data storage unit 512 and the position of the robot 2 at the time of imaging.
In step S3, the operation control unit 61 of the robot controller 6 moves the robot 2 by a predetermined distance D from the reference position in the direction orthogonal to the optical axis of the vision sensor 4, and the vision sensor 4 captures a second image of the object W at the position where the robot 2 has moved by the predetermined distance D.
In step S4, the second acquisition unit 522 detects the object in each of the first image obtained by imaging the object W at the reference position and the second image obtained by imaging the object W at the position where the robot 2 has moved by the predetermined distance D.
The second acquisition unit 522 calculates two lines of sight from the position of the object W in the image coordinate system. The second acquisition unit 522 acquires the intersection of the two lines of sight or the point at which the distance between the two lines of sight is the closest as the second position Ps of the object W.
In step S5, it is determined whether or not the difference between the first position Pw and the second position Ps is within a predetermined range. If the difference is within the predetermined range (YES), the process proceeds to step S6, and if the difference is outside the predetermined range (NO), the process proceeds to step S7.
In step S6, the estimating unit 524 determines that the motion correction of the robot 2 is normal, and then ends the process.
In step S7, it is determined that the robot 2 has an abnormality in motion correction, and the cause of the abnormality in motion correction of the robot 2 is estimated based on the first position Pw, the second position Ps, the position of the robot 2 corresponding to the first position Pw, and the position of the robot 2 corresponding to the second position Ps.
In step S8, the notification unit 526 notifies the robot that there is an abnormality in the operation correction. The notification unit 526 notifies the cause of the abnormality estimated by the estimation unit 524.
In step S9, the operation control unit 61 moves the robot 2 in a grid-like manner from the reference position in a direction orthogonal to the optical axis of the vision sensor 4. The vision sensor 4 captures images of the object W at a plurality of positions where the robot 2 has moved. Thereby, the vision sensor 4 captures a plurality of images of the object W.
In step S10, the calibration unit 525 detects the object W from the plurality of captured images, and acquires the position Pci of the object W in the image coordinate system.
The calibration unit 525 obtains the position Psi of the object W viewed from the position of the flange of the robot 2 when the image is captured. The calibration unit 525 performs calibration of the vision sensor 4 using the n sets (Pci, Psi) thus obtained.
According to the present embodiment, the robot system 1 includes: a vision sensor 4 that captures a first image of the object W at a reference position of the robot 2 and captures a second image of the object W at a position reached by moving the robot 2 by a predetermined distance D from the reference position; a calibration data storage unit 512 that stores calibration data in which a robot coordinate system serving as a reference for operation control of the robot 2 and an image coordinate system serving as a reference for measurement processing by the vision sensor 4 are associated with each other; a first acquisition unit 521 for acquiring a first position Pw of the object W in the robot coordinate system based on the first image and the calibration data; a second acquisition unit 522 that acquires a second position Ps of the object W in the robot coordinate system based on the first image and the second image; and a determination unit 523 that determines whether or not the difference between the first position Pw and the second position Ps is within a predetermined range. Thus, the robot system 1 can determine an abnormality in the motion correction of the robot 2 based on the difference between the first position Pw and the second position Ps.
The robot system 1 further includes an estimating unit 524, and when the determining unit 523 determines that the difference is outside the predetermined range, the estimating unit 524 estimates the cause of the abnormality in the motion correction of the robot 2 based on the first position, the second position, the position of the robot 2 corresponding to the first position, and the position of the robot 2 corresponding to the second position. Thus, when there is an abnormality in the motion correction of the robot 2, the robot system 1 can estimate the cause of the abnormality. Therefore, the robot system 1 can appropriately correct the motion of the robot 2 based on the estimated cause of the motion correction of the robot 2.
In the robot system 1, when the determination unit 523 determines that the difference is outside the predetermined range, the vision sensor 4 captures a plurality of images of the object W at a plurality of positions, and the estimation unit 524 estimates the cause of the abnormality in the motion correction of the robot 2 based on the plurality of images and the plurality of positions. Thus, the robot system 1 can appropriately correct the operation of the robot 2.
Further, the vision sensor 4 may capture a plurality of second images of the object W, and the second acquisition unit 522 may acquire a plurality of second positions of the object W in the robot coordinate system based on the first image and the plurality of second images.
Thus, the robot system 1 can determine an abnormality in the motion correction of the robot 2 using the plurality of second images and the plurality of second positions.
The robot system 1 may further include a setting changing unit that changes the setting of the operation correction of the robot based on the abnormality cause of the operation correction of the robot estimated by the estimating unit 524. Thus, the robot system 1 can appropriately change the setting of the operation correction of the robot.
In the above-described embodiment, the robot system 1 uses the first image and the second image, but may use only an image corresponding to the second image.
For example, the robot system 1 may include: a first acquisition unit 521 for acquiring a first position of the object W in the robot coordinate system based on the calibration data and the plurality of second images; and a second acquisition unit 522 for acquiring a second position of the object W in the robot coordinate system based on the plurality of second images.
In addition, instead of the above-described embodiment, the determination unit 523 of the robot system 1 may use a method of checking each element of the difference between the three-dimensional positions of the first position Pw and the second position Pw, a method of checking the distance, or the like. That is, the determination unit 523 may determine whether or not the relationship between the first position Pw and the second position Ps is within a predetermined range based on the first position Pw and the second position Ps.
In addition, instead of the above-described embodiment, the determination unit 523 of the robot system 1 may use a method of measuring the variance or standard deviation of three or more positions. That is, the determination unit 523 may determine whether or not the relationship between the first position and the plurality of positions is within a predetermined range based on the first position and the plurality of positions.
The embodiments of the present invention have been described above, but the present invention is not limited to the above embodiments. The effects described in the present embodiment are merely the best effects according to the present invention, and the effects of the present invention are not limited to the effects described in the present embodiment.
Description of the reference numerals
1: a robotic system; 2: a robot; 3: an arm; 4: a vision sensor; 5: a vision sensor control device; 6: a robot control device; 61: an operation control unit; 511: a model pattern storage unit; 512: a calibration data storage unit; 521: a first acquisition unit; 522: a second acquisition unit; 523: a determination unit; 524: an estimation unit; 525: a calibration unit; 526: a notification unit.

Claims (11)

1. A robot system is provided with:
a vision sensor that captures a first image of an object at a predetermined position of a robot and captures a second image of the object at a position reached by moving the robot from the predetermined position by a predetermined distance;
a calibration data storage unit that stores calibration data in which a robot coordinate system of the robot and an image coordinate system of the vision sensor are associated with each other;
a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the first image and the calibration data;
a second acquisition unit that acquires a second position of the object in a robot coordinate system based on the first image and the second image; and
a determination unit that determines whether or not a difference between the first position and the second position is within a predetermined range.
2. The robotic system of claim 1, wherein,
the robot control device further includes an estimating unit that estimates a cause of an abnormality in the motion correction of the robot based on the first position, the second position, the position of the robot corresponding to the first position, and the position of the robot corresponding to the second position when the determining unit determines that the difference is outside the predetermined range.
3. The robotic system of claim 2, wherein,
the vision sensor captures a plurality of images of the object at a plurality of positions when the determination unit determines that the difference is outside the predetermined range,
the estimation unit estimates the cause of the abnormality in the motion correction of the robot based on the plurality of images and the plurality of positions.
4. The robotic system of claim 2, wherein,
the robot control device further includes a notification unit configured to notify that there is an abnormality in the motion correction of the robot when the determination unit determines that the difference is outside the predetermined range.
5. The robotic system of claim 4, wherein,
the notification unit notifies the abnormality cause estimated by the estimation unit.
6. The robotic system of any one of claims 1-5, wherein,
the vision sensor captures a plurality of second images of the object,
the second acquisition unit acquires a plurality of second positions of the object in a robot coordinate system based on the first image and the plurality of second images.
7. The robotic system of claim 2 or 3, wherein,
the robot control device further includes a setting changing unit that changes a setting of the operation correction of the robot based on the cause of the abnormality of the operation correction of the robot estimated by the estimating unit.
8. A robot system is provided with:
a vision sensor that captures a plurality of images of an object at a plurality of positions reached by moving the robot by a predetermined distance;
a calibration data storage unit that stores calibration data in which a robot coordinate system serving as a reference for operation control of the robot and an image coordinate system serving as a reference for measurement processing by the vision sensor are associated with each other;
a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the plurality of images and the calibration data;
a second acquisition unit that acquires a second position of the object in the robot coordinate system based on the plurality of images; and
a determination unit that determines whether or not a difference between the first position and the second position is within a predetermined range.
9. A robot control method comprising the steps of:
setting the action correction of the robot;
capturing a first image of an object at a first position of the robot;
capturing a second image of the object at a second position reached by moving the robot a predetermined distance from the first position;
acquiring a first position of the object in a robot coordinate system based on the first image and the calibration data;
acquiring a second position of the object in a robot coordinate system based on the first image and the second image;
determining whether a difference between the first position and the second position is within a predetermined range;
estimating a cause of an abnormality in the motion correction of the robot based on the first position, the second position, the position of the robot corresponding to the first position, and the position of the robot corresponding to the second position when it is determined that the difference is outside the predetermined range; and
changing the setting of the motion correction of the robot based on the estimated cause of the abnormality of the motion correction of the robot.
10. A robot system is provided with:
a vision sensor that captures a first image of an object at a predetermined position of a robot and captures a second image of the object at a position reached by moving the robot from the predetermined position by a predetermined distance;
a calibration data storage unit that stores calibration data in which a robot coordinate system of the robot and an image coordinate system of the vision sensor are associated with each other;
a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the first image and the calibration data;
a second acquisition unit that acquires a second position of the object in a robot coordinate system based on the first image and the second image; and
a determination unit that determines whether or not a relationship between the first position and the second position is within a predetermined range based on the first position and the second position.
11. A robot system is provided with:
a vision sensor that captures a first image of an object at a predetermined position of a robot and captures a plurality of images of the object at a plurality of positions reached by moving the robot from the predetermined position;
a calibration data storage unit that stores calibration data in which a robot coordinate system of the robot and an image coordinate system of the vision sensor are associated with each other;
a first acquisition unit that acquires a first position of the object in a robot coordinate system based on the calibration data and a first image of the object;
a second acquisition unit that acquires a second position of the object in a robot coordinate system based on the first image and the plurality of images; and
a determination unit that determines whether or not a relationship between the first position and the plurality of positions is within a predetermined range based on the first position and the plurality of positions.
CN202180008788.9A 2020-01-14 2021-01-08 Robot system Pending CN114945450A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020003862 2020-01-14
JP2020-003862 2020-01-14
PCT/JP2021/000486 WO2021145280A1 (en) 2020-01-14 2021-01-08 Robot system

Publications (1)

Publication Number Publication Date
CN114945450A true CN114945450A (en) 2022-08-26

Family

ID=76864418

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180008788.9A Pending CN114945450A (en) 2020-01-14 2021-01-08 Robot system

Country Status (5)

Country Link
US (1) US20230032421A1 (en)
JP (1) JP7414850B2 (en)
CN (1) CN114945450A (en)
DE (1) DE112021000229T5 (en)
WO (1) WO2021145280A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023157067A1 (en) * 2022-02-15 2023-08-24 ファナック株式会社 Robot system and calibration method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1727839A (en) * 2004-07-28 2006-02-01 发那科株式会社 Method of and device for re-calibrating three-dimensional visual sensor in robot system
JP2011230249A (en) * 2010-04-28 2011-11-17 Daihen Corp Method for calibrating sensor of manipulator having visual sensor and robot control system
CN104842352A (en) * 2014-02-13 2015-08-19 发那科株式会社 Robot system using visual feedback
CN107053167A (en) * 2015-12-01 2017-08-18 精工爱普生株式会社 Control device, robot and robot system
CN109382821A (en) * 2017-08-09 2019-02-26 欧姆龙株式会社 Calibration method, calibration system and program
CN109489558A (en) * 2017-09-11 2019-03-19 发那科株式会社 Range Measurement System and distance measurement method
JP2019063954A (en) * 2017-10-03 2019-04-25 株式会社ダイヘン Robot system, calibration method and calibration program
CN109816717A (en) * 2017-11-20 2019-05-28 天津工业大学 The vision point stabilization of wheeled mobile robot in dynamic scene
CN110114191A (en) * 2017-03-09 2019-08-09 三菱电机株式会社 Robot controller and calibration method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2690603B2 (en) 1990-05-30 1997-12-10 ファナック株式会社 Vision sensor calibration method
JP4220920B2 (en) * 2004-03-08 2009-02-04 ファナック株式会社 Visual sensor
JP2006250722A (en) * 2005-03-10 2006-09-21 Toshiba Corp Device, method and program for calibration
JP5496008B2 (en) * 2010-08-06 2014-05-21 キヤノン株式会社 Position / orientation measuring apparatus, position / orientation measuring method, and program
JP5670416B2 (en) * 2012-12-28 2015-02-18 ファナック株式会社 Robot system display device
JP6392908B2 (en) * 2017-01-12 2018-09-19 ファナック株式会社 Visual sensor abnormality cause estimation system
JP6871220B2 (en) * 2018-11-08 2021-05-12 ファナック株式会社 Control system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1727839A (en) * 2004-07-28 2006-02-01 发那科株式会社 Method of and device for re-calibrating three-dimensional visual sensor in robot system
JP2011230249A (en) * 2010-04-28 2011-11-17 Daihen Corp Method for calibrating sensor of manipulator having visual sensor and robot control system
CN104842352A (en) * 2014-02-13 2015-08-19 发那科株式会社 Robot system using visual feedback
CN107053167A (en) * 2015-12-01 2017-08-18 精工爱普生株式会社 Control device, robot and robot system
CN110114191A (en) * 2017-03-09 2019-08-09 三菱电机株式会社 Robot controller and calibration method
CN109382821A (en) * 2017-08-09 2019-02-26 欧姆龙株式会社 Calibration method, calibration system and program
CN109489558A (en) * 2017-09-11 2019-03-19 发那科株式会社 Range Measurement System and distance measurement method
JP2019063954A (en) * 2017-10-03 2019-04-25 株式会社ダイヘン Robot system, calibration method and calibration program
CN109816717A (en) * 2017-11-20 2019-05-28 天津工业大学 The vision point stabilization of wheeled mobile robot in dynamic scene

Also Published As

Publication number Publication date
US20230032421A1 (en) 2023-02-02
DE112021000229T5 (en) 2022-09-15
JP7414850B2 (en) 2024-01-16
WO2021145280A1 (en) 2021-07-22
JPWO2021145280A1 (en) 2021-07-22

Similar Documents

Publication Publication Date Title
CN108297096B (en) Calibration device, calibration method, and computer-readable medium
US9672630B2 (en) Contour line measurement apparatus and robot system
JP6180087B2 (en) Information processing apparatus and information processing method
EP2416113B1 (en) Position and orientation measurement apparatus and position and orientation measurement method
JP5602392B2 (en) Information processing apparatus, information processing method, and program
CN114174006A (en) Robot eye calibration method, device, computing equipment, medium and product
US20200098118A1 (en) Image processing apparatus and image processing method
JP2013515959A (en) System and method for runtime determination of camera miscalibration
CN111152243B (en) Control system
JP6885856B2 (en) Robot system and calibration method
US10909720B2 (en) Control device for robot, robot, robot system, and method of confirming abnormality of robot
CN114945450A (en) Robot system
JP6565367B2 (en) Position correction system
CN111522299B (en) mechanical control device
JP2010214546A (en) Device and method for assembling
US20230281857A1 (en) Detection device and detection method
CN114902281A (en) Image processing system
JP6578671B2 (en) ROBOT, ROBOT CONTROL METHOD, AND ROBOT CONTROL DEVICE
WO2022124232A1 (en) Image processing system and image processing method
US11698434B2 (en) Machine control device
CN115397626A (en) Coordinate system setting system and position and orientation measuring system
CN115752246A (en) Measurement system, measurement method, and storage medium
US20190061152A1 (en) Measuring method, program, measuring apparatus and method of manufacturing article
JP2021133458A (en) Three-dimensional measurement apparatus and three-dimensional measurement system
CN113165188A (en) Aligning device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination