WO2012076038A1 - A method for calibrating a robot unit, a computer unit, a robot unit and use of a robot unit - Google Patents

A method for calibrating a robot unit, a computer unit, a robot unit and use of a robot unit Download PDF

Info

Publication number
WO2012076038A1
WO2012076038A1 PCT/EP2010/068997 EP2010068997W WO2012076038A1 WO 2012076038 A1 WO2012076038 A1 WO 2012076038A1 EP 2010068997 W EP2010068997 W EP 2010068997W WO 2012076038 A1 WO2012076038 A1 WO 2012076038A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
target point
unit
camera unit
pose
Prior art date
Application number
PCT/EP2010/068997
Other languages
French (fr)
Inventor
Soenke Kock
Mikael Hedelind
Original Assignee
Abb Research Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Research Ltd. filed Critical Abb Research Ltd.
Priority to PCT/EP2010/068997 priority Critical patent/WO2012076038A1/en
Publication of WO2012076038A1 publication Critical patent/WO2012076038A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39016Simultaneous calibration of manipulator and camera

Definitions

  • a M ETHOD FOR CALI BRATIN G A ROBOT U N IT, A COM- PUTER U N IT, A ROBOT U N IT AN D USE OF A ROBOT U N IT
  • the present invention relates to a method for cali brating a first coordinate system of a robot unit with a second coordi nate system of an object identification unit.
  • the robot unit comprises a robot arm with a cali bration tool and the object identification unit comprises a camera unit.
  • the present i nvention furthermore relates to a computer unit adapted to execute the method , a robot unit comprising an object identification unit, which robot unit is adapted to execute the method and use of the robot unit.
  • a robot unit may use an object identification unit for identifying certain objects in a work station of the robot unit in order to per- form work on the objects, such as picking and sorting objects, welding , assembling , etc.
  • the robot unit is arranged with the first coordinate system and the object identification unit is arranged with the second coordi- nate system.
  • the object identification unit comprises the camera unit and a computer unit adapted to, based on the information from the camera, identify an object in relation to the second coordinate system.
  • the first and the second coordination system need to be essentially the same in order for the robot unit to perform work on the object. During cali bration procedure the first and the second coordination system are adjusted to be the same.
  • the robot unit is cali brated by means of that an operator manually operates the robot arm of the robot unit comprising the cali bration tool so that the cali bration tool is moved to a plurality of target points in which the robot unit is cali brated .
  • the calibration tool is held in a static position for a period of time duri ng which the object identification unit determines the coordinates of the calibration tool in the second coordinate system.
  • the second coordi nate system is adjusted to the coordinate of the target poi nt accordi ng to the first coordi nate system of the robot unit.
  • a problem with above descri bed cali bration procedure is that the result of the cali bration is dependent on the operator due to that different operators have different ways of setti ng the cali bration tool in the target poi nts. Furthermore, the manual cali bration procedure is time consumi ng . The result of the cali bration is improved by increasi ng the number of target points. This, however, further increases the cali bration time. The qual ity of the cali bration is also influenced by how the target points are selected . I n order to obtain a high quality the target points shall be distri ubbed over the area of the work station . A problem with prior art cali bration procedure is that the target points are not utilized optimal .
  • Another problem is that the operator must visually verify that the cali bration tool is in a target point that can be used for cali bration. For example, some target points can not be used for calibration due to that the cali bration tool is not visi ble for the cam- era. Furthermore, a target poi nts may involve a collision with objects at the work station of the robot unit. There is a trend to design robot units with less rigid structure and in some cases the robot unit is not firmly attached at the work station. Such robot units need to be calibrated frequently and thus it is desired to reduce the duration of the calibration procedure.
  • EP1468792 discloses a method for calibrating a coordinate system of a camera with a coordinate system of a robot unit, wherein the method may be executed automatically.
  • a problem with the method is that target points which are not suitable for the calibration may be used. Accordingly, an operator is required to visually confirm that the target points are valid and collision free. The quality of the calibration would however not be optimal, because not all possible target points are used for the calibration.
  • the object of the present invention is to provide an improved method for calibrating a first coordinate system of a robot unit with a second coordinate system of an object identification unit.
  • a first object of the method is to reduce the duration of the calibration procedure.
  • a second object of the method is to utilize the target points in order to obtain an optimal quality of the calibration.
  • a third object of the method is to enable calibration without visual confirmation of an operator.
  • the method comprises
  • the method further comprises, for each target point, evaluating the target point by executing the following steps until all target points are either maintained or rejected: a) determining whether the target point is visi ble for the camera unit,
  • the plurality of target points are generated so that the target points are separated from each other.
  • the method steps a-g re- gard an eval uation of the validity of the generated target point.
  • the target point can not be used for calibration and the target point is moved towards the camera unit by the certain increment.
  • the lack of visi bility may for example occur d ue to that an object blocks the visi bility.
  • the object may for example be a fixture, a shelf, a machine, etc.
  • Non-visi ble target points are moved towards the camera unit until the target points become visi ble or until the target points reach the limit of the certain range of distance from the camera unit. Outside the certai n range of distance from the camera unit, the quality of the target poi nts is insufficient for the cali bration , wherei n such target point is rejected .
  • Each target point is evaluated and moved until the target point is either maintained or rejected .
  • the method step of generating and eval uating the target point is performed without the necessity of physical movement of the robot arm. Accordingly, the term "move" refers to that the position of a target poi nt is changed in a direction by the certain increment without involvement if a physical movement.
  • increment refers to a distance that a target poi nt is to be moved .
  • the eval uation time is dependent on the size of the increment. Small sized increments result in high quality of the target points at the cost of high eval uation time i nvolving a large number of repetition for evaluation and movi ng the target points, and vice versa.
  • the size of the i ncrement shall involve a balanced between quality of the target points and evaluation time.
  • target point refers to a position and an orientation of the cali bration tool to which the cali bration tool is adapted to be moved by the robot arm for cali brating the first coordinate system with the second coordi nate system.
  • the target point comprises six deg rees of freedom ; three deg rees of freedom i n re- gards to a position of the cali bration tool and three degrees of freedom in regards to the orientation of the cali bration tool .
  • the term "mai ntained target point” refers to a target point that has been evaluated and is stored in a memory unit for generat- ing the robot program or for further evaluation .
  • the term "rejected target point” refers to a target point that can not be used for the calibration. A target point that is not visible for the camera unit can not be used for the calibration. Likewise, a target point that is outside the certain range of distance from the camera unit will not provide sufficient quality for the calibration procedure and thus the target point can not be used for calibration.
  • Each target point is evaluated and moved until it is either main- tained or rejected. Thereafter, the robot program is generated based on the maintain target points and the robot program is executed, wherein the robot arm physically moves the calibration tool to the target points, while calibrating the first coordinate system with the second coordinate system.
  • the evaluation and correction of the target points are performed prior to executing the calibration.
  • the calibration is performed automatically without that the operator visually must confirm that all the target points can be used.
  • the actual physical calibration procedure where the calibration tool is moved to the maintained target points by means of the robot program, is performed in a time ef- ficient manner.
  • the calibration tool comprises a calibration feature that is adapted to be recog- nized by the object identification unit.
  • the cali bration feature comprises a specific form i n order to enable recognition of the cali bration tool for cali bration.
  • the cali bration feature comprises an L-shaped element, a rectangular element, etc.
  • the steps a-g are simulated based on a virtual model of the robot arm, characteristics and position of the camera unit and characteristics of a work station of the robot unit. Accordingly, the simulation is per- formed without movi ng the robot arm prior to the execution of the generated robot program. Thereby, the eval uation of the target points is performed prior to physical movement of the calibration tool to the maintained target points. Accordi ngly, the duration of the cali bration procedure that requires the physical movement of the robot unit is reduced .
  • the method further comprises:
  • the quality of the target point is optimal in the focal plane. As the target point is moved towards the camera unit, the quality of the target points gradually decreases. If the target poi nt is out- side the certain range of distance from the camera unit, the quality of the target point is not sufficient for the cali bration.
  • a target point in the focal plane of the camera unit provides the highest quality picture for determini ng the position of the target point i n the second coord inate system .
  • the target points are generated in a position providing the best quality.
  • Target poi nts that are moved towards the camera unit due to lack of visi bility will thus have lower quality compared to the generated position of the targets points.
  • the target points are generated so that they are distributed evenly from each other in a plane within the range of distance from the focal plane of the camera unit. Thereby, the generated target points will provide an optimal calibration quality if all the targets points are maintained.
  • the target points are generated in the focal plane of the camera unit.
  • the method further comprises for each of the target points, evaluating the target points by executing the following steps until all target points are either maintained or rejected:
  • step a if the moved target point is beyond said distance from the robot base, evaluate the moved target point according to step a, n) repeat the steps h-m until the moved target point is either maintained or rejected if the moved target point is visible for the camera unit,
  • the method steps h-o regard an evaluation of if it is possible for the robot unit to move the calibration tool to the maintained tar- get points.
  • the target point can not be used for calibration and the target point is moved towards the robot base by a certain i ncrement.
  • the lack of reach may be due to that the target point is at a distance too far away from the robot base or d ue to that the robot arm must be stretch around an object to the target poi nt and therefore can not reach the target poi nt.
  • By means of moving the target poi nt towa rd s th e robot base the target point may be reached by the robot arm.
  • the target poi nts are moved towards the robot base until the target points reaches the certai n distance from the robot base.
  • the robot arm can not reach target points withi n the certain distance from the robot base. Accordingly, such target poi nts are rejected .
  • the steps a-o are simulated based on a virtual model of the robot arm, characteristics and position of the camera u n it and characteristics of a work station of the robot unit.
  • the method further comprises for each of the target points:
  • the method further comprises, for each robot pose, evaluati ng the robot pose by executi ng the following steps until all robot poses are either maintained or rejected :
  • the robot pose is generated for each maintained target point.
  • the robot pose defines a certain orientation of the robot arm.
  • the method steps p-t regard an evaluation of possible collision of the robot poses.
  • a robot pose for a target point involves a collision
  • the robot pose can not be used for calibration and one or more alternative robot poses for the same target points are evaluated for collision. If an alternative robot pose can reach the target point without collision, the generated robot pose is changed to the alternative robot pose, otherwise the target point and the robot pose are rejected.
  • the term "maintained robot pose” refers to a robot pose that has been evaluated and is stored in a memory unit for generating the robot program or for further evaluation.
  • rejected robot pose refers to a robot pose that can not reach its target point without collision. The robot pose is therefore not stored and the target point associated to it is rejected.
  • Each generated robot pose is evaluated for collision and is maintained or changed to an alternative robot pose. Otherwise the robot pose is rejected. Thereafter, the robot program is generated based on the maintain target points and the maintained robot poses. Thereafter the robot program is executed while performing calibration.
  • the steps a-t are simulated based on a virtual model of the robot arm, characteristics and position of the camera un it and characteristics of a work station of the robot unit.
  • the method comprises for each of the target points:
  • step p determining whether the cali bration tool is visi ble for the camera unit using the robot pose
  • step q maintain the robot pose if the cali bration tool is visible for the camera unit usi ng the robot pose,
  • step r if the cali bration tool is visi ble for the camera unit using the robot pose, determine whether at least one alternative robot pose for the target point is visi ble for the camera unit, - in step s, change the robot pose to the alternative robot pose if the alternative robot pose is visi ble for the camera unit.
  • step t reject the alternative robot pose and the target point if the alternative robot pose is not visible for the camera unit.
  • the method steps p-t furthermore regard an evaluation of the visi bility of the cali bration tool in the generated robot pose.
  • the robot pose may for example be oriented so that the robot arm itself obstructs the visi bility for the camera unit of the target point.
  • the robot pose can not be used for cali bration and one or more alternative robot poses for the same target poi nts are evaluated for visi bility. If an alternative robot pose is visi ble in the target point, the generated robot pose is changed to the alternative robot pose, otherwise the target poi nt and the robot pose are rejected .
  • the method fur- ther comprises, - in step j , determine whether at least one alternative robot pose is within reach of the target point,
  • step k change the robot pose to an alternative robot pose if the alternative robot pose enables reach of the target poi nt, oth- erwise reject the alternative robot pose.
  • the reach of a target poi nt may be dependent on the robot pose.
  • the target point may be within reach without moving the target point.
  • the robot pose comprises a first orientation of the robot arm and the at least one alternative robot pose comprises a second orientation of the robot arm.
  • the robot pose comprises a first orientation of the cali bration tool and the at least one alternative robot pose comprises a second orientation of the cali bration tool .
  • the steps a-o are simulated without moving the robot arm prior to the execution of the generated robot program.
  • the determination on whether the target point is visible for the camera unit is based on i nformation on the geometry of a work station of the robot unit and the position of the camera unit.
  • the determination on the whether the target point is withi n reach of the robot arm is based on the position of the robot base and the geometry of the robot unit.
  • the determination on the whether the robot poses are collision free is based on i n- formation on geometry of the robot unit and a work station of the robot unit.
  • the determi nation on the whether the cali bration tool is visi ble for the camera unit is based on i nformation on the geometry of the robot unit, the cali bration tool and the camera unit.
  • said range of dis- tance from the camera unit in step e is dependent on a focus distance of the camera unit.
  • the target points must be at the range of distance for sufficient quality for the cali bration.
  • said range of dis- tance from the camera unit in step e is dependent on depth of field of the camera unit.
  • depth of field refers to the distance around the focal plane, which the target points are sufficiently sharp for the cali bration .
  • the determination on whether the target poi nt is visi ble is based on information on camera unit field of view.
  • field of view refers to the extent to which target points are visi ble for the camera unit. The term is related to the angle of view of the camera unit.
  • said distance from the robot base in step k is dependent on the geometry of the robot arm.
  • the method further comprises:
  • step a-t is simulated based on a model of the characteristics of the robot unit and the object identification unit. Accordingly, the evaluation is performed prior to the actual calibration that is performed according to the robot program. Accordingly, the duration of the actual calibration where the calibration tool is moved to the plurality of target points are reduced.
  • Fig. 1 shows an example of a robot unit with a first coordinate system and an object identification unit with a second coordinate system, which first and second coordinate systems are adapted to be calibrated by an embodiment of the invention.
  • Fig.2 shows examples of parameters of the camera unit used in the method.
  • Fig.3 shows an example of a first evaluation of the target points according to an embodiment of the invention.
  • Fig.4 shows an example of a second evaluation of the target points.
  • Fig.5 shows an example of a first evaluation of the robot poses.
  • Fig.6 shows an example of a second evaluation of the robot poses.
  • Fig.7 shows a general flow chart of the method.
  • Figure 1 shows an example of a robot unit 1 with a first coordi nate system and an object identification unit 2 with a second co ordinate system.
  • the robot unit 1 comprises a robot arm 3 with a calibration tool 4.
  • the robot unit 1 and the object identification unit 2 are located at a work station 7.
  • the robot unit 1 is adapted to perform work at the work station 7.
  • the robot unit 1 comprises a robot controller 9 adapted to control the movements of the robot arm 3 by means of controlling a plurality of electric motors on the robot arm 3.
  • the robot controller 9 comprises a central processing unit (CPU) 10, a memory unit 1 1 and a drive unit 12.
  • the CPU 10 is adapted to execute a robot program located on the memory unit, wherei n the robot arm is moved to a plurality of position using a plurality or robot poses.
  • the drive unit 12 is adapted control the electric motors of the robot arm 3 i n dependency of the executed robot program.
  • the object identification unit 2 comprises a camera un it 20 and an information processi ng unit 22. The camera unit is adapted to be directed towards the work station 7 of the robot unit 1 .
  • the information processing unit 22 comprises a central process- ing unit (CPU) 24 and a memory unit 26.
  • the information processing unit 22 is adapted to receive information from the camera unit 20 in the form of a depiction of one or more object at the work station 7.
  • the information processing unit 22 is adapted to process the information so that the object is recognized and the position of the object in the second coordinate system is determined by means of certai n object recognition algorithms.
  • the object identification unit 2 is adapted to recognize a cali bration feature of the cali bration tool 4 on the robot arm 3.
  • the robot unit 1 is adapted to move the robot arm 3 to the position of the object and perform work on the object, such as picking , welding , painting , assembly, etcetera. Accordingly, the robot unit 1 and the object identification unit 2 are co-operating in the work at the work station 7.
  • the first coordinate system of the robot unit 1 and the second coordi nate system of the object identification unit 2 must be essentially the same. Therefore, the first and the second coordi nate system must be cali brated with each other by means of a cali bration method prior to performi ng work at the work station 7. It shall be understood that the calibration comprises correcting one of the first and the second coordinate system with the other of the first and the second coor- dinate system. The cali bration is in some working condition repeated frequently in order to assure the accuracy of the work performed by the robot unit 1 .
  • the robot unit 1 further comprises a computer u nit 30 compris- ing a central processi ng u n it (CPU ) 32 and a memory u n it 34.
  • the computer unit 30 is adapted to generate a plurality of target points to which the cali bration tool 4 is to be moved by the robot arm 3 and a plurality of robot poses in which the robot pose the robot arm 3 is adapted to be oriented .
  • the computer unit 30 unit is further adapted to evaluate the target poi nts and robot poses prior to executing the cali bration. After the evaluation a set of mai ntained target points and maintained robot poses is obtai ned .
  • the robot controller 9 After receiving the information, the robot controller 9 is adapted to generate a robot program based on the maintained target points and maintained robot poses and the robot program is adapted to be executed wh ile cal i brati ng the fi rst coordi nate system with the second coordinate system.
  • Figure 2 shows an example of parameters used i n an embodi- ment of the method for cali brating the first coordinate system of the robot unit 1 with the second coordinate system of the object identification unit 2.
  • the parameters are determined by the characteristics of a camera unit 20.
  • the camera unit 20 comprises a view cone, i n fig . 2 , il lustrated as a triangle. Only targets points within the view cone are visi ble and thus useful for the cali bration.
  • the camera unit 20 comprises a focus distance.
  • the focus distance is a distance from the camera unit 20 to a focal plane of the camera unit 20.
  • the focal plane is a plane where objects are depicted with the highest quality.
  • the object identification unit 2 is arranged to determine the position of objects i n the focal plane with the highest accuracy.
  • the camera unit 5 is also characterized by the term depth of field .
  • the depth of field is a range of distance around the focal plane where the quality of the depicted object is sufficient for determi ning the position of the objects.
  • Figu re 7 shows a g eneral flow chart of an embodi ment of the method for cali brating the first coordinate system of a robot unit 1 with the second coordinate system of the object identification unit 2.
  • the first step of a method comprises generati ng a pl ural ity of target poi nts to which the cali bration tool 4 is to be moved by the robot arm 3 for cali bration.
  • the target points are preferably generated so that they are separated from each other by an equal distance.
  • the target points are preferably gen- erated i n the focal plane of a camera unit 20.
  • Each target point is a point in space where the robot arm 3 is adapted to move the cali bration tool 4 for cali bration of a first coordinate system with the second coordinate system.
  • the target point further comprises an orientation . Accordingly, each target point comprises six degrees of freedom , three degree regarding the position of the target point and three degrees regarding the orientation of the target point.
  • the number of generated target points depends on the desired quality of the cali bration. Accordingly, a large number of target points are necessary in order to obtain a high quality cali bration .
  • the target points shall preferably be located in the focal plane of the camera unit 20.
  • I n order to enable cali bration of the first coordinate system with the second coordinate system, th e ta rg et po i nts a re required to be located within a certai n range of distance from the camera unit 20.
  • the certai n range of distance from the camera unit 20 provides an acceptable quality of the target points in order to perform the cali bration.
  • the certain range of distance from the camera unit 20 is the distance from the camera unit 20 to and withi n the depth of field of the camera unit 20, see fig . 2.
  • target points that are located outside the certain range of distance from the camera unit 20 will not provide sufficient quality in order to cali brate the first coordinate system with the second coordinate system.
  • the evaluation of a target point comprises a first evaluation and a second evaluation.
  • the first evaluation comprises determination and correction i n ord er to assure that the target points are visi ble for the camera unit 20 and within the certain range of distance from the camera unit 20 in order to enable the calibration .
  • the first evaluation is performed for each target point.
  • the first evaluation is initiated by the determining whether the target point is visi ble for the camera unit 20. If a target point is visi ble for the camera unit 20 the target point is mai ntained . If a target point is not visi ble for the camera unit 20 the target point is corrected by moving the target point by a certain increment in a di- rection towards the camera unit 20, illustrated as "+ 1 camera" in fig . 3.
  • the moved target poi nt i s rejected if the moved target poi nt is not within the certain range of distance from the camera unit 20. If the moved target point is within the certain range of distance from the camera unit 20, the process is repeated until the moved target point is either maintained or rejected .
  • a maintained target point is a target point that has been evaluated and possi bly corrected by moving the target point towards the camera unit 20.
  • the maintained target point is stored in the memory unit 26 of the object identification unit.
  • the maintained target point is adapted to be used for generating a robot program or for further evaluation.
  • a rejected target point is a target poi nt that can not be used for the cali bration. I n regards to the first evaluation , the rejected target point is either not visi ble for the camera unit 20 or is outside the certain range of distance from the camera unit 20.
  • the second evaluation of the target point is shown in figure 4.
  • the second evaluation of the target point is shown is performed on each of the target poi nts and is initiated by determini ng whether the target poi nt is within reach of the robot arm 3. If a target poi nt is within reach of the robot arm 3, the target point is maintai ned . If a target poi nt is not withi n reach of the robot arm 3, the target point is moved by a certain increment in a direction towards a robot base 28 of the robot arm 3, i ll ustrated as "+ 1 robot" i n fi g . 4. By moving the target point towards the robot base, the target poi nt is moved closer to the robot base 28.
  • the robot unit 1 is adapted to reach target points that are located at a distance beyond the certain distance from the robot base 28.
  • the certai n range of distance from the robot base 28 regards a mi n imum distance that the target poi nts shal l be located in order for the robot unit 1 to move the cali bration tool 4 to the target point.
  • the target point is rejected . If a target point is not beyond the distance from the robot base, the target point is rejected . If a target point is beyond the certain distance from the robot base 28, the target point is evaluated accordi ng to a first evaluation in figure 3 until the target poi nt is either re-evaluated regarding if the robot arm 3 can reach the target point or rejected . The second evaluation is performed for each target poi nt unti l the target point is either maintai ned or rejected .
  • a set of maintained target points are stored in the memory unit 34 of the com- puter unit 30.
  • the first eval uation and second evaluation of the target points are performed by means of a simulation software program on the computer unit.
  • the software program uses a model of the characteristics of the work space 7, the robot unit 1 and the camera unit 20.
  • a robot pose associated to each of the mai ntained target points is generated by the computer unit 30.
  • Each robot pose defines a certain orientation of the robot arm 3 that positions the cali bration tool 4 i n the spe- cific target point.
  • a specific target point can often be reached by a plurality of different robot poses.
  • each robot pose are evaluated .
  • the evalua- tion of the robot poses comprises a fi rst eval uation and a second evaluation.
  • the first evaluation and second evaluation of the robot poses are performed by means of a simulation software program on the computer unit 30.
  • the software program controls if the robot pose involves a collision and if the cali bra- tion tool is visible using a certain robot pose.
  • a first evaluation of the robot poses are shown in figure 5.
  • the evaluation comprises, for each robot pose, determining whether the robot pose is collision free. If the robot pose is collision free, the robot pose is mai ntained . If a robot pose is not collision free, it is determined whether at least one alternative robot pose for the same target point is collision free. If such alternative robot pose is available for the target point, the robot pose is changed to the alternative robot pose and the alternative robot pose is maintai ned . If no alternative robot pose for the target poi nt is collision free, the robot pose is rejected and the target poi nt associated with the robot pose is also rejected .
  • the target point that is to be rejected according to above may by moved with the certain range of distance from the camera un it 20 in order to fi nd an alternative target point that may be assigned a robot pose without collision.
  • the moved target point is also reevaluated accordi ng to the first and second evaluation of the target point accordi ng to fig . 3 and 4.
  • the mai ntained collision free robot poses are stored in the memory unit 1 1 of the robot controller 9 for generating the robot program or for further evaluation.
  • the rejected robot poses relates to that the robot pose and its associated target poi nt are deleted and thus not used for the calibration .
  • the second eval uation is shown in figure 6.
  • the evaluation comprises, for each robot pose, determining whether cali bration tool 4 is visi ble for the camera unit 20 using the robot pose. If the cali bration tool 4 is visi ble for the camera unit 20 using the robot pose, the robot pose is mai ntained . If the cali bration tool 4 is not visi ble for the camera unit 20 using the robot pose, it is determi ned whether at least one alternative robot pose for the same target point is visi ble for the camera unit 20. If such alter- native robot pose is available for the target point, the robot pose is changed to the alternative robot pose and the alternative robot pose is maintained . If no alternative robot pose for the target point enables the cali bration tool 4 to be visi ble for the camera unit 20, the robot pose is rejected and the target point associ- ated with the robot pose is also rejected .
  • the target point that is to be rejected according to above may by moved with the certain range of distance from the camera unit 20 in order to fi nd an alternative target poi nt may be assigned a robot pose, which robot pose enables the cali bration tool 4 to be visi ble for the camera unit 20.
  • the moved target point is also reeval uated according to the first and second evaluation of the target point according to fig . 3 and 4.
  • a set of maintained robot poses are stored in the memory unit 34 of the computer unit 30.
  • the first eval uation and second evaluation of the robot poses are performed by means of a si mulation software prog ra m on th e computer u n it.
  • Th e software prog ram uses a model of the characteristics of the work space 7, the robot unit 1 and the camera unit 20.
  • the rejected robot poses and their associated target poi nt are deleted and are th us not used for the cali bration .
  • a set of mai ntained target poi nts and a set of maintai ned robot poses are stored in the memory unit 34 of the computer unit 30 and the information is transferred to the robot controller 12.
  • a robot program is generated by the robot control- ler 12 based on the set of mai ntained target points and the set of maintai n ed ro bot poses .
  • the robot program is executed while cali brating the first coordinate system with the second coordi nate system. The method has the benefit that the target points and the robot poses are generated and evaluated before the actual cali bration procedure.
  • the duration of the cali bration while the robot unit 1 by means of the robot arm 3 moves the cali bration tool 4 to the plurality of target points while cali brating the first coordinate system with the second coordinate system is reduced in comparison to prior art cali bration methods.
  • a further benefit is that the operator is not required to visually confirm the target points and the robot poses while the robot arm 3 moves the cali bration tool 4 to the plurality of target points by means of the plurality of robot poses.
  • the robot controller 9, the information processing unit 22 and the computer unit 30 may be the same unit. It shall also be understood that a single CPU 10 and a single memory unit 1 1 of the robot controller 9 may be used for achieving the method . It shall furthermore be understood that the evaluation step of the target points and robot poses are performed "off-line" without requiring movement of the robot arm 3 of the robot unit 1.
  • first and the second evaluation in fig. 7 also involves the corrective measures of moving the target point.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A method for calibrating a first coordinate system of a robot unit (1) with a second coordinate system of an object identification unit (2). The method comprises generating a plurality of target points to which a calibration tool (4) is to be moved by the robot unit for calibration, evaluating the target points for visibility of the camera unit and range of distance from the camera unit, and moving the target points towards the camera unit until the target point are either maintained or rejected, generating a robot pro¬ gram based on the maintained target points, and executing the robot program while calibrating the first coordinate system with the second coordinate system.

Description

A M ETHOD FOR CALI BRATIN G A ROBOT U N IT, A COM- PUTER U N IT, A ROBOT U N IT AN D USE OF A ROBOT U N IT
FI ELD OF TH E I NVENTI ON The present invention relates to a method for cali brating a first coordinate system of a robot unit with a second coordi nate system of an object identification unit. The robot unit comprises a robot arm with a cali bration tool and the object identification unit comprises a camera unit.
The present i nvention furthermore relates to a computer unit adapted to execute the method , a robot unit comprising an object identification unit, which robot unit is adapted to execute the method and use of the robot unit.
PRIOR ART
A robot unit may use an object identification unit for identifying certain objects in a work station of the robot unit in order to per- form work on the objects, such as picking and sorting objects, welding , assembling , etc.
The robot unit is arranged with the first coordinate system and the object identification unit is arranged with the second coordi- nate system. The object identification unit comprises the camera unit and a computer unit adapted to, based on the information from the camera, identify an object in relation to the second coordinate system. The first and the second coordination system need to be essentially the same in order for the robot unit to perform work on the object. During cali bration procedure the first and the second coordination system are adjusted to be the same.
In prior art robot units, the robot unit is cali brated by means of that an operator manually operates the robot arm of the robot unit comprising the cali bration tool so that the cali bration tool is moved to a plurality of target points in which the robot unit is cali brated . At the target points the calibration tool is held in a static position for a period of time duri ng which the object identification unit determines the coordinates of the calibration tool in the second coordinate system. Thereafter the second coordi nate system is adjusted to the coordinate of the target poi nt accordi ng to the first coordi nate system of the robot unit.
A problem with above descri bed cali bration procedure is that the result of the cali bration is dependent on the operator due to that different operators have different ways of setti ng the cali bration tool in the target poi nts. Furthermore, the manual cali bration procedure is time consumi ng . The result of the cali bration is improved by increasi ng the number of target points. This, however, further increases the cali bration time. The qual ity of the cali bration is also influenced by how the target points are selected . I n order to obtain a high quality the target points shall be distri buted over the area of the work station . A problem with prior art cali bration procedure is that the target points are not utilized optimal .
Another problem is that the operator must visually verify that the cali bration tool is in a target point that can be used for cali bration. For example, some target points can not be used for calibration due to that the cali bration tool is not visi ble for the cam- era. Furthermore, a target poi nts may involve a collision with objects at the work station of the robot unit. There is a trend to design robot units with less rigid structure and in some cases the robot unit is not firmly attached at the work station. Such robot units need to be calibrated frequently and thus it is desired to reduce the duration of the calibration procedure.
EP1468792 discloses a method for calibrating a coordinate system of a camera with a coordinate system of a robot unit, wherein the method may be executed automatically. A problem with the method is that target points which are not suitable for the calibration may be used. Accordingly, an operator is required to visually confirm that the target points are valid and collision free. The quality of the calibration would however not be optimal, because not all possible target points are used for the calibration.
OBJECTS AND SUMMARY OF THE INVENTION The object of the present invention is to provide an improved method for calibrating a first coordinate system of a robot unit with a second coordinate system of an object identification unit. A first object of the method is to reduce the duration of the calibration procedure. A second object of the method is to utilize the target points in order to obtain an optimal quality of the calibration. A third object of the method is to enable calibration without visual confirmation of an operator.
This object is obtained by a method as defined by claim 1. The method comprises
- generating a plurality of target points to which the calibration tool is to be moved by the robot arm for calibration,
the method further comprises, for each target point, evaluating the target point by executing the following steps until all target points are either maintained or rejected: a) determining whether the target point is visi ble for the camera unit,
b) mai ntai n i ng the target poi nt if the target poi nt is visi ble for the camera unit,
c) if the target poi nt is not visi ble for the camera u nit, execute the following steps:
d) moving the target point by a certain increment i n a direction towards the camera unit,
e) determining whether the moved target point is within a certain range of distance from the camera unit,
f) reject the moved target point if the moved target point is not within said range of distance from the camera unit,
g) if the moved target point is withi n the certain range of distance from the camera unit, repeat the steps a-g until the moved target point is either maintained or rejected , and
perform the following steps after the evaluation has been completed :
- generati ng a robot program based on the maintai ned target points,
- executing the robot program while cali brating the first coordinate system with the second coordinate system.
The plurality of target points are generated so that the target points are separated from each other. The method steps a-g re- gard an eval uation of the validity of the generated target point.
In case the target poi nts are not visible for the camera unit, the target point can not be used for calibration and the target point is moved towards the camera unit by the certain increment. The lack of visi bility may for example occur d ue to that an object blocks the visi bility. The object may for example be a fixture, a shelf, a machine, etc. By means of moving the target point towards the camera unit the obstruction to the visi bility by the object may be removed . Non-visi ble target points are moved towards the camera unit until the target points become visi ble or until the target points reach the limit of the certain range of distance from the camera unit. Outside the certai n range of distance from the camera unit, the quality of the target poi nts is insufficient for the cali bration , wherei n such target point is rejected .
Each target point is evaluated and moved until the target point is either maintained or rejected . The method step of generating and eval uating the target point is performed without the necessity of physical movement of the robot arm. Accordingly, the term "move" refers to that the position of a target poi nt is changed in a direction by the certain increment without involvement if a physical movement.
The term "increment" refers to a distance that a target poi nt is to be moved . The eval uation time is dependent on the size of the increment. Small sized increments result in high quality of the target points at the cost of high eval uation time i nvolving a large number of repetition for evaluation and movi ng the target points, and vice versa. Preferably, the size of the i ncrement shall involve a balanced between quality of the target points and evaluation time. The term "target point" refers to a position and an orientation of the cali bration tool to which the cali bration tool is adapted to be moved by the robot arm for cali brating the first coordinate system with the second coordi nate system. The target point comprises six deg rees of freedom ; three deg rees of freedom i n re- gards to a position of the cali bration tool and three degrees of freedom in regards to the orientation of the cali bration tool .
The term "mai ntained target point" refers to a target point that has been evaluated and is stored in a memory unit for generat- ing the robot program or for further evaluation . The term "rejected target point" refers to a target point that can not be used for the calibration. A target point that is not visible for the camera unit can not be used for the calibration. Likewise, a target point that is outside the certain range of distance from the camera unit will not provide sufficient quality for the calibration procedure and thus the target point can not be used for calibration.
Each target point is evaluated and moved until it is either main- tained or rejected. Thereafter, the robot program is generated based on the maintain target points and the robot program is executed, wherein the robot arm physically moves the calibration tool to the target points, while calibrating the first coordinate system with the second coordinate system.
The evaluation and correction of the target points are performed prior to executing the calibration. By means of the method, the calibration is performed automatically without that the operator visually must confirm that all the target points can be used.
By means of generating target points, and evaluating and moving the target points, the actual physical calibration procedure, where the calibration tool is moved to the maintained target points by means of the robot program, is performed in a time ef- ficient manner.
The method further has the benefit that the generated target points are utilized optimal. Target points that are not visible are if possible moved so that they are visible, otherwise they are re- jected. Target points that lack sufficient quality are rejected. Accordingly, the method assures that the quality of the calibration is as high as possible given the certain number of generated target points. According to one embodiment of the invention, the calibration tool comprises a calibration feature that is adapted to be recog- nized by the object identification unit. The cali bration feature comprises a specific form i n order to enable recognition of the cali bration tool for cali bration. For example, the cali bration feature comprises an L-shaped element, a rectangular element, etc.
Accordi ng to one embodiment of the invention, the steps a-g are simulated based on a virtual model of the robot arm, characteristics and position of the camera unit and characteristics of a work station of the robot unit. Accordingly, the simulation is per- formed without movi ng the robot arm prior to the execution of the generated robot program. Thereby, the eval uation of the target points is performed prior to physical movement of the calibration tool to the maintained target points. Accordi ngly, the duration of the cali bration procedure that requires the physical movement of the robot unit is reduced .
Accordi ng to one embodiment of the i nvention,
the method further comprises:
- generati ng the plurality of target points within a range of dis- tance from a focal plane of the camera unit.
The quality of the target point is optimal in the focal plane. As the target point is moved towards the camera unit, the quality of the target points gradually decreases. If the target poi nt is out- side the certain range of distance from the camera unit, the quality of the target point is not sufficient for the cali bration.
A target point in the focal plane of the camera unit provides the highest quality picture for determini ng the position of the target point i n the second coord inate system . Accordi ng ly, the target points are generated in a position providing the best quality. Target poi nts that are moved towards the camera unit due to lack of visi bility will thus have lower quality compared to the generated position of the targets points. According to one embodiment of the invention, the target points are generated so that they are distributed evenly from each other in a plane within the range of distance from the focal plane of the camera unit. Thereby, the generated target points will provide an optimal calibration quality if all the targets points are maintained. Preferably, the target points are generated in the focal plane of the camera unit.
According to one embodiment of the invention,
the method further comprises for each of the target points, evaluating the target points by executing the following steps until all target points are either maintained or rejected:
h) determining whether the target point is within reach of the robot arm,
i) maintaining the target point if the target point is within reach of the robot arm,
j) if the target point is not within reach of the robot arm, move the target point by a certain increment in a direction towards a robot base of the robot arm,
k) determining whether the moved target point is beyond a certain distance from the robot base,
I) reject the moved target point if the moved target point is not beyond the said distance from the robot base,
m) if the moved target point is beyond said distance from the robot base, evaluate the moved target point according to step a, n) repeat the steps h-m until the moved target point is either maintained or rejected if the moved target point is visible for the camera unit,
o) if the moved target point is not visible for the camera unit, execute the steps d-g until the moved target point is either maintained or rejected.
The method steps h-o regard an evaluation of if it is possible for the robot unit to move the calibration tool to the maintained tar- get points. In case the target poi nts are not reachable for the robot arm, the target point can not be used for calibration and the target point is moved towards the robot base by a certain i ncrement. The lack of reach may be due to that the target point is at a distance too far away from the robot base or d ue to that the robot arm must be stretch around an object to the target poi nt and therefore can not reach the target poi nt. By means of moving the target poi nt towa rd s th e robot base the target point may be reached by the robot arm.
The target poi nts are moved towards the robot base until the target points reaches the certai n distance from the robot base. The robot arm can not reach target points withi n the certain distance from the robot base. Accordingly, such target poi nts are rejected .
Accordi ng to one embodiment of the invention, the steps a-o are simulated based on a virtual model of the robot arm, characteristics and position of the camera u n it and characteristics of a work station of the robot unit.
Accordi ng to one embodiment of the invention, the method further comprises for each of the target points:
- generati ng a robot pose for each of the target points, in which robot pose the robot arm is to be moved for cali bration,
the method further comprises, for each robot pose, evaluati ng the robot pose by executi ng the following steps until all robot poses are either maintained or rejected :
p) determini ng whether the robot pose is collision free,
q) maintain the robot pose if the robot pose is collision free, r) if the robot pose is not collision free, determi ne whether at least one alternative robot pose for the target point is collision free,
s) change the robot pose to the alternative robot pose if the al- ternative robot pose is collision free, t) reject alternative robot pose and the target point if the alternative robot pose is not collision free.
perform the following steps after the evaluation has been completed:
- generating a robot program based on the maintained target points and maintained robot poses,
The robot pose is generated for each maintained target point. The robot pose defines a certain orientation of the robot arm.
The method steps p-t regard an evaluation of possible collision of the robot poses. In case a robot pose for a target point involves a collision, the robot pose can not be used for calibration and one or more alternative robot poses for the same target points are evaluated for collision. If an alternative robot pose can reach the target point without collision, the generated robot pose is changed to the alternative robot pose, otherwise the target point and the robot pose are rejected. The term "maintained robot pose" refers to a robot pose that has been evaluated and is stored in a memory unit for generating the robot program or for further evaluation.
The term "rejected robot pose" refers to a robot pose that can not reach its target point without collision. The robot pose is therefore not stored and the target point associated to it is rejected.
Each generated robot pose is evaluated for collision and is maintained or changed to an alternative robot pose. Otherwise the robot pose is rejected. Thereafter, the robot program is generated based on the maintain target points and the maintained robot poses. Thereafter the robot program is executed while performing calibration. Accordi ng to one embodiment of the invention, the steps a-t are simulated based on a virtual model of the robot arm, characteristics and position of the camera un it and characteristics of a work station of the robot unit.
Accordi ng to one embodiment of the i nvention,
the method comprises for each of the target points:
- in step p, determining whether the cali bration tool is visi ble for the camera unit using the robot pose,
- in step q , maintain the robot pose if the cali bration tool is visible for the camera unit usi ng the robot pose,
- in step r, if the cali bration tool is visi ble for the camera unit using the robot pose, determine whether at least one alternative robot pose for the target point is visi ble for the camera unit, - in step s, change the robot pose to the alternative robot pose if the alternative robot pose is visi ble for the camera unit.
- in step t, reject the alternative robot pose and the target point if the alternative robot pose is not visible for the camera unit. The method steps p-t furthermore regard an evaluation of the visi bility of the cali bration tool in the generated robot pose. The robot pose may for example be oriented so that the robot arm itself obstructs the visi bility for the camera unit of the target point.
In case a robot pose for a target poi nt is not visi ble for the camera unit, the robot pose can not be used for cali bration and one or more alternative robot poses for the same target poi nts are evaluated for visi bility. If an alternative robot pose is visi ble in the target point, the generated robot pose is changed to the alternative robot pose, otherwise the target poi nt and the robot pose are rejected .
Accordi ng to one embodiment of the invention, the method fur- ther comprises, - in step j , determine whether at least one alternative robot pose is within reach of the target point,
- in step k, change the robot pose to an alternative robot pose if the alternative robot pose enables reach of the target poi nt, oth- erwise reject the alternative robot pose.
The reach of a target poi nt may be dependent on the robot pose. By means of an alternative robot pose the target point may be within reach without moving the target point.
Accordi ng to one embodiment of the invention, the robot pose comprises a first orientation of the robot arm and the at least one alternative robot pose comprises a second orientation of the robot arm.
Accordi ng to one embodiment of the invention, the robot pose comprises a first orientation of the cali bration tool and the at least one alternative robot pose comprises a second orientation of the cali bration tool .
Accordi ng to one embodiment of the invention, the steps a-o are simulated without moving the robot arm prior to the execution of the generated robot program. Accordi ng to one embodiment of the invention , the determination on whether the target point is visible for the camera unit is based on i nformation on the geometry of a work station of the robot unit and the position of the camera unit. Accordi ng to one embodiment of the invention , the determination on the whether the target point is withi n reach of the robot arm is based on the position of the robot base and the geometry of the robot unit. Accordi ng to one embodiment of the invention , the determination on the whether the robot poses are collision free is based on i n- formation on geometry of the robot unit and a work station of the robot unit.
Accordi ng to one embodiment of the invention, the determi nation on the whether the cali bration tool is visi ble for the camera unit is based on i nformation on the geometry of the robot unit, the cali bration tool and the camera unit.
Accordi ng to one embodiment of the invention, said range of dis- tance from the camera unit in step e is dependent on a focus distance of the camera unit. The target points must be at the range of distance for sufficient quality for the cali bration.
Accordi ng to one embodiment of the invention, said range of dis- tance from the camera unit in step e is dependent on depth of field of the camera unit. The term "depth of field" refers to the distance around the focal plane, which the target points are sufficiently sharp for the cali bration . Accordi ng to one embodiment of the invention , the determination on whether the target poi nt is visi ble is based on information on camera unit field of view. The term "field of view" refers to the extent to which target points are visi ble for the camera unit. The term is related to the angle of view of the camera unit.
Accordi ng to one embodiment of the invention, said distance from the robot base in step k is dependent on the geometry of the robot arm. Accordi ng to one em bodiment of the invention , the method further comprises:
- determi ning a cali bration quality based on the number of maintained target point and the extent that the target points have been moved from the generated position of the target poi nt, and - presenting the calibration quality to an operator. According to one embodiment of the invention, the evaluation of any of step a-t is simulated based on a model of the characteristics of the robot unit and the object identification unit. Accordingly, the evaluation is performed prior to the actual calibration that is performed according to the robot program. Accordingly, the duration of the actual calibration where the calibration tool is moved to the plurality of target points are reduced.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be explained more closely by the description of different embodiments of the invention and with reference to the appended figures. Fig. 1 shows an example of a robot unit with a first coordinate system and an object identification unit with a second coordinate system, which first and second coordinate systems are adapted to be calibrated by an embodiment of the invention.
Fig.2 shows examples of parameters of the camera unit used in the method.
Fig.3 shows an example of a first evaluation of the target points according to an embodiment of the invention.
Fig.4 shows an example of a second evaluation of the target points.
Fig.5 shows an example of a first evaluation of the robot poses.
Fig.6 shows an example of a second evaluation of the robot poses.
Fig.7 shows a general flow chart of the method.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS OF THE INVENTION
Figure 1 shows an example of a robot unit 1 with a first coordi nate system and an object identification unit 2 with a second co ordinate system. The robot unit 1 comprises a robot arm 3 with a calibration tool 4.
The robot unit 1 and the object identification unit 2 are located at a work station 7. The robot unit 1 is adapted to perform work at the work station 7.
The robot unit 1 comprises a robot controller 9 adapted to control the movements of the robot arm 3 by means of controlling a plurality of electric motors on the robot arm 3.
The robot controller 9 comprises a central processing unit (CPU) 10, a memory unit 1 1 and a drive unit 12. The CPU 10 is adapted to execute a robot program located on the memory unit, wherei n the robot arm is moved to a plurality of position using a plurality or robot poses. The drive unit 12 is adapted control the electric motors of the robot arm 3 i n dependency of the executed robot program. The object identification unit 2 comprises a camera un it 20 and an information processi ng unit 22. The camera unit is adapted to be directed towards the work station 7 of the robot unit 1 .
The information processing unit 22 comprises a central process- ing unit (CPU) 24 and a memory unit 26. The information processing unit 22 is adapted to receive information from the camera unit 20 in the form of a depiction of one or more object at the work station 7. The information processing unit 22 is adapted to process the information so that the object is recognized and the position of the object in the second coordinate system is determined by means of certai n object recognition algorithms. In particular the object identification unit 2 is adapted to recognize a cali bration feature of the cali bration tool 4 on the robot arm 3. Based on the position of the recognized object, the robot unit 1 is adapted to move the robot arm 3 to the position of the object and perform work on the object, such as picking , welding , painting , assembly, etcetera. Accordingly, the robot unit 1 and the object identification unit 2 are co-operating in the work at the work station 7.
In order to enable the co-operation between the robot unit 1 and the object identification unit 2, the first coordinate system of the robot unit 1 and the second coordi nate system of the object identification unit 2 must be essentially the same. Therefore, the first and the second coordi nate system must be cali brated with each other by means of a cali bration method prior to performi ng work at the work station 7. It shall be understood that the calibration comprises correcting one of the first and the second coordinate system with the other of the first and the second coor- dinate system. The cali bration is in some working condition repeated frequently in order to assure the accuracy of the work performed by the robot unit 1 .
The robot unit 1 further comprises a computer u nit 30 compris- ing a central processi ng u n it (CPU ) 32 and a memory u n it 34. The computer unit 30 is adapted to generate a plurality of target points to which the cali bration tool 4 is to be moved by the robot arm 3 and a plurality of robot poses in which the robot pose the robot arm 3 is adapted to be oriented .
The computer unit 30 unit is further adapted to evaluate the target poi nts and robot poses prior to executing the cali bration. After the evaluation a set of mai ntained target points and maintained robot poses is obtai ned .
Information on the set of mai ntai ned target points and mai ntained robot poses is transmitted to the robot controller 9. After receiving the information, the robot controller 9 is adapted to generate a robot program based on the maintained target points and maintained robot poses and the robot program is adapted to be executed wh ile cal i brati ng the fi rst coordi nate system with the second coordinate system.
Figure 2 shows an example of parameters used i n an embodi- ment of the method for cali brating the first coordinate system of the robot unit 1 with the second coordinate system of the object identification unit 2. The parameters are determined by the characteristics of a camera unit 20. The camera unit 20 comprises a view cone, i n fig . 2 , il lustrated as a triangle. Only targets points within the view cone are visi ble and thus useful for the cali bration. The camera unit 20 comprises a focus distance. The focus distance is a distance from the camera unit 20 to a focal plane of the camera unit 20. The focal plane is a plane where objects are depicted with the highest quality. Accordingly to an em bodiment of the invention , the object identification unit 2 is arranged to determine the position of objects i n the focal plane with the highest accuracy. The camera unit 5 is also characterized by the term depth of field . The depth of field is a range of distance around the focal plane where the quality of the depicted object is sufficient for determi ning the position of the objects. Figu re 7 shows a g eneral flow chart of an embodi ment of the method for cali brating the first coordinate system of a robot unit 1 with the second coordinate system of the object identification unit 2. The first step of a method comprises generati ng a pl ural ity of target poi nts to which the cali bration tool 4 is to be moved by the robot arm 3 for cali bration. The target points are preferably generated so that they are separated from each other by an equal distance. Moreover, the target points are preferably gen- erated i n the focal plane of a camera unit 20. Each target point is a point in space where the robot arm 3 is adapted to move the cali bration tool 4 for cali bration of a first coordinate system with the second coordinate system. The target point further comprises an orientation . Accordingly, each target point comprises six degrees of freedom , three degree regarding the position of the target point and three degrees regarding the orientation of the target point. The number of generated target points depends on the desired quality of the cali bration. Accordingly, a large number of target points are necessary in order to obtain a high quality cali bration .
In order to obtai n a high quality cali bration, the target points shall preferably be located in the focal plane of the camera unit 20. I n order to enable cali bration of the first coordinate system with the second coordinate system, th e ta rg et po i nts a re required to be located within a certai n range of distance from the camera unit 20. The certai n range of distance from the camera unit 20 provides an acceptable quality of the target points in order to perform the cali bration. According to an embodiment of the invention the certain range of distance from the camera unit 20 is the distance from the camera unit 20 to and withi n the depth of field of the camera unit 20, see fig . 2.
On the other hand , target points that are located outside the certain range of distance from the camera unit 20 will not provide sufficient quality in order to cali brate the first coordinate system with the second coordinate system.
After that the target points have been generated , the target points are evaluated i n respect to camera unit 20 visi bility and quality. The evaluation of a target point comprises a first evaluation and a second evaluation.
An example of the first evaluation is shown i n figure 3. The first evaluation comprises determination and correction i n ord er to assure that the target points are visi ble for the camera unit 20 and within the certain range of distance from the camera unit 20 in order to enable the calibration .
The first evaluation is performed for each target point. The first evaluation is initiated by the determining whether the target point is visi ble for the camera unit 20. If a target point is visi ble for the camera unit 20 the target point is mai ntained . If a target point is not visi ble for the camera unit 20 the target point is corrected by moving the target point by a certain increment in a di- rection towards the camera unit 20, illustrated as "+ 1 camera" in fig . 3.
It is further determined whether the moved target point is within the certai n range of distance from the camera unit 20. The moved target poi nt i s rejected if the moved target poi nt is not within the certain range of distance from the camera unit 20. If the moved target point is within the certain range of distance from the camera unit 20, the process is repeated until the moved target point is either maintained or rejected .
A maintained target point is a target point that has been evaluated and possi bly corrected by moving the target point towards the camera unit 20. The maintained target point is stored in the memory unit 26 of the object identification unit. The maintained target point is adapted to be used for generating a robot program or for further evaluation.
A rejected target point is a target poi nt that can not be used for the cali bration. I n regards to the first evaluation , the rejected target point is either not visi ble for the camera unit 20 or is outside the certain range of distance from the camera unit 20.
The second evaluation of the target point is shown in figure 4. The second evaluation of the target point is shown is performed on each of the target poi nts and is initiated by determini ng whether the target poi nt is within reach of the robot arm 3. If a target poi nt is within reach of the robot arm 3, the target point is maintai ned . If a target poi nt is not withi n reach of the robot arm 3, the target point is moved by a certain increment in a direction towards a robot base 28 of the robot arm 3, i ll ustrated as "+ 1 robot" i n fi g . 4. By moving the target point towards the robot base, the target poi nt is moved closer to the robot base 28. Thereby the chance that the target point is within reach of the robot arm 3 i ncreases. It is furthermore determined whether the moved target point is beyond certain range of distance from the robot base 28. The robot unit 1 is adapted to reach target points that are located at a distance beyond the certain distance from the robot base 28. Accordi ngly, the certai n range of distance from the robot base 28 regards a mi n imum distance that the target poi nts shal l be located in order for the robot unit 1 to move the cali bration tool 4 to the target point.
If a target point is not beyond the distance from the robot base, the target point is rejected . If a target point is beyond the certain distance from the robot base 28, the target point is evaluated accordi ng to a first evaluation in figure 3 until the target poi nt is either re-evaluated regarding if the robot arm 3 can reach the target point or rejected . The second evaluation is performed for each target poi nt unti l the target point is either maintai ned or rejected .
After that the target poi nts have been evaluated , a set of maintained target points are stored in the memory unit 34 of the com- puter unit 30. The first eval uation and second evaluation of the target points are performed by means of a simulation software program on the computer unit. The software program uses a model of the characteristics of the work space 7, the robot unit 1 and the camera unit 20. After the evaluation of the target poi nts, a robot pose associated to each of the mai ntained target points is generated by the computer unit 30. Each robot pose defines a certain orientation of the robot arm 3 that positions the cali bration tool 4 i n the spe- cific target point. A specific target point can often be reached by a plurality of different robot poses.
After that the robot poses has been generated for the mai ntained target poi nts, each robot pose are evaluated . The evalua- tion of the robot poses comprises a fi rst eval uation and a second evaluation. The first evaluation and second evaluation of the robot poses are performed by means of a simulation software program on the computer unit 30. The software program controls if the robot pose involves a collision and if the cali bra- tion tool is visible using a certain robot pose.
A first evaluation of the robot poses are shown in figure 5. The evaluation comprises, for each robot pose, determining whether the robot pose is collision free. If the robot pose is collision free, the robot pose is mai ntained . If a robot pose is not collision free, it is determined whether at least one alternative robot pose for the same target point is collision free. If such alternative robot pose is available for the target point, the robot pose is changed to the alternative robot pose and the alternative robot pose is maintai ned . If no alternative robot pose for the target poi nt is collision free, the robot pose is rejected and the target poi nt associated with the robot pose is also rejected . According to an embodiment of the invention, the target point that is to be rejected according to above may by moved with the certain range of distance from the camera un it 20 in order to fi nd an alternative target point that may be assigned a robot pose without collision. The moved target point is also reevaluated accordi ng to the first and second evaluation of the target point accordi ng to fig . 3 and 4. The mai ntained collision free robot poses are stored in the memory unit 1 1 of the robot controller 9 for generating the robot program or for further evaluation. The rejected robot poses relates to that the robot pose and its associated target poi nt are deleted and thus not used for the calibration .
The second eval uation is shown in figure 6. The evaluation comprises, for each robot pose, determining whether cali bration tool 4 is visi ble for the camera unit 20 using the robot pose. If the cali bration tool 4 is visi ble for the camera unit 20 using the robot pose, the robot pose is mai ntained . If the cali bration tool 4 is not visi ble for the camera unit 20 using the robot pose, it is determi ned whether at least one alternative robot pose for the same target point is visi ble for the camera unit 20. If such alter- native robot pose is available for the target point, the robot pose is changed to the alternative robot pose and the alternative robot pose is maintained . If no alternative robot pose for the target point enables the cali bration tool 4 to be visi ble for the camera unit 20, the robot pose is rejected and the target point associ- ated with the robot pose is also rejected .
Accordi ng to an embodiment of the invention, the target point that is to be rejected according to above may by moved with the certain range of distance from the camera unit 20 in order to fi nd an alternative target poi nt may be assigned a robot pose, which robot pose enables the cali bration tool 4 to be visi ble for the camera unit 20. The moved target point is also reeval uated according to the first and second evaluation of the target point according to fig . 3 and 4.
After that the robot poses have been eval uated , a set of maintained robot poses are stored in the memory unit 34 of the computer unit 30. The first eval uation and second evaluation of the robot poses are performed by means of a si mulation software prog ra m on th e computer u n it. Th e software prog ram uses a model of the characteristics of the work space 7, the robot unit 1 and the camera unit 20. The rejected robot poses and their associated target poi nt are deleted and are th us not used for the cali bration . After the eval uation of the target points and the eval uation of the robot poses, a set of mai ntained target poi nts and a set of maintai ned robot poses are stored in the memory unit 34 of the computer unit 30 and the information is transferred to the robot controller 12. A robot program is generated by the robot control- ler 12 based on the set of mai ntained target points and the set of maintai n ed ro bot poses . After th at th e robot prog ram has been generated , the robot program is executed while cali brating the first coordinate system with the second coordi nate system. The method has the benefit that the target points and the robot poses are generated and evaluated before the actual cali bration procedure. Accordingly, the duration of the cali bration while the robot unit 1 by means of the robot arm 3 moves the cali bration tool 4 to the plurality of target points while cali brating the first coordinate system with the second coordinate system is reduced in comparison to prior art cali bration methods. A further benefit is that the operator is not required to visually confirm the target points and the robot poses while the robot arm 3 moves the cali bration tool 4 to the plurality of target points by means of the plurality of robot poses.
The present inventi on is n ot l i mited to the embod iments d isclosed but may be varied and modified with in the scope of the following claims.
It shall be understood that the robot controller 9, the information processing unit 22 and the computer unit 30 may be the same unit. It shall also be understood that a single CPU 10 and a single memory unit 1 1 of the robot controller 9 may be used for achieving the method . It shall furthermore be understood that the evaluation step of the target points and robot poses are performed "off-line" without requiring movement of the robot arm 3 of the robot unit 1.
It shall furthermore be understood that the first and the second evaluation in fig. 7 also involves the corrective measures of moving the target point.

Claims

1. A method for calibrating a first coordinate system of a robot unit (1) with a second coordinate system of an object identification unit (2), wherein the robot unit (1) comprises a robot arm (3) with a calibration tool (4) and the object identification unit (2) comprises a camera unit (20), the method comprises:
- generating a plurality of target points to which the calibra- tion tool (4) is to be moved by the robot arm (3) for calibration,
the method further comprises, for each target point, evaluating the target point by executing the following steps until all target points are either maintained or rejected:
a) determining whether the target point is visible for the camera unit (20),
b) maintaining the target point if the target point is visible for the camera unit (20),
c) if the target point is not visible for the camera unit (20), execute the following steps:
d) moving the target point by a certain increment in a direction towards the camera unit (20),
e) determining whether the moved target point is within a certain range of distance from the camera unit (20),
f) reject the moved target point if the moved target point is not within said range of distance from the camera unit (20), g) if the moved target point is within the certain range of distance from the camera unit (20), repeat the steps a-g until the moved target point is either maintained or rejected, and perform the following steps after the evaluation has been completed:
- generating a robot program based on the maintained target points,
- executing the robot program while calibrating the first coor- dinate system with the second coordinate system.
2. A method according to claim 1, wherein the method further comprises:
- generating the plurality of target points within a range of distance from a focal plane of the camera unit (20).
3. A method according to any of claim 1 or 2, wherein the method further comprises for each of the target points, evaluating the target points by executing the following steps until all target points are either maintained or rejected:
h) determining whether the target point is within reach of the robot arm (3),
i) maintaining the target point if the target point is within reach of the robot arm (3),
j) if the target point is not within reach of the robot arm (3), move the target point by a certain increment in a direction towards a robot base (28) of the robot arm (3),
k) determining whether the moved target point is beyond a certain distance from the robot base (28),
I) reject the moved target point if the moved target point is not beyond the said distance from the robot base (28), m) if the moved target point is beyond said distance from the robot base (28), evaluate the moved target point according to step a,
n) repeat the steps h-m until the moved target point is either maintained or rejected if the moved target point is visible for the camera unit (20),
o) if the moved target point is not visible for the camera unit (20), execute the steps d-g until the moved target point is either maintained or rejected.
4. A method according to any of the preceding claims, wherein the method further comprises for each of the target points:
- generating a robot pose for each of the target points, in which robot pose the robot arm (3) is to be moved for calibration, the method further comprises, for each robot pose, evaluating the robot pose by executing the following steps until all robot poses are either maintained or rejected:
p) determining whether the robot pose is collision free, q) maintain the robot pose if the robot pose is collision free, r) if the robot pose is not collision free, determine whether at least one alternative robot pose for the target point is collision free,
s) change the robot pose to the alternative robot pose if the alternative robot pose is collision free,
t) reject alternative robot pose and the target point if the alternative robot pose is not collision free,
perform the following steps after the evaluation has been completed:
- generating a robot program based on the maintained target points and maintained robot poses,
5. A method according to claim 4, wherein the method comprises for each of the target points:
- in step p, determining whether the calibration tool (4) is visible for the camera unit (20) using the robot pose,
- in step q, maintain the robot pose if the calibration tool (4) is visible for the camera unit (20) using the robot pose,
- in step r, if the calibration tool (4) is visible for the camera unit (20) using the robot pose, determine whether at least one alternative robot pose for the target point is visible for the camera unit (20),
- in step s, change the robot pose to the alternative robot pose if the alternative robot pose is visible for the camera unit (20).
- in step t, reject the alternative robot pose and the target point if the alternative robot pose is not visible for the camera unit (20).
6. A method according to any of claim 4-5, wherein the robot pose comprises a first orientation of the robot arm (3) and the at least one alternative robot pose comprises a second orientation of the robot arm (3).
7. A method according to any of claim 4-6, wherein the robot pose comprises a first orientation of the calibration tool and the at least one alternative robot pose comprises a second orientation of the calibration tool.
8. A method according to any of the preceding claims, wherein the steps a-t are simulated based on a virtual model of the robot arm (3), characteristics and position of the camera unit (20), and characteristics of a work station (7) of the robot unit (1 ).
9. A method according to any of the preceding claims, wherein the determination on whether the target point is visible for the camera unit (20) is based on information on the geometry of a work station (7) of the robot unit (1) and the position of the camera unit (20).
10. A method according to any of the preceding claims, wherein the determination on the whether the target point is within reach of the robot arm (3) is based on the position of the robot base (28) and the geometry of the robot unit (1).
11. A method according to any of the preceding claims, wherein the determination on the whether the robot poses are collision free is based on information on geometry of the robot unit (1 ) and a work station (7) of the robot unit (1 ).
12. A method according to any of the preceding claims, wherein the determination on the whether the robot poses are visible for the camera unit (20) is based on information on the geometry of the robot unit (1), the calibration tool (4) and the camera unit (20).
13. A computer unit (30) adapted to execute the method according to claim 1-13.
14. A robot unit (1) comprising an object identification unit (2), which robot unit (1) is adapted to execute the method according to claim 1-12.
15. Use of a robot unit (1 ) according to claim 14.
PCT/EP2010/068997 2010-12-06 2010-12-06 A method for calibrating a robot unit, a computer unit, a robot unit and use of a robot unit WO2012076038A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/068997 WO2012076038A1 (en) 2010-12-06 2010-12-06 A method for calibrating a robot unit, a computer unit, a robot unit and use of a robot unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/068997 WO2012076038A1 (en) 2010-12-06 2010-12-06 A method for calibrating a robot unit, a computer unit, a robot unit and use of a robot unit

Publications (1)

Publication Number Publication Date
WO2012076038A1 true WO2012076038A1 (en) 2012-06-14

Family

ID=43909967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/068997 WO2012076038A1 (en) 2010-12-06 2010-12-06 A method for calibrating a robot unit, a computer unit, a robot unit and use of a robot unit

Country Status (1)

Country Link
WO (1) WO2012076038A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103363899A (en) * 2013-07-05 2013-10-23 科瑞自动化技术(深圳)有限公司 Calibration device and calibration method for calibrating coordinate system of robot arm
WO2014065744A1 (en) 2012-10-23 2014-05-01 Cognibotics Ab Method and system for determination of at least one property of a joint
WO2014161603A1 (en) * 2013-04-05 2014-10-09 Abb Technology Ltd A robot system and method for calibration
JP2015062991A (en) * 2013-08-28 2015-04-09 キヤノン株式会社 Coordinate system calibration method, robot system, program, and recording medium
CN105773613A (en) * 2016-03-30 2016-07-20 东莞市速美达自动化有限公司 Horizontal robot camera coordinate system calibration method
WO2017167687A2 (en) 2016-03-29 2017-10-05 Cognibotics Ab Method, constraining device and system for determining geometric properties of a manipulator
CN108463313A (en) * 2016-02-02 2018-08-28 Abb瑞士股份有限公司 Robot system is calibrated
CN108942927A (en) * 2018-06-29 2018-12-07 齐鲁工业大学 A method of pixel coordinate and mechanical arm coordinate unification based on machine vision
CN111531547A (en) * 2020-05-26 2020-08-14 华中科技大学 Robot calibration and detection method based on vision measurement
US10926414B2 (en) 2017-09-29 2021-02-23 Industrial Technology Research Institute System and method for calibrating tool center point of robot
CN112802122A (en) * 2021-01-21 2021-05-14 珠海市运泰利自动化设备有限公司 Robot vision guiding assembly method
CN113246128A (en) * 2021-05-20 2021-08-13 菲烁易维(重庆)科技有限公司 Robot teaching method based on vision measurement technology
CN113635311A (en) * 2021-10-18 2021-11-12 杭州灵西机器人智能科技有限公司 Method and system for out-of-hand calibration of eye for fixing calibration plate

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1468792A2 (en) 2003-04-16 2004-10-20 VMT Bildverarbeitungssysteme GmbH Method for robot calibration
US20090062960A1 (en) * 2007-08-30 2009-03-05 Sekar Krishnasamy Method and system for robot calibrations with a camera
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera
EP2070664A1 (en) * 2007-12-14 2009-06-17 Montanuniversität Leoben Object processing system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1468792A2 (en) 2003-04-16 2004-10-20 VMT Bildverarbeitungssysteme GmbH Method for robot calibration
US20090062960A1 (en) * 2007-08-30 2009-03-05 Sekar Krishnasamy Method and system for robot calibrations with a camera
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera
EP2070664A1 (en) * 2007-12-14 2009-06-17 Montanuniversität Leoben Object processing system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HANQI ZHUANG ET AL: "Camera-Assisted Calibration of SCARA Arms", IEEE ROBOTICS & AUTOMATION MAGAZINE, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 3, no. 4, 1 December 1996 (1996-12-01), pages 46 - 53, XP011089687, ISSN: 1070-9932, DOI: DOI:10.1109/100.556482 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104736307A (en) * 2012-10-23 2015-06-24 康格尼博提克斯股份公司 Method and system for determination of at least one property of a joint
WO2014065744A1 (en) 2012-10-23 2014-05-01 Cognibotics Ab Method and system for determination of at least one property of a joint
US9645565B2 (en) 2012-10-23 2017-05-09 Cognibotics Ab Method and system for determination of at least one property of a joint
CN104736307B (en) * 2012-10-23 2017-03-08 康格尼博提克斯股份公司 For determining the method and system of at least one characteristic of joint
US9457470B2 (en) 2013-04-05 2016-10-04 Abb Technology Ltd Robot system and method for calibration
CN105073348B (en) * 2013-04-05 2016-11-09 Abb技术有限公司 Robot system and method for calibration
CN105073348A (en) * 2013-04-05 2015-11-18 Abb技术有限公司 A robot system and method for calibration
WO2014161603A1 (en) * 2013-04-05 2014-10-09 Abb Technology Ltd A robot system and method for calibration
CN103363899A (en) * 2013-07-05 2013-10-23 科瑞自动化技术(深圳)有限公司 Calibration device and calibration method for calibrating coordinate system of robot arm
CN103363899B (en) * 2013-07-05 2016-09-14 深圳科瑞技术股份有限公司 A kind of caliberating device for demarcating coordinate system of robot arm and scaling method
JP2015062991A (en) * 2013-08-28 2015-04-09 キヤノン株式会社 Coordinate system calibration method, robot system, program, and recording medium
CN108463313A (en) * 2016-02-02 2018-08-28 Abb瑞士股份有限公司 Robot system is calibrated
US11230011B2 (en) 2016-02-02 2022-01-25 Abb Schweiz Ag Robot system calibration
CN109196429B (en) * 2016-03-29 2021-10-15 康格尼博提克斯股份公司 Method, constraint device and system for determining geometrical characteristics of manipulator
WO2017167687A2 (en) 2016-03-29 2017-10-05 Cognibotics Ab Method, constraining device and system for determining geometric properties of a manipulator
CN109196429A (en) * 2016-03-29 2019-01-11 康格尼博提克斯股份公司 For determining method, restraint device and the system of the geometrical property of executor
US11192243B2 (en) 2016-03-29 2021-12-07 Cognibotics Ab Method, constraining device and system for determining geometric properties of a manipulator
CN105773613A (en) * 2016-03-30 2016-07-20 东莞市速美达自动化有限公司 Horizontal robot camera coordinate system calibration method
US10926414B2 (en) 2017-09-29 2021-02-23 Industrial Technology Research Institute System and method for calibrating tool center point of robot
CN108942927A (en) * 2018-06-29 2018-12-07 齐鲁工业大学 A method of pixel coordinate and mechanical arm coordinate unification based on machine vision
CN108942927B (en) * 2018-06-29 2022-04-26 齐鲁工业大学 Method for unifying pixel coordinates and mechanical arm coordinates based on machine vision
CN111531547B (en) * 2020-05-26 2021-10-26 华中科技大学 Robot calibration and detection method based on vision measurement
CN111531547A (en) * 2020-05-26 2020-08-14 华中科技大学 Robot calibration and detection method based on vision measurement
CN112802122A (en) * 2021-01-21 2021-05-14 珠海市运泰利自动化设备有限公司 Robot vision guiding assembly method
CN112802122B (en) * 2021-01-21 2023-08-29 珠海市运泰利自动化设备有限公司 Robot vision guiding assembly method
CN113246128A (en) * 2021-05-20 2021-08-13 菲烁易维(重庆)科技有限公司 Robot teaching method based on vision measurement technology
CN113246128B (en) * 2021-05-20 2022-06-21 菲烁易维(重庆)科技有限公司 Robot teaching method based on vision measurement technology
CN113635311A (en) * 2021-10-18 2021-11-12 杭州灵西机器人智能科技有限公司 Method and system for out-of-hand calibration of eye for fixing calibration plate

Similar Documents

Publication Publication Date Title
WO2012076038A1 (en) A method for calibrating a robot unit, a computer unit, a robot unit and use of a robot unit
US10525597B2 (en) Robot and robot system
EP3705239B1 (en) Calibration system and method for robotic cells
US11396100B2 (en) Robot calibration for AR and digital twin
US8989897B2 (en) Robot-cell calibration
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
US9043024B2 (en) Vision correction method for tool center point of a robot manipulator
KR102276259B1 (en) Calibration and operation of vision-based manipulation systems
EP2350750B1 (en) A method and an apparatus for calibration of an industrial robot system
US20190022867A1 (en) Automatic Calibration Method For Robot System
US20090234502A1 (en) Apparatus for determining pickup pose of robot arm with camera
US20140229005A1 (en) Robot system and method for controlling the same
JP6235664B2 (en) Measuring device used to calibrate mechanism parameters of robot
JP2018012184A (en) Control device, robot, and robot system
WO2012004232A2 (en) A method for calibration of a robot positioned on a movable platform
CN103442858A (en) Robotic work object cell calibration device, system, and method
JP6900290B2 (en) Robot system
Mustafa et al. A geometrical approach for online error compensation of industrial manipulators
JP2006110705A5 (en)
JP6565175B2 (en) Robot and robot system
JP6869159B2 (en) Robot system
WO2014206787A1 (en) Method for robot calibration
US20190030722A1 (en) Control device, robot system, and control method
CN109531604A (en) Robot controller, measuring system and the calibration method calibrated
US20090228144A1 (en) Method For Calculating Rotation Center Point And Axis Of Rotation, Method For Generating Program, Method For Moving Manipulator And Positioning Device, And Robotic System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10784327

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10784327

Country of ref document: EP

Kind code of ref document: A1