WO2021012124A1 - 机器人手眼标定方法、装置、计算设备、介质以及产品 - Google Patents

机器人手眼标定方法、装置、计算设备、介质以及产品 Download PDF

Info

Publication number
WO2021012124A1
WO2021012124A1 PCT/CN2019/096908 CN2019096908W WO2021012124A1 WO 2021012124 A1 WO2021012124 A1 WO 2021012124A1 CN 2019096908 W CN2019096908 W CN 2019096908W WO 2021012124 A1 WO2021012124 A1 WO 2021012124A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
laser
calibration
robot arm
projection
Prior art date
Application number
PCT/CN2019/096908
Other languages
English (en)
French (fr)
Inventor
贺银增
陈颀潇
Original Assignee
西门子(中国)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西门子(中国)有限公司 filed Critical 西门子(中国)有限公司
Priority to PCT/CN2019/096908 priority Critical patent/WO2021012124A1/zh
Priority to US17/625,824 priority patent/US20220250248A1/en
Priority to EP19938228.4A priority patent/EP3974779A4/en
Priority to CN201980096398.4A priority patent/CN113825980B/zh
Publication of WO2021012124A1 publication Critical patent/WO2021012124A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39057Hand eye calibration, eye, camera on hand, end effector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present disclosure generally relates to the field of robotics, and more specifically, to methods, devices, computing devices, media, and products for robot hand-eye calibration.
  • the hand-eye system is a vision system composed of a camera and a robot arm.
  • the camera is equivalent to a human eye
  • the end of the robot arm is equivalent to a human hand.
  • Vision positioning guides the robot arm to perform tasks. First, the camera coordinates and robot coordinates are calibrated, and the camera coordinates of the vision positioning are converted to robot coordinates to complete the vision guidance.
  • the hand-eye calibration is the key to the vision-guided robot arm.
  • this hand-eye calibration process is usually done manually, and the robot needs to be taught by the camera.
  • a calibration needle is installed at the end of the robot arm, and the robot arm is manually operated to move to the nine points of the calibration board. Since the target positions in the camera coordinate system and the robot coordinate system must be collected to calculate the calibration data, this requires a lot of work from the developer; in addition, the accuracy of the calibration needle will affect the calibration accuracy, and the manual operation of the robot arm to move to nine points The accuracy requirements of the calibration are high, the calibration accuracy is greatly affected by human factors and the calibration time is longer. Therefore, the traditional hand-eye calibration method has the problems of complex calibration process, low calibration efficiency and the calibration accuracy is greatly affected by human factors.
  • the present disclosure provides an automatic hand-eye calibration method with higher measurement accuracy.
  • a laser is introduced on the robot arm, and the laser can be used to realize automatic hand-eye calibration.
  • a robot hand-eye calibration method which includes: a coordinate recording step: controlling the end of the robot arm to sequentially move to at least three predetermined positions above the calibration board, at each predetermined position: the control is set on the robot arm The laser projected on the calibration board, recorded the coordinates of the end of the robot arm in the robot coordinate system during the projection, and controlled the camera on the end of the robot arm to shoot the projection on the calibration board and record the projection on the camera The coordinates in the coordinate system; and the transformation matrix calculation step: according to the recorded coordinates of at least three projections on the calibration board in the camera coordinate system and the end point of the robot arm in the robot during each projection The coordinates in the coordinate system are used to calculate the calibration transformation matrix.
  • the laser includes a laser distance measuring component and a laser projection component, wherein the laser projection component is used to project on the calibration plate.
  • the advancement direction of the laser beam of the laser distance measuring component is parallel to the optical axis of the camera, and the advancement direction of the laser beam of the laser projection component is parallel to the optical axis of the camera parallel.
  • the method before performing the robot coordinate recording step, the method further includes: a parallel correction step: using the laser distance measuring component of the laser to correct the end of the robot arm
  • the imaging plane of the camera is parallel to the plane of the calibration plate.
  • the parallel correction step includes: controlling the end of the robot arm to follow a plane parallel to the imaging plane of the camera above the calibration plate to move to a non-collinear three Points; calculate the motion plane of the end of the robot arm based on the three points; use the laser distance measuring component to calculate the angle between the motion plane and the calibration plate, thereby obtaining the imaging plane and the Adjusting the angle of the calibration board; and adjusting the posture of the end of the robot arm according to the calculated angle so that the imaging plane of the camera provided at the end of the robot arm is parallel to the calibration board.
  • the imaging plane of the camera can be parallel to the plane of the calibration plate, thereby reducing the calibration error caused by the non-parallelism of the imaging plane of the camera and the plane of the calibration plate, and improving the accuracy of hand-eye calibration.
  • the method before performing the robot coordinate recording step, the method further includes: a distance adjustment step: measuring the distance between the camera and the calibration plate with the laser distance measuring component The distance between the camera and the calibration plate is adjusted to maintain a predetermined distance, wherein the predetermined distance is preset according to the focal length of the camera, so that the image projected on the calibration plate on the camera Clear images on the plane.
  • a robot hand-eye calibration device including: a coordinate recording unit that controls the end of the robot arm to sequentially move to at least three predetermined positions above the calibration board, at each predetermined position: Control the laser set on the robot arm to project on the calibration board, record the coordinates of the end point of the robot arm in the robot coordinate system during projection, and control the camera on the end of the robot arm to shoot and photograph the projection on the calibration board. Record the coordinates of the projection in the camera coordinate system; and a transformation matrix calculation unit that records the coordinates of at least three projections on the calibration board in the camera coordinate system and each projection Calculate the calibration transformation matrix based on the coordinates of the end of the robot arm in the robot coordinate system.
  • the laser includes a laser distance measuring component and a laser projection component, wherein the laser projection component is used to project on the calibration plate.
  • the advancement direction of the laser beam of the laser distance measuring component is parallel to the optical axis of the camera, and the advancement direction of the laser beam of the laser projection component is parallel to the optical axis of the camera parallel.
  • the robot hand-eye calibration device further includes: a parallel correction unit that controls the laser ranging component of the laser to correct the imaging of the camera on the end of the robot arm
  • the plane is parallel to the plane of the calibration board.
  • the parallel correction unit is further configured to: control the end of the robot arm to move to a non-common plane above the calibration plate following a plane parallel to the imaging plane of the camera.
  • the three points of the line; the motion plane of the end of the robot arm is calculated based on the three points; the laser distance measuring component is used to calculate the angle between the motion plane and the calibration plate to obtain the imaging plane An included angle with the calibration board; and adjusting the posture of the end of the robot arm according to the calculated included angle so that the imaging plane of the camera provided at the end of the robot arm is parallel to the calibration board.
  • the robot hand-eye calibration device further includes: a distance adjustment unit that controls the laser distance measuring component to measure the distance between the camera and the calibration plate, And adjust a predetermined distance between the camera and the calibration plate, wherein the predetermined distance is preset according to the focal length of the camera, so that the projection on the calibration plate is clear on the imaging plane of the camera Imaging.
  • a robot arm including: a laser; and a camera; wherein the laser includes a laser distance measuring component and a laser projection component, and the laser projection component is used for projecting on a calibration board, The laser distance measuring component is used to correct that the imaging plane of the camera is parallel to the plane of the calibration plate.
  • a computing device including: at least one processor; and a memory coupled with the at least one processor, the memory is used to store instructions, when the instructions are When at least one processor executes, the processor is caused to execute the method described above.
  • a non-transitory machine-readable storage medium which stores executable instructions, which when executed cause the machine to perform the method as described above.
  • a computer program product that is tangibly stored on a computer-readable medium and includes computer-executable instructions that, when executed, cause at least A processor executes the method described above.
  • Fig. 1 shows a schematic diagram of a robot arm according to an embodiment of the present disclosure
  • Fig. 2 shows a flowchart of an exemplary process of a robot hand-eye calibration method according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart showing an exemplary process of the operation in block S206 in FIG. 2;
  • FIG. 4 is a block diagram showing an exemplary configuration of a robot hand-eye calibration device according to an embodiment of the present disclosure.
  • FIG. 5 shows a block diagram of a computing device for hand-eye calibration of a robot according to an embodiment of the present disclosure.
  • the term “including” and its variants means open terms, meaning “including but not limited to.”
  • the term “based on” means “based at least in part on.”
  • the terms “one embodiment” and “an embodiment” mean “at least one embodiment.”
  • the term “another embodiment” means “at least one other embodiment.”
  • the terms “first”, “second”, etc. may refer to different or the same objects. Other definitions can be included below, either explicit or implicit. Unless clearly indicated in the context, the definition of a term is consistent throughout the specification.
  • the present disclosure provides an automatic hand-eye calibration method with higher measurement accuracy.
  • a laser is introduced on the robot arm, and the laser can be used to realize automatic hand-eye calibration.
  • Fig. 1 shows a schematic diagram of a robot according to an embodiment of the present disclosure.
  • 102 is a robot arm
  • 104 is a laser set at the end of the robot arm
  • 106 is a camera set at the end of the robot arm
  • 108 is a calibration board.
  • the laser 104 includes a laser distance measuring component and a laser projection component.
  • the laser projection component is used to project on the calibration plate 108, and the laser distance measurement component is used to correct the imaging plane of the camera 106 to be parallel to the plane of the calibration plate 108.
  • the laser 104 may be a laser ranging component and a laser projection component integrated with two laser components, or it may be implemented by a laser that has both a ranging function and a projection function.
  • FIG. 1 is a schematic diagram of a robot arm provided with a laser, in which the relationship between the components is only schematic, and those skilled in the art can set the position relationship between the components as needed, and are not limited to FIG. 1 Shown.
  • FIG. 2 shows a flowchart of an exemplary process of a robot hand-eye calibration method 200 according to an embodiment of the present disclosure.
  • the coordinate recording step is performed, and the end of the robot arm is controlled to move to at least three predetermined positions above the calibration board in sequence.
  • the laser set on the robot arm is controlled to be in the calibration Projection on the board, recording the coordinates of the end of the robot arm in the robot coordinate system during projection, and controlling the camera on the end of the robot arm to shoot the projection on the calibration board and record the coordinates of the projection in the camera coordinate system.
  • the laser includes a laser distance measuring component and a laser projection component, wherein the laser projection component is used for projecting on the calibration plate.
  • the advancing direction of the laser beam of the laser distance measuring component is parallel to the optical axis of the camera, and the advancing direction of the laser beam of the laser projection component is parallel to the optical axis of the camera.
  • the recorded coordinates of the projection in the camera coordinate system are the coordinates of the geometric center of the projection.
  • the end of the robot arm can be controlled by a program to move to at least three predetermined positions in sequence.
  • the three predetermined positions are set to be non-collinear, and the plane formed by the three predetermined positions is parallel to the imaging plane of the camera.
  • the distance between the three points needs to consider the size of the calibration board, that is, not too large And if it exceeds the calibration board, it should not be too small to affect the calibration accuracy.
  • the distance between the three points can be approximately two-thirds of the length and width of the calibration plate.
  • a transformation matrix calculation step is executed, according to the recorded coordinates of the at least three projections on the calibration board in the camera coordinate system and the end of the robot arm in the The coordinates in the robot coordinate system are used to calculate the calibration transformation matrix.
  • the method according to the present disclosure may further include the parallel correction step in block S206, using the laser ranging component of the laser to correct the camera on the end of the robot arm.
  • the imaging plane is parallel to the plane of the calibration plate.
  • the operation in block S206 can be implemented through the following process.
  • the end of the robot arm is controlled to move to three non-collinear points on a plane parallel to the imaging plane of the camera above the calibration plate.
  • the laser distance measuring component is used to calculate the angle between the motion plane and the calibration plate, so as to obtain the angle between the imaging plane and the calibration plate.
  • the posture of the end of the robot arm is determined by the parallel correction step so that the imaging plane of the camera on the end of the robot arm is parallel to the plane of the calibration board. After that, when the robot arm is moved for hand-eye calibration, the posture of the robot arm is kept unchanged, and it only moves in the x, y, and z directions.
  • the method according to the present disclosure may further include a distance adjustment step in block S208: measuring the distance between the camera and the calibration plate with the laser distance measuring component, and adjusting the distance between the camera and the calibration plate.
  • a predetermined distance is maintained between the calibration plates, wherein the predetermined distance is preset according to the focal length of the camera, so that the projection on the calibration plate is clearly imaged on the imaging plane of the camera.
  • the industrial lens usually used by robots is not auto-focusing, manual focus is required. This means that objects at a certain distance can be photographed most clearly. The image is not clear if it is too close or too far from the lens.
  • the CCD/CMOS imaging size of the object is different (reflected as the physical distance corresponding to the two pixels is different), and only at a fixed distance, calibration is meaningful. Therefore, by keeping a predetermined distance between the camera and the calibration plate through the operations in the above block S210, the sharpness of the camera imaging can be ensured, and the calibration accuracy can be further improved.
  • FIG. 4 is a block diagram showing an exemplary configuration of a robot hand-eye calibration device 400 according to an embodiment of the present disclosure.
  • the robot hand-eye calibration device 400 includes: a coordinate recording unit 402 and a transformation matrix calculation unit 404.
  • the robot coordinate recording unit 402 controls the end of the robot arm to move to at least three predetermined positions above the calibration board in sequence. At each predetermined position: control the laser set on the robot arm to project on the calibration board, and record the projection time The coordinates of the end point of the robot arm in the robot coordinate system, and the camera on the end of the robot arm is controlled to shoot the projection on the calibration board and record the coordinate of the projection in the camera coordinate system.
  • the transformation matrix calculation unit 404 calculates the calibration according to the recorded coordinates of at least three projections on the calibration board in the camera coordinate system and the coordinates of the end of the robot arm in the robot coordinate system during each projection. Transformation matrix.
  • the laser includes a laser distance measuring component and a laser projection component, wherein the laser projection component is used for projecting on the calibration plate.
  • the advancing direction of the laser beam of the laser distance measuring component is parallel to the optical axis of the camera, and the advancing direction of the laser beam of the laser projection component is parallel to the optical axis of the camera.
  • the robot hand-eye calibration device 400 may further include a parallel correction unit 406 that controls the laser distance measuring component of the laser to correct the imaging plane of the camera on the end of the robot arm and the The plane of the calibration board is parallel.
  • the parallel correction unit 406 may be configured to: control the end of the robot arm to move to three non-collinear points on a plane parallel to the imaging plane of the camera above the calibration plate; The three points are used to calculate the motion plane of the end of the robot arm; the laser distance measuring component is used to calculate the angle between the motion plane and the calibration plate to obtain the angle between the imaging plane and the calibration plate And adjusting the posture of the end of the robot arm according to the calculated included angle so that the imaging plane of the camera provided at the end of the robot arm is parallel to the calibration plate.
  • the robot hand-eye calibration device 400 may further include a distance adjustment unit 408 that controls the laser distance measuring component to measure the distance between the camera and the calibration board, and adjusts the distance between the camera and the calibration board.
  • a distance adjustment unit 408 controls the laser distance measuring component to measure the distance between the camera and the calibration board, and adjusts the distance between the camera and the calibration board.
  • a predetermined distance is maintained between the camera and the calibration plate, wherein the predetermined distance is preset according to the focal length of the camera, so that the projection on the calibration plate is clearly imaged on the imaging plane of the camera.
  • each part of the robot hand-eye calibration device 400 may be the same or similar to the relevant parts of the embodiment of the robot hand-eye calibration method of the present disclosure described with reference to FIGS. 1-3, and will not be described in detail here.
  • the above robot hand-eye calibration device can be implemented by hardware, or by software or a combination of hardware and software.
  • the robot hand-eye calibration device 400 may be implemented using a computing device.
  • FIG. 5 shows a block diagram of a computing device 500 for hand-eye calibration of a robot according to an embodiment of the present disclosure.
  • the computing device 500 may include at least one processor 502, which executes at least one computer-readable instruction stored or encoded in a computer-readable storage medium (ie, the memory 504) (ie, the above-mentioned in the form of software) Implementation elements).
  • computer-executable instructions are stored in the memory 504, which when executed, cause at least one processor 502 to complete the following actions: control the end of the robot arm to move to at least three predetermined positions above the calibration plate in sequence, in each At a predetermined position: control the laser set on the robot arm to project on the calibration board, record the coordinates of the end of the robot arm in the robot coordinate system during projection, and control the camera on the end of the robot arm to project on the calibration board Shooting and recording the coordinates of the projection in the camera coordinate system; and according to the recorded coordinates of at least three projections on the calibration board in the camera coordinate system and the end of the robot arm during each projection
  • the coordinates of the endpoints in the robot coordinate system are used to calculate the calibration transformation matrix.
  • a non-transitory machine-readable medium may have machine-executable instructions (that is, the above-mentioned elements implemented in the form of software), which when executed by a machine, cause the machine to execute the various embodiments of the present disclosure.
  • machine-executable instructions that is, the above-mentioned elements implemented in the form of software
  • a computer program product including computer-executable instructions, which when executed, cause at least one processor to execute the above described in conjunction with FIGS. 1-4 in the various embodiments of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

一种机器人手眼标定方法、装置、计算设备、存储介质以及计算机程序产品。机器人手眼标定方法包括:坐标记录步骤:控制机器人手臂(102)末端依次移动至标定板(108)上方的至少三个预定位置,在每一个预定位置处:控制设置在机器人手臂(102)上的激光器(104)在标定板上投影,记录投影时机器人手臂(102)末端端点在机器人坐标系下的坐标,并且控制机器人手臂(102)末端上的相机(106)对标定板(108)上的投影进行拍摄并记录该投影在相机坐标系下的坐标;以及变换矩阵计算步骤:根据所记录的标定板(108)上的至少三个投影在相机坐标系下的坐标和每次投影时机器人手臂(102)末端端点在机器人坐标系下的坐标来计算标定变换矩阵。

Description

机器人手眼标定方法、装置、计算设备、介质以及产品 技术领域
本公开通常涉及机器人领域,更具体地,涉及机器人手眼标定方法、装置、计算设备、介质以及产品。
背景技术
在工业应用领域中,机器人需要依靠手眼***执行机械加工、安装等任务。手眼***是由相机和机器人手臂构成的视觉***,其中,相机相当于人的眼睛,机器人手臂末端相当于人的手。
视觉定位引导机器人手臂执行任务,首先要将相机坐标和机器人坐标进行标定,使视觉定位的相机坐标转换为机器人坐标,以完成视觉的引导,其中手眼标定是视觉引导机器人手臂的关键所在。
目前这个手眼标定过程通常是通过人工来完成,机器人需要由相机示教。具体地,在机器人手臂末端安装一个标定针,手动操作机器人手臂移动至标定板的九个点的位置。由于相机坐标系和机器人坐标系中的目标位置都要收集来计算标定数据,这需要开发者的大量工作;此外,由于标定针的精度会影响标定精度,且手动操作机器人手臂移动至九个点的精度要求较高,标定精度受人为因素影响较大且标定的时间较长,因此,传统的手眼标定的方法存在标定过程复杂、标定效率低且标定精度受人为因素影响较大的问题。
另外,由于标定工具中心点不完全垂直于标定板也会影响标定的精度。
因此,需要一种具有较高测量精度的自动手眼标定的方法。
发明内容
在下文中给出关于本发明的简要概述,以便提供关于本发明的某些方面的基本理解。应当理解,这个概述并不是关于本发明的穷举性概述。它并不是意图确定本发明的关键或重要部分,也不是意图限定本发明的范围。 其目的仅仅是以简化的形式给出某些概念,以此作为稍后论述的更详细描述的前序。
鉴于上述,本公开提供了一种具有较高测量精度的自动手眼标定的方法。在本公开的技术方案中,在机器人手臂上引入了一个激光器,利用该激光器可以实现自动手眼标定。
根据本公开的一个方面,提供了机器人手眼标定方法,包括:坐标记录步骤:控制机器人手臂末端依次移动至标定板上方的至少三个预定位置,在每一个预定位置处:控制设置在机器人手臂上的激光器在在所述标定板上投影,记录投影时机器人手臂末端端点在机器人坐标系下的坐标,并且控制机器人手臂末端上的相机对所述标定板上的投影进行拍摄并记录该投影在相机坐标系下的坐标;以及变换矩阵计算步骤:根据所记录的所述标定板上的至少三个投影在所述相机坐标系下的坐标和每次投影时所述机器人手臂末端端点在所述机器人坐标系下的坐标来计算标定变换矩阵。
以这样的方式,通过用程序控制激光器在标定板上分别进行至少三次投影,并分别用相机拍摄这些投影,基于所记录的这三个投影分别在相机坐标系下的坐标和投影时机器人手臂末端端点在机器人坐标系下的坐标来计算标定变换矩阵,可以在没有人为操作的情况下,自动完成手眼标定的过程。
可选地,在上述方面的一个示例中,所述激光器包括激光测距部件和激光投影部件,其中,所述激光投影部件用于在所述标定板上投影。
可选地,在上述方面的一个示例中,所述激光测距部件的激光束前进方向与所述相机的光轴平行,以及所述激光投影部件的激光束前进方向与所述相机的光轴平行。
可选地,在上述方面的一个示例中,在执行所述机器人坐标记录步骤之前,所述方法还包括:平行校正步骤:利用所述激光器的激光测距部件来校正所述机器人手臂末端上的相机的成像平面与所述标定板的平面平行。
可选地,在上述方面的一个示例中,所述平行校正步骤包括:控制所述机器人手臂末端在所述标定板上方跟随平行于所述相机的成像平面的平面上移动到不共线的三个点;基于所述三个点计算出所述机器人手臂末端 的运动平面;利用所述激光测距部件计算所述运动平面与所述标定板的夹角,从而获得所述成像平面与所述标定板的夹角;以及根据所计算的夹角调整所述机器人手臂末端的姿态,使得设置在所述机器人手臂末端的所述相机的成像平面与所述标定板平行。
以这样的方式,可以实现相机的成像平面与标定板的平面平行,从而减小由于相机的成像平面与标定板的平面不平行所引起的标定误差,提高手眼标定的精确度。
可选地,在上述方面的一个示例中,在执行所述机器人坐标记录步骤之前,所述方法还包括:距离调整步骤:用所述激光测距部件测量所述相机与所述标定板之间的距离,并且调整所述相机与所述标定板之间保持预定距离,其中,所述预定距离根据所述相机的焦距预先设定,以使得所述标定板上的投影在所述相机的成像平面上清晰成像。
以这样的方式,可以使得相机在进行拍摄时与标定板之间保持合适的距离,来确保相机成像的清晰度,从而进一步提高标定精确度。
根据本公开的另一方面,提供了机器人手眼标定装置,包括:坐标记录单元,所述坐标记录单元控制机器人手臂末端依次移动至标定板上方的至少三个预定位置,在每一个预定位置处:控制设置在机器人手臂上的激光器在所述标定板上投影,记录投影时机器人手臂末端端点在机器人坐标系下的坐标,并且控制机器人手臂末端上的相机对所述标定板上的投影进行拍摄并记录该投影在相机坐标系下的坐标;以及变换矩阵计算单元,所述变换矩阵计算单元根据所记录的所述标定板上的至少三个投影在所述相机坐标系下的坐标和每次投影时所述机器人手臂末端在所述机器人坐标系下的坐标来计算标定变换矩阵。
可选地,在上述方面的一个示例中,所述激光器包括激光测距部件和激光投影部件,其中,所述激光投影部件用于在所述标定板上投影。
可选地,在上述方面的一个示例中,所述激光测距部件的激光束前进方向与所述相机的光轴平行,以及所述激光投影部件的激光束前进方向与所述相机的光轴平行。
可选地,在上述方面的一个示例中,机器人手眼标定装置还包括:一个平行校正单元,所述平行校正单元控制所述激光器的激光测距部件来校 正所述机器人手臂末端上的相机的成像平面与所述标定板的平面平行。
可选地,在上述方面的一个示例中,所述平行校正单元进一步被配置为:控制所述机器人手臂末端在所述标定板上方跟随平行于所述相机的成像平面的平面上移动到不共线的三个点;基于所述三个点计算出所述机器人手臂末端的运动平面;利用所述激光测距部件计算所述运动平面与所述标定板的夹角,从而获得所述成像平面与所述标定板的夹角;以及根据所计算的夹角调整所述机器人手臂末端的姿态,使得设置在所述机器人手臂末端的所述相机的成像平面与所述标定板平行。
可选地,在上述方面的一个示例中,机器人手眼标定装置还包括:一个距离调整单元,所述距离调整单元控制所述激光测距部件测量所述相机与所述标定板之间的距离,并且调整所述相机与所述标定板之间保持预定距离,其中,所述预定距离根据所述相机的焦距预先设定,以使得所述标定板上的投影在所述相机的成像平面上清晰成像。
根据本公开的另一方面,提供了机器人手臂,包括:一个激光器;以及一个相机;其中,所述激光器包括激光测距部件和激光投影部件,所述激光投影部件用于在标定板上投影,所述激光测距部件用于校正所述相机的成像平面与所述标定板的平面平行。
根据本公开的另一方面,提供了一种计算设备,包括:至少一个处理器;以及与所述至少一个处理器耦合的一个存储器,所述存储器用于存储指令,当所述指令被所述至少一个处理器执行时,使得所述处理器执行如上所述的方法。
根据本公开的另一方面,提供了一种非暂时性机器可读存储介质,其存储有可执行指令,所述指令当被执行时使得所述机器执行如上所述的方法。
根据本公开的另一方面,提供了一种计算机程序产品,所述计算机程序产品被有形地存储在计算机可读介质上并且包括计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行如上所述的方法。
附图说明
参照下面结合附图对本发明实施例的说明,会更加容易地理解本发明 的以上和其它目的、特点和优点。附图中的部件只是为了示出本发明的原理。在附图中,相同的或类似的技术特征或部件将采用相同或类似的附图标记来表示。以下附图仅旨在于对本发明做示意性说明和解释,并不限定本发明的范围。
图1示出了根据本公开的一个实施例的机器人手臂的示意图;
图2示出了根据本公开的一个实施例的机器人手眼标定方法的示例性过程的流程图;
图3是示出图2中的方框S206中的操作的一种示例性过程的流程图;
图4是示出了根据本公开的一个实施例的机器人手眼标定装置的示例性配置的框图;以及
图5示出了根据本公开的实施例的对机器人进行手眼标定的计算设备的方框图。
附图标记
102:机器人
104:激光器
106:相机
108:标定板。
200:机器人手眼标定方法
S202、S204、S206、S208、S302、S304、S306、S308:步骤
400:机器人手眼标定装置
402:坐标记录单元
404:变换矩阵计算单元
406:平行校正单元
408:距离调整单元
500:计算设备
502:处理器
504:存储器
具体实施方式
现在将参考示例实施方式讨论本文描述的主题。应该理解,讨论这些实施方式只是为了使得本领域技术人员能够更好地理解从而实现本文描述的主题,并非是对权利要求书中所阐述的保护范围、适用性或者示例的限制。可以在不脱离本公开内容的保护范围的情况下,对所讨论的元素的功能和排列进行改变。各个示例可以根据需要,省略、替代或者添加各种过程或组件。例如,所描述的方法可以按照与所描述的顺序不同的顺序来执行,以及各个步骤可以被添加、省略或者组合。另外,相对一些示例所描述的特征在其它例子中也可以进行组合。
如本文中使用的,术语“包括”及其变型表示开放的术语,含义是“包括但不限于”。术语“基于”表示“至少部分地基于”。术语“一个实施例”和“一实施例”表示“至少一个实施例”。术语“另一个实施例”表示“至少一个其他实施例”。术语“第一”、“第二”等可以指代不同的或相同的对象。下面可以包括其他的定义,无论是明确的还是隐含的。除非上下文中明确地指明,否则一个术语的定义在整个说明书中是一致的。
本公开提供了一种具有较高测量精度的自动手眼标定的方法。在本公开的技术方案中,在机器人手臂上引入了一个激光器,利用该激光器可以实现自动手眼标定。
图1示出了根据本公开的一个实施例的机器人的示意图。在图1中,102是机器人手臂,104是设置在机器人手臂末端的激光器,106是设置在机器人手臂末端的相机,108是标定板。
其中,激光器104包括激光测距部件和激光投影部件,激光投影部件用于在标定板108上投影,激光测距部件用于校正相机106的成像平面与所述标定板108的平面平行。
其中,激光器104可以是激光测距部件和激光投影部件两个激光器部件集成在一起,也可以是由一个即有测距功能也有投影功能的激光器来实现。
可以理解,图1是设置有一个激光器的机器人手臂的示意图,其中各部件之间的关系只是示意性的,本领域技术人员可以根据需要来设置各部件之间的位置关系,而不限于图1所示。
现在结合附图来描述根据本公开实施例的机器人手眼标定方法和装置。
图2示出了根据本公开的一个实施例的机器人手眼标定方法200的示例性过程的流程图。
首先,在方框S202中,执行坐标记录步骤,控制机器人手臂末端依次移动至标定板上方的至少三个预定位置,在每一个预定位置处:控制设置在机器人手臂上的激光器在在所述标定板上投影,记录投影时机器人手臂末端端点在机器人坐标系下的坐标,并且控制机器人手臂末端上的相机对所述标定板上的投影进行拍摄并记录该投影在相机坐标系下的坐标。
在根据本公开的一个实施例中,激光器包括激光测距部件和激光投影部件,其中,激光投影部件用于在标定板上投影。
其中,所述激光测距部件的激光束前进方向与所述相机的光轴平行,以及所述激光投影部件的激光束前进方向与所述相机的光轴平行。
其中,所记录的投影在相机坐标系下的坐标是投影的几何中心的坐标。
通过该步骤,可以记录标定板上的至少三个投影在相机坐标系下和每次投影时所述机器人手臂末端端点在机器人坐标系下的至少三组坐标。
其中,可以通过程序来控制机器人手臂末端依次移动到至少三个预定位置,这三个预定位置被设定为不共线,且这三个预定位置所形成的平面平行于相机的成像平面。
此外,设定预定位置时还可以考虑在机器人合理的活动范围内,即不应该设定到机器人到达不了或不易到达的地方;三点之间的距离需要考虑标定板的大小,即不要过大而超过标定板,也不要过小而影响标定的精度。优选地,可以将三点的之间跨越的距离约为标定板长宽的三分之二。
然后,在方框S204中,执行变换矩阵计算步骤,根据所记录的所述标定板上的至少三个投影在所述相机坐标系下的坐标和每次投影时所述机器人手臂末端在所述机器人坐标系下的坐标来计算标定变换矩阵。
本领域技术人员可以理解基于这三组坐标来计算变换矩阵的具体过程本领域技术人员,在此不再赘述。
在根据本实施例的方法中,通过用程序控制激光器在标定板上分别进行至少三次投影,并分别用相机拍摄这些投影,基于所记录的这三个投影 分别在相机坐标系下的坐标和投影时机器人手臂末端端点在机器人坐标系下的坐标来计算标定变换矩阵,可以在没有人为操作的情况下,自动完成手眼标定的过程。
在一个示例中,在进行手眼标定的过程之前,根据本公开的方法还可以包括方框S206中的平行校正步骤,利用所述激光器的激光测距部件来校正所述机器人手臂末端上的相机的成像平面与所述标定板的平面平行。
具体地,如图3所示,可以通过以下过程来实现方框S206中的操作。
首先,在方框S302中,控制所述机器人手臂末端在所述标定板上方跟随平行于所述相机的成像平面的平面上移动到不共线的三个点。
接着,在方框S304中,基于所述三个点计算出所述机器人手臂末端的运动平面。
在方框S306中,利用所述激光测距部件计算所述运动平面与所述标定板的夹角,从而获得所述成像平面与所述标定板的夹角。
最后,在方框S308中,根据所计算的夹角调整所述机器人手臂末端的姿态,使得设置在所述机器人手臂末端的所述相机的成像平面与所述标定板平行。
也就是说,在进行机器人手眼标定之前,先通过平行校正步骤来确定机器人手臂末端的姿态使得机器人手臂末端上的相机的成像平面与标定板平面平行。在此后移动机器人手臂进行手眼标定时都保持机器人手臂的姿态不变,而仅在x、y、z方向上移动。
通过采用图3所示的方式来实现相机的成像平面与标定板的平行,可以减小由于相机的成像平面与标定板不平行所引起的标定误差,从而可以提高手眼标定的精确度。
在一个示例中,根据本公开的方法还可以包括方框S208中的距离调整步骤:用所述激光测距部件测量所述相机与所述标定板之间的距离,并且调整所述相机与所述标定板之间保持预定距离,其中,所述预定距离根据所述相机的焦距预先设定,以使得所述标定板上的投影在所述相机的成像平面上清晰成像。
由于机器人通常采用的工业镜头不是自动对焦的,需要手动对焦。这意味着在某个距离上的物体拍的最清晰。离镜头太近或太远,图像都不清 晰。另一方面,在不同距离下,物体在CCD/CMOS成像尺寸是不同的(反映为两个像素对应的物理距离不一样),只有在固定距离下,标定才有意义。因此,通过上述方框S210中的操作使得相机与标定板之间保持预定距离,来确保相机成像的清晰度,可以进一步提高标定精确度。
图4是示出了根据本公开的一个实施例的机器人手眼标定装置400的示例性配置的框图。
如图4所示,机器人手眼标定装置400包括:一个坐标记录单元402和一个变换矩阵计算单元404。
其中,机器人坐标记录单元402控制机器人手臂末端依次移动至标定板上方的至少三个预定位置,在每一个预定位置处:控制设置在机器人手臂上的激光器在所述标定板上投影,记录投影时机器人手臂末端端点在机器人坐标系下的坐标,并且控制机器人手臂末端上的相机对所述标定板上的投影进行拍摄并记录该投影在相机坐标系下的坐标。
变换矩阵计算单元404根据所记录的所述标定板上的至少三个投影在所述相机坐标系下的坐标和每次投影时所述机器人手臂末端在所述机器人坐标系下的坐标来计算标定变换矩阵。
在根据本公开的一个实施例中,激光器包括激光测距部件和激光投影部件,其中,激光投影部件用于在标定板上投影。
其中,所述激光测距部件的激光束前进方向与所述相机的光轴平行,以及所述激光投影部件的激光束前进方向与所述相机的光轴平行。
在一个示例中,机器人手眼标定装置400还可以包括一个平行校正单元406,所述平行校正单元406控制所述激光器的激光测距部件来校正所述机器人手臂末端上的相机的成像平面与所述标定板的平面平行。
具体地,所述平行校正单元406可以被配置为:控制所述机器人手臂末端在所述标定板上方跟随平行于所述相机的成像平面的平面上移动到不共线的三个点;基于所述三个点计算出所述机器人手臂末端的运动平面;利用所述激光测距部件计算所述运动平面与所述标定板的夹角,从而获得所述成像平面与所述标定板的夹角;以及根据所计算的夹角调整所述机器人手臂末端的姿态,使得设置在所述机器人手臂末端的所述相机的成像平面与所述标定板平行。
在另一个示例中,机器人手眼标定装置400还可以包括一个距离调整单元408,所述距离调整单元408控制所述激光测距部件测量所述相机与所述标定板之间的距离,并且调整所述相机与所述标定板之间保持预定距离,其中,所述预定距离根据所述相机的焦距预先设定,以使得所述标定板上的投影在所述相机的成像平面上清晰成像。
机器人手眼标定装置400的各个部分的操作和功能的细节例如可以与参照结合图1-3所描述的本公开的机器人手眼标定方法的实施例的相关部分相同或类似,这里不再详细描述。
在此需要说明的是,图4所示的机器人手眼标定装置400及其组成单元的结构仅仅是示例性的,本领域技术人员可以根据需要对图4所示的结构框图进行修改。
如上参照图1到图4,对根据本公开的实施例的机器人手眼标定方法和装置进行了描述。上面的机器人手眼标定装置可以采用硬件实现,也可以采用软件或者硬件和软件的组合来实现。
在本公开中,机器人手眼标定装置400可以利用计算设备实现。图5示出了根据本公开的实施例的对机器人进行手眼标定的计算设备500的方框图。根据一个实施例,计算设备500可以包括至少一个处理器502,处理器502执行在计算机可读存储介质(即,存储器504)中存储或编码的至少一个计算机可读指令(即,上述以软件形式实现的元素)。
在一个实施例中,在存储器504中存储计算机可执行指令,其当执行时使得至少一个处理器502完成以下动作:控制机器人手臂末端依次移动至标定板上方的至少三个预定位置,在每一个预定位置处:控制设置在机器人手臂上的激光器在在所述标定板上投影,记录投影时机器人手臂末端端点在机器人坐标系下的坐标,并且控制机器人手臂末端上的相机对所述标定板上的投影进行拍摄并记录该投影在相机坐标系下的坐标;以及根据所记录的所述标定板上的至少三个投影在所述相机坐标系下的坐标和每次投影时所述机器人手臂末端端点在所述机器人坐标系下的坐标来计算标定变换矩阵。
应该理解,在存储器504中存储的计算机可执行指令当执行时使得至少一个处理器502进行本公开的各个实施例中以上结合图1-4描述的各种操 作和功能。
根据一个实施例,提供了一种非暂时性机器可读介质。该非暂时性机器可读介质可以具有机器可执行指令(即,上述以软件形式实现的元素),该指令当被机器执行时,使得机器执行本公开的各个实施例中以上结合图1-4描述的各种操作和功能。
根据一个实施例,提供了一种计算机程序产品,包括计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行本公开的各个实施例中以上结合图1-4描述的各种操作和功能。
上面结合附图阐述的具体实施方式描述了示例性实施例,但并不表示可以实现的或者落入权利要求书的保护范围的所有实施例。在整个本说明书中使用的术语“示例性”意味着“用作示例、实例或例示”,并不意味着比其它实施例“优选”或“具有优势”。出于提供对所描述技术的理解的目的,具体实施方式包括具体细节。然而,可以在没有这些具体细节的情况下实施这些技术。在一些实例中,为了避免对所描述的实施例的概念造成难以理解,公知的结构和装置以框图形式示出。
本公开内容的上述描述被提供来使得本领域任何普通技术人员能够实现或者使用本公开内容。对于本领域普通技术人员来说,对本公开内容进行的各种修改是显而易见的,并且,也可以在不脱离本公开内容的保护范围的情况下,将本文所定义的一般性原理应用于其它变型。因此,本公开内容并不限于本文所描述的示例和设计,而是与符合本文公开的原理和新颖性特征的最广范围相一致。

Claims (16)

  1. 机器人手眼标定方法,包括:
    坐标记录步骤:控制机器人手臂末端依次移动至标定板上方的至少三个预定位置,在每一个预定位置处:控制设置在机器人手臂上的激光器在在所述标定板上投影,记录投影时机器人手臂末端端点在机器人坐标系下的坐标,并且控制机器人手臂末端上的相机对所述标定板上的投影进行拍摄并记录该投影在相机坐标系下的坐标;以及
    变换矩阵计算步骤:根据所记录的所述标定板上的至少三个投影在所述相机坐标系下的坐标和每次投影时所述机器人手臂末端端点在所述机器人坐标系下的坐标来计算标定变换矩阵。
  2. 如权利要求1所述的方法,其中,所述激光器包括激光测距部件和激光投影部件,其中,所述激光投影部件用于在所述标定板上投影。
  3. 如权利要求2所述的方法,其中,所述激光测距部件的激光束前进方向与所述相机的光轴平行,以及所述激光投影部件的激光束前进方向与所述相机的光轴平行。
  4. 如权利要求1-3中任意一项所述的方法,其中,在执行所述坐标记录步骤之前,所述方法还包括:
    平行校正步骤:利用所述激光器的激光测距部件来校正所述机器人手臂末端上的相机的成像平面与所述标定板的平面平行。
  5. 如权利要求4所述的方法,其中,所述平行校正步骤包括:
    控制所述机器人手臂末端在所述标定板上方跟随平行于所述相机的成像平面的平面上移动到不共线的三个点;
    控制所述机器人手臂末端在所述标定板上方跟随平行于所述相机的成像平面的平面;基于所述三个点计算出所述机器人手臂末端的运动平面;
    利用所述激光测距部件计算所述运动平面与所述标定板的夹角,从而 获得所述成像平面与所述标定板的夹角;以及
    根据所计算的夹角调整所述机器人手臂末端的姿态,使得设置在所述机器人手臂末端的所述相机的成像平面与所述标定板平行。
  6. 如权利要求1-3中任意一项所述的方法,其中,在执行所述机器人坐标记录步骤之前,所述方法还包括:距离调整步骤:用所述激光测距部件测量所述相机与所述标定板之间的距离,并且调整所述相机与所述标定板之间保持预定距离,其中,所述预定距离根据所述相机的焦距预先设定,以使得所述标定板上的投影在所述相机的成像平面上清晰成像。
  7. 机器人手眼标定装置(400),包括:
    坐标记录单元(402),所述坐标记录单元(402)控制机器人手臂末端依次移动至标定板上方的至少三个预定位置,在每一个预定位置处:控制设置在机器人手臂上的激光器在所述标定板上投影,记录投影时机器人手臂末端端点在机器人坐标系下的坐标,并且控制机器人手臂末端上的相机对所述标定板上的投影进行拍摄并记录该投影在相机坐标系下的坐标;以及
    变换矩阵计算单元(404),所述变换矩阵计算单元(404)根据所记录的所述标定板上的至少三个投影在所述相机坐标系下的坐标和每次投影时所述机器人手臂末端在所述机器人坐标系下的坐标来计算标定变换矩阵。
  8. 如权利要求7所述的装置(400),其中,所述激光器包括激光测距部件和激光投影部件,其中,所述激光投影部件用于在所述标定板上投影。
  9. 如权利要求8所述的装置(400),其中,所述激光测距部件的激光束前进方向与所述相机的光轴平行,以及所述激光投影部件的激光束前进方向与所述相机的光轴平行。
  10. 如权利要求7-9中任意一项所述的装置(400),还包括:
    一个平行校正单元(406),所述平行校正单元(406)控制所述激光器 的激光测距部件来校正所述机器人手臂末端上的相机的成像平面与所述标定板的平面平行。
  11. 如权利要求10所述的装置(400),其中,所述平行校正单元(406)进一步被配置为:
    控制所述机器人手臂末端在所述标定板上方跟随平行于所述相机的成像平面的平面上移动到不共线的三个点;
    基于所述三个点计算出所述机器人手臂末端的运动平面;
    利用所述激光测距部件计算所述运动平面与所述标定板的夹角,从而获得所述成像平面与所述标定板的夹角;以及
    根据所计算的夹角调整所述机器人手臂末端的姿态,使得设置在所述机器人手臂末端的所述相机的成像平面与所述标定板平行。
  12. 如权利要求7-9中任意一项所述的装置(400),还包括:
    一个距离调整单元(408),所述距离调整单元(408)控制所述激光测距部件测量所述相机与所述标定板之间的距离,并且调整所述相机与所述标定板之间保持预定距离,其中,所述预定距离根据所述相机的焦距预先设定,以使得所述标定板上的投影在所述相机的成像平面上清晰成像。
  13. 机器人手臂(102),包括:
    一个激光器(104);以及
    一个相机(106);
    其中,所述激光器(104)包括激光测距部件和激光投影部件,所述激光投影部件用于在标定板(108)上投影,所述激光测距部件用于校正所述相机(106)的成像平面与所述标定板(108)的平面平行。
  14. 计算设备(500),包括:
    至少一个处理器(502);以及
    与所述至少一个处理器(502)耦合的一个存储器(504),所述存储器用于存储指令,当所述指令被所述至少一个处理器(502)执行时,使得所 述处理器(502)执行如权利要求1到6中任意一项所述的方法。
  15. 一种非暂时性机器可读存储介质,其存储有可执行指令,所述指令当被执行时使得机器执行如权利要求1到6中任意一项所述的方法。
  16. 一种计算机程序产品,所述计算机程序产品被有形地存储在计算机可读介质上并且包括计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行根据权利要求1至6中任意一项所述的方法。
PCT/CN2019/096908 2019-07-19 2019-07-19 机器人手眼标定方法、装置、计算设备、介质以及产品 WO2021012124A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/CN2019/096908 WO2021012124A1 (zh) 2019-07-19 2019-07-19 机器人手眼标定方法、装置、计算设备、介质以及产品
US17/625,824 US20220250248A1 (en) 2019-07-19 2019-07-19 Robot hand-eye calibration method and apparatus, computing device, medium and product
EP19938228.4A EP3974779A4 (en) 2019-07-19 2019-07-19 ROBOT HAND-EYE CALIBRATION METHOD AND APPARATUS, COMPUTER DEVICE, MEDIA AND PRODUCT
CN201980096398.4A CN113825980B (zh) 2019-07-19 2019-07-19 机器人手眼标定方法、装置、计算设备以及介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/096908 WO2021012124A1 (zh) 2019-07-19 2019-07-19 机器人手眼标定方法、装置、计算设备、介质以及产品

Publications (1)

Publication Number Publication Date
WO2021012124A1 true WO2021012124A1 (zh) 2021-01-28

Family

ID=74192603

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/096908 WO2021012124A1 (zh) 2019-07-19 2019-07-19 机器人手眼标定方法、装置、计算设备、介质以及产品

Country Status (4)

Country Link
US (1) US20220250248A1 (zh)
EP (1) EP3974779A4 (zh)
CN (1) CN113825980B (zh)
WO (1) WO2021012124A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907682A (zh) * 2021-04-07 2021-06-04 歌尔光学科技有限公司 一种五轴运动平台的手眼标定方法、装置及相关设备
CN113664835A (zh) * 2021-09-15 2021-11-19 上海景吾智能科技有限公司 机器人自动手眼标定方法与***
CN113843792A (zh) * 2021-09-23 2021-12-28 四川锋准机器人科技有限公司 一种手术机器人的手眼标定方法
CN114652299A (zh) * 2022-02-25 2022-06-24 天键医疗科技(广东)有限公司 3d耳道扫描***误差修正方法
CN116038701A (zh) * 2022-12-30 2023-05-02 北京中科原动力科技有限公司 一种四轴机械臂的手眼标定方法及装置

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115488883B (zh) * 2022-09-06 2023-11-07 群青华创(南京)智能科技有限公司 机器人手眼标定方法、装置及***
CN116160454A (zh) * 2023-03-28 2023-05-26 重庆智能机器人研究院 机器人末端平面视觉“手-眼”标定算法模型
US11992959B1 (en) 2023-04-03 2024-05-28 Guangdong University Of Technology Kinematics-free hand-eye calibration method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105157725A (zh) * 2015-07-29 2015-12-16 华南理工大学 一种二维激光视觉传感器和机器人的手眼标定方法
WO2017003949A1 (en) * 2015-06-30 2017-01-05 Abb Technology Ltd. Technologies for pan tilt unit calibration
CN107253190A (zh) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 一种高精度机器人手眼相机自动标定的装置及其使用方法
CN109278044A (zh) * 2018-09-14 2019-01-29 合肥工业大学 一种手眼标定及坐标转换方法
CN109671122A (zh) * 2018-12-14 2019-04-23 四川长虹电器股份有限公司 手眼相机标定方法及装置
CN109900251A (zh) * 2017-12-07 2019-06-18 广州映博智能科技有限公司 一种基于视觉技术的机器人定位装置及方法

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010152550A (ja) * 2008-12-24 2010-07-08 Canon Inc 作業装置及びその校正方法
WO2012105727A1 (en) * 2011-01-31 2012-08-09 Agency For Defense Development Device, system and method for calibration of camera and laser sensor
US20130338525A1 (en) * 2012-04-24 2013-12-19 Irobot Corporation Mobile Human Interface Robot
US9457470B2 (en) * 2013-04-05 2016-10-04 Abb Technology Ltd Robot system and method for calibration
US10162352B2 (en) * 2013-05-13 2018-12-25 The Boeing Company Remotely operated mobile stand-off measurement and inspection system
US20160260250A1 (en) * 2015-03-05 2016-09-08 Dejan Jovanovic Method and system for 3d capture based on structure from motion with pose detection tool
CN105303560B (zh) * 2015-09-22 2018-01-12 中国计量学院 机器人激光扫描式焊缝跟踪***标定方法
WO2017055955A1 (en) * 2015-09-29 2017-04-06 Koninklijke Philips N.V. Automatic robotic arm calibration to camera system using a laser
CN106595700A (zh) * 2015-10-15 2017-04-26 南京理工大学 基于三点坐标测量的靶道空间基准标定方法
DE102016212694A1 (de) * 2016-05-31 2017-11-30 Siemens Aktiengesellschaft Verfahren zur Orientierung eines Industrieroboters und Industrieroboter
US10076842B2 (en) * 2016-09-28 2018-09-18 Cognex Corporation Simultaneous kinematic and hand-eye calibration
CN106335061A (zh) * 2016-11-11 2017-01-18 福州大学 一种基于四自由度机器人的手眼关系标定方法
CN107465868B (zh) * 2017-06-21 2018-11-16 珠海格力电器股份有限公司 基于终端的物体识别方法、装置及电子设备
US10791275B2 (en) * 2017-09-25 2020-09-29 The Boeing Company Methods for measuring and inspecting structures using cable-suspended platforms
US11274929B1 (en) * 2017-10-17 2022-03-15 AI Incorporated Method for constructing a map while performing work
US10633066B2 (en) * 2018-03-27 2020-04-28 The Boeing Company Apparatus and methods for measuring positions of points on submerged surfaces
US10634632B2 (en) * 2018-04-25 2020-04-28 The Boeing Company Methods for inspecting structures having non-planar surfaces using location alignment feedback
CN109794938B (zh) * 2019-02-01 2021-08-06 南京航空航天大学 一种适用于曲面结构的机器人制孔误差补偿装置及其方法
US20210129339A1 (en) * 2019-11-05 2021-05-06 Elementary Robotics, Inc. Calibration and zeroing in robotic systems

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017003949A1 (en) * 2015-06-30 2017-01-05 Abb Technology Ltd. Technologies for pan tilt unit calibration
CN105157725A (zh) * 2015-07-29 2015-12-16 华南理工大学 一种二维激光视觉传感器和机器人的手眼标定方法
CN107253190A (zh) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 一种高精度机器人手眼相机自动标定的装置及其使用方法
CN109900251A (zh) * 2017-12-07 2019-06-18 广州映博智能科技有限公司 一种基于视觉技术的机器人定位装置及方法
CN109278044A (zh) * 2018-09-14 2019-01-29 合肥工业大学 一种手眼标定及坐标转换方法
CN109671122A (zh) * 2018-12-14 2019-04-23 四川长虹电器股份有限公司 手眼相机标定方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3974779A4 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907682A (zh) * 2021-04-07 2021-06-04 歌尔光学科技有限公司 一种五轴运动平台的手眼标定方法、装置及相关设备
CN113664835A (zh) * 2021-09-15 2021-11-19 上海景吾智能科技有限公司 机器人自动手眼标定方法与***
CN113843792A (zh) * 2021-09-23 2021-12-28 四川锋准机器人科技有限公司 一种手术机器人的手眼标定方法
CN113843792B (zh) * 2021-09-23 2024-02-06 四川锋准机器人科技有限公司 一种手术机器人的手眼标定方法
CN114652299A (zh) * 2022-02-25 2022-06-24 天键医疗科技(广东)有限公司 3d耳道扫描***误差修正方法
CN116038701A (zh) * 2022-12-30 2023-05-02 北京中科原动力科技有限公司 一种四轴机械臂的手眼标定方法及装置
CN116038701B (zh) * 2022-12-30 2023-12-05 北京中科原动力科技有限公司 一种四轴机械臂的手眼标定方法及装置

Also Published As

Publication number Publication date
EP3974779A1 (en) 2022-03-30
EP3974779A4 (en) 2023-01-11
CN113825980A (zh) 2021-12-21
CN113825980B (zh) 2024-04-09
US20220250248A1 (en) 2022-08-11

Similar Documents

Publication Publication Date Title
WO2021012124A1 (zh) 机器人手眼标定方法、装置、计算设备、介质以及产品
WO2021012122A1 (zh) 机器人手眼标定方法、装置、计算设备、介质以及产品
TWI672206B (zh) 機械手臂非接觸式工具中心點校正裝置及其方法以及具有校正功能的機械手臂系統
WO2021103347A1 (zh) 投影仪的梯形校正方法、装置、***及可读存储介质
KR102458415B1 (ko) 로봇 모션 용 비전 시스템의 자동 핸드-아이 캘리브레이션을 위한 시스템 및 방법
JP6167622B2 (ja) 制御システムおよび制御方法
JP6429473B2 (ja) ロボットシステム、ロボットシステムの校正方法、プログラム、およびコンピュータ読み取り可能な記録媒体
JP6812095B2 (ja) 制御方法、プログラム、記録媒体、ロボット装置、及び物品の製造方法
JP5949242B2 (ja) ロボットシステム、ロボット、ロボット制御装置、ロボット制御方法、およびロボット制御プログラム
WO2020252632A1 (zh) 一种坐标系校准方法、装置和计算机可读介质
JP2015000454A (ja) ロボット装置及びロボット制御方法
JP2015150636A (ja) ビジュアルフィードバックを利用したロボットシステム
TWI404609B (zh) 機械手臂系統參數的校正方法與校正裝置
JPWO2018043525A1 (ja) ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法
US11358290B2 (en) Control apparatus, robot system, method for operating control apparatus, and storage medium
WO2020063058A1 (zh) 一种多自由度可动视觉***的标定方法
JP7191309B2 (ja) カメラを用いるレーザープロジェクションマーキングの自動ガイド・位置決め及びリアルタイム補正方法
US20220395981A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
JP6410411B2 (ja) パターンマッチング装置及びパターンマッチング方法
EP4101604A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
WO2014048166A1 (zh) 光学防震相机模组的自动调试方法及***
CN110853102B (zh) 一种新的机器人视觉标定及引导方法、装置及计算机设备
CN107921643A (zh) 机器人***
US11951637B2 (en) Calibration apparatus and calibration method for coordinate system of robotic arm
JP2019077026A (ja) 制御装置、ロボットシステム、制御装置の動作方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19938228

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019938228

Country of ref document: EP

Effective date: 20211221

NENP Non-entry into the national phase

Ref country code: DE