WO2022062464A1 - Computer vision-based hand-eye calibration method and apparatus, and storage medium - Google Patents

Computer vision-based hand-eye calibration method and apparatus, and storage medium Download PDF

Info

Publication number
WO2022062464A1
WO2022062464A1 PCT/CN2021/097271 CN2021097271W WO2022062464A1 WO 2022062464 A1 WO2022062464 A1 WO 2022062464A1 CN 2021097271 W CN2021097271 W CN 2021097271W WO 2022062464 A1 WO2022062464 A1 WO 2022062464A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
preset
target point
coordinates
transformation
Prior art date
Application number
PCT/CN2021/097271
Other languages
French (fr)
Chinese (zh)
Inventor
喻凌威
周宸
周宝
陈远旭
Original Assignee
平安科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 平安科技(深圳)有限公司 filed Critical 平安科技(深圳)有限公司
Publication of WO2022062464A1 publication Critical patent/WO2022062464A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/77Manipulators with motion or force scaling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Definitions

  • the present application relates to the technical field of computer vision, and in particular, to a computer vision-based hand-eye calibration method and device, storage medium, and computer equipment.
  • the famous medical surgical robot, the Da Vinci surgical robot uses an external accurate visual tracking tool to assist the calibration process. Therefore, additional calibration error sources are inevitably introduced, which is easy to cause calibration errors.
  • how to perform accurate hand-eye calibration for a robot with limited freedom of movement and limited range of movement has become a hot issue.
  • the present application provides a computer vision-based hand-eye calibration method and device, storage medium, and computer equipment.
  • a computer vision-based hand-eye calibration method comprising:
  • the robotic arm of the Eye-in-hand robot is controlled to move within a preset range under the constraint of four degrees of freedom, and a calibration image is collected by a camera set at the end of the robotic arm, wherein the robotic arm is in the preset range.
  • the Z-axis of the camera coordinate system corresponding to the camera always passes through the position of the preset target point;
  • the rigid transformation between the end of the robotic arm and the camera is determined.
  • the determining the coordinates of the preset target point specifically includes:
  • auxiliary point Based on the calibration image, determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
  • the coordinates of the preset target point are determined according to the preset error term formula and the coordinates of the auxiliary point.
  • the calibration image includes N images, and the preset error term formula is:
  • N is greater than or equal to 2
  • w i is the penalty factor corresponding to the ith calibration image
  • o ref,i is the coordinate of the auxiliary point corresponding to the ith calibration image
  • pref,i is the ith calibration image
  • the corresponding coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
  • the determining the coordinates of the preset target point specifically includes:
  • the first coordinate of the preset target point is determined based on the left-eye calibration image corresponding to the left-eye camera in the binocular camera, and the first coordinate of the preset target point is determined based on the right-eye calibration image corresponding to the right-eye camera in the binocular camera.
  • the second coordinate of the preset target point is determined based on the left-eye calibration image corresponding to the left-eye camera in the binocular camera, and the first coordinate of the preset target point is determined based on the right-eye calibration image corresponding to the right-eye camera in the binocular camera.
  • the average value of the first coordinate and the second coordinate is taken as the coordinate of the preset target point.
  • the determining the rigid transformation between the end of the robotic arm and the camera specifically includes:
  • the rigid transformation between the end of the robotic arm and the camera is determined according to a preset rigid transformation formula, wherein the preset rigid transformation formula is: is the first relative transformation, pref is the coordinate of the preset target point, is the second relative transformation, and p world is the coordinate of the preset calibration point.
  • the rigid transformation includes rotational rigid transformation and translation rigid transformation; and the determining the rigid transformation between the end of the robotic arm and the camera specifically includes:
  • the translationally rigid transformation is determined.
  • the method further includes:
  • the motion of the robotic arm is controlled based on the control image and the rigid transformation of the end of the robotic arm and the camera.
  • a computer vision-based hand-eye calibration device comprising:
  • the calibration image acquisition module is used to control the robotic arm of the Eye-in-hand robot to move within a preset range under the constraint of four degrees of freedom, and to collect calibration images through a camera set at the end of the robotic arm, wherein the During the movement of the robotic arm in the preset range, the Z-axis of the camera coordinate system corresponding to the camera always passes through the preset target point position;
  • a target point coordinate determination module configured to determine the coordinates of the preset target point based on the calibration image, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system and the world coordinates the second relative transformation between the preset calibration point under the system and the end of the robotic arm;
  • a rigid transformation determination module configured to determine the end of the robotic arm and the camera according to the first relative transformation, the coordinates of the preset target point, the second relative transformation and the coordinates of the preset calibration point rigid transformation of .
  • the target point coordinate determination module specifically includes:
  • an auxiliary point determination unit configured to determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system based on the calibration image, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
  • the first target point determination unit is configured to determine the coordinates of the preset target point according to the preset error term formula and the coordinates of the auxiliary point.
  • the calibration image includes N images, and the preset error term formula is: Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the ith calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the ith calibration image, and pref,i is the ith calibration image
  • the corresponding coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
  • the target point coordinate determination module specifically includes:
  • the second target point determination unit is configured to, if the camera is a binocular camera, determine the first coordinate of the preset target point based on the left eye calibration image corresponding to the left eye camera in the binocular camera, and based on The right eye calibration image corresponding to the right eye camera in the binocular camera determines the second coordinate of the preset target point;
  • a third target point determination unit configured to use the average value of the first coordinate and the second coordinate as the coordinate of the preset target point.
  • the rigid transformation determination module specifically includes:
  • a first rigid transformation determining unit configured to determine the rigid transformation between the end of the robotic arm and the camera according to a preset rigid transformation formula, wherein the preset rigid transformation formula is: is the first relative transformation, and pref is the coordinate of the preset target point, is the second relative transformation, and p world is the coordinate of the preset calibration point.
  • the rigid transformation includes rotational rigid transformation and translation rigid transformation;
  • the rigid transformation determination module specifically includes:
  • a second rigid transformation determining unit configured to determine the rotational rigid transformation according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point;
  • a third rigid transformation determining unit configured to determine the translation rigid transformation according to the rotational rigid transformation.
  • the device further includes:
  • an image acquisition module configured to acquire a control image collected by the camera after determining the rigid transformation between the end of the robotic arm and the camera;
  • a control module configured to control the motion of the robotic arm based on the control image and the rigid transformation between the end of the robotic arm and the camera.
  • a computer-readable storage medium on which computer-readable instructions are stored, and when the program is executed by a processor, the above-mentioned computer vision-based hand-eye calibration method is implemented.
  • a computer device comprising a computer-readable storage medium, a processor, and computer-readable instructions stored on the computer-readable storage medium and executable on the processor, the processor executing The program implements the above-mentioned computer vision-based hand-eye calibration method.
  • a computer vision-based hand-eye calibration method and device, storage medium, and computer equipment realize the hand-eye calibration of the eye-in-hand robot under the dual restrictions of the freedom of movement and the range of movement.
  • Calibration By estimating the trauma position and introducing known points in the world coordinate system, combined with the calibration pictures collected by the camera at the end of the robotic arm, determine the first relative transformation between the preset target point and the camera in the camera coordinate system, and the world The preset calibration point in the coordinate system and the second relative transformation of the end of the manipulator, and then determine the camera position coordinates based on the first relative transformation and the estimated coordinates of the preset target point, based on the second relative transformation and the preset punctuation point pair.
  • the coordinates determine the position coordinates of the end of the robot arm, so that the rigid transformation from the end of the robot arm to the camera can be determined, and the hand-eye calibration problem that restricts the degree of freedom and the movement range is transformed from the original ill-posed problem to a well-posed problem, which is different from the existing technology.
  • the hand-eye calibration that restricts the degree of freedom and the movement range, it not only saves the tedious operation of introducing auxiliary calibration tools, but also reduces the introduction of calibration error sources because there is no need to introduce auxiliary calibration tools. calibration accuracy.
  • FIG. 1 shows a schematic flowchart of a computer vision-based hand-eye calibration method provided by an embodiment of the present application
  • FIG. 2 shows a schematic diagram of a movement degree of freedom provided by an embodiment of the present application
  • FIG. 3 shows a schematic diagram of a camera movement trajectory provided by an embodiment of the present application
  • FIG. 4 shows a schematic diagram of the movement range of the end of a robotic arm provided by an embodiment of the present application
  • FIG. 5 shows a schematic structural diagram of a computer vision-based hand-eye calibration device provided by an embodiment of the present application
  • FIG. 6 shows a schematic structural diagram of another computer vision-based hand-eye calibration device provided by an embodiment of the present application.
  • This application relates to computer vision technology, which can convert still image or video data into a decision or a new representation.
  • This application applies computer vision technology to medical robots.
  • the camera collects images and analyzes the images to obtain decision-making results to control the work of the robotic arm.
  • a computer vision-based hand-eye calibration method as shown in FIG. 1 , the method includes:
  • Step 101 control the robotic arm of the Eye-in-hand robot to move within a preset range under the constraint of four degrees of freedom, and collect a calibration image through a camera set at the end of the robotic arm, wherein the movement of the robotic arm within the preset range
  • the Z-axis of the camera coordinate system corresponding to the camera always passes through the preset target point position.
  • the hand-eye calibration process is the solution equation in is the relative transformation of the camera position from the ith frame to the jth frame in the camera coordinate system, is the relative transformation of the position of the end of the robot arm from the ith frame to the jth frame in the world coordinate system, and X is still the rigid transformation between the end of the robot arm and the camera. In this way, during the movement of the robot end, the end of the robot arm can be realized. and dynamic calibration of the end camera.
  • the commonly used method is to decouple the rotation component in formula (1) from the rigid transformation X, that is, to decouple the matrix X into two components of rotation and translation, and then to and Perform the same decoupling operation, (1) can be transformed into the form of multiplying two homogeneous matrices on the left and right sides, and then can be transformed into the following two equations (2) and (3) by the condition that the corresponding elements are equal , where R X and are the rotation and translation components of matrix X:
  • the degree of freedom and the range of motion are limited.
  • the degree of freedom is reduced from 6 to 4 (two translation degrees of freedom are lost, that is, the two degrees of freedom on the plane where the dots in the figure are located, otherwise the wound will be enlarged), and the 4 degrees of freedom are pan (in the camera coordinate system)
  • the x-axis direction is the central axis of rotation
  • roll rotation with the z-axis direction of the camera coordinate system as the central axis
  • tilt tilt
  • insertion movement along the z-axis of the camera coordinate system
  • the camera coordinate system is determined by the right-hand rule, and the angle range that tilt and pan can rotate in Fig.
  • the key problem to be solved in hand-eye calibration is to establish a rigid relationship between the end of the robotic arm and the end camera, since it cannot be solved by the traditional methods (2) and (3), the inventors here introduce some other reference quantities To assist the solution process, by introducing preset calibration points and preset target points to solve the problem of hand-eye calibration of surgical robots adapted to the above technical scenarios.
  • the embodiment of the present application introduces a preset target point pref , where pref is a point near the actual target point.
  • pref represents the estimated location of the wound site.
  • the robotic arm that controls the robot moves within the range corresponding to the preset target point under the constraint of four degrees of freedom.
  • the Z axis of the camera coordinate system corresponding to the control camera always passes through the preset target point p ref , pre ref is used as a reference point to be determined, pre ref is in a certain area of the wound plane (here, the wound is assumed to be a plane, because the treated wound area is small enough), under the restriction of four degrees of freedom at the end of the robotic arm,
  • the movement trajectory of the end of the manipulator is a three-dimensional cone-top space, as shown in Figure 4, which is the main view of the movement trajectory of the end of the manipulator.
  • Step 102 based on the calibration image, determine the coordinates of the preset target point, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system and the second relative transformation between the preset calibration point in the world coordinate system and the end of the robotic arm. transform.
  • determining the coordinates of the preset target point in step 102 of the present application may include the following steps:
  • Step 102-1 based on the calibration image, determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
  • Step 102-2 Determine the coordinates of the preset target point according to the preset error term formula and the coordinates of the auxiliary point.
  • (o ref -p ref ) T d ref is the projection length of a line segment on the Z-axis of the camera on the Z-axis direction vector
  • ((o ref -p ref ) T d ref )d ref is the projection
  • the vector form of , (o ref -pre ref ) is the vector form of the original line segment
  • D is the second norm of the difference between the two vectors, that is, the description of the error estimation about the position of pre ref .
  • the embodiments of the present application take into account the influence (especially insertion) brought by the four degrees of freedom in the moving process, which may lead to different lengths of expansion and contraction, and the introduction of w i eliminates the influence of the length.
  • the preset error term formula is: Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the i-th calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the i-th calibration image, and pref,i is the i-th calibration image corresponding to The coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
  • a binocular camera is used for image acquisition in some application scenarios.
  • the calibration image determines the first coordinate of the preset target point, and determines the second coordinate of the preset target point based on the right-eye calibration image corresponding to the right-eye camera in the binocular camera; the average value of the first coordinate and the second coordinate is taken as The coordinates of the preset target point.
  • the estimated positions of the corresponding preset target points are determined based on the calibration images collected by the left-eye camera and the right-eye camera respectively, that is, the first coordinates pref,left and the first coordinates Two coordinates pre ref, right , the position of the final preset target point takes the average value of the two, namely At this point, the position of the pref is obtained.
  • the first relative transformation and the second relative transformation may be determined based on the forward kinematics principle of the manipulator.
  • the camera coordinate system can be the center of the preset target point as the origin, and the x and y axes are located on the plane where the preset target point is located.
  • the z axis is the axis perpendicular to the plane where the x and y axes are located.
  • the preset position of the end of the manipulator can be the position reserved by the manipulator in the structural design stage, and the relative position between this position and the end of the manipulator is fixed.
  • the relative position to the camera position is fixed, and the preset target point can be fixed at the preset position for a long time, or can be placed at the above preset position during use.
  • the preset target point is located at the above-mentioned preset position, the preset target point and the preset position are completely and precisely matched, and then the first relative transformation relationship between the preset target point and the camera can be determined.
  • Step 103 Determine the rigid transformation between the end of the robotic arm and the camera according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point.
  • the embodiment of the present application also introduces a preset calibration point p world , where p world is any known point in the world coordinate system, and then according to the preset target point determined by the calibration image and the camera first relative
  • p world is any known point in the world coordinate system
  • the preset rigidity can be established. Transform the formula and solve it.
  • step 103 may be: determining the rigidity transformation X between the end of the manipulator and the camera according to a preset rigidity transformation formula, wherein the preset rigidity transformation formula is: is the first relative transformation, pref is the coordinate of the preset target point, is the second relative transformation, and p world is the coordinate of the preset calibration point.
  • the rigid transformation can be disassembled into two parts: the rotation component and the translation component, and the rigidity of the hand-eye calibration can be determined by solving the rotation rigid transformation and the translation rigid transformation respectively.
  • Transform matrix X Transform matrix X.
  • the rotation rigid transformation is determined according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point; and the translation rigid transformation is determined according to the rotation rigid transformation.
  • the Z axis of the end effector of the manipulator is parallel to the Z axis of the camera, thereby reducing the rotational degrees of freedom of the rigid matrix X from 6 to 4, that is,
  • the preset rigid transformation formula has nothing to do with the rotation and translation components in the hand-eye calibration, but because X needs to satisfy both Formula (1), so it is not enough to solve only by the preset rigid transformation formula, so what needs to be solved is a constraint problem. From (2) and (3), it can be deduced that the translation components of the decoupling of these two formulas can be inferred.
  • the preset rigid transformation formula must be satisfied at the same time.
  • the corresponding hand-eye calibration matrix X can be finally determined by establishing the least squares method for the rotation matrix and translation vector respectively, and then obtaining the rotation component and translation component.
  • the solution of the rotation component R X can be transformed into an optimization problem through the LM algorithm (Levenberg-Marquardt, Levenberg-Marquardt algorithm), which can be solved by the following formula: in, is the relative transformation of the camera coordinates in the N-1th frame calibration image to the camera coordinates in the Nth frame calibration image, N is the number of calibration images, The relative transformation of the coordinates of the end of the robot arm in the frame N-1 calibration image to the coordinates of the end of the robot arm in the frame N calibration image in the translation direction, For the relative transformation of the camera coordinates in the N-1th frame calibration image to the camera coordinates in the Nth frame calibration image in the translation direction, is the average value of the camera coordinates of any frame of the calibration image corresponding to the left-eye camera and the right-eye camera, is the average value of the coordinates of the LM
  • the eye-in-hand robot can be calibrated by hand under the double restrictions of the freedom of movement and the range of movement.
  • the eye-in-hand robot can be calibrated by hand under the double restrictions of the freedom of movement and the range of movement.
  • the parameters are decoupled to determine the rigid transformation from the end of the robot arm to the camera on the rotation and translation levels, and the hand-eye calibration problem that limits the degree of freedom and movement range is transformed from the original ill-posed problem to a well-posed problem, which is consistent with the existing technology.
  • the hand-eye calibration that limits the degree of freedom and the movement range, it not only saves the tedious operation of introducing auxiliary calibration tools, but also reduces the introduction of calibration error sources because there is no need to introduce auxiliary calibration tools. Improved calibration accuracy.
  • step 103 it may further include: acquiring a control image captured by the camera; and controlling the motion of the robotic arm based on the control image and the rigid transformation between the end of the robotic arm and the camera.
  • the movement of the robotic arm of the robot can be controlled according to the calibration results, thereby realizing the use of the robot to treat wounds and realize medical functions.
  • an embodiment of the present application provides a computer vision-based hand-eye calibration device. As shown in FIG. 5 , the device includes:
  • the calibration image acquisition module 51 is used to control the robotic arm of the Eye-in-hand robot to move within a preset range under the constraint of four degrees of freedom, and to acquire a calibration image through a camera set at the end of the robotic arm, wherein the robotic arm is at the end of the robotic arm.
  • the Z-axis of the camera coordinate system corresponding to the camera always passes through the preset target point position;
  • the target point coordinate determination module 52 is used to determine the coordinates of the preset target point based on the calibration image, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system, and the preset calibration point in the world coordinate system. a second relative transformation of the end of the arm;
  • the rigid transformation determining module 53 is configured to determine the rigid transformation between the end of the robotic arm and the camera according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point.
  • the target point coordinate determination module 52 specifically includes:
  • the auxiliary point determination unit 521 is used to determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system based on the calibration image, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
  • the first target point determination unit 522 is configured to determine the coordinates of the preset target point according to the preset error term formula and the coordinates of the auxiliary points.
  • the calibration images include N images, and the preset error term formula is Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the i-th calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the i-th calibration image, and pref,i is the i-th calibration image corresponding to The coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
  • the target point coordinate determination module 52 specifically includes:
  • the second target point determining unit 523 is configured to, if the camera is a binocular camera, determine the first coordinate of the preset target point based on the left eye calibration image corresponding to the left eye camera in the binocular camera, and determine the first coordinates of the preset target point based on the binocular camera
  • the right eye calibration image corresponding to the right eye camera determines the second coordinate of the preset target point
  • the third target point determination unit 524 is configured to use the average value of the first coordinate and the second coordinate as the coordinate of the preset target point.
  • the rigid transformation determination module 53 specifically includes:
  • the first rigid transformation determination unit 531 is used to determine the rigid transformation between the end of the robot arm and the camera according to a preset rigid transformation formula, wherein the preset rigid transformation formula is: is the first relative transformation, pref is the coordinate of the preset target point, is the second relative transformation, and p world is the coordinate of the preset calibration point.
  • the rigid transformation includes rotational rigid transformation and translation rigid transformation;
  • the rigid transformation determination module 53 specifically includes:
  • the second rigid transformation determining unit 532 is configured to determine the rotational rigid transformation according to the first relative transformation, the coordinates of the preset target point, the second relative transformation and the coordinates of the preset calibration point;
  • the third rigid transformation determining unit 533 is configured to determine the translation rigid transformation according to the rotational rigid transformation.
  • the device further includes:
  • the image acquisition module 54 is used to acquire the control image acquired by the camera after determining the rigid transformation between the end of the robotic arm and the camera;
  • the control module 55 is configured to control the movement of the robotic arm based on the control image and the rigid transformation between the end of the robotic arm and the camera.
  • an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium may be non-volatile or volatile.
  • the computer-readable storage medium stores computer-readable instructions, and when the computer-readable instructions are executed by the processor, implements the above-mentioned computer vision-based hand-eye calibration method shown in FIG. 1 to FIG. 4 .
  • the technical solution of the present application can be embodied in the form of a software product, and the software product can be stored in a non-volatile storage medium (which can be a CD-ROM, U disk, mobile hard disk, etc.), or a volatile storage medium.
  • the non-volatile storage medium includes several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in various implementation scenarios of this application.
  • an embodiment of the present application further provides a computer device, which may specifically be a personal computer, A server, a network device, etc.
  • the computer device includes a computer-readable storage medium and a processor; a computer-readable storage medium for storing computer-readable instructions; a processor for executing computer-readable instructions to achieve the above-mentioned Figures 1 to 1 Figure 4 shows the computer vision-based hand-eye calibration method.
  • the computer device may further include a user interface, a network interface, a camera, a radio frequency (Radio Frequency, RF) circuit, a sensor, an audio circuit, a WI-FI module, and the like.
  • the user interface may include a display screen (Display), an input unit such as a keyboard (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, and the like.
  • Optional network interfaces may include standard wired interfaces, wireless interfaces (such as Bluetooth interfaces, WI-FI interfaces), and the like.
  • a computer device does not constitute a limitation on the computer device, and may include more or less components, or combine some components, or arrange different components.
  • the computer-readable storage medium may further include an operating system and a network communication module.
  • An operating system is a program that manages and saves the hardware and software resources of computer equipment, supports the operation of information processing programs and other software and/or programs.
  • the network communication module is used to implement communication between various components in the computer-readable storage medium, as well as communication with other hardware and software in the physical device.
  • the hand-eye calibration of the eye-in-hand robot is carried out under the constraints.
  • the preset target point in the camera coordinate system is determined.
  • the first relative transformation with the camera and the second relative transformation between the preset calibration point in the world coordinate system and the end of the robot arm, and then the above parameters are decoupled to determine the rigidity of the robot arm end to the camera on the rotation and translation levels, respectively.
  • the transformation transforms the hand-eye calibration problem with limited degrees of freedom and limited movement range from the original ill-posed problem to a well-posed problem, and finally solves the hand-eye calibration problem with limited degrees of freedom and limited movement range.
  • the accompanying drawing is only a schematic diagram of a preferred implementation scenario, and the modules or processes in the accompanying drawing are not necessarily necessary to implement the present application.
  • the modules in the device in the implementation scenario may be distributed in the device in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the implementation scenario with corresponding changes.
  • the modules of the above implementation scenarios may be combined into one module, or may be further split into multiple sub-modules.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)

Abstract

A computer vision-based hand-eye calibration method and apparatus, a storage medium, and a computer device. The hand-eye calibration method comprises: controlling the mechanical arm of a robot to move within a preset range under the constraint condition of four degrees of freedom, and collecting a calibration image by means of a camera provided at the tail end of the mechanical arm, wherein during the movement of the mechanical arm within the preset range, the Z-axis of a camera coordinate system corresponding to the camera always passes through a preset target point position (101); according to the calibration image, determining the coordinates of a preset target point, and determining first relative transformation between the preset target point and the camera in the camera coordinate system, and second relative transformation between a preset calibration point and the tail end of the mechanical arm in a world coordinate system (102); and determining rigid transformation between the tail end of the mechanical arm and the camera according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point (103).

Description

基于计算机视觉的手眼标定方法及装置、存储介质Computer vision-based hand-eye calibration method and device, storage medium
本申请要求与2020年9月27日提交中国专利局、申请号为202011030608.0、申请名称为“基于计算机视觉的手眼标定方法及装置、存储介质”的中国专利申请的优先权,其全部内容通过引用结合在申请中。This application claims priority to the Chinese patent application filed on September 27, 2020 with the application number 202011030608.0 and the application title "Computer Vision-Based Hand-Eye Calibration Method and Device, Storage Medium", the entire contents of which are by reference incorporated in the application.
技术领域technical field
本申请涉及计算机视觉技术领域,尤其是涉及到一种基于计算机视觉的手眼标定方法及装置、存储介质、计算机设备。The present application relates to the technical field of computer vision, and in particular, to a computer vision-based hand-eye calibration method and device, storage medium, and computer equipment.
背景技术Background technique
目前,医疗电子机械设备的发展对先进治疗技术的发展起到了巨大的帮助,例如医疗机器人。手术机器人除了需要精准的位置、力控制,完善的视觉传感***,另外一个依赖性很强的技术就是实现精准鲁棒的手眼标定,保证机器人和手术专用相机间的坐标***关系。在eye-in-hand机械臂设置下(即相机放置在机械臂末端),手眼标定技术在提供机械臂末端与相机坐标的刚性变换的过程中起着关键的作用。就手眼标定算法本身而言,已经有大量成熟和准确的算法来减少标定误差从而达到稳定的应用效果。但这些算法的前提都要求末端的移动为6个自由度,且需要一定程度的运动范围作为前提,因此在移动自由度数量和移动范围的双重限定下,如何完成标定过程一个值得研究的问题。At present, the development of medical electro-mechanical devices has greatly helped the development of advanced treatment technologies, such as medical robots. In addition to precise position and force control, and a complete visual sensing system for surgical robots, another highly dependent technology is to achieve accurate and robust hand-eye calibration to ensure the coordinate system relationship between the robot and the surgical camera. In the eye-in-hand manipulator setup (i.e. the camera is placed at the end of the manipulator), the hand-eye calibration technique plays a key role in providing a rigid transformation of the coordinates of the manipulator end and the camera. As far as the hand-eye calibration algorithm itself is concerned, there are a large number of mature and accurate algorithms to reduce the calibration error and achieve stable application results. However, the premise of these algorithms requires that the movement of the end is 6 degrees of freedom, and a certain degree of motion range is required as a premise. Therefore, under the dual limitation of the number of movement degrees of freedom and the range of motion, how to complete the calibration process is a problem worth studying.
发明人发现,结合该问题映射到实际场景中的问题,比如在医疗手术机器人的工作过程中,机械臂末端的自由度会被限制,且移动范围需要限制地非常小,稍有不慎就会影响伤口大小及手术效果,因此,在该过程中实现精准的手眼标定是非常重要的问题。著名的医疗手术机器人—达芬奇手术机器人在该过程中则是引入外部精准的视觉跟踪工具辅助标定过程,因此也不可避免地引入额外的标定误差源,容易造成标定误差。目前在该技术领域,如何对限定移动自由度和限定移动范围的机器人进行精准的手眼标定成为了热点问题。The inventor found that, combining this problem into the problem in the actual scene, for example, during the working process of the medical surgical robot, the degree of freedom of the end of the mechanical arm will be limited, and the movement range needs to be limited to a very small extent, which will cause a slight accident. It affects the size of the wound and the surgical effect. Therefore, it is very important to achieve accurate hand-eye calibration in this process. The famous medical surgical robot, the Da Vinci surgical robot, uses an external accurate visual tracking tool to assist the calibration process. Therefore, additional calibration error sources are inevitably introduced, which is easy to cause calibration errors. At present, in this technical field, how to perform accurate hand-eye calibration for a robot with limited freedom of movement and limited range of movement has become a hot issue.
发明内容SUMMARY OF THE INVENTION
有鉴于此,本申请提供了一种基于计算机视觉的手眼标定方法及装置、存储介质、计算机设备。In view of this, the present application provides a computer vision-based hand-eye calibration method and device, storage medium, and computer equipment.
根据本申请的一个方面,提供了一种基于计算机视觉的手眼标定方法,包括:According to one aspect of the present application, a computer vision-based hand-eye calibration method is provided, comprising:
控制Eye-in-hand机器人的机械臂在四个自由度的约束条件下在预设范围移动并通过设置在所述机械臂末端的相机采集标定图像,其中,所述机械臂在所述预设范围的移动过程中所述相机对应的相机坐标系Z轴始终穿过预设目标点的位置;The robotic arm of the Eye-in-hand robot is controlled to move within a preset range under the constraint of four degrees of freedom, and a calibration image is collected by a camera set at the end of the robotic arm, wherein the robotic arm is in the preset range. During the movement of the range, the Z-axis of the camera coordinate system corresponding to the camera always passes through the position of the preset target point;
基于所述标定图像,确定所述预设目标点的坐标,并确定所述相机坐标系下所述预设目标点与所述相机的第一相对变换以及世界坐标系下的预设标定点与所述机械臂末端的第二相对变换;Based on the calibration image, determine the coordinates of the preset target point, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system, and the preset calibration point and the world coordinate system. a second relative transformation of the end of the robotic arm;
根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述机械臂末端与所述相机的刚性变换。According to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point, the rigid transformation between the end of the robotic arm and the camera is determined.
具体地,所述确定所述预设目标点的坐标,具体包括:Specifically, the determining the coordinates of the preset target point specifically includes:
基于所述标定图像,确定所述相机坐标系的Z轴对应的辅助点的坐标,其中,所述辅助点为所述相机坐标系的Z轴上任意一点;Based on the calibration image, determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
按照预设误差项公式以及所述辅助点的坐标,确定所述预设目标点的坐标。The coordinates of the preset target point are determined according to the preset error term formula and the coordinates of the auxiliary point.
具体地,所述标定图像包括N张,所述预设误差项公式为Specifically, the calibration image includes N images, and the preset error term formula is:
Figure PCTCN2021097271-appb-000001
其中,N大于或等于2,w i为第i张标定图像对应的惩罚因子,o ref,i为第i张标定图像对应的所述辅助点的坐标,p ref,i为第i张标定图像对应的所述预设目标点的坐标,d ref,i为第i张标定图像对应的所述相机坐标系的Z轴方向向量。
Figure PCTCN2021097271-appb-000001
Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the ith calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the ith calibration image, and pref,i is the ith calibration image The corresponding coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
具体地,若所述相机为双目相机,则所述确定所述预设目标点的坐标,具体包括:Specifically, if the camera is a binocular camera, the determining the coordinates of the preset target point specifically includes:
基于所述双目相机中的左眼相机对应的左眼标定图像确定所述预设目标点的第一坐标,以及基于所述双目相机中的右眼相机对应的右眼标定图像确定所述预设目标点的第二坐标;The first coordinate of the preset target point is determined based on the left-eye calibration image corresponding to the left-eye camera in the binocular camera, and the first coordinate of the preset target point is determined based on the right-eye calibration image corresponding to the right-eye camera in the binocular camera. the second coordinate of the preset target point;
将所述第一坐标和所述第二坐标的平均值作为所述预设目标点的坐标。The average value of the first coordinate and the second coordinate is taken as the coordinate of the preset target point.
具体地,所述确定所述机械臂末端与所述相机的刚性变换,具体包括:Specifically, the determining the rigid transformation between the end of the robotic arm and the camera specifically includes:
按照预设刚性变换公式确定所述机械臂末端与所述相机的刚性变换,其中,所述预设刚性变换公式为
Figure PCTCN2021097271-appb-000002
为所述第一相对变换, p ref为所述预设目标点的坐标,
Figure PCTCN2021097271-appb-000003
为所述第二相对变换,p world为所述预设标定点的坐标。
The rigid transformation between the end of the robotic arm and the camera is determined according to a preset rigid transformation formula, wherein the preset rigid transformation formula is:
Figure PCTCN2021097271-appb-000002
is the first relative transformation, pref is the coordinate of the preset target point,
Figure PCTCN2021097271-appb-000003
is the second relative transformation, and p world is the coordinate of the preset calibration point.
具体地,所述刚性变换包括旋转刚性变换和平移刚性变换;所述确定所述机械臂末端与所述相机的刚性变换,具体包括:Specifically, the rigid transformation includes rotational rigid transformation and translation rigid transformation; and the determining the rigid transformation between the end of the robotic arm and the camera specifically includes:
根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述旋转刚性变换;determining the rotationally rigid transformation according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point;
根据所述旋转刚性变换,确定所述平移刚性变换。From the rotationally rigid transformation, the translationally rigid transformation is determined.
具体地,所述确定所述机械臂末端与所述相机的刚性变换之后,所述方法还包括:Specifically, after determining the rigid transformation between the end of the robotic arm and the camera, the method further includes:
获取所述相机采集的控制图像;acquiring a control image captured by the camera;
基于所述控制图像以及所述机械臂末端与所述相机的刚性变换,控制所述机械臂运动。The motion of the robotic arm is controlled based on the control image and the rigid transformation of the end of the robotic arm and the camera.
根据本申请的另一方面,提供了一种基于计算机视觉的手眼标定装置,包括:According to another aspect of the present application, a computer vision-based hand-eye calibration device is provided, comprising:
标定图像采集模块,用于控制Eye-in-hand机器人的机械臂在四个自由度的约束条件下在预设范围移动并通过设置在所述机械臂末端的相机采集标定图像,其中,所述机械臂在所述预设范围的移动过程中所述相机对应的相机坐标系Z轴始终穿过预设目标点位置;The calibration image acquisition module is used to control the robotic arm of the Eye-in-hand robot to move within a preset range under the constraint of four degrees of freedom, and to collect calibration images through a camera set at the end of the robotic arm, wherein the During the movement of the robotic arm in the preset range, the Z-axis of the camera coordinate system corresponding to the camera always passes through the preset target point position;
目标点坐标确定模块,用于基于所述标定图像,确定所述预设目标点的坐标,并确定所述相机坐标系下所述预设目标点与所述相机的第一相对变换以及世界坐标系下的预设标定点与所述机械臂末端的第二相对变换;A target point coordinate determination module, configured to determine the coordinates of the preset target point based on the calibration image, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system and the world coordinates the second relative transformation between the preset calibration point under the system and the end of the robotic arm;
刚性变换确定模块,用于根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述机械臂末端与所述相机的刚性变换。a rigid transformation determination module, configured to determine the end of the robotic arm and the camera according to the first relative transformation, the coordinates of the preset target point, the second relative transformation and the coordinates of the preset calibration point rigid transformation of .
具体地,所述目标点坐标确定模块,具体包括:Specifically, the target point coordinate determination module specifically includes:
辅助点确定单元,用于基于所述标定图像,确定所述相机坐标系的Z轴对应的辅助点的坐标,其中,所述辅助点为所述相机坐标系的Z轴上任意一点;an auxiliary point determination unit, configured to determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system based on the calibration image, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
第一目标点确定单元,用于按照预设误差项公式以及所述辅助点的坐标,确定所述预设目标点的坐标。The first target point determination unit is configured to determine the coordinates of the preset target point according to the preset error term formula and the coordinates of the auxiliary point.
具体地,所述标定图像包括N张,所述预设误差项公式为
Figure PCTCN2021097271-appb-000004
其中,N大于或等于2,w i为第i张标定图像对应的惩罚因子,o ref,i为第i张标定图像对应的所述辅助点的坐标,p ref,i为第i张标定图像对应的所述预设目标点的坐标,d ref,i为第i张标定图像对应的所述相机坐标系的Z轴方向向量。
Specifically, the calibration image includes N images, and the preset error term formula is:
Figure PCTCN2021097271-appb-000004
Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the ith calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the ith calibration image, and pref,i is the ith calibration image The corresponding coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
具体地,所述目标点坐标确定模块,具体包括:Specifically, the target point coordinate determination module specifically includes:
第二目标点确定单元,用于若所述相机为双目相机,则基于所述双目相机中的左眼相机对应的左眼标定图像确定所述预设目标点的第一坐标,以及基于所述双目相机中的右眼相机对应的右眼标定图像确定所述预设目标点的第二坐标;The second target point determination unit is configured to, if the camera is a binocular camera, determine the first coordinate of the preset target point based on the left eye calibration image corresponding to the left eye camera in the binocular camera, and based on The right eye calibration image corresponding to the right eye camera in the binocular camera determines the second coordinate of the preset target point;
第三目标点确定单元,用于将所述第一坐标和所述第二坐标的平均值作为所述预设目标点的坐标。A third target point determination unit, configured to use the average value of the first coordinate and the second coordinate as the coordinate of the preset target point.
具体地,所述刚性变换确定模块,具体包括:Specifically, the rigid transformation determination module specifically includes:
第一刚性变换确定单元,用于按照预设刚性变换公式确定所述机械臂末端与所述相机的刚性变换,其中,所述预设刚性变换公式为
Figure PCTCN2021097271-appb-000005
Figure PCTCN2021097271-appb-000006
为所述第一相对变换,p ref为所述预设目标点的坐标,
Figure PCTCN2021097271-appb-000007
为所述第二相对变换,p world为所述预设标定点的坐标。
a first rigid transformation determining unit, configured to determine the rigid transformation between the end of the robotic arm and the camera according to a preset rigid transformation formula, wherein the preset rigid transformation formula is:
Figure PCTCN2021097271-appb-000005
Figure PCTCN2021097271-appb-000006
is the first relative transformation, and pref is the coordinate of the preset target point,
Figure PCTCN2021097271-appb-000007
is the second relative transformation, and p world is the coordinate of the preset calibration point.
具体地,所述刚性变换包括旋转刚性变换和平移刚性变换;所述刚性变换确定模块,具体包括:Specifically, the rigid transformation includes rotational rigid transformation and translation rigid transformation; the rigid transformation determination module specifically includes:
第二刚性变换确定单元,用于根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述旋转刚性变换;a second rigid transformation determining unit, configured to determine the rotational rigid transformation according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point;
第三刚性变换确定单元,用于根据所述旋转刚性变换,确定所述平移刚性变换。A third rigid transformation determining unit, configured to determine the translation rigid transformation according to the rotational rigid transformation.
具体地,所述装置还包括:Specifically, the device further includes:
图像获取模块,用于确定所述机械臂末端与所述相机的刚性变换之后,获取所述相机采集的控制图像;an image acquisition module, configured to acquire a control image collected by the camera after determining the rigid transformation between the end of the robotic arm and the camera;
控制模块,用于基于所述控制图像以及所述机械臂末端与所述相机的刚性变换,控制所述机械臂运动。A control module, configured to control the motion of the robotic arm based on the control image and the rigid transformation between the end of the robotic arm and the camera.
依据本申请又一个方面,提供了一种计算机可读存储介质,其上存储有计算机可读指令,所述程序被处理器执行时实现上述基于计算机视觉的手眼标定方法。According to yet another aspect of the present application, a computer-readable storage medium is provided on which computer-readable instructions are stored, and when the program is executed by a processor, the above-mentioned computer vision-based hand-eye calibration method is implemented.
依据本申请再一个方面,提供了一种计算机设备,包括计算机可读存储介质、处理器及存储在计算机可读存储介质上并可在处理器上运行的计算机可读指令,所述处理器执行所述程序时实现上述基于计算机视觉的手眼标定方法。According to yet another aspect of the present application, a computer device is provided, comprising a computer-readable storage medium, a processor, and computer-readable instructions stored on the computer-readable storage medium and executable on the processor, the processor executing The program implements the above-mentioned computer vision-based hand-eye calibration method.
借由上述技术方案,本申请提供的一种基于计算机视觉的手眼标定方法及装置、存储介质、计算机设备,实现了在移动自由度和移动范围的双重限制下对eye-in-hand机器人进行手眼标定,通过对创伤位置的预估以及引入世界坐标系下的已知点,结合机械臂末端的相机采集的标定图片,确定相机坐标系下预设目标点与相机的第一相对变换,以及世界坐标系下的预设标定点与机械臂末端的第二相对变换,进而基于第一相对变换和预估的预设目标点的坐标确定相机位置坐标,基于第二相对变换和预设标点对的坐标确定机械臂末端位置坐标,从而可以确定机器人机械臂末端到相机的刚性变换,将限制自由度和限制移动范围的手眼标定问题从原来的不适定问题转化为适定问题,与现有技术中需要引入辅助标定工具来进行限制自由度和限制移动范围的手眼标定相比,不仅省去了引入辅助标定工具的繁琐操作,而且由于不需要引入辅助标定工具也减少了标定误差源的引入,提高了标定精度。With the above technical solutions, a computer vision-based hand-eye calibration method and device, storage medium, and computer equipment provided by the present application realize the hand-eye calibration of the eye-in-hand robot under the dual restrictions of the freedom of movement and the range of movement. Calibration: By estimating the trauma position and introducing known points in the world coordinate system, combined with the calibration pictures collected by the camera at the end of the robotic arm, determine the first relative transformation between the preset target point and the camera in the camera coordinate system, and the world The preset calibration point in the coordinate system and the second relative transformation of the end of the manipulator, and then determine the camera position coordinates based on the first relative transformation and the estimated coordinates of the preset target point, based on the second relative transformation and the preset punctuation point pair. The coordinates determine the position coordinates of the end of the robot arm, so that the rigid transformation from the end of the robot arm to the camera can be determined, and the hand-eye calibration problem that restricts the degree of freedom and the movement range is transformed from the original ill-posed problem to a well-posed problem, which is different from the existing technology. Compared with the hand-eye calibration that restricts the degree of freedom and the movement range, it not only saves the tedious operation of introducing auxiliary calibration tools, but also reduces the introduction of calibration error sources because there is no need to introduce auxiliary calibration tools. calibration accuracy.
上述说明仅是本申请技术方案的概述,为了能够更清楚了解本申请的技术手段,而可依照说明书的内容予以实施,并且为了让本申请的上述和其它目的、特征和优点能够更明显易懂,以下特举本申请的具体实施方式。The above description is only an overview of the technical solution of the present application. In order to be able to understand the technical means of the present application more clearly, it can be implemented according to the content of the description, and in order to make the above-mentioned and other purposes, features and advantages of the present application more obvious and easy to understand , and the specific embodiments of the present application are listed below.
附图说明Description of drawings
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:The drawings described herein are used to provide further understanding of the present application and constitute a part of the present application. The schematic embodiments and descriptions of the present application are used to explain the present application and do not constitute an improper limitation of the present application. In the attached image:
图1示出了本申请实施例提供的一种基于计算机视觉的手眼标定方法的流程示意图;FIG. 1 shows a schematic flowchart of a computer vision-based hand-eye calibration method provided by an embodiment of the present application;
图2示出了本申请实施例提供的一种移动自由度的示意图;FIG. 2 shows a schematic diagram of a movement degree of freedom provided by an embodiment of the present application;
图3示出了本申请实施例提供的一种相机移动轨迹的示意图;FIG. 3 shows a schematic diagram of a camera movement trajectory provided by an embodiment of the present application;
图4示出了本申请实施例提供的一种机械臂末端的移动范围的示意图;FIG. 4 shows a schematic diagram of the movement range of the end of a robotic arm provided by an embodiment of the present application;
图5示出了本申请实施例提供的一种基于计算机视觉的手眼标定装置的结构示意图;5 shows a schematic structural diagram of a computer vision-based hand-eye calibration device provided by an embodiment of the present application;
图6示出了本申请实施例提供的另一种基于计算机视觉的手眼标定装置的结构示意图。FIG. 6 shows a schematic structural diagram of another computer vision-based hand-eye calibration device provided by an embodiment of the present application.
具体实施方式detailed description
下文中将参考附图并结合实施例来详细说明本申请。需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。Hereinafter, the present application will be described in detail with reference to the accompanying drawings and in conjunction with the embodiments. It should be noted that the embodiments in the present application and the features of the embodiments may be combined with each other in the case of no conflict.
本申请涉及计算机视觉技术,计算机视觉这种技术可以将静止图像或视频数据转换为一种决策或新的表示,本申请将计算机视觉技术应用在医疗机器人上,通过设置在机器人的机械臂末端的相机采集图像,对图像进行分析后得到决策结果从而控制机械臂工作。This application relates to computer vision technology, which can convert still image or video data into a decision or a new representation. This application applies computer vision technology to medical robots. The camera collects images and analyzes the images to obtain decision-making results to control the work of the robotic arm.
在本实施例中提供了一种基于计算机视觉的手眼标定方法,如图1所示,该方法包括:In this embodiment, a computer vision-based hand-eye calibration method is provided, as shown in FIG. 1 , the method includes:
步骤101,控制Eye-in-hand机器人的机械臂在四个自由度的约束条件下在预设范围移动并通过设置在机械臂末端的相机采集标定图像,其中,机械臂在预设范围的移动过程中相机对应的相机坐标系Z轴始终穿过预设目标点位置。 Step 101 , control the robotic arm of the Eye-in-hand robot to move within a preset range under the constraint of four degrees of freedom, and collect a calibration image through a camera set at the end of the robotic arm, wherein the movement of the robotic arm within the preset range During the process, the Z-axis of the camera coordinate system corresponding to the camera always passes through the preset target point position.
手眼标定的问题对应的数学描述为AX=XB,其中A,B为在不同参考系下的相对变换,X为机器人的机械臂末端和相机间的刚性变换。具体到本申请实施例中,手眼标定过程即为解决方程
Figure PCTCN2021097271-appb-000008
其中
Figure PCTCN2021097271-appb-000009
为相机坐标系中从第i帧到第j帧相机位置的相对变换,
Figure PCTCN2021097271-appb-000010
为世界坐标系中机械臂末端位置从第i帧到第j帧的相对变换,X依然为机械臂末端到相机之间的刚性变换,这样,在机器人末端移动的过程中,可以实现机械臂末端和末端相机的动态标定。
The mathematical description corresponding to the problem of hand-eye calibration is AX=XB, where A and B are relative transformations in different reference frames, and X is the rigid transformation between the end of the robot arm and the camera. Specifically in the embodiment of the present application, the hand-eye calibration process is the solution equation
Figure PCTCN2021097271-appb-000008
in
Figure PCTCN2021097271-appb-000009
is the relative transformation of the camera position from the ith frame to the jth frame in the camera coordinate system,
Figure PCTCN2021097271-appb-000010
is the relative transformation of the position of the end of the robot arm from the ith frame to the jth frame in the world coordinate system, and X is still the rigid transformation between the end of the robot arm and the camera. In this way, during the movement of the robot end, the end of the robot arm can be realized. and dynamic calibration of the end camera.
对于(1)式,常用的方式是把(1)式中的旋转分量从刚性变换X中进行解耦,也就是将矩阵X解耦为旋转和平移两个分量来表示,再对
Figure PCTCN2021097271-appb-000011
Figure PCTCN2021097271-appb-000012
进行同样解耦的操作,(1)式即可转化为左右两边两个齐次矩阵相 乘的形式,再由对应元素相等的条件即可转换为下方(2)和(3)两个等式,其中R X
Figure PCTCN2021097271-appb-000013
为矩阵X的旋转和平移分量:
For formula (1), the commonly used method is to decouple the rotation component in formula (1) from the rigid transformation X, that is, to decouple the matrix X into two components of rotation and translation, and then to
Figure PCTCN2021097271-appb-000011
and
Figure PCTCN2021097271-appb-000012
Perform the same decoupling operation, (1) can be transformed into the form of multiplying two homogeneous matrices on the left and right sides, and then can be transformed into the following two equations (2) and (3) by the condition that the corresponding elements are equal , where R X and
Figure PCTCN2021097271-appb-000013
are the rotation and translation components of matrix X:
Figure PCTCN2021097271-appb-000014
Figure PCTCN2021097271-appb-000014
Figure PCTCN2021097271-appb-000015
Figure PCTCN2021097271-appb-000015
对于手术机器人而言,在某些情况下,自由度和运动范围都会受到限制,如图2所示,在机械臂末端处理伤口区域时,为了不使伤口的范围更大,机械臂末端的自由度由6个降为了4个(失去了两个平移自由度,即图中圆点所在的平面上的两个自由度,否则会扩大伤口),4个自由度分别为pan(以相机坐标系x轴方向为中心轴旋转),roll(以相机坐标系z轴方向为中心轴旋转),tilt(以相机坐标系y轴方向为中心轴旋转),insertion(沿相机坐标系z轴方向移动),相机坐标系以右手定则确定,且图2中tilt和pan能够旋转的角度范围非常小,在这种情况下,(2)式中的
Figure PCTCN2021097271-appb-000016
Figure PCTCN2021097271-appb-000017
由于旋转角度接近于零而导致两个矩阵都近似为单位矩阵,因此(2)式在该情况下为恒等式,在这种情况下,(3)式中就失去了对旋转自由度的限定,从而导致该背景下的手眼标定问题是一个ill-posed(不适定)的病态问题,这种情况下无法确定机器人机械臂末端到相机的刚性变换,即无法实现手眼标定。基于上述描述,可以很清楚地看到导致ill-posed的问题的根本原因是平移自由度的损失和旋转范围的急剧缩小和限制,因而在这种情况下,基于(2)、(3)两式的求解方式不再适用。因为手眼标定的要解决的关键问题就是建立机械臂末端和末端相机之间的刚性关系,由于不能通过基于传统方式的(2)、(3)式求解,这里发明人引入了一些其他的参考量来辅助求解过程,通过引入预设标定点和预设目标点来解决适应于上述技术场景的手术机器人的手眼标定问题。
For surgical robots, in some cases, the degree of freedom and the range of motion are limited. As shown in Figure 2, when the end of the robotic arm handles the wound area, in order not to make the wound larger, the freedom of the end of the robotic arm The degree of freedom is reduced from 6 to 4 (two translation degrees of freedom are lost, that is, the two degrees of freedom on the plane where the dots in the figure are located, otherwise the wound will be enlarged), and the 4 degrees of freedom are pan (in the camera coordinate system) The x-axis direction is the central axis of rotation), roll (rotation with the z-axis direction of the camera coordinate system as the central axis), tilt (rotation with the y-axis direction of the camera coordinate system as the central axis), insertion (movement along the z-axis of the camera coordinate system) , the camera coordinate system is determined by the right-hand rule, and the angle range that tilt and pan can rotate in Fig. 2 is very small. In this case, in equation (2)
Figure PCTCN2021097271-appb-000016
and
Figure PCTCN2021097271-appb-000017
Since the rotation angle is close to zero, both matrices are approximated as identity matrices, so equation (2) is an identity in this case. In this case, the restriction on the degree of freedom of rotation is lost in equation (3), As a result, the hand-eye calibration problem in this context is an ill-posed (ill-posed) ill-posed problem. In this case, the rigid transformation from the end of the robot arm to the camera cannot be determined, that is, hand-eye calibration cannot be achieved. Based on the above description, it can be clearly seen that the root cause of the ill-posed problem is the loss of translation degrees of freedom and the sharp reduction and limitation of the rotation range, so in this case, based on (2), (3) two The solution method of formula no longer applies. Because the key problem to be solved in hand-eye calibration is to establish a rigid relationship between the end of the robotic arm and the end camera, since it cannot be solved by the traditional methods (2) and (3), the inventors here introduce some other reference quantities To assist the solution process, by introducing preset calibration points and preset target points to solve the problem of hand-eye calibration of surgical robots adapted to the above technical scenarios.
在上述步骤中,如图3所示,本申请实施例引入了预设目标点p ref,p ref为实际目标点附近的某一点,在本申请实施例中由于创伤口很小,因此p ref代表创伤口的估计位置。控制机器人的机械臂在四个自由度的约束条件下在预设目标点对应的范围内移动,在相机的移动过程中,控制相机对应的相机坐标系的Z轴始终穿过预设目标点p ref,p ref作为一个待求的参考点,p ref在伤口平面的某个区域(这里假设伤口为平面,因为处理的伤口区域足够小), 在机械臂末端的四个自由度的限制下,机械臂末端的移动轨迹是一个三维锥顶的空间,如图4所示为机械臂末端运动轨迹的主视图,在移动过程中始终交于p ref,insertion会导致伸缩长度的变化,图4中上部分的线1、线2和线3分别代表了不同伸缩长度和旋转程度(tilt,pan)而引起的机械臂末端上限,注意roll只会改变机械臂末端的姿态而不会影响图4中运动空间的变化。 In the above steps, as shown in FIG. 3 , the embodiment of the present application introduces a preset target point pref , where pref is a point near the actual target point. In the embodiment of the present application, because the wound is small, pref represents the estimated location of the wound site. The robotic arm that controls the robot moves within the range corresponding to the preset target point under the constraint of four degrees of freedom. During the movement of the camera, the Z axis of the camera coordinate system corresponding to the control camera always passes through the preset target point p ref , pre ref is used as a reference point to be determined, pre ref is in a certain area of the wound plane (here, the wound is assumed to be a plane, because the treated wound area is small enough), under the restriction of four degrees of freedom at the end of the robotic arm, The movement trajectory of the end of the manipulator is a three-dimensional cone-top space, as shown in Figure 4, which is the main view of the movement trajectory of the end of the manipulator. During the movement, it always intersects with pref , and the insertion will cause the change of the telescopic length, as shown in Figure 4 Line 1, Line 2, and Line 3 in the upper part represent the upper limit of the end of the manipulator caused by different telescopic lengths and degrees of rotation (tilt, pan). Note that roll only changes the posture of the end of the manipulator without affecting the end of the manipulator in Figure 4. Movement space changes.
步骤102,基于标定图像,确定预设目标点的坐标,并确定相机坐标系下预设目标点与相机的第一相对变换以及世界坐标系下的预设标定点与机械臂末端的第二相对变换。 Step 102 , based on the calibration image, determine the coordinates of the preset target point, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system and the second relative transformation between the preset calibration point in the world coordinate system and the end of the robotic arm. transform.
在上述实施例中,为了预估目标点的位置,引入了辅助点的概念,辅助点o ref为相机坐标系下Z轴上的任意一点。具体地,本申请步骤102中确定预设目标点的坐标,可以包括以下步骤: In the above embodiment, in order to estimate the position of the target point, the concept of an auxiliary point is introduced, and the auxiliary point o ref is any point on the Z-axis in the camera coordinate system. Specifically, determining the coordinates of the preset target point in step 102 of the present application may include the following steps:
步骤102-1,基于标定图像,确定相机坐标系的Z轴对应的辅助点的坐标,其中,辅助点为相机坐标系的Z轴上任意一点;Step 102-1, based on the calibration image, determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
步骤102-2,按照预设误差项公式以及辅助点的坐标,确定预设目标点的坐标。Step 102-2: Determine the coordinates of the preset target point according to the preset error term formula and the coordinates of the auxiliary point.
如图3所示,假设相机坐标系的Z轴相交于p ref,确定p ref的位置,等价于将D=|(o ref-p ref)-((o ref-p ref) Td ref)d ref|最小化后对应的p ref的坐标,其中,o ref为辅助点的坐标,p ref为预设目标点的坐标,d ref为相机坐标系的Z轴方向向量。这里的D实际上是一个误差项,在理想情况下,p ref就在相机Z轴上,但由于误差原因,p ref往往不能被估计地非常准确。在D中,(o ref-p ref) Td ref为相机Z轴上的某一条线段在Z轴方向向量上的投影长度,((o ref-p ref) Td ref)d ref为该投影的矢量形式,(o ref-p ref)为原线段的矢量形式,D为上述两矢量差的二范数,即描述了关于p ref位置的误差估计描述量。 As shown in Figure 3, assuming that the Z axis of the camera coordinate system intersects at pref , to determine the position of pref , it is equivalent to D=|( oref - pref )-(( oref - pref ) Tdref )d ref | The coordinates of the corresponding pre ref after the minimization, where o ref is the coordinate of the auxiliary point, pre ref is the coordinate of the preset target point, and d ref is the Z-axis direction vector of the camera coordinate system. D here is actually an error term, ideally, pref is on the camera Z-axis, but due to errors, pref often cannot be estimated very accurately. In D, (o ref -p ref ) T d ref is the projection length of a line segment on the Z-axis of the camera on the Z-axis direction vector, and ((o ref -p ref ) T d ref )d ref is the projection The vector form of , (o ref -pre ref ) is the vector form of the original line segment, and D is the second norm of the difference between the two vectors, that is, the description of the error estimation about the position of pre ref .
进一步,本申请实施例考虑到移动过程中4个自由度带来的影响(特别是insertion),会导致伸缩的长度不同,引入w i消除长度带来的影响。 Further, the embodiments of the present application take into account the influence (especially insertion) brought by the four degrees of freedom in the moving process, which may lead to different lengths of expansion and contraction, and the introduction of w i eliminates the influence of the length.
具体地,预设误差项公式为
Figure PCTCN2021097271-appb-000018
其中,N大于或等于2,w i为第i张标定图像对应的惩罚因子,o ref,i为第i张标定图像对应的辅助点的坐标,p ref,i为第i张标定图像对应的预设目标点的坐标,d ref,i为第i张标定图像对应的相机坐标系的Z轴方向向量。
Specifically, the preset error term formula is:
Figure PCTCN2021097271-appb-000018
Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the i-th calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the i-th calibration image, and pref,i is the i-th calibration image corresponding to The coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
在上述实施例中,因为在移动过程中直线的长度会变化,
Figure PCTCN2021097271-appb-000019
时,将d ref,i和o ref,i替换为相机的Z轴和相机的位置即可通过最小二乘法求解出p ref
In the above embodiment, since the length of the straight line changes during the movement,
Figure PCTCN2021097271-appb-000019
When , replace d ref,i and o ref,i with the Z axis of the camera and the position of the camera to solve pre ref by the least squares method.
在本申请实施例中,由于医疗机器人属于精密仪器,因此在一些应用场景下采用双目相机进行图像采集,在这种情况下,具体地,基于双目相机中的左眼相机对应的左眼标定图像确定预设目标点的第一坐标,以及基于双目相机中的右眼相机对应的右眼标定图像确定预设目标点的第二坐标;将第一坐标和第二坐标的平均值作为预设目标点的坐标。In the embodiments of the present application, since the medical robot is a precision instrument, a binocular camera is used for image acquisition in some application scenarios. In this case, specifically, based on the left eye corresponding to the left eye camera in the binocular camera The calibration image determines the first coordinate of the preset target point, and determines the second coordinate of the preset target point based on the right-eye calibration image corresponding to the right-eye camera in the binocular camera; the average value of the first coordinate and the second coordinate is taken as The coordinates of the preset target point.
在上述实施例中,对于双目相机的情况,分别基于左眼相机和右眼相机采集到的标定图像确定各自对应的预设目标点的预估位置,即第一坐标p ref,left和第二坐标p ref,right,最终的预设目标点的位置取二者的平均值,即
Figure PCTCN2021097271-appb-000020
至此得到了p ref的位置。
In the above embodiment, for the case of binocular cameras, the estimated positions of the corresponding preset target points are determined based on the calibration images collected by the left-eye camera and the right-eye camera respectively, that is, the first coordinates pref,left and the first coordinates Two coordinates pre ref, right , the position of the final preset target point takes the average value of the two, namely
Figure PCTCN2021097271-appb-000020
At this point, the position of the pref is obtained.
另外,在确定了预设目标点的位置之后,可以基于机械臂的正运动学原理确定第一相对变换以及第二相对变换。例如,以第一相对变换为例,预设目标点位于机械臂末端的预设位置处,相机坐标系可以为以预设目标点的中心为原点,x、y轴位于预设目标点所在平面的坐标系,z轴为垂直于x、y轴所在平面的轴,机械臂末端的预设位置可以是机械臂在结构设计阶段预留的位置,该位置与机械臂末端之间的相对位置固定,与相机位置之间的相对位置固定,预设目标点可以长期固定在预设位置,也可以在使用时放置在上述预设位置。当预设目标点位于上述预设位置时,预设目标点与预设位置完全精确契合,进而可以确定预设目标点与相机的第一相对变换关系。In addition, after the position of the preset target point is determined, the first relative transformation and the second relative transformation may be determined based on the forward kinematics principle of the manipulator. For example, taking the first relative transformation as an example, the preset target point is located at the preset position of the end of the robot arm, the camera coordinate system can be the center of the preset target point as the origin, and the x and y axes are located on the plane where the preset target point is located. The z axis is the axis perpendicular to the plane where the x and y axes are located. The preset position of the end of the manipulator can be the position reserved by the manipulator in the structural design stage, and the relative position between this position and the end of the manipulator is fixed. , the relative position to the camera position is fixed, and the preset target point can be fixed at the preset position for a long time, or can be placed at the above preset position during use. When the preset target point is located at the above-mentioned preset position, the preset target point and the preset position are completely and precisely matched, and then the first relative transformation relationship between the preset target point and the camera can be determined.
步骤103,根据第一相对变换、预设目标点的坐标、第二相对变换以及预设标定点的坐标,确定机械臂末端与相机的刚性变换。Step 103: Determine the rigid transformation between the end of the robotic arm and the camera according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point.
在上述步骤中,本申请实施例还引入了预设标定点p world,p world为世界坐标系下的任意一个已知点,进而根据由标定图像确定的预设目标点与相机的第一相对变换以及预定标定点与机械臂末端的第二相对变换,在第一相对变换、第二相对变换以及预设目标点的位置和预设标定点的位置已知的情况下,可以建立预设刚性变换公式并进行求解。 In the above steps, the embodiment of the present application also introduces a preset calibration point p world , where p world is any known point in the world coordinate system, and then according to the preset target point determined by the calibration image and the camera first relative The transformation and the second relative transformation between the predetermined calibration point and the end of the manipulator, when the first relative transformation, the second relative transformation, and the positions of the preset target point and the preset calibration point are known, the preset rigidity can be established. Transform the formula and solve it.
具体地,步骤103可以为:按照预设刚性变换公式确定机械臂末端与相机的刚性变换X,其中,预设刚性变换公式为
Figure PCTCN2021097271-appb-000021
为第一相对变换,p ref为预设目标点的坐标,
Figure PCTCN2021097271-appb-000022
为第二相对变换,p world为预设标定点的坐标。
Specifically, step 103 may be: determining the rigidity transformation X between the end of the manipulator and the camera according to a preset rigidity transformation formula, wherein the preset rigidity transformation formula is:
Figure PCTCN2021097271-appb-000021
is the first relative transformation, pref is the coordinate of the preset target point,
Figure PCTCN2021097271-appb-000022
is the second relative transformation, and p world is the coordinate of the preset calibration point.
进一步,为了保证对医疗机器人的精准控制,保证刚性变换的准确性可以将刚性变换拆解为旋转分量和平移分量两部分,通过对旋转刚性变换和平移刚性变换分别进行求解来确定手眼标定的刚性变换矩阵X。具体地,根据第一相对变换、预设目标点的坐标、第二相对变换以及预设标定点的坐标,确定旋转刚性变换;根据旋转刚性变换,确定平移刚性变换。Further, in order to ensure the precise control of the medical robot and the accuracy of the rigid transformation, the rigid transformation can be disassembled into two parts: the rotation component and the translation component, and the rigidity of the hand-eye calibration can be determined by solving the rotation rigid transformation and the translation rigid transformation respectively. Transform matrix X. Specifically, the rotation rigid transformation is determined according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point; and the translation rigid transformation is determined according to the rotation rigid transformation.
在上述实施例中,基于上述的设定,可以认为机械臂末端执行器的Z轴与相机Z轴平行,从而将刚性矩阵X的旋转自由度从6个减少到4个,即In the above embodiment, based on the above settings, it can be considered that the Z axis of the end effector of the manipulator is parallel to the Z axis of the camera, thereby reducing the rotational degrees of freedom of the rigid matrix X from 6 to 4, that is,
Figure PCTCN2021097271-appb-000023
Figure PCTCN2021097271-appb-000023
根据上述设定,预设刚性变换公式为
Figure PCTCN2021097271-appb-000024
现在可以简化为p cam,i=Xp robot,i(*),
Figure PCTCN2021097271-appb-000025
Figure PCTCN2021097271-appb-000026
在p ref和p world已求出的情况下可以转化为一个求解单应矩阵(Homography)的问题,预设刚性变换公式与手眼标定中的旋转分量与平移分量都无关,但因为X需要同时满足(1)式,因此仅靠预设刚性变换公式求解是不够的,因此需要解决的是一个约束问题由(2)和(3)可以推断出这两个式子解耦的平移分量
Figure PCTCN2021097271-appb-000027
一定同时满足预设刚性变换公式,因此,通过对旋转矩阵和平移向量分别建立最小二乘法求解,再求出旋转分量和平移分量后,最终可确定对应的手眼标定矩阵X。具体来说,可以通过LM算法(Levenberg-Marquardt,列文伯格-马夸尔特算法)将旋转分量R X的求解转化为一个优化问题,具体可以通过对下述公式进行求解:
Figure PCTCN2021097271-appb-000028
其中,
Figure PCTCN2021097271-appb-000029
为第N-1帧标定图像中的相机坐标向第N帧标定图像中的相机坐标的相对变换,N为标定图像的数量,
Figure PCTCN2021097271-appb-000030
为第N-1帧标定图像中的机械臂末端坐标向第N帧标定图像中的机械臂末端坐标在平移方向上的相对变换,
Figure PCTCN2021097271-appb-000031
为第N-1帧标定图像中的相机坐标向第N帧标定图像中的相机坐标在平移方向上的相对变换,
Figure PCTCN2021097271-appb-000032
为左眼相机和右眼相机对应的任意一帧标定图像的相机坐标平均值,
Figure PCTCN2021097271-appb-000033
为左眼相机和右眼相机对应的任意一帧标定图像的机械臂末端坐标平均值,p cam,i为第i帧标定图像对应的相机坐标,p robot,i为第i帧标定图像对应的机械臂末端坐标,
Figure PCTCN2021097271-appb-000034
为R X的罗德里格斯表示。平移向量可以结合(3)式和(*)式堆积求解:
Figure PCTCN2021097271-appb-000035
从而得出平移分量
Figure PCTCN2021097271-appb-000036
According to the above settings, the preset rigid transformation formula is
Figure PCTCN2021097271-appb-000024
Now it can be simplified to p cam,i =Xp robot,i (*),
Figure PCTCN2021097271-appb-000025
and
Figure PCTCN2021097271-appb-000026
When pre ref and p world have been obtained, it can be transformed into a problem of solving the homography matrix (Homography). The preset rigid transformation formula has nothing to do with the rotation and translation components in the hand-eye calibration, but because X needs to satisfy both Formula (1), so it is not enough to solve only by the preset rigid transformation formula, so what needs to be solved is a constraint problem. From (2) and (3), it can be deduced that the translation components of the decoupling of these two formulas can be inferred.
Figure PCTCN2021097271-appb-000027
The preset rigid transformation formula must be satisfied at the same time. Therefore, the corresponding hand-eye calibration matrix X can be finally determined by establishing the least squares method for the rotation matrix and translation vector respectively, and then obtaining the rotation component and translation component. Specifically, the solution of the rotation component R X can be transformed into an optimization problem through the LM algorithm (Levenberg-Marquardt, Levenberg-Marquardt algorithm), which can be solved by the following formula:
Figure PCTCN2021097271-appb-000028
in,
Figure PCTCN2021097271-appb-000029
is the relative transformation of the camera coordinates in the N-1th frame calibration image to the camera coordinates in the Nth frame calibration image, N is the number of calibration images,
Figure PCTCN2021097271-appb-000030
The relative transformation of the coordinates of the end of the robot arm in the frame N-1 calibration image to the coordinates of the end of the robot arm in the frame N calibration image in the translation direction,
Figure PCTCN2021097271-appb-000031
For the relative transformation of the camera coordinates in the N-1th frame calibration image to the camera coordinates in the Nth frame calibration image in the translation direction,
Figure PCTCN2021097271-appb-000032
is the average value of the camera coordinates of any frame of the calibration image corresponding to the left-eye camera and the right-eye camera,
Figure PCTCN2021097271-appb-000033
is the average value of the coordinates of the end of the robot arm for any frame of the calibration image corresponding to the left-eye camera and the right-eye camera, p cam,i is the camera coordinate corresponding to the ith frame of the calibration image, p robot,i is the ith frame of the calibration image. The coordinates of the end of the robotic arm,
Figure PCTCN2021097271-appb-000034
Rodriguez for R X. The translation vector can be solved by combining (3) and (*) by stacking:
Figure PCTCN2021097271-appb-000035
resulting in the translation component
Figure PCTCN2021097271-appb-000036
通过应用本实施例的技术方案,实现了在移动自由度和移动范围的双重限制下对eye-in-hand机器人进行手眼标定,通过对创伤位置的预估以及引入世界坐标系下的已知点,结合机械臂末端的相机采集的标定图片确定相机坐标系下预设目标点与相机的第一相对变换以及世界坐标系下的预设标定点与机械臂末端的第二相对变换,进而对上述参数进行解耦分别确定机器人机械臂末端到相机在旋转和平移层面上的刚性变换,将限制自由度和限制移动范围的手眼标定问题从原来的不适定问题转化为适定问题,与现有技术中需要引入辅助标定工具来进行限制自由度和限制移动范围的手眼标定相比,不仅省去了引入辅助标定工具的繁琐操作,而且由于不需要引入辅助标定工具也减少了标定误差源的引入,提高了标定精度。By applying the technical solution of this embodiment, the eye-in-hand robot can be calibrated by hand under the double restrictions of the freedom of movement and the range of movement. By estimating the location of the trauma and introducing known points in the world coordinate system , and determine the first relative transformation between the preset target point and the camera in the camera coordinate system and the second relative transformation between the preset calibration point in the world coordinate system and the end of the robotic arm in combination with the calibration pictures collected by the camera at the end of the robot arm, and then the above The parameters are decoupled to determine the rigid transformation from the end of the robot arm to the camera on the rotation and translation levels, and the hand-eye calibration problem that limits the degree of freedom and movement range is transformed from the original ill-posed problem to a well-posed problem, which is consistent with the existing technology. Compared with the hand-eye calibration that limits the degree of freedom and the movement range, it not only saves the tedious operation of introducing auxiliary calibration tools, but also reduces the introduction of calibration error sources because there is no need to introduce auxiliary calibration tools. Improved calibration accuracy.
在本申请实施例中,步骤103之后还可以包:获取相机采集的控制图像;基于控制图像以及机械臂末端与相机的刚性变换,控制机械臂运动。In this embodiment of the present application, after step 103, it may further include: acquiring a control image captured by the camera; and controlling the motion of the robotic arm based on the control image and the rigid transformation between the end of the robotic arm and the camera.
在上述实施例中,确定机械臂末端与相机的刚性变换之后,完成对机器人的手眼标定之后,即可以根据标定结果来控制机器人机械臂的移动,从而实现利用机器人处理伤口,实现医疗功能。In the above embodiment, after determining the rigid transformation between the end of the robotic arm and the camera and completing the hand-eye calibration of the robot, the movement of the robotic arm of the robot can be controlled according to the calibration results, thereby realizing the use of the robot to treat wounds and realize medical functions.
进一步的,作为图1方法的具体实现,本申请实施例提供了一种基于计算机视觉的手眼标定装置,如图5所示,该装置包括:Further, as a specific implementation of the method in FIG. 1 , an embodiment of the present application provides a computer vision-based hand-eye calibration device. As shown in FIG. 5 , the device includes:
标定图像采集模块51,用于控制Eye-in-hand机器人的机械臂在四个自由度的约束条件下在预设范围移动并通过设置在机械臂末端的相机采集标定图像,其中,机械臂在预设范围的移动过程中相机对应的相机坐标系Z轴始终穿过预设目标点位置;The calibration image acquisition module 51 is used to control the robotic arm of the Eye-in-hand robot to move within a preset range under the constraint of four degrees of freedom, and to acquire a calibration image through a camera set at the end of the robotic arm, wherein the robotic arm is at the end of the robotic arm. During the movement of the preset range, the Z-axis of the camera coordinate system corresponding to the camera always passes through the preset target point position;
目标点坐标确定模块52,用于基于标定图像,确定预设目标点的坐标,并确定相机坐标系下预设目标点与相机的第一相对变换以及世界坐标系下的预设标定点与机械臂末端的第二相对变换;The target point coordinate determination module 52 is used to determine the coordinates of the preset target point based on the calibration image, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system, and the preset calibration point in the world coordinate system. a second relative transformation of the end of the arm;
刚性变换确定模块53,用于根据第一相对变换、预设目标点的坐标、第二相对变换以及预设标定点的坐标,确定机械臂末端与相机的刚性变换。The rigid transformation determining module 53 is configured to determine the rigid transformation between the end of the robotic arm and the camera according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point.
具体地,如图6所示,目标点坐标确定模块52,具体包括:Specifically, as shown in FIG. 6 , the target point coordinate determination module 52 specifically includes:
辅助点确定单元521,用于基于标定图像,确定相机坐标系的Z轴对应的辅助点的坐标,其中,辅助点为相机坐标系的Z轴上任意一点;The auxiliary point determination unit 521 is used to determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system based on the calibration image, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
第一目标点确定单元522,用于按照预设误差项公式以及辅助点的坐标,确定预设目标点的坐标。The first target point determination unit 522 is configured to determine the coordinates of the preset target point according to the preset error term formula and the coordinates of the auxiliary points.
具体地,标定图像包括N张,预设误差项公式为
Figure PCTCN2021097271-appb-000037
其中,N大于或等于2,w i为第i张标定图像对应的惩罚因子,o ref,i为第i张标定图像对应的辅助点的坐标,p ref,i为第i张标定图像对应的预设目标点的坐标,d ref,i为第i张标定图像对应的相机坐标系的Z轴方向向量。
Specifically, the calibration images include N images, and the preset error term formula is
Figure PCTCN2021097271-appb-000037
Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the i-th calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the i-th calibration image, and pref,i is the i-th calibration image corresponding to The coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
具体地,如图6所示,目标点坐标确定模块52,具体包括:Specifically, as shown in FIG. 6 , the target point coordinate determination module 52 specifically includes:
第二目标点确定单元523,用于若相机为双目相机,则基于双目相机中的左眼相机对应的左眼标定图像确定预设目标点的第一坐标,以及基于双目相机中的右眼相机对应的右眼标定图像确定预设目标点的第二坐标;The second target point determining unit 523 is configured to, if the camera is a binocular camera, determine the first coordinate of the preset target point based on the left eye calibration image corresponding to the left eye camera in the binocular camera, and determine the first coordinates of the preset target point based on the binocular camera The right eye calibration image corresponding to the right eye camera determines the second coordinate of the preset target point;
第三目标点确定单元524,用于将第一坐标和第二坐标的平均值作为预设目标点的坐标。The third target point determination unit 524 is configured to use the average value of the first coordinate and the second coordinate as the coordinate of the preset target point.
具体地,如图6所示,刚性变换确定模块53,具体包括:Specifically, as shown in FIG. 6 , the rigid transformation determination module 53 specifically includes:
第一刚性变换确定单元531,用于按照预设刚性变换公式确定机械臂末端与相机的刚性变换,其中,预设刚性变换公式为
Figure PCTCN2021097271-appb-000038
Figure PCTCN2021097271-appb-000039
为第一相对变换,p ref为预设目标点的坐标,
Figure PCTCN2021097271-appb-000040
为第二相对变换,p world为预设标定点的坐标。
The first rigid transformation determination unit 531 is used to determine the rigid transformation between the end of the robot arm and the camera according to a preset rigid transformation formula, wherein the preset rigid transformation formula is:
Figure PCTCN2021097271-appb-000038
Figure PCTCN2021097271-appb-000039
is the first relative transformation, pref is the coordinate of the preset target point,
Figure PCTCN2021097271-appb-000040
is the second relative transformation, and p world is the coordinate of the preset calibration point.
具体地,如图6所示,刚性变换包括旋转刚性变换和平移刚性变换;刚性变换确定模块53,具体包括:Specifically, as shown in FIG. 6 , the rigid transformation includes rotational rigid transformation and translation rigid transformation; the rigid transformation determination module 53 specifically includes:
第二刚性变换确定单元532,用于根据第一相对变换、预设目标点的坐标、第二相对变换以及预设标定点的坐标,确定旋转刚性变换;The second rigid transformation determining unit 532 is configured to determine the rotational rigid transformation according to the first relative transformation, the coordinates of the preset target point, the second relative transformation and the coordinates of the preset calibration point;
第三刚性变换确定单元533,用于根据旋转刚性变换,确定平移刚性变换。The third rigid transformation determining unit 533 is configured to determine the translation rigid transformation according to the rotational rigid transformation.
具体地,如图6所示,该装置还包括:Specifically, as shown in Figure 6, the device further includes:
图像获取模块54,用于确定机械臂末端与相机的刚性变换之后,获取相机采集的控制图像;The image acquisition module 54 is used to acquire the control image acquired by the camera after determining the rigid transformation between the end of the robotic arm and the camera;
控制模块55,用于基于控制图像以及机械臂末端与相机的刚性变换,控制机械臂运动。The control module 55 is configured to control the movement of the robotic arm based on the control image and the rigid transformation between the end of the robotic arm and the camera.
需要说明的是,本申请实施例提供的一种基于计算机视觉的手眼标定装置所涉及各功能单元的其他相应描述,可以参考图1至图4方法中的对应描述,在此不再赘述。It should be noted that, for other corresponding descriptions of the functional units involved in the computer vision-based hand-eye calibration device provided in the embodiments of the present application, reference may be made to the corresponding descriptions in the methods in FIGS. 1 to 4 , and details are not repeated here.
基于上述如图1至图4所示方法,相应的,本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质可以是非易失性,也可以是易失性。所述计算机可读存储介质上存储有计算机可读指令,该计算机可读指令被处理器执行时实现上述如图1至图4所示的基于计算机视觉的手眼标定方法。Based on the foregoing methods shown in FIGS. 1 to 4 , correspondingly, an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium may be non-volatile or volatile. The computer-readable storage medium stores computer-readable instructions, and when the computer-readable instructions are executed by the processor, implements the above-mentioned computer vision-based hand-eye calibration method shown in FIG. 1 to FIG. 4 .
基于这样的理解,本申请的技术方案可以以软件产品的形式体现出来,该软件产品可以存储在一个非易失性存储介质(可以是CD-ROM,U盘,移动硬盘等)、或易失性存储介质中,包括若干指令用以使得一台计算机设备(可 以是个人计算机,服务器,或者网络设备等)执行本申请各个实施场景所述的方法。Based on this understanding, the technical solution of the present application can be embodied in the form of a software product, and the software product can be stored in a non-volatile storage medium (which can be a CD-ROM, U disk, mobile hard disk, etc.), or a volatile storage medium. The non-volatile storage medium includes several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in various implementation scenarios of this application.
基于上述如图1至图4所示的方法,以及图5至图6所示的虚拟装置实施例,为了实现上述目的,本申请实施例还提供了一种计算机设备,具体可以为个人计算机、服务器、网络设备等,该计算机设备包括计算机可读存储介质和处理器;计算机可读存储介质,用于存储计算机可读指令;处理器,用于执行计算机可读指令以实现上述如图1至图4所示的基于计算机视觉的手眼标定方法。Based on the foregoing methods shown in FIGS. 1 to 4 and the virtual device embodiments shown in FIGS. 5 to 6 , in order to achieve the foregoing purpose, an embodiment of the present application further provides a computer device, which may specifically be a personal computer, A server, a network device, etc., the computer device includes a computer-readable storage medium and a processor; a computer-readable storage medium for storing computer-readable instructions; a processor for executing computer-readable instructions to achieve the above-mentioned Figures 1 to 1 Figure 4 shows the computer vision-based hand-eye calibration method.
可选地,该计算机设备还可以包括用户接口、网络接口、摄像头、射频(Radio Frequency,RF)电路,传感器、音频电路、WI-FI模块等等。用户接口可以包括显示屏(Display)、输入单元比如键盘(Keyboard)等,可选用户接口还可以包括USB接口、读卡器接口等。网络接口可选的可以包括标准的有线接口、无线接口(如蓝牙接口、WI-FI接口)等。Optionally, the computer device may further include a user interface, a network interface, a camera, a radio frequency (Radio Frequency, RF) circuit, a sensor, an audio circuit, a WI-FI module, and the like. The user interface may include a display screen (Display), an input unit such as a keyboard (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, and the like. Optional network interfaces may include standard wired interfaces, wireless interfaces (such as Bluetooth interfaces, WI-FI interfaces), and the like.
本领域技术人员可以理解,本实施例提供的一种计算机设备结构并不构成对该计算机设备的限定,可以包括更多或更少的部件,或者组合某些部件,或者不同的部件布置。Those skilled in the art can understand that the structure of a computer device provided in this embodiment does not constitute a limitation on the computer device, and may include more or less components, or combine some components, or arrange different components.
计算机可读存储介质中还可以包括操作***、网络通信模块。操作***是管理和保存计算机设备硬件和软件资源的程序,支持信息处理程序以及其它软件和/或程序的运行。网络通信模块用于实现计算机可读存储介质内部各组件之间的通信,以及与该实体设备中其它硬件和软件之间通信。The computer-readable storage medium may further include an operating system and a network communication module. An operating system is a program that manages and saves the hardware and software resources of computer equipment, supports the operation of information processing programs and other software and/or programs. The network communication module is used to implement communication between various components in the computer-readable storage medium, as well as communication with other hardware and software in the physical device.
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到本申请可以借助软件加必要的通用硬件平台的方式来实现,也可以通过硬件实现实现了在移动自由度和移动范围的双重限制下对eye-in-hand机器人进行手眼标定,通过对创伤位置的预估以及引入世界坐标系下的已知点,结合机械臂末端的相机采集的标定图片确定相机坐标系下预设目标点与相机的第一相对变换以及世界坐标系下的预设标定点与机械臂末端的第二相对变换,进而对上述参数进行解耦分别确定机器人机械臂末端到相机在旋转和平移层面上的刚性变换,将限制自由度和限制移动范围的手眼标定问题从原来的不适定问题转化为适定问题,最终解决了限制自由度和限制移动范围的手眼标定问题。From the description of the above embodiments, those skilled in the art can clearly understand that the present application can be implemented by means of software plus a necessary general hardware platform, and can also be implemented by hardware to achieve dual freedom of movement and range of movement. The hand-eye calibration of the eye-in-hand robot is carried out under the constraints. By estimating the trauma position and introducing known points in the world coordinate system, combined with the calibration pictures collected by the camera at the end of the robot arm, the preset target point in the camera coordinate system is determined. The first relative transformation with the camera and the second relative transformation between the preset calibration point in the world coordinate system and the end of the robot arm, and then the above parameters are decoupled to determine the rigidity of the robot arm end to the camera on the rotation and translation levels, respectively. The transformation transforms the hand-eye calibration problem with limited degrees of freedom and limited movement range from the original ill-posed problem to a well-posed problem, and finally solves the hand-eye calibration problem with limited degrees of freedom and limited movement range.
本领域技术人员可以理解附图只是一个优选实施场景的示意图,附图中的模块或流程并不一定是实施本申请所必须的。本领域技术人员可以理解实施场景中的装置中的模块可以按照实施场景描述进行分布于实施场景的装置中,也可以进行相应变化位于不同于本实施场景的一个或多个装置中。上述实施场景的模块可以合并为一个模块,也可以进一步拆分成多个子模块。Those skilled in the art can understand that the accompanying drawing is only a schematic diagram of a preferred implementation scenario, and the modules or processes in the accompanying drawing are not necessarily necessary to implement the present application. Those skilled in the art can understand that the modules in the device in the implementation scenario may be distributed in the device in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the implementation scenario with corresponding changes. The modules of the above implementation scenarios may be combined into one module, or may be further split into multiple sub-modules.
上述本申请序号仅仅为了描述,不代表实施场景的优劣。以上公开的仅为本申请的几个具体实施场景,但是,本申请并非局限于此,任何本领域的技术人员能思之的变化都应落入本申请的保护范围。The above serial numbers in the present application are only for description, and do not represent the pros and cons of the implementation scenarios. The above disclosures are only a few specific implementation scenarios of the present application, however, the present application is not limited thereto, and any changes that can be conceived by those skilled in the art should fall within the protection scope of the present application.

Claims (20)

  1. 一种基于计算机视觉的手眼标定方法,其中,包括:A computer vision-based hand-eye calibration method, comprising:
    控制Eye-in-hand机器人的机械臂在四个自由度的约束条件下在预设范围移动并通过设置在所述机械臂末端的相机采集标定图像,其中,所述机械臂在所述预设范围的移动过程中所述相机对应的相机坐标系Z轴始终穿过预设目标点的位置;The robotic arm of the Eye-in-hand robot is controlled to move within a preset range under the constraint of four degrees of freedom, and a calibration image is collected by a camera set at the end of the robotic arm, wherein the robotic arm is in the preset range. During the movement of the range, the Z-axis of the camera coordinate system corresponding to the camera always passes through the position of the preset target point;
    基于所述标定图像,确定所述预设目标点的坐标,并确定所述相机坐标系下所述预设目标点与所述相机的第一相对变换以及世界坐标系下的预设标定点与所述机械臂末端的第二相对变换;Based on the calibration image, determine the coordinates of the preset target point, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system, and the preset calibration point and the world coordinate system. a second relative transformation of the end of the robotic arm;
    根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述机械臂末端与所述相机的刚性变换。According to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point, the rigid transformation between the end of the robotic arm and the camera is determined.
  2. 根据权利要求1所述的方法,其中,所述确定所述预设目标点的坐标,具体包括:The method according to claim 1, wherein the determining the coordinates of the preset target point specifically comprises:
    基于所述标定图像,确定所述相机坐标系的Z轴对应的辅助点的坐标,其中,所述辅助点为所述相机坐标系的Z轴上任意一点;Based on the calibration image, determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system, wherein the auxiliary point is any point on the Z axis of the camera coordinate system;
    按照预设误差项公式以及所述辅助点的坐标,确定所述预设目标点的坐标。The coordinates of the preset target point are determined according to the preset error term formula and the coordinates of the auxiliary point.
  3. 根据权利要求2所述的方法,其中,所述标定图像包括N张,所述预设误差项公式为
    Figure PCTCN2021097271-appb-100001
    其中,N大于或等于2,w i为第i张标定图像对应的惩罚因子,o ref,i为第i张标定图像对应的所述辅助点的坐标,p ref,i为第i张标定图像对应的所述预设目标点的坐标,d ref,i为第i张标定图像对应的所述相机坐标系的Z轴方向向量。
    The method according to claim 2, wherein the calibration image comprises N images, and the formula of the preset error term is:
    Figure PCTCN2021097271-appb-100001
    Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the ith calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the ith calibration image, and pref,i is the ith calibration image The corresponding coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
  4. 根据权利要求1至3中任一项所述的方法,其中,若所述相机为双目相机,则所述确定所述预设目标点的坐标,具体包括:The method according to any one of claims 1 to 3, wherein, if the camera is a binocular camera, the determining the coordinates of the preset target point specifically includes:
    基于所述双目相机中的左眼相机对应的左眼标定图像确定所述预设目标点的第一坐标,以及基于所述双目相机中的右眼相机对应的右眼标定图像确定所述预设目标点的第二坐标;The first coordinate of the preset target point is determined based on the left-eye calibration image corresponding to the left-eye camera in the binocular camera, and the first coordinate of the preset target point is determined based on the right-eye calibration image corresponding to the right-eye camera in the binocular camera. the second coordinate of the preset target point;
    将所述第一坐标和所述第二坐标的平均值作为所述预设目标点的坐标。The average value of the first coordinate and the second coordinate is taken as the coordinate of the preset target point.
  5. 根据权利要求4所述的方法,其中,所述确定所述机械臂末端与所述相机的刚性变换,具体包括:The method according to claim 4, wherein the determining the rigid transformation between the end of the robotic arm and the camera specifically comprises:
    按照预设刚性变换公式确定所述机械臂末端与所述相机的刚性变换,其中,所述预设刚性变换公式为
    Figure PCTCN2021097271-appb-100002
    为所述第一相对变换,p ref为所述预设目标点的坐标,
    Figure PCTCN2021097271-appb-100003
    为所述第二相对变换,p world为所述预设标定点的坐标。
    The rigid transformation between the end of the robotic arm and the camera is determined according to a preset rigid transformation formula, wherein the preset rigid transformation formula is:
    Figure PCTCN2021097271-appb-100002
    is the first relative transformation, and pref is the coordinate of the preset target point,
    Figure PCTCN2021097271-appb-100003
    is the second relative transformation, and p world is the coordinate of the preset calibration point.
  6. 根据权利要求4所述的方法,其中,所述刚性变换包括旋转刚性变换和平移刚性变换;所述确定所述机械臂末端与所述相机的刚性变换,具体包括:The method according to claim 4, wherein the rigid transformation includes a rotational rigid transformation and a translation rigid transformation; and the determining the rigid transformation between the end of the manipulator and the camera specifically includes:
    根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述旋转刚性变换;determining the rotationally rigid transformation according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point;
    根据所述旋转刚性变换,确定所述平移刚性变换。From the rotationally rigid transformation, the translationally rigid transformation is determined.
  7. 根据权利要求4所述的方法,其中,所述确定所述机械臂末端与所述相机的刚性变换之后,所述方法还包括:The method of claim 4, wherein after the determining the rigid transformation of the end of the robotic arm and the camera, the method further comprises:
    获取所述相机采集的控制图像;obtaining a control image captured by the camera;
    基于所述控制图像以及所述机械臂末端与所述相机的刚性变换,控制所述机械臂运动。The motion of the robotic arm is controlled based on the control image and the rigid transformation of the end of the robotic arm and the camera.
  8. 一种基于计算机视觉的手眼标定装置,其中,包括:A computer vision-based hand-eye calibration device, comprising:
    标定图像采集模块,用于控制Eye-in-hand机器人的机械臂在四个自由度的约束条件下在预设范围移动并通过设置在所述机械臂末端的相机采集标定图像,其中,所述机械臂在所述预设范围的移动过程中所述相机对应的相机坐标系Z轴始终穿过预设目标点的位置;The calibration image acquisition module is used to control the robotic arm of the Eye-in-hand robot to move within a preset range under the constraint of four degrees of freedom, and to collect calibration images through a camera set at the end of the robotic arm, wherein the During the movement of the robotic arm in the preset range, the Z-axis of the camera coordinate system corresponding to the camera always passes through the position of the preset target point;
    目标点坐标确定模块,用于基于所述标定图像,确定所述预设目标点的坐标,并确定所述相机坐标系下所述预设目标点与所述相机的第一相对变换以及世界坐标系下的预设标定点与所述机械臂末端的第二相对变换;A target point coordinate determination module, configured to determine the coordinates of the preset target point based on the calibration image, and determine the first relative transformation between the preset target point and the camera in the camera coordinate system and the world coordinates the second relative transformation between the preset calibration point under the system and the end of the robotic arm;
    刚性变换确定模块,用于根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述机械臂末端与所述相机的刚性变换。a rigid transformation determination module, configured to determine the end of the robotic arm and the camera according to the first relative transformation, the coordinates of the preset target point, the second relative transformation and the coordinates of the preset calibration point rigid transformation of .
  9. 一种计算机可读存储介质,其上存储有计算机可读指令,其中,所述计算机可读指令被处理器执行时实现基于计算机视觉的手眼标定方法,包括:A computer-readable storage medium on which computer-readable instructions are stored, wherein when the computer-readable instructions are executed by a processor, a computer vision-based hand-eye calibration method is implemented, comprising:
    控制Eye-in-hand机器人的机械臂在四个自由度的约束条件下在预设范围移动并通过设置在所述机械臂末端的相机采集标定图像,其中,所述机械臂在所 述预设范围的移动过程中所述相机对应的相机坐标系Z轴始终穿过预设目标点的位置;基于所述标定图像,确定所述预设目标点的坐标,并确定所述相机坐标系下所述预设目标点与所述相机的第一相对变换以及世界坐标系下的预设标定点与所述机械臂末端的第二相对变换;根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述机械臂末端与所述相机的刚性变换。The robotic arm of the Eye-in-hand robot is controlled to move within a preset range under the constraint of four degrees of freedom, and a calibration image is collected by a camera set at the end of the robotic arm, wherein the robotic arm is in the preset range. During the movement of the range, the Z-axis of the camera coordinate system corresponding to the camera always passes through the position of the preset target point; based on the calibration image, the coordinates of the preset target point are determined, and the coordinates of the camera coordinate system are determined. The first relative transformation between the preset target point and the camera and the second relative transformation between the preset calibration point in the world coordinate system and the end of the robotic arm; according to the first relative transformation, the preset target point The coordinates of , the second relative transformation, and the coordinates of the preset calibration point determine the rigid transformation between the end of the robotic arm and the camera.
  10. 根据权利要求9所述的计算机可读存储介质,其中,所述计算机可读指令被处理器执行时实现所述确定所述预设目标点的坐标,具体包括:The computer-readable storage medium according to claim 9, wherein, when the computer-readable instructions are executed by the processor, the determining of the coordinates of the preset target point includes:
    基于所述标定图像,确定所述相机坐标系的Z轴对应的辅助点的坐标,其中,所述辅助点为所述相机坐标系的Z轴上任意一点;按照预设误差项公式以及所述辅助点的坐标,确定所述预设目标点的坐标。Based on the calibration image, determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system, wherein the auxiliary point is any point on the Z axis of the camera coordinate system; according to the preset error term formula and the The coordinates of the auxiliary point determine the coordinates of the preset target point.
  11. 根据权利要求10所述的计算机可读存储介质,其中,所述标定图像包括N张,所述预设误差项公式为
    Figure PCTCN2021097271-appb-100004
    其中,N大于或等于2,w i为第i张标定图像对应的惩罚因子,o ref,i为第i张标定图像对应的所述辅助点的坐标,p ref,i为第i张标定图像对应的所述预设目标点的坐标,d ref,i为第i张标定图像对应的所述相机坐标系的Z轴方向向量。
    The computer-readable storage medium according to claim 10, wherein the calibration image comprises N images, and the preset error term formula is:
    Figure PCTCN2021097271-appb-100004
    Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the ith calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the ith calibration image, and pref,i is the ith calibration image The corresponding coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
  12. 根据权利要求9至11所述的计算机可读存储介质,其中,所述计算机可读指令被处理器执行时确定所述预设目标点的坐标,具体包括:The computer-readable storage medium according to claims 9 to 11, wherein, when the computer-readable instructions are executed by the processor, the coordinates of the preset target point are determined, specifically comprising:
    若所述相机为双目相机,基于所述双目相机中的左眼相机对应的左眼标定图像确定所述预设目标点的第一坐标,以及基于所述双目相机中的右眼相机对应的右眼标定图像确定所述预设目标点的第二坐标;将所述第一坐标和所述第二坐标的平均值作为所述预设目标点的坐标。If the camera is a binocular camera, determine the first coordinate of the preset target point based on the left eye calibration image corresponding to the left eye camera in the binocular camera, and determine the first coordinate of the preset target point based on the right eye camera in the binocular camera The corresponding right eye calibration image determines the second coordinate of the preset target point; the average value of the first coordinate and the second coordinate is used as the coordinate of the preset target point.
  13. 根据权利要求12所述的计算机可读存储介质,其中,所述计算机可读指令被处理器执行时确定所述机械臂末端与所述相机的刚性变换,具体包括:The computer-readable storage medium of claim 12, wherein the computer-readable instructions, when executed by the processor, determine the rigid transformation between the end of the robotic arm and the camera, including:
    按照预设刚性变换公式确定所述机械臂末端与所述相机的刚性变换,其中,所述预设刚性变换公式为
    Figure PCTCN2021097271-appb-100005
    为所述第一相对变换,p ref为所述预设目标点的坐标,
    Figure PCTCN2021097271-appb-100006
    为所述第二相对变换,p world为所述预设标定点的坐标。
    The rigid transformation between the end of the robotic arm and the camera is determined according to a preset rigid transformation formula, wherein the preset rigid transformation formula is:
    Figure PCTCN2021097271-appb-100005
    is the first relative transformation, and pref is the coordinate of the preset target point,
    Figure PCTCN2021097271-appb-100006
    is the second relative transformation, and p world is the coordinate of the preset calibration point.
  14. 根据权利要求12所述的计算机可读存储介质,其中,所述刚性变换包括旋转刚性变换和平移刚性变换,所述计算机可读指令被处理器执行时确定所述机械臂末端与所述相机的刚性变换,具体包括:13. The computer-readable storage medium of claim 12, wherein the rigid transformations comprise rotational rigid transformations and translational rigid transformations, the computer readable instructions, when executed by a processor, determine the distance between the end of the robotic arm and the camera. Rigid transformations, including:
    根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述旋转刚性变换;根据所述旋转刚性变换,确定所述平移刚性变换。Determine the rotationally rigid transformation according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point; determine the rotationally rigid transformation according to the rotationally rigid transformation Translation rigid transformation.
  15. 一种计算机设备,包括计算机可读存储介质、处理器及存储在计算机可读存储介质上并可在处理器上运行的计算机可读指令,其中,所述处理器执行所述计算机可读指令时实现基于计算机视觉的手眼标定方法,包括:A computer device comprising a computer-readable storage medium, a processor, and computer-readable instructions stored on the computer-readable storage medium and executable on the processor, wherein when the processor executes the computer-readable instructions Implement a computer vision-based hand-eye calibration method, including:
    控制Eye-in-hand机器人的机械臂在四个自由度的约束条件下在预设范围移动并通过设置在所述机械臂末端的相机采集标定图像,其中,所述机械臂在所述预设范围的移动过程中所述相机对应的相机坐标系Z轴始终穿过预设目标点的位置;基于所述标定图像,确定所述预设目标点的坐标,并确定所述相机坐标系下所述预设目标点与所述相机的第一相对变换以及世界坐标系下的预设标定点与所述机械臂末端的第二相对变换;根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述机械臂末端与所述相机的刚性变换。The robotic arm of the Eye-in-hand robot is controlled to move within a preset range under the constraint of four degrees of freedom, and a calibration image is collected by a camera set at the end of the robotic arm, wherein the robotic arm is in the preset range. During the movement of the range, the Z-axis of the camera coordinate system corresponding to the camera always passes through the position of the preset target point; based on the calibration image, the coordinates of the preset target point are determined, and the coordinates of the camera coordinate system are determined. The first relative transformation between the preset target point and the camera and the second relative transformation between the preset calibration point in the world coordinate system and the end of the robotic arm; according to the first relative transformation, the preset target point The coordinates of , the second relative transformation, and the coordinates of the preset calibration point determine the rigid transformation between the end of the robotic arm and the camera.
  16. 根据权利要求15所述的计算机设备,其中,所述处理器执行所述计算机可读指令时实现所述确定所述预设目标点的坐标,具体包括:The computer device according to claim 15, wherein, when the processor executes the computer-readable instructions, the determining of the coordinates of the preset target point includes:
    基于所述标定图像,确定所述相机坐标系的Z轴对应的辅助点的坐标,其中,所述辅助点为所述相机坐标系的Z轴上任意一点;按照预设误差项公式以及所述辅助点的坐标,确定所述预设目标点的坐标。Based on the calibration image, determine the coordinates of the auxiliary point corresponding to the Z axis of the camera coordinate system, wherein the auxiliary point is any point on the Z axis of the camera coordinate system; according to the preset error term formula and the The coordinates of the auxiliary point determine the coordinates of the preset target point.
  17. 根据权利要求16所述的计算机设备,其中,所述标定图像包括N张,所述预设误差项公式为
    Figure PCTCN2021097271-appb-100007
    其中,N大于或等于2,w i为第i张标定图像对应的惩罚因子,o ref,i为第i张标定图像对应的所述辅助点的坐标,p ref,i为第i张标定图像对应的所述预设目标点的坐标,d ref,i为第i张标定图像对应的所述相机坐标系的Z轴方向向量。
    The computer device according to claim 16, wherein the calibration image includes N pieces, and the preset error term formula is:
    Figure PCTCN2021097271-appb-100007
    Among them, N is greater than or equal to 2, w i is the penalty factor corresponding to the ith calibration image, o ref,i is the coordinate of the auxiliary point corresponding to the ith calibration image, and pref,i is the ith calibration image The corresponding coordinates of the preset target point, d ref,i is the Z-axis direction vector of the camera coordinate system corresponding to the ith calibration image.
  18. 根据权利要求15至17所述的计算机设备,其中,所述处理器执行所述计算机可读指令时确定所述预设目标点的坐标,具体包括:The computer device according to claims 15 to 17, wherein when the processor executes the computer-readable instructions, determining the coordinates of the preset target point specifically includes:
    若所述相机为双目相机,基于所述双目相机中的左眼相机对应的左眼标定图像确定所述预设目标点的第一坐标,以及基于所述双目相机中的右眼相机对应的右眼标定图像确定所述预设目标点的第二坐标;将所述第一坐标和所述第二坐标的平均值作为所述预设目标点的坐标。If the camera is a binocular camera, determine the first coordinate of the preset target point based on the left eye calibration image corresponding to the left eye camera in the binocular camera, and determine the first coordinate of the preset target point based on the right eye camera in the binocular camera The corresponding right eye calibration image determines the second coordinate of the preset target point; the average value of the first coordinate and the second coordinate is used as the coordinate of the preset target point.
  19. 根据权利要求18所述的计算机设备,其中,所述处理器执行所述计算机可读指令时实现所述确定所述机械臂末端与所述相机的刚性变换,具体包括:The computer device according to claim 18, wherein, when the processor executes the computer-readable instructions, the determining of the rigid transformation between the end of the robotic arm and the camera is implemented, specifically comprising:
    按照预设刚性变换公式确定所述机械臂末端与所述相机的刚性变换,其中,所述预设刚性变换公式为
    Figure PCTCN2021097271-appb-100008
    为所述第一相对变换,p ref为所述预设目标点的坐标,
    Figure PCTCN2021097271-appb-100009
    为所述第二相对变换,p world为所述预设标定点的坐标。
    The rigid transformation between the end of the robotic arm and the camera is determined according to a preset rigid transformation formula, wherein the preset rigid transformation formula is:
    Figure PCTCN2021097271-appb-100008
    is the first relative transformation, and pref is the coordinate of the preset target point,
    Figure PCTCN2021097271-appb-100009
    is the second relative transformation, and p world is the coordinate of the preset calibration point.
  20. 根据权利要求18所述的计算机设备,其中,所述刚性变换包括旋转刚性变换和平移刚性变换,所述处理器执行所述计算机可读指令时实现所述确定所述机械臂末端与所述相机的刚性变换,具体包括:19. The computer apparatus of claim 18, wherein the rigid transformation includes a rotational rigid transformation and a translational rigid transformation, the processor executing the computer-readable instructions to effect the determining of the end of the robotic arm and the camera The rigid transformation of , specifically includes:
    根据所述第一相对变换、所述预设目标点的坐标、所述第二相对变换以及所述预设标定点的坐标,确定所述旋转刚性变换;根据所述旋转刚性变换,确定所述平移刚性变换。Determine the rotationally rigid transformation according to the first relative transformation, the coordinates of the preset target point, the second relative transformation, and the coordinates of the preset calibration point; determine the rotationally rigid transformation according to the rotationally rigid transformation Translation rigid transformation.
PCT/CN2021/097271 2020-09-27 2021-05-31 Computer vision-based hand-eye calibration method and apparatus, and storage medium WO2022062464A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011030608.0 2020-09-27
CN202011030608.0A CN112022355B (en) 2020-09-27 2020-09-27 Hand-eye calibration method and device based on computer vision and storage medium

Publications (1)

Publication Number Publication Date
WO2022062464A1 true WO2022062464A1 (en) 2022-03-31

Family

ID=73574578

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/097271 WO2022062464A1 (en) 2020-09-27 2021-05-31 Computer vision-based hand-eye calibration method and apparatus, and storage medium

Country Status (2)

Country Link
CN (1) CN112022355B (en)
WO (1) WO2022062464A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115089293A (en) * 2022-07-04 2022-09-23 山东大学 Calibration method for spinal endoscopic surgical robot
CN115861445A (en) * 2022-12-23 2023-03-28 广东工业大学 Hand-eye calibration method based on calibration plate three-dimensional point cloud
CN116878386A (en) * 2023-09-06 2023-10-13 北京华卓精科科技股份有限公司 Calibration method and calibration device for up-down alignment visual device
CN117103286A (en) * 2023-10-25 2023-11-24 杭州汇萃智能科技有限公司 Manipulator eye calibration method and system and readable storage medium

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112022355B (en) * 2020-09-27 2022-06-10 平安科技(深圳)有限公司 Hand-eye calibration method and device based on computer vision and storage medium
CN113223048B (en) * 2021-04-20 2024-02-27 深圳瀚维智能医疗科技有限公司 Method and device for determining hand-eye calibration precision, terminal equipment and storage medium
CN113397704B (en) * 2021-05-10 2022-05-20 武汉联影智融医疗科技有限公司 Robot positioning method, device and system and computer equipment
CN114012718B (en) * 2021-10-18 2023-03-31 阿里云计算有限公司 Data processing method
CN116277035B (en) * 2023-05-15 2023-09-12 北京壹点灵动科技有限公司 Robot control method and device, processor and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040102911A1 (en) * 2002-11-21 2004-05-27 Samsung Electronics Co., Ltd. Hand/eye calibration method using projective invariant shape descriptor of 2-dimensional image
CN109910016A (en) * 2019-04-22 2019-06-21 亿嘉和科技股份有限公司 Vision collecting scaling method, apparatus and system based on multi-degree-of-freemechanical mechanical arm
JP2019155556A (en) * 2018-03-15 2019-09-19 セイコーエプソン株式会社 Control device of robot, robot, robot system, and calibration method for camera
CN110276806A (en) * 2019-05-27 2019-09-24 江苏大学 Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system
CN110421562A (en) * 2019-07-24 2019-11-08 中国地质大学(武汉) Mechanical arm calibration system and scaling method based on four item stereo visions
CN112022355A (en) * 2020-09-27 2020-12-04 平安科技(深圳)有限公司 Hand-eye calibration method and device based on computer vision and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102794763B (en) * 2012-08-31 2014-09-24 江南大学 Systematic calibration method of welding robot guided by line structured light vision sensor
KR101766756B1 (en) * 2015-11-20 2017-08-10 경북대학교 산학협력단 Apparatus for Rectification of Stereo Vision System and Method thereof
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN110116411B (en) * 2019-06-06 2020-10-30 浙江汉振智能技术有限公司 Robot 3D vision hand-eye calibration method based on spherical target
CN110717943A (en) * 2019-09-05 2020-01-21 中北大学 Method and system for calibrating eyes of on-hand manipulator for two-dimensional plane

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040102911A1 (en) * 2002-11-21 2004-05-27 Samsung Electronics Co., Ltd. Hand/eye calibration method using projective invariant shape descriptor of 2-dimensional image
JP2019155556A (en) * 2018-03-15 2019-09-19 セイコーエプソン株式会社 Control device of robot, robot, robot system, and calibration method for camera
CN109910016A (en) * 2019-04-22 2019-06-21 亿嘉和科技股份有限公司 Vision collecting scaling method, apparatus and system based on multi-degree-of-freemechanical mechanical arm
CN110276806A (en) * 2019-05-27 2019-09-24 江苏大学 Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system
CN110421562A (en) * 2019-07-24 2019-11-08 中国地质大学(武汉) Mechanical arm calibration system and scaling method based on four item stereo visions
CN112022355A (en) * 2020-09-27 2020-12-04 平安科技(深圳)有限公司 Hand-eye calibration method and device based on computer vision and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115089293A (en) * 2022-07-04 2022-09-23 山东大学 Calibration method for spinal endoscopic surgical robot
CN115861445A (en) * 2022-12-23 2023-03-28 广东工业大学 Hand-eye calibration method based on calibration plate three-dimensional point cloud
CN116878386A (en) * 2023-09-06 2023-10-13 北京华卓精科科技股份有限公司 Calibration method and calibration device for up-down alignment visual device
CN116878386B (en) * 2023-09-06 2023-12-08 北京华卓精科科技股份有限公司 Calibration method and calibration device for up-down alignment visual device
CN117103286A (en) * 2023-10-25 2023-11-24 杭州汇萃智能科技有限公司 Manipulator eye calibration method and system and readable storage medium
CN117103286B (en) * 2023-10-25 2024-03-19 杭州汇萃智能科技有限公司 Manipulator eye calibration method and system and readable storage medium

Also Published As

Publication number Publication date
CN112022355A (en) 2020-12-04
CN112022355B (en) 2022-06-10

Similar Documents

Publication Publication Date Title
WO2022062464A1 (en) Computer vision-based hand-eye calibration method and apparatus, and storage medium
US11911914B2 (en) System and method for automatic hand-eye calibration of vision system for robot motion
JP6180086B2 (en) Information processing apparatus and information processing method
KR20180080630A (en) Robot and electronic device for performing hand-eye calibration
JP7111114B2 (en) Information processing device, information processing method, and information processing system
US20090129631A1 (en) Method of Tracking the Position of the Head in Real Time in a Video Image Stream
JP2016173313A (en) Visual line direction estimation system, visual line direction estimation method and visual line direction estimation program
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
WO2020057121A1 (en) Data processing method and apparatus, electronic device and storage medium
US20010053204A1 (en) Method and apparatus for relative calibration of a mobile X-ray C-arm and an external pose tracking system
Pachtrachai et al. Hand-eye calibration with a remote centre of motion
Bianchi et al. High precision augmented reality haptics
CN113442169A (en) Method and device for calibrating hands and eyes of robot, computer equipment and readable storage medium
JP2019098409A (en) Robot system and calibration method
CN109785373A (en) A kind of six-freedom degree pose estimating system and method based on speckle
KR101349347B1 (en) System for generating a frontal-view image for augmented reality based on the gyroscope of smart phone and Method therefor
JP6924455B1 (en) Trajectory calculation device, trajectory calculation method, trajectory calculation program
WO2021097332A1 (en) Scene perception systems and methods
CN114833822B (en) Rapid hand-eye calibration method for robot
JP7323993B2 (en) Control device, robot system, operating method and program for control device
CN116135169A (en) Positioning method, positioning device, electronic equipment and computer readable storage medium
JP6253847B1 (en) Laser processing apparatus, laser processing method, and laser processing program
JP6343930B2 (en) Robot system, robot control apparatus, and robot control method
CN114740967A (en) Handle correction method, electronic device, chip and readable storage medium
CN117103286B (en) Manipulator eye calibration method and system and readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21870851

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21870851

Country of ref document: EP

Kind code of ref document: A1