CN108627178B - Robot eye calibration method and system - Google Patents

Robot eye calibration method and system Download PDF

Info

Publication number
CN108627178B
CN108627178B CN201810442460.8A CN201810442460A CN108627178B CN 108627178 B CN108627178 B CN 108627178B CN 201810442460 A CN201810442460 A CN 201810442460A CN 108627178 B CN108627178 B CN 108627178B
Authority
CN
China
Prior art keywords
coordinate
machine
robot
perspective transformation
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810442460.8A
Other languages
Chinese (zh)
Other versions
CN108627178A (en
Inventor
孙高磊
吴丰礼
罗小军
李相前
张文刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Topstar Technology Co Ltd
Original Assignee
Guangdong Topstar Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Topstar Technology Co Ltd filed Critical Guangdong Topstar Technology Co Ltd
Priority to CN201810442460.8A priority Critical patent/CN108627178B/en
Publication of CN108627178A publication Critical patent/CN108627178A/en
Application granted granted Critical
Publication of CN108627178B publication Critical patent/CN108627178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manipulator (AREA)

Abstract

The application relates to a robot hand-eye calibration method and system. The terminal camera and the standard smelting tool of installing of robot are provided with the characteristic point on the fixed calibration object, the method includes: the method comprises the steps of obtaining a first machine coordinate, a second machine coordinate and a calibration pixel coordinate, obtaining a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate, obtaining a mapping relation between a movement control coordinate of the robot and a position coordinate in a camera visual field according to the perspective transformation matrix, and converting the pixel coordinate of a target point into a target machine coordinate according to the mapping relation. By adopting the method, the mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field can be obtained, the pixel coordinate of the target point can be converted into the coordinate of the target machine according to the mapping relation, the hand-eye calibration of the robot is completed, and the precision of the robot reaching the position where the object is grabbed can be improved.

Description

Robot eye calibration method and system
Technical Field
The application relates to the technical field of machine vision, in particular to a robot hand-eye calibration method and a robot hand-eye calibration system.
Background
When the machine vision is applied to the robot technology, a camera is often fixed on an end effector of the robot, and when the end effector of the robot grabs a workpiece, the relative position of the end effector and the workpiece can be measured through the camera, so that a robot 'hand-eye' vision system is formed.
The 'hand eye' vision system of robot generally requires that the focal plane of camera after the installation is absolutely parallel with the plane that the robot ring flange is located, but all can have the error of different degrees in the terminal of robot and the executor installation, causes the 'hand eye' vision system of robot also can have great error for the robot snatchs object position precision low.
Disclosure of Invention
Therefore, it is necessary to provide a robot hand-eye calibration method and system for solving the problem of low precision of the position where the robot grabs the object.
A robot hand-eye calibration method is characterized in that a camera and a standard jig are installed at the tail end of a robot, and characteristic points are arranged on a fixed calibration object, and the method comprises the following steps:
acquiring a first machine coordinate, a second machine coordinate and a calibration pixel coordinate, wherein the first machine coordinate is a movement control coordinate of the robot when the tail end of the calibration jig is aligned with the feature point along a standard direction, the standard direction is a normal direction of a flange plate installed on the robot, the second machine coordinate is a movement control coordinate of the robot when the feature point is located at a specified position in a camera visual field, and the calibration pixel coordinate is a position coordinate of the feature point in the camera visual field when the feature point moves to the specified position in the camera visual field along with the robot;
acquiring a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate;
and acquiring a mapping relation between the movement control coordinate of the robot and the position coordinate in the field of view of the camera according to the perspective transformation matrix, and converting the pixel coordinate of the target point into a target machine coordinate according to the mapping relation, wherein the pixel coordinate of the target point comprises the position coordinate of the target point in the camera, and the target machine coordinate is the movement control coordinate corresponding to the movement of the robot to the target point.
The utility model provides a robot hand eye calibration system, camera and standard smelting tool are installed to the end of robot, are provided with the characteristic point on the fixed demarcation object, and the system includes:
the system comprises a coordinate acquisition module, a calibration jig and a calibration pixel coordinate, wherein the coordinate acquisition module is used for acquiring a first machine coordinate, a second machine coordinate and a calibration pixel coordinate, the first machine coordinate is a movement control coordinate of the robot when a tail end of the calibration jig aligns to a feature point along a standard direction, the standard direction is a normal direction of a flange plate installed on the robot, the second machine coordinate is a movement control coordinate of the robot when the feature point is located at a specified position in a camera visual field, and the calibration pixel coordinate is a position coordinate of the feature point in the camera visual field when the feature point moves to the specified position in the camera visual field along with the robot;
the perspective transformation matrix acquisition module is used for acquiring a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate;
and the coordinate transformation module is used for acquiring the mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field according to the perspective transformation matrix, and converting the pixel coordinate of the target point into the target machine coordinate according to the mapping relation, wherein the pixel coordinate of the target point comprises the position coordinate of the target point in the camera, and the target machine coordinate is the movement control coordinate corresponding to the movement of the robot to the target point.
According to the robot hand-eye calibration method and system, the mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field is obtained, the pixel coordinate of the target point can be converted into the target machine coordinate according to the mapping relation, the hand-eye calibration of the robot is completed, and the precision of the robot reaching the position where the object is grabbed can be improved.
Drawings
FIG. 1 is a diagram of an exemplary environment in which a robot hand-eye calibration method may be implemented;
FIG. 2 is a flow diagram of a robot eye calibration method in one embodiment;
FIG. 3 is a flow diagram of target machine coordinate transformation in one embodiment;
FIG. 4 is a flow diagram of perspective transformation matrix acquisition in one embodiment;
FIG. 5 is a flow diagram of a perspective transformation matrix solution in one embodiment;
FIG. 6 is a flowchart of a robot hand-eye calibration method in another embodiment;
FIG. 7 is a schematic diagram of the configuration of the robot eye calibration system in one embodiment;
FIG. 8 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The robot hand-eye calibration method provided by the application can be applied to the application environment shown in fig. 1, and fig. 1 is an application environment diagram of the robot hand-eye calibration method in one embodiment. Wherein the end of the robot 10 comprises a movable tool arm 11, the camera 20 and the standard jig 30 can be mounted on the end of the robot 10 by means of the tool arm 11, a stationary feature point 40 is placed under the end of the robot 10, and the tool arm 11 is mounted on the end of the robot 10 by means of a flange 50, so that the direction of the calibration jig 30 is parallel to the normal direction of the flange 50.
In an embodiment, as shown in fig. 2, fig. 2 is a flowchart of a robot hand-eye calibration method in an embodiment, which is described by applying the method to the application environment in fig. 1, a camera and a standard jig are installed at a tail end of a robot, and a feature point is set on a fixed calibration object, the method includes the following steps:
step S210: the method comprises the steps of obtaining a first machine coordinate, a second machine coordinate and a calibration pixel coordinate, wherein the first machine coordinate is a movement control coordinate of the robot when the tail end of the calibration jig aligns to a feature point along a standard direction, the standard direction is a normal direction of a flange plate installed on the robot, the second machine coordinate is a movement control coordinate of the robot when the feature point is located at a specified position in a camera visual field, and the calibration pixel coordinate is a position coordinate of the feature point in the camera visual field when the feature point moves to the specified position in the camera visual field along with the robot.
The standard jig is indirectly installed at the tail end of the robot through the flange plate, so that the placing direction of the calibration jig is parallel to the normal direction of the flange plate. When the tail end of the calibration jig is aligned with the feature points along the standard direction, the alignment of the feature points of the calibration jig along the placing direction of the calibration jig can be ensured, and the tail end of the calibration jig is aligned with the feature points.
The movement control coordinates of the robot may be used to record the spatial position of the tool arm at the end of the robot and to control the movement of the tool arm to a specified spatial position. Therefore, the first machine coordinate can be acquired and stored when the tip of the calibration jig is aligned with the feature point in the standard direction. The second machine coordinates may be collected and stored of the movement control coordinates of the robot when the feature point is located at a specified position in the camera field of view.
The camera view in the camera at the robot end may contain the feature points and the focal length of the camera may be such that the feature points can be clearly identified. The calibration pixel coordinates may be used to identify and store position coordinates of the feature point in the camera field of view acquired by the camera when the feature point moves to a specified position in the camera field of view along with the robot.
The designated location includes a plurality, and may include upper left, upper right, left, middle, right, lower left, lower right, and lower right in the camera field of view. For example, the robot may be moved so that the feature points appear at the upper left, upper right, left, middle, right, lower left, lower right, and lower right in the camera field of view, and the second machine coordinates and the calibration pixel coordinates at the 9 designated positions may be acquired, or the second machine coordinates and the calibration pixel coordinates at any number of the designated positions may be acquired.
Step S220: and acquiring a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate.
From another point of view, it can be understood that when the position of the robot is at the first machine coordinates, in order to move the feature point to a specified position in the camera field of view, it is actually necessary to control the robot and move the robot to the second machine coordinates. Therefore, a perspective transformation relation exists between the second machine coordinate and the calibration pixel coordinate, and a perspective transformation matrix reflecting the perspective transformation relation can be calculated according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate.
Step S230: and acquiring the mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field according to the perspective transformation matrix.
Step S240: and converting the pixel coordinates of the target point into target machine coordinates according to the mapping relation, wherein the pixel coordinates of the target point comprise position coordinates of the target point in the camera, and the target machine coordinates are movement control coordinates corresponding to the movement of the robot to the target point.
According to the perspective transformation matrix, the pixel coordinates of the target point can be converted into the coordinates of the target machine, so that the robot can move to the target point, and the hand-eye calibration of the robot is completed.
According to the robot hand-eye calibration method, the mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field is obtained, the pixel coordinate of the target point can be converted into the target machine coordinate according to the mapping relation, the hand-eye calibration of the robot is completed, and the precision of the robot reaching the position where the object is grabbed can be improved.
In one embodiment, as shown in fig. 3, fig. 3 is a flowchart of target machine coordinate conversion in one embodiment, and the step of converting pixel coordinates of the target point into target machine coordinates according to the mapping relationship includes the following steps:
step S241: and acquiring initial machine coordinates and pixel coordinates of the target point, wherein the initial machine coordinates are movement control coordinates when the camera view of the target point is acquired.
And acquiring the initial machine coordinate while acquiring the pixel coordinate of the target point.
Step S242: and acquiring the coordinates of the target machine according to the initial machine coordinates, the pixel coordinates of the target point and the mapping relation.
The robot is in the position corresponding to the initial machine coordinate, and the target machine coordinate can be obtained according to the pixel coordinate and the mapping relation of the target point at the moment.
According to the robot hand-eye calibration method, the target machine coordinate is obtained according to the initial machine coordinate, the pixel coordinate of the target point and the mapping relation, the robot hand-eye calibration is completed, and the precision of the robot reaching the position where the object is grabbed can be improved.
In one embodiment, as shown in fig. 4, fig. 4 is a flowchart of the acquisition of the perspective transformation matrix in one embodiment, and the step of acquiring the perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate includes the following steps:
step S221: and acquiring a third machine coordinate according to the first machine coordinate and the second machine coordinate, wherein the third machine coordinate is a coordinate of the second machine coordinate projected on the plane where the first machine coordinate is located.
By means of the plane where the first machine coordinate is located, the perspective transformation matrix is solved according to the obtained third machine coordinate, the complexity and the operation amount of calculation are reduced, and the precision is improved.
The first machine coordinate may include an X-axis coordinate, a Y-axis coordinate, and a rotation coordinate. And projecting the second machine coordinate on the plane where the first machine coordinate is located, namely projecting the second machine coordinate according to the angle corresponding to the rotation coordinate of the first machine coordinate, wherein the third machine coordinate obtained after the second machine coordinate is projected is the coordinate of the X axis and the Y axis under the corresponding angle.
For example, the first machine coordinate may include an X-axis coordinate, a Y-axis coordinate, and a rotation coordinate, where the rotation coordinate is 0, and then the second machine coordinate is projected behind the plane of the first machine coordinate with the rotation coordinate 0 to be understood as the second machine coordinate is projected on the plane with the rotation angle 0. Namely, the third machine coordinate obtained by projection is the coordinate with the rotation angle of 0, so that the complexity and the calculation amount of calculation can be reduced, and the precision can be improved.
Step S222: and acquiring a perspective transformation matrix according to the third machine coordinate and the calibration pixel coordinate.
And solving and acquiring a perspective transformation matrix according to the one-to-one correspondence relationship between the third machine coordinate and the calibration pixel coordinate, wherein the acquired perspective transformation matrix can be used for expressing the mapping relationship between the movement control coordinate of the robot and the position coordinate in the camera visual field.
According to the robot hand-eye calibration method, the third machine coordinate is obtained according to the first machine coordinate and the second machine coordinate, and the perspective transformation matrix is obtained according to the third machine coordinate and the calibration pixel coordinate, so that the calculation complexity and the calculation amount can be reduced, and the precision can be improved.
In one embodiment, the step of obtaining the perspective transformation matrix from the third machine coordinates and the calibration pixel coordinates comprises the steps of:
according to Qi=PiA acquisition perspective transformationMatrix, where i is the number of the designated location, QiIs the ith third machine coordinate, PiAnd A is a perspective transformation matrix for the ith calibration pixel coordinate.
According to the robot hand-eye calibration method, the perspective transformation matrix is determined and obtained according to the third machine coordinate and the calibration pixel coordinate, the complexity and the calculation amount of calculation can be reduced, and the precision is improved.
In one embodiment, the step of obtaining third machine coordinates from the first machine coordinates and the second machine coordinates comprises the steps of:
according to
Figure BDA0001656355720000071
Obtaining a third machine coordinate, wherein i is a serial number of the designated position, and Qi(xi,yi) Is the ith third machine coordinate, xiAnd yiRespectively the abscissa and ordinate, vx, of the ith third machine coordinatei、vyiAnd vriRespectively the abscissa, ordinate and rotation coordinate, x, of the ith second machine coordinate0And y0Respectively the abscissa and ordinate of the first machine coordinate.
The robot hand-eye calibration method, x0And y0The abscissa and the ordinate of the first machine coordinate are respectively, the rotation coordinate of the first machine coordinate is 0, the third machine coordinate is calculated according to the first machine coordinate with the rotation coordinate of 0, the third machine coordinate with the same rotation coordinate of 0 can be obtained, the calculation complexity and the calculation amount can be reduced, and the precision is improved.
In one embodiment, as shown in fig. 5, fig. 5 is a flowchart of solving the perspective transformation matrix in one embodiment, and the step of obtaining the perspective transformation matrix according to the third machine coordinate and the calibration pixel coordinate includes the following steps:
step S223: and establishing a perspective transformation equation according to the third machine coordinate and the calibration pixel coordinate.
And establishing a perspective transformation equation for solving a subsequent perspective transformation matrix according to the relation between the movement control coordinate of the represented robot and the position coordinate in the camera visual field in the third machine coordinate and the calibration pixel coordinate at the specified position.
Step S224: the perspective transformation matrix is solved according to the perspective transformation equation.
According to the perspective transformation equation and the determined equation relation, the perspective transformation matrix can be accurately solved.
According to the robot eye calibration method, the perspective transformation matrix can be accurately solved through the perspective transformation equation established by the third machine coordinate and the calibration pixel coordinate, so that the precision of the robot reaching the position where the object is grabbed is improved subsequently.
In one embodiment, the step of establishing a perspective transformation equation based on the third machine coordinates and the calibration pixel coordinates comprises the steps of:
step S225: determining
Figure BDA0001656355720000081
The transformation relation between the coordinate of the third machine and the coordinate of the calibration pixel is shown, wherein i is the serial number of the designated position, and xiAnd yiRespectively the abscissa and ordinate of the ith third machine coordinate, ziSatisfies zi=a13ui+a23wi+a33,uiAnd wiRespectively an abscissa and an ordinate of the coordinate of the ith calibration pixel,
Figure BDA0001656355720000091
for perspective transformation of matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
And acquiring a transformation relation between the third machine coordinate and the calibration pixel coordinate according to the third machine coordinate and the calibration pixel coordinate at the specified position, wherein the transformation relation can be used for representing the relation between the movement control coordinate of the robot and the position coordinate in the camera visual field.
Where 1 can be regarded as a given reference, the reference can be a constant value, and the reference can be other constant values besides 1. In addition, when the reference is 1, the calculation can be convenient and simplified, and the accuracy is improved.
Step S226: obtaining from a transformation relation
Figure BDA0001656355720000092
And
Figure BDA0001656355720000093
wherein the content of the first and second substances,
Figure BDA0001656355720000094
and
Figure BDA0001656355720000095
for perspective transformation equation, i is the number of the specified location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
From the perspective transformation equations obtained at the plurality of specified positions, the elements in the perspective transformation matrix can be accurately solved.
In addition, what is important, the included angle between the plane where the flange plate is located and the plane where the camera view field is located can be calculated through a perspective transformation matrix obtained by solving after a perspective transformation equation, namely, the error between the focal plane of the installed camera and the plane where the robot flange plate is located is calculated. Wherein, a13Can embody the included angle between the X axis and the U axis, a23Can embody the included angle between the Y axis and the W axis, a11Can embody the proportionality coefficient between the X axis and the U axis12Can embody the proportionality coefficient between X axis and W axis21Can embody the proportionality coefficient between Y axis and U axis22Can embody the proportionality coefficient between Y axis and W axis31Can embodyAmount of translation between X-axis and U-axis, a32Can show the translation amount between the Y axis and the W axis, a33a33And in special cases may be 1. The X axis is a coordinate axis of the X, the Y axis is a coordinate axis of the Y, the U axis is a coordinate axis of the U, and the W axis is a coordinate axis of the W.
According to the robot eye calibration method, the transformation relation between the third machine coordinate and the calibration pixel coordinate is determined, the perspective transformation equation is obtained, the elements in the perspective transformation matrix can be accurately solved, and the perspective transformation matrix is obtained.
In one embodiment, the step of obtaining a mapping relationship of the movement control coordinates of the robot and the position coordinates in the camera field of view from the perspective transformation matrix includes the steps of:
according to
Figure BDA0001656355720000101
Acquiring a mapping relation between a movement control coordinate of the robot and a position coordinate in a camera view field, wherein x and y are respectively an abscissa and an ordinate of the movement control coordinate of the robot, u and w are respectively the abscissa and the ordinate of the position coordinate in the camera view field, and a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix, xt、ytAnd rtThe initial machine coordinate is a movement control coordinate when the visual field of the camera where the target point is located is acquired.
According to the robot hand-eye calibration method, after the solved perspective transformation matrix is obtained, the mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field can be accurately obtained, so that the target machine coordinate can be obtained subsequently, the robot hand-eye calibration is completed, and the precision of the robot reaching the position where the object is grabbed is improved.
In one embodiment, the step of converting the pixel coordinates of the target point to target machine coordinates according to the mapping relationship comprises the steps of:
acquiring initial machine coordinates and pixel coordinates of a target point;
according to
Figure BDA0001656355720000111
Obtaining target machine coordinates, where xfAnd yfRespectively the abscissa and ordinate of the target machine coordinate, utAnd wtRespectively the abscissa and the ordinate of the pixel coordinate of the target point, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix, xt、ytAnd rtRespectively the abscissa, ordinate and rotation coordinate of the initial machine coordinate.
According to the robot hand-eye calibration method, the target machine coordinate is obtained according to the initial machine coordinate, the pixel coordinate of the target point and the mapping relation, the robot hand-eye calibration is completed, and the precision of the robot reaching the position where the object is grabbed can be improved.
And acquiring the initial machine coordinate while acquiring the pixel coordinate of the target point. The robot is in the position corresponding to the initial machine coordinate, and the target machine coordinate can be obtained according to the pixel coordinate and the mapping relation of the target point at the moment.
In one embodiment, after the step of solving the perspective transformation matrix according to the perspective transformation equation, the following steps are further included:
according to
Figure BDA0001656355720000112
Obtaining a perspective transformation matrix that minimizes the value, where i is the serial number of the designated location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively a perspective transformation momentElements of the array.
According to the robot hand-eye calibration method, the perspective transformation matrix which enables the value to be minimum is obtained, the optimized perspective transformation matrix which is closest to the actual perspective transformation matrix can be obtained, the error is further reduced, the accuracy of the perspective transformation matrix is improved, and therefore the precision of the robot reaching the position where the object is grabbed can be improved.
In another embodiment, as shown in fig. 6, fig. 6 is a flowchart of a robot hand-eye calibration method in another embodiment, where the robot hand-eye calibration method in this embodiment includes the following steps:
and acquiring a first machine coordinate and a second machine coordinate of the robot, acquiring a calibration pixel coordinate of the feature point in the camera view, and matching the corresponding second machine coordinate and calibration pixel coordinate under 9 groups of specified positions. The camera and the standard jig are installed at the tail end of the robot, a fixed characteristic point is placed below the tail end of the robot, the camera is adjusted to enable the focal plane of the camera to be clear, the characteristic point and the plane where the characteristic point is located can be easily identified and unique, and the shape of the characteristic point can be circular, round hole-shaped and the like. And aligning the tail end of the calibration jig to the center of the characteristic point, and recording the movement control coordinate of the robot at the moment as a first machine coordinate. The robot moves to drive the camera view to change, so that the feature points can respectively appear at the upper left, upper right, left, middle, right, lower left, lower right and lower right in the camera view, calibration pixel coordinates at the 9 specified positions are obtained, and corresponding 9 second machine coordinates are obtained while the 9 calibration pixel coordinates are collected; the images can be processed and the calibration pixel coordinates of the feature points can be extracted by collecting the images of the camera at the moment. According to the first machine coordinate, the 9 second machine coordinates are normalized, and corresponding 9 third machine coordinates are obtained, where the normalization may mean that the second machine coordinates are projected on a plane with a rotation angle of 0, and the obtained third machine coordinates are coordinates with a rotation angle of 0. According to
Figure BDA0001656355720000121
Obtaining a third machine coordinate, wherein i is a serial number of the designated position, and Qi(xi,yi) Is the ith third machine coordinate, xiAnd yiRespectively the abscissa and ordinate, vx, of the ith third machine coordinatei、vyiAnd vriRespectively the abscissa, ordinate and rotation coordinate, x, of the ith second machine coordinate0And y0Respectively the abscissa and ordinate of the first machine coordinate.
And calculating a perspective transformation matrix according to the matched second machine coordinate and the calibration pixel coordinate. After the coordinate of the third machine is obtained according to the coordinate of the second machine, a perspective transformation conversion model is established according to the coordinate of the third machine at the same designated position and the corresponding calibration pixel coordinate, so that Q is enabled to be obtainedi=PiA, wherein i is the serial number of the designated position, QiIs the ith third machine coordinate, PiAnd A is a perspective transformation matrix for the ith calibration pixel coordinate. From perspective transformation model
Figure BDA0001656355720000131
The transformation relation between the coordinate of the third machine and the coordinate of the calibration pixel is shown, wherein i is the serial number of the designated position, and xiAnd yiRespectively the abscissa and ordinate of the ith third machine coordinate, ziSatisfies zi=a13ui+a23wi+a33,uiAnd wiRespectively an abscissa and an ordinate of the coordinate of the ith calibration pixel,
Figure BDA0001656355720000132
for perspective transformation of matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix. Obtaining from a transformation relation
Figure BDA0001656355720000133
And
Figure BDA0001656355720000134
wherein the content of the first and second substances,
Figure BDA0001656355720000135
and
Figure BDA0001656355720000136
is a perspective transformation equation. From the perspective transformation equation and
Figure BDA0001656355720000137
a perspective transformation matrix is obtained that minimizes the values.
And converting the pixel coordinates of the target point of the camera view to the target machine coordinates according to the perspective transformation matrix so as to facilitate the robot to move to the target point and finish the hand-eye calibration of the robot. According to
Figure BDA0001656355720000138
Acquiring a mapping relation between a movement control coordinate of the robot and a position coordinate in a camera view field, wherein x and y are respectively an abscissa and an ordinate of the movement control coordinate of the robot, u and w are respectively the abscissa and the ordinate of the position coordinate in the camera view field, and xt、ytAnd rtThe initial machine coordinate is a movement control coordinate when the visual field of the camera where the target point is located is acquired. According to
Figure BDA0001656355720000141
Obtaining target machine coordinates, where xfAnd yfRespectively the abscissa and ordinate of the target machine coordinate, utAnd wtRespectively the abscissa and ordinate of the pixel coordinate of the target point.
According to the robot hand-eye calibration method, the second machine coordinates and the calibration pixel coordinates of the feature points at the 9 specified positions are provided, the accurate and optimal perspective transformation matrix can be calculated, the pixel coordinates of the target point can be converted into the target machine coordinates according to the mapping relation, the robot hand-eye calibration is completed, and the situation that the robot reaches the grabbed object can be improvedThe accuracy of the position of the body. Meanwhile, a perspective transformation matrix obtained by solving after a perspective transformation equation is calculated, and an included angle between a plane where the flange plate is located and a plane where the camera view is located can be calculated, namely an error between a focal plane of the installed camera and the plane where the robot flange plate is located is calculated. Wherein, a13Can embody the included angle between the X axis and the U axis, a23The angle between the Y-axis and the W-axis can be embodied.
It should be understood that although the steps in the flowcharts of fig. 2 to 6 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-6 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least some of the sub-steps or stages of other steps.
In an embodiment, as shown in fig. 7, fig. 7 is a schematic structural diagram of a robot eye calibration system in an embodiment, and provides a robot eye calibration system, where a camera and a standard jig are installed at a distal end of a robot, feature points are arranged on a fixed calibration object, the system includes a coordinate acquisition module 310, a perspective transformation matrix acquisition module 320, and a coordinate transformation module 330, where:
the coordinate acquisition module 310 is configured to acquire a first machine coordinate, a second machine coordinate, and a calibration pixel coordinate, where the first machine coordinate is a movement control coordinate of the robot when the terminal of the calibration jig aligns with the feature point along a standard direction, the standard direction is a normal direction of a flange plate mounted on the robot, the second machine coordinate is a movement control coordinate of the robot when the feature point is located at a specified position in a camera field of view, and the calibration pixel coordinate is a position coordinate of the feature point in the camera field of view when the feature point moves to the specified position in the camera field of view along with the robot;
a perspective transformation matrix obtaining module 320, configured to obtain a perspective transformation matrix according to the first machine coordinate, the second machine coordinate, and the calibration pixel coordinate;
and the coordinate transformation module 330 is configured to obtain a mapping relationship between the movement control coordinate of the robot and the position coordinate in the camera field of view according to the perspective transformation matrix, and convert the pixel coordinate of the target point into a target machine coordinate according to the mapping relationship, where the pixel coordinate of the target point includes the position coordinate of the target point in the camera, and the target machine coordinate is the movement control coordinate corresponding to the movement of the robot to the target point.
According to the robot hand-eye calibration system, the mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field is obtained, the pixel coordinate of the target point can be converted into the target machine coordinate according to the mapping relation, the hand-eye calibration of the robot is completed, and the precision of the robot reaching the position where the object is grabbed can be improved.
In one embodiment, the coordinate transformation module 330 is further configured to acquire initial machine coordinates and pixel coordinates of a target point, where the initial machine coordinates are movement control coordinates when acquiring a camera view in which the target point is located; and acquiring the coordinates of the target machine according to the initial machine coordinates, the pixel coordinates of the target point and the mapping relation.
According to the robot hand-eye calibration system, the target machine coordinate is obtained according to the initial machine coordinate, the pixel coordinate of the target point and the mapping relation, the robot hand-eye calibration is completed, and the precision of the robot reaching the position where the object is grabbed can be improved.
In one embodiment, the perspective transformation matrix obtaining module 320 is further configured to obtain a third machine coordinate according to the first machine coordinate and the second machine coordinate, where the third machine coordinate is a coordinate of the second machine coordinate projected on the plane where the first machine coordinate is located; and acquiring a perspective transformation matrix according to the third machine coordinate and the calibration pixel coordinate.
According to the robot hand-eye calibration system, the third machine coordinate is obtained according to the first machine coordinate and the second machine coordinate, and the perspective transformation matrix is obtained according to the third machine coordinate and the calibration pixel coordinate, so that the calculation complexity and the calculation amount can be reduced, and the precision can be improved.
In one embodiment, the perspective transformation matrix acquisition module 320 is further configured to obtain the transformation matrix according to Qi=PiA obtains perspective transformation matrix, wherein i is serial number of designated position, QiIs the ith third machine coordinate, PiAnd A is a perspective transformation matrix for the ith calibration pixel coordinate.
According to the robot hand-eye calibration system, the perspective transformation matrix is determined and obtained according to the third machine coordinate and the calibration pixel coordinate, the complexity and the calculation amount of calculation can be reduced, and the precision is improved.
In one embodiment, the perspective transformation matrix acquisition module 320 is further configured to obtain the transformation matrix based on
Figure BDA0001656355720000161
Obtaining a third machine coordinate, wherein i is a serial number of the designated position, and Qi(xi,yi) Is the ith third machine coordinate, xiAnd yiRespectively the abscissa and ordinate, vx, of the ith third machine coordinatei、vyiAnd vriRespectively the abscissa, ordinate and rotation coordinate, x, of the ith second machine coordinate0And y0Respectively the abscissa and ordinate of the first machine coordinate.
The robot hand-eye calibration system, x0And y0The abscissa and the ordinate of the first machine coordinate are respectively, the rotation coordinate of the first machine coordinate is 0, the third machine coordinate is calculated according to the first machine coordinate with the rotation coordinate of 0, the third machine coordinate with the same rotation coordinate of 0 can be obtained, the calculation complexity and the calculation amount can be reduced, and the precision is improved.
In one embodiment, the perspective transformation matrix obtaining module 320 is further configured to establish a perspective transformation equation according to the third machine coordinate and the calibration pixel coordinate; the perspective transformation matrix is solved according to the perspective transformation equation.
According to the robot eye calibration system, the perspective transformation matrix can be accurately solved through the perspective transformation equation established by the third machine coordinate and the calibration pixel coordinate, so that the precision of the robot reaching the position where the object is grabbed is improved subsequently.
In one embodiment, the perspective transformation matrix acquisition module 320 is also used to determine
Figure BDA0001656355720000171
The transformation relation between the coordinate of the third machine and the coordinate of the calibration pixel is shown, wherein i is the serial number of the designated position, and xiAnd yiRespectively the abscissa and ordinate of the ith third machine coordinate, ziSatisfies zi=a13ui+a23wi+a33,uiAnd wiRespectively an abscissa and an ordinate of the coordinate of the ith calibration pixel,
Figure BDA0001656355720000172
for perspective transformation of matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33Elements of a perspective transformation matrix are respectively; obtaining from a transformation relation
Figure BDA0001656355720000173
And
Figure BDA0001656355720000174
wherein the content of the first and second substances,
Figure BDA0001656355720000175
and
Figure BDA0001656355720000176
for perspective transformation equation, i is the number of the specified location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
According to the robot hand-eye calibration system, the transformation relation between the third machine coordinate and the calibration pixel coordinate is determined, the perspective transformation equation is obtained, the elements in the perspective transformation matrix can be accurately solved, and the perspective transformation matrix is obtained.
In one embodiment, the coordinate transformation module 330 is further configured to transform the image according to
Figure BDA0001656355720000177
Acquiring a mapping relation between a movement control coordinate of the robot and a position coordinate in a camera view field, wherein x and y are respectively an abscissa and an ordinate of the movement control coordinate of the robot, u and w are respectively the abscissa and the ordinate of the position coordinate in the camera view field, and a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix, xt、ytAnd rtThe initial machine coordinate is a movement control coordinate when the visual field of the camera where the target point is located is acquired.
According to the robot hand-eye calibration system, after the solved perspective transformation matrix is obtained, the mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field can be accurately obtained, so that the target machine coordinate can be obtained subsequently, the hand-eye calibration of the robot is completed, and the precision of the robot reaching the position where the object is grabbed is improved.
In one embodiment, the coordinate transformation module 330 is further configured to obtain initial machine coordinates and pixel coordinates of the target point; according to
Figure BDA0001656355720000181
Obtaining target machine coordinates, where xfAnd yfRespectively the abscissa and ordinate of the target machine coordinate, utAnd wtRespectively the abscissa and the ordinate of the pixel coordinate of the target point, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix, xt、ytAnd rtRespectively the abscissa, ordinate and rotation coordinate of the initial machine coordinate.
According to the robot hand-eye calibration system, the target machine coordinate is obtained according to the initial machine coordinate, the pixel coordinate of the target point and the mapping relation, the robot hand-eye calibration is completed, and the precision of the robot reaching the position where the object is grabbed can be improved.
In one embodiment, the perspective transformation matrix acquisition module 320 is further configured to obtain the transformation matrix based on
Figure BDA0001656355720000182
Obtaining a perspective transformation matrix that minimizes the value, where i is the serial number of the designated location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
According to the robot hand-eye calibration system, the perspective transformation matrix which enables the value to be minimum is obtained, the optimized perspective transformation matrix which is closest to the actual perspective transformation matrix can be obtained, the error is further reduced, the accuracy of the perspective transformation matrix is improved, and therefore the precision of the robot reaching the position where the object is grabbed can be improved.
For specific limitations of the robot eye calibration system, reference may be made to the above limitations of the robot eye calibration method, which are not described herein again. All or part of the modules in the robot eye calibration system can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 8. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a robot hand-eye calibration method.
Those skilled in the art will appreciate that the architecture shown in fig. 8 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring a first machine coordinate, a second machine coordinate and a calibration pixel coordinate, wherein the first machine coordinate is a movement control coordinate of the robot when the tail end of the calibration jig is aligned with the feature point along a standard direction, the standard direction is a normal direction of a flange plate installed on the robot, the second machine coordinate is a movement control coordinate of the robot when the feature point is located at a specified position in a camera visual field, and the calibration pixel coordinate is a position coordinate of the feature point in the camera visual field when the feature point moves to the specified position in the camera visual field along with the robot;
acquiring a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate;
and acquiring a mapping relation between the movement control coordinate of the robot and the position coordinate in the field of view of the camera according to the perspective transformation matrix, and converting the pixel coordinate of the target point into a target machine coordinate according to the mapping relation, wherein the pixel coordinate of the target point comprises the position coordinate of the target point in the camera, and the target machine coordinate is the movement control coordinate corresponding to the movement of the robot to the target point.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring initial machine coordinates and pixel coordinates of a target point, wherein the initial machine coordinates are movement control coordinates when a camera view of the target point is acquired; and acquiring the coordinates of the target machine according to the initial machine coordinates, the pixel coordinates of the target point and the mapping relation.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring a third machine coordinate according to the first machine coordinate and the second machine coordinate, wherein the third machine coordinate is a coordinate of the second machine coordinate projected on a plane where the first machine coordinate is located; and acquiring a perspective transformation matrix according to the third machine coordinate and the calibration pixel coordinate.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
according to Qi=PiA obtains perspective transformation matrix, wherein i is serial number of designated position, QiIs the ith third machine coordinate, PiAnd A is a perspective transformation matrix for the ith calibration pixel coordinate.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
according to
Figure BDA0001656355720000211
Obtaining a third machine coordinate, wherein i is a serial number of the designated position, and Qi(xi,yi) Is the ith third machine coordinate, xiAnd yiRespectively the abscissa and ordinate, vx, of the ith third machine coordinatei、vyiAnd vriRespectively the abscissa, ordinate and rotation coordinate, x, of the ith second machine coordinate0And y0Respectively the abscissa and ordinate of the first machine coordinate.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
establishing a perspective transformation equation according to the third machine coordinate and the calibration pixel coordinate; the perspective transformation matrix is solved according to the perspective transformation equation.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
determining
Figure BDA0001656355720000212
The transformation relation between the coordinate of the third machine and the coordinate of the calibration pixel is shown, wherein i is the serial number of the designated position, and xiAnd yiRespectively the abscissa and ordinate of the ith third machine coordinate, ziSatisfies zi=a13ui+a23wi+a33,uiAnd wiRespectively an abscissa and an ordinate of the coordinate of the ith calibration pixel,
Figure BDA0001656355720000213
for perspective transformation of matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33Elements of a perspective transformation matrix are respectively;
obtaining from a transformation relation
Figure BDA0001656355720000214
And
Figure BDA0001656355720000215
wherein the content of the first and second substances,
Figure BDA0001656355720000216
and
Figure BDA0001656355720000217
for perspective transformation equation, i is the number of the specified location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring initial machine coordinates and pixel coordinates of a target point; according to
Figure BDA0001656355720000221
Obtaining target machine coordinates, where xfAnd yfRespectively the abscissa and ordinate of the target machine coordinate, utAnd wtRespectively the abscissa and the ordinate of the pixel coordinate of the target point, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix, xt、ytAnd rtRespectively the abscissa, ordinate and rotation coordinate of the initial machine coordinate.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
according to
Figure BDA0001656355720000222
Obtaining a perspective transformation matrix that minimizes the value, where i is the serial number of the designated location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring a first machine coordinate, a second machine coordinate and a calibration pixel coordinate, wherein the first machine coordinate is a movement control coordinate of the robot when the tail end of the calibration jig is aligned with the feature point along a standard direction, the standard direction is a normal direction of a flange plate installed on the robot, the second machine coordinate is a movement control coordinate of the robot when the feature point is located at a specified position in a camera visual field, and the calibration pixel coordinate is a position coordinate of the feature point in the camera visual field when the feature point moves to the specified position in the camera visual field along with the robot;
acquiring a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate;
and acquiring a mapping relation between the movement control coordinate of the robot and the position coordinate in the field of view of the camera according to the perspective transformation matrix, and converting the pixel coordinate of the target point into a target machine coordinate according to the mapping relation, wherein the pixel coordinate of the target point comprises the position coordinate of the target point in the camera, and the target machine coordinate is the movement control coordinate corresponding to the movement of the robot to the target point.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring initial machine coordinates and pixel coordinates of a target point, wherein the initial machine coordinates are movement control coordinates when a camera view of the target point is acquired; and acquiring the coordinates of the target machine according to the initial machine coordinates, the pixel coordinates of the target point and the mapping relation.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring a third machine coordinate according to the first machine coordinate and the second machine coordinate, wherein the third machine coordinate is a coordinate of the second machine coordinate projected on a plane where the first machine coordinate is located; and acquiring a perspective transformation matrix according to the third machine coordinate and the calibration pixel coordinate.
In one embodiment, the computer program when executed by the processor further performs the steps of:
according to Qi=PiA obtains perspective transformation matrix, wherein i is serial number of designated position, QiIs the ith third machine coordinate, PiAnd A is a perspective transformation matrix for the ith calibration pixel coordinate.
In one embodiment, the computer program when executed by the processor further performs the steps of:
according to
Figure BDA0001656355720000231
Obtaining a third machine coordinate, wherein i is a serial number of the designated position, and Qi(xi,yi) Is the ith third machine coordinate, xiAnd yiRespectively the abscissa and ordinate, vx, of the ith third machine coordinatei、vyiAnd vriRespectively the abscissa, ordinate and rotation coordinate, x, of the ith second machine coordinate0And y0Respectively the abscissa and ordinate of the first machine coordinate.
In one embodiment, the computer program when executed by the processor further performs the steps of:
establishing a perspective transformation equation according to the third machine coordinate and the calibration pixel coordinate; the perspective transformation matrix is solved according to the perspective transformation equation.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining
Figure BDA0001656355720000241
The transformation relation between the coordinate of the third machine and the coordinate of the calibration pixel is shown, wherein i is the serial number of the designated position, and xiAnd yiRespectively the abscissa and ordinate of the ith third machine coordinate, ziSatisfies zi=a13ui+a23wi+a33,uiAnd wiRespectively an abscissa and an ordinate of the coordinate of the ith calibration pixel,
Figure BDA0001656355720000242
for perspective transformation of matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33Elements of a perspective transformation matrix are respectively;
obtaining from a transformation relation
Figure BDA0001656355720000243
And
Figure BDA0001656355720000244
wherein the content of the first and second substances,
Figure BDA0001656355720000245
and
Figure BDA0001656355720000246
for perspective transformation equation, i is the number of the specified location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring initial machine coordinates and pixel coordinates of a target point; according to
Figure BDA0001656355720000247
Obtaining target machine coordinates, where xfAnd yfRespectively the abscissa and ordinate of the target machine coordinate, utAnd wtRespectively the abscissa and the ordinate of the pixel coordinate of the target point, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix, xt、ytAnd rtRespectively the abscissa, ordinate and rotation coordinate of the initial machine coordinate.
In one embodiment, the computer program when executed by the processor further performs the steps of:
according to
Figure BDA0001656355720000251
Obtaining a perspective transformation matrix that minimizes the value, where i is the serial number of the designated location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (13)

1. A robot hand-eye calibration method is characterized in that a camera and a calibration jig are installed at the tail end of a robot, and feature points are arranged on a fixed calibration object, and the method comprises the following steps:
acquiring a first machine coordinate, a second machine coordinate and a calibration pixel coordinate, wherein the first machine coordinate is a movement control coordinate of the robot when the tail end of the calibration jig is aligned to the feature point along a standard direction, the standard direction is a normal direction of a flange plate installed on the robot, the second machine coordinate is the movement control coordinate of the robot when the feature point is located at a specified position in the camera visual field, and the calibration pixel coordinate is a position coordinate of the feature point in the camera visual field when the feature point moves to the specified position in the camera visual field along with the robot;
acquiring a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate;
and acquiring a mapping relation between the movement control coordinate of the robot and the position coordinate in the camera visual field according to the perspective transformation matrix, and converting the pixel coordinate of a target point into a target machine coordinate according to the mapping relation, wherein the pixel coordinate of the target point comprises the position coordinate of the target point in the camera, and the target machine coordinate is the movement control coordinate corresponding to the movement of the robot to the target point.
2. The robot hand-eye calibration method according to claim 1, wherein the step of converting the pixel coordinates of the target point into the target machine coordinates according to the mapping relationship comprises the steps of:
acquiring initial machine coordinates and pixel coordinates of the target point, wherein the initial machine coordinates are movement control coordinates when a camera view of the target point is acquired;
and acquiring the coordinates of the target machine according to the initial machine coordinates, the pixel coordinates of the target point and the mapping relation.
3. A robot hand-eye calibration method according to claim 1, wherein the step of obtaining a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate comprises the steps of:
acquiring a third machine coordinate according to the first machine coordinate and the second machine coordinate, wherein the third machine coordinate is a coordinate of the second machine coordinate projected on a plane where the first machine coordinate is located;
and acquiring the perspective transformation matrix according to the third machine coordinate and the calibration pixel coordinate.
4. A robotic eye calibration method according to claim 3 wherein said step of obtaining said perspective transformation matrix from said third machine coordinates and said calibration pixel coordinates comprises the steps of:
according to Qi=PiObtaining the perspective transformation matrix by A, wherein i is the serial number of the designated position, and QiIs the ith third machine coordinate, PiAnd A is the coordinate of the ith calibration pixel, and A is the perspective transformation matrix.
5. A robotic eye calibration method according to claim 3 wherein said step of obtaining third machine coordinates from said first and second machine coordinates comprises the steps of:
according to
Figure FDA0002400570040000021
Acquiring the coordinate of the third machine, wherein i is the serial number of the designated position, and Qi(xi,yi) Is the ith third machine coordinate, xiAnd yiRespectively the abscissa and ordinate, vx, of the ith third machine coordinatei、vyiAnd vriRespectively the abscissa, ordinate and rotation coordinate, x, of the ith second machine coordinate0And y0Respectively the abscissa and ordinate of the first machine coordinate.
6. A robotic eye calibration method according to claim 3 wherein said step of obtaining said perspective transformation matrix from said third machine coordinates and said calibration pixel coordinates comprises the steps of:
establishing a perspective transformation equation according to the third machine coordinate and the calibration pixel coordinate;
solving the perspective transformation matrix according to the perspective transformation equation.
7. A robot hand-eye calibration method according to claim 6, wherein the step of establishing a perspective transformation equation according to the third machine coordinate and the calibration pixel coordinate comprises the steps of:
determining
Figure FDA0002400570040000031
The transformation relation between the coordinate of the third machine and the coordinate of the calibration pixel is shown, wherein i is the serial number of the designated position, and xiAnd yiRespectively the abscissa and ordinate of the ith third machine coordinate, ziSatisfies zi=a13ui+a23wi+a33,uiAnd wiRespectively as the abscissa and ordinate of the ith calibration pixel coordinateThe coordinates of the position of the object to be imaged,
Figure FDA0002400570040000032
for the perspective transformation matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix;
obtaining from the transformation relation
Figure FDA0002400570040000033
And
Figure FDA0002400570040000034
wherein the content of the first and second substances,
Figure FDA0002400570040000035
and
Figure FDA0002400570040000036
for the perspective transformation equation, i is the number of the specified location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
8. The robot eye calibration method according to claim 7, wherein the step of obtaining the mapping relationship between the movement control coordinates of the robot and the position coordinates in the camera field of view according to the perspective transformation matrix comprises the steps of:
according to
Figure FDA0002400570040000037
Acquiring the movement control coordinates of the robot and the cameraMapping relation of position coordinates in a field of view, wherein x and y are respectively abscissa and ordinate of movement control coordinates of the robot, u and w are respectively abscissa and ordinate of position coordinates in the field of view of the camera, and a11、a12、a13、a21、a22、a23、a31、a32And a33Are respectively the elements, x, of the perspective transformation matrixt、ytAnd rtThe coordinate system comprises an abscissa, an ordinate and a rotation coordinate of an initial machine coordinate, wherein the initial machine coordinate is a movement control coordinate when the camera view of the target point is acquired.
9. The robot hand-eye calibration method according to claim 8, wherein the step of converting the pixel coordinates of the target point into the target machine coordinates according to the mapping relationship comprises the steps of:
acquiring the initial machine coordinate and the pixel coordinate of the target point;
according to
Figure FDA0002400570040000041
Acquiring the target machine coordinate, wherein xfAnd yfRespectively the abscissa and the ordinate, u, of the target machine coordinatetAnd wtRespectively the abscissa and the ordinate of the pixel coordinate of the target point, a11、a12、a13、a21、a22、a23、a31、a32And a33Are respectively the elements, x, of the perspective transformation matrixt、ytAnd rtRespectively the abscissa, ordinate and rotation coordinate of the initial machine coordinate.
10. A robotic eye calibration method according to claim 7 further comprising the following steps after said step of solving said perspective transformation matrix according to said perspective transformation equation:
according to
Figure FDA0002400570040000042
Obtaining a perspective transformation matrix minimizing a value, wherein i is a serial number of the designated location, xiAnd yiRespectively the abscissa and ordinate, u, of the ith third machine coordinateiAnd wiRespectively the abscissa and ordinate of the ith calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively, elements of the perspective transformation matrix.
11. The utility model provides a robot hand eye calibration system which characterized in that, camera and calibration jig are installed to the end of robot, are provided with the characteristic point on the fixed calibration object, the system includes:
the calibration jig comprises a coordinate acquisition module, a calibration pixel coordinate and a calibration module, wherein the coordinate acquisition module is used for acquiring a first machine coordinate, a second machine coordinate and a calibration pixel coordinate, the first machine coordinate is a movement control coordinate of the robot when the tail end of the calibration jig is aligned to a feature point along a standard direction, the standard direction is a normal direction of a flange plate installed on the robot, the second machine coordinate is a movement control coordinate of the robot when the feature point is located at a specified position in a camera visual field, and the calibration pixel coordinate is a position coordinate of the feature point in the camera visual field when the feature point moves to the specified position in the camera visual field along with the robot;
the perspective transformation matrix acquisition module is used for acquiring a perspective transformation matrix according to the first machine coordinate, the second machine coordinate and the calibration pixel coordinate;
and the coordinate transformation module is used for acquiring a mapping relation between the movement control coordinate of the robot and the position coordinate in the camera view field according to the perspective transformation matrix, and converting the pixel coordinate of the target point into a target machine coordinate according to the mapping relation, wherein the pixel coordinate of the target point comprises the position coordinate of the target point in the camera, and the target machine coordinate is the movement control coordinate corresponding to the movement of the robot to the target point.
12. A computer arrangement comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the computer program, carries out the steps of the method for robotic eye calibration of any one of claims 1-10.
13. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the robotic eye calibration method of any one of claims 1 to 10.
CN201810442460.8A 2018-05-10 2018-05-10 Robot eye calibration method and system Active CN108627178B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810442460.8A CN108627178B (en) 2018-05-10 2018-05-10 Robot eye calibration method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810442460.8A CN108627178B (en) 2018-05-10 2018-05-10 Robot eye calibration method and system

Publications (2)

Publication Number Publication Date
CN108627178A CN108627178A (en) 2018-10-09
CN108627178B true CN108627178B (en) 2020-10-13

Family

ID=63692516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810442460.8A Active CN108627178B (en) 2018-05-10 2018-05-10 Robot eye calibration method and system

Country Status (1)

Country Link
CN (1) CN108627178B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109702738B (en) * 2018-11-06 2021-12-07 深圳大学 Mechanical arm hand-eye calibration method and device based on three-dimensional object recognition
CN109737871B (en) * 2018-12-29 2020-11-17 南方科技大学 Calibration method for relative position of three-dimensional sensor and mechanical arm
CN110930442B (en) * 2019-11-26 2020-07-31 广东技术师范大学 Method and device for determining positions of key points in robot hand-eye calibration based on calibration block
CN112308931B (en) * 2020-11-02 2021-09-17 深圳市泰沃德技术有限公司 Camera calibration method and device, computer equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105073348B (en) * 2013-04-05 2016-11-09 Abb技术有限公司 Robot system and method for calibration
CN106767393B (en) * 2015-11-20 2020-01-03 沈阳新松机器人自动化股份有限公司 Hand-eye calibration device and method for robot
CN108122257B (en) * 2016-11-28 2021-11-30 沈阳新松机器人自动化股份有限公司 Robot hand-eye calibration method and device

Also Published As

Publication number Publication date
CN108627178A (en) 2018-10-09

Similar Documents

Publication Publication Date Title
CN108627178B (en) Robot eye calibration method and system
CN109829953B (en) Image acquisition device calibration method and device, computer equipment and storage medium
CN109129445B (en) Hand-eye calibration method, calibration plate, device, equipment and storage medium for mechanical arm
EP1607194B1 (en) Robot system comprising a plurality of robots provided with means for calibrating their relative position
CN110640747B (en) Hand-eye calibration method and system for robot, electronic equipment and storage medium
JP4191080B2 (en) Measuring device
CN112017205B (en) Automatic calibration method and system for space positions of laser radar and camera sensor
CN112241989A (en) External parameter calibration method and device, computer equipment and storage medium
US10569418B2 (en) Robot controller for executing calibration, measurement system and calibration method
CN109952176A (en) A kind of robot calibration method, system, robot and storage medium
CN110722558A (en) Origin correction method and device for robot, controller and storage medium
CN113172636B (en) Automatic hand-eye calibration method and device and storage medium
KR20200076628A (en) Location measuring method of mobile device, location measuring device and electronic device
US11577400B2 (en) Method and apparatus for managing robot system
EP4004488A1 (en) Improvements in or relating to photogrammetry
CN116469101A (en) Data labeling method, device, electronic equipment and storage medium
CN116117815A (en) Distribution network robot working tool path calibration method, controller, equipment and medium
CN113223148B (en) Automatic placement method and device of VCM framework and computer equipment
CN111699445A (en) Robot kinematics model optimization method and system and storage device
CN114186190A (en) Method, device and equipment for calculating coordinate transformation matrix and readable storage medium
CN113763400A (en) Robot vision guiding method, device, equipment and storage medium
CN114998561B (en) Category-level pose optimization method and device
CN110500953B (en) Thread R value measuring method and device, computer equipment and storage medium
CN111353932B (en) Coordinate conversion method and device, electronic equipment and storage medium
CN115781698B (en) Method, system, equipment and medium for automatically generating motion pose of layered hand-eye calibration robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20181009

Assignee: Guangdong Rongtong Financial Leasing Co.,Ltd.

Assignor: GUANGDONG TOPSTAR TECHNOLOGY Co.,Ltd.

Contract record no.: X2022980013974

Denomination of invention: Robot hand-eye calibration method and system

Granted publication date: 20201013

License type: Exclusive License

Record date: 20220902

EE01 Entry into force of recordation of patent licensing contract
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Robot hand-eye calibration method and system

Effective date of registration: 20220906

Granted publication date: 20201013

Pledgee: Guangdong Rongtong Financial Leasing Co.,Ltd.

Pledgor: GUANGDONG TOPSTAR TECHNOLOGY Co.,Ltd.

Registration number: Y2022980014594

PC01 Cancellation of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20231101

Granted publication date: 20201013

Pledgee: Guangdong Rongtong Financial Leasing Co.,Ltd.

Pledgor: GUANGDONG TOPSTAR TECHNOLOGY Co.,Ltd.

Registration number: Y2022980014594

EC01 Cancellation of recordation of patent licensing contract
EC01 Cancellation of recordation of patent licensing contract

Assignee: Guangdong Rongtong Financial Leasing Co.,Ltd.

Assignor: GUANGDONG TOPSTAR TECHNOLOGY Co.,Ltd.

Contract record no.: X2022980013974

Date of cancellation: 20231124