CN111975756A - Hand-eye calibration system and method of 3D vision measurement system - Google Patents
Hand-eye calibration system and method of 3D vision measurement system Download PDFInfo
- Publication number
- CN111975756A CN111975756A CN202010448062.4A CN202010448062A CN111975756A CN 111975756 A CN111975756 A CN 111975756A CN 202010448062 A CN202010448062 A CN 202010448062A CN 111975756 A CN111975756 A CN 111975756A
- Authority
- CN
- China
- Prior art keywords
- coordinate
- coordinate system
- robot
- vision sensor
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 58
- 238000000034 method Methods 0.000 title claims abstract description 31
- 239000011159 matrix material Substances 0.000 claims abstract description 64
- 239000013598 vector Substances 0.000 claims abstract description 45
- 238000013519 translation Methods 0.000 claims abstract description 31
- 238000006243 chemical reaction Methods 0.000 claims abstract description 16
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000033001 locomotion Effects 0.000 description 10
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/02—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
- B25J9/023—Cartesian coordinate type
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/087—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/08—Programme-controlled manipulators characterised by modular constructions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Manipulator (AREA)
Abstract
The invention relates to a hand-eye calibration system and a hand-eye calibration method of a 3D vision measurement system, wherein the method comprises the following steps: the method comprises the following steps of mounting a 3D vision sensor at the tail end of a robot, setting three non-coplanar straight lines in a space, wherein the three straight lines are mutually perpendicular and intersect at the same point; measuring three first characteristic points on three straight lines by using a 3D vision sensor to obtain coordinate positions of the three first characteristic points; the robot translates the 3D vision sensor, three second characteristic points on the three straight lines are measured, and coordinate positions of the three second characteristic points are obtained; recording the pose of the robot after the 3D vision sensor is translated; calculating a translation vector of the 3D vision sensor, and calculating the pose of the intersection point of the three straight lines in a second measurement coordinate system; and calibrating a robot base coordinate system, and performing coordinate conversion to obtain a hand-eye matrix. Compared with the prior art, the method has the advantages of simple steps, workload and working time reduction, and high hand-eye calibration stability.
Description
Technical Field
The invention relates to the technical field of robot vision sensing, in particular to a hand-eye calibration system and a calibration method of a 3D vision measurement system.
Background
In the prior art, an on-line hand-eye calibration and grabbing pose calculation method for a four-degree-of-freedom 4-R (2-SS) parallel robot stereoscopic vision hand-eye system, as disclosed in application number 201910446270.8, includes the following steps: stereoscopic Eye-to-hand model improvement with motion error compensation: an Eye-to-hand basic model of a camera fixed outside a robot body and a three-dimensional vision model based on nonlinear distortion in a hand-Eye system are constructed, and meanwhile, a hand-Eye model group of each camera and the robot in the three-dimensional vision is constructed according to the attitude relation between the cameras so as to improve the Eye-to-hand basic model of a single camera; the improved Eye-to-hand model is subjected to robot motion error compensation; solving an Eye-to-hand model based on vertical component correction: correcting vertical components in hand-eye calibration pose parameters based on vertical constraints of a calibration plate and a tail end clamping mechanism in the parallel robot according to calibration data of multiple motions of the robot acquired by each camera, and accurately solving all poses and motion errors in hand-eye calibration of the four-degree-of-freedom 4-R (2-SS) parallel robot with rotational motion constraints; 4-R (2-SS) parallel robot calibration motion planning based on Eye-to-hand model non-trivial solution constraint: establishing non-trivial solution constraint of an Eye-to-hand model based on the pose relationship between the calibration motions of the tail end clamping mechanism, and eliminating invalid poses in the calibration motions to plan the hand-Eye calibration motions of the tail end clamping mechanism of the parallel robot, thereby realizing the on-line hand-Eye calibration of the four-freedom-degree 4-R (2-SS) parallel robot with high precision and high efficiency; calculating the grabbing pose based on the stereoscopic vision and the 4-R (2-SS) parallel robot: the method comprises the steps of constructing a parallel robot grabbing model with error compensation by adopting robot motion errors obtained based on hand-eye calibration, meanwhile, calculating the optimal grabbing pose of an object under a camera coordinate system based on a stereoscopic vision model, calculating the current pose of a tail end clamping mechanism under a parallel robot basic coordinate system based on a parallel robot kinematic equation, calculating a conversion matrix between the current pose of the tail end clamping mechanism and the optimal grabbing pose by combining the grabbing model and the pose of the camera basic coordinate system obtained by on-line hand-eye calibration under the parallel robot basic coordinate system, and realizing grabbing pose calculation based on stereoscopic vision and the tail end clamping mechanism of the 4-R (2-SS) parallel robot. As can be seen from the above description, in the prior art, the line-to-hand-eye calibration method for the robot has complicated steps, needs to spend a large amount of hand-to-eye calibration workload and time, has high requirements on the professional level of technicians, and has poor stability of hand-to-eye calibration. It is necessary to solve these problems.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the above-mentioned problems in the prior art. Therefore, an object of the present invention is to provide a hand-eye calibration system and method for a 3D vision measurement system with simple steps, reduced workload and working time, and high stability of hand-eye calibration.
The technical scheme for solving the technical problems is as follows: a hand-eye calibration method of a 3D vision measurement system comprises the following steps:
a hand-eye calibration method of a 3D vision measurement system comprises the following steps:
calculating a translation vector of the 3D vision sensor by using the coordinate positions of the three first characteristic points and the coordinate positions of the three second characteristic points, and calculating the pose of the intersection points of the three straight lines in a second measurement coordinate system { C } of the 3D vision sensor by using the translation vector;
and 4, calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines by the robot, performing coordinate conversion, and calculating the pose of a second measurement coordinate system { C } of the 3D vision sensor in a robot tool coordinate system { T }, namely a hand-eye matrix.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, in the step 1, the three straight lines are respectively a straight line OA, a straight line OB and a straight line OC, and the straight line OA, the straight line OB and the straight line OC intersect at a point O; the coordinate position of the first feature point P1 of the straight line OA is [ X ]1 Y1 Z1]T(ii) a The coordinate position of the first feature point P2 of the straight line OB is [ X ]2 Y2 Z2]T(ii) a The coordinate position of the first feature point P3 of the straight line OC is [ X ]3 Y3 Z3]T;
The coordinate position of the second feature point P4 of the straight line OA in step 2 is [ X4 Y4 Z4]T(ii) a The coordinate position of the second feature point P5 of the straight line OB is [ X ]5 Y5 Z5]T(ii) a The coordinate position of the second feature point P6 of the straight line OC is [ X ]6 Y6 Z6]T(ii) a The pose of the robot after the 3D vision sensor is translated is a homogeneous coordinateWhere { T } represents the robot tool coordinate system and { B } represents the base coordinate system of the robot.
Further, the algorithm for calculating the translation vector of the 3D vision sensor in step 3 is:
let the translation vector of the 3D vision sensor be(x, y, z); converting the coordinate positions of the three first characteristic points into a coordinate system of three second characteristic points, namely a second measurement coordinate system { C } of the 3D vision sensor; the coordinate position of point P1 is converted to [ X ]1+x Y1+y Z1+z]TThe coordinate position of point P2 is converted to [ X ]2+x Y2+y Z2+z]TThe coordinate position of point P3 is converted to [ X ]3+x Y3+y Z3+z]T;
The point P1 and the point P4 form a vector by using the coordinate system of the second feature point (X4-X1-x,Y4-Y1-y,Z4-Z1-z); p2 and P5 form a vector of (X5-X2-x,Y5-Y2-y,Z5-Z2-z); p3 and P6 form a vector of (X6-X3-x,Y6-Y3-y,Z6-Z3-z);
An algorithm is constructed by utilizing the included angles among the straight line OA, the straight line OB and the straight line OC:
since the lines OA, OB and OC are perpendicular to each other two by two, the above formula can be simplified as follows:
that is:
unfolding to obtain:
make a difference every 2 equations:
writing in matrix form:
wherein, the meaning of the upper right corner + of the matrix: a. the+A generalized inverse matrix representing an A matrix;
the values of x, y and z can be solved, namely the translation vector of the 3D vision sensor is recorded as(a,b,c)。
Further, the plane equation of the plane OAB obtained by the point-normal formula of the coordinate position of the intersection O of the three straight lines in step 3 is as follows:
(X6-X3-a)(x-X4)+(Y6-Y3-b)(y-Y4)+(Z6-Z3-c)(z-Z4)=0;
and obtaining an equation set for solving the coordinate value of the point O of the intersection of the three straight lines by the same method:
and (3) expanding the equation set to obtain:
obtaining by solution:
wherein, the meaning of the upper right corner + of the matrix: a. the+A generalized inverse matrix representing an A matrix;
the values of x, y and z can be obtained by solution, namely the coordinate of the point O of the intersection of the three straight lines is obtained by solution and is marked as [ ijk ]]T。
Further, in the step 3, a cartesian rectangular coordinate system { O } is established with the intersection point O of the three straight lines as the origin of the coordinate system; the X axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OA, the Y axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OB, and the Z axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OC; using vectorsThe coordinate value of the coordinate system is calculated, and the pose of the Cartesian rectangular coordinate system { O } in the 3D vision sensor secondary measurement coordinate system { C } is calculatedThe method specifically comprises the following steps:
wherein the column vectorThe position of the origin of the Cartesian rectangular coordinate system { O } in the 3D vision sensor secondary measurement coordinate system { C } is represented, namely the obtained coordinate [ ijk ] of the intersection point O point of the three straight lines]T。
Further, in the step 4:
calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines by the robot to obtain a matrixPerform coordinate conversion, i.e.
Wherein,the matrix describes the pose of the second measurement coordinate system { C } of the 3D vision sensor in the robot tool coordinate system { T }, namely the hand-eye matrix to be solved;the matrix is pose data of the robot when controlling the 3D vision sensor to perform the second measurement,the matrix being a transit machineThe robot is calibrated by calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines,the matrix is the pose of a Cartesian rectangular coordinate system { O } in a second measurement coordinate system { C } of the 3D vision sensor
The invention has the beneficial effects that: the method has the advantages of simple steps, capability of greatly reducing the workload and the working time of the hand-eye calibration of the robot vision system, low requirement on the professional level of technicians and high stability of the hand-eye calibration.
Another technical solution of the present invention for solving the above technical problems is as follows: a hand-eye calibration system for a 3D vision measurement system, comprising:
the first coordinate acquisition module is used for installing the 3D vision sensor at the tail end of the robot, setting three non-coplanar straight lines in a space, wherein the three straight lines are mutually perpendicular and intersect at the same point; measuring three first characteristic points on three straight lines by using a 3D vision sensor, wherein the three first characteristic points are positioned on the same laser plane, and acquiring coordinate positions of the three first characteristic points;
the first coordinate acquisition module measures three second characteristic points on three straight lines through the robot translation 3D vision sensor, the three second characteristic points are located on the same laser plane, and coordinate positions of the three second characteristic points are acquired; recording the pose of the robot after the 3D vision sensor translates;
the calculation module is used for calculating translation vectors of the 3D vision sensor by using the coordinate positions of the three first characteristic points and the coordinate positions of the three second characteristic points, and calculating the positions of the coordinate systems at the intersection points of the three straight lines in a second-time measurement coordinate system { C } of the 3D vision sensor by using the translation vectors;
and the coordinate conversion module calibrates a robot base coordinate system at the point O of the intersection point of the three straight lines through the robot, performs coordinate conversion, and calculates the pose of a second measurement coordinate system { C } of the 3D vision sensor in a robot tool coordinate system { T }, namely a hand-eye matrix.
The invention has the beneficial effects that: the workload and the working time of calibrating the hands and the eyes of the robot vision system can be greatly reduced, the requirement on the professional level of technicians is low, and the stability of calibrating the hands and the eyes is high.
Drawings
FIG. 1 is a flow chart of a hand-eye calibration method of a 3D vision measurement system according to the present invention;
FIG. 2 is a schematic view of the second measurement performed by the translational 3D vision sensor of the present invention;
FIG. 3 is a schematic diagram of a Cartesian rectangular coordinate system { O } established in accordance with the present invention;
fig. 4 is a block diagram of a hand-eye calibration system of a 3D vision measurement system according to the present invention.
In the drawings, the components represented by the respective reference numerals are listed below:
1. the device comprises a first coordinate obtaining module, a second coordinate obtaining module, a calculating module and a coordinate converting module.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
Example 1:
as shown in fig. 1 to 3, a hand-eye calibration method of a 3D vision measurement system includes the following steps:
calculating a translation vector of the 3D vision sensor by using the coordinate positions of the three first characteristic points and the coordinate positions of the three second characteristic points, and calculating the pose of the intersection points of the three straight lines in a second measurement coordinate system { C } of the 3D vision sensor by using the translation vector;
and 4, calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines by the robot, performing coordinate conversion, and calculating the pose of a second measurement coordinate system { C } of the 3D vision sensor in a robot tool coordinate system { T }, namely a hand-eye matrix.
In the above embodiment, the three straight lines in step 1 are respectively a straight line OA, a straight line OB and a straight line OC, and the straight line OA, the straight line OB and the straight line OC intersect at a point O; the coordinate position of the first feature point P1 of the straight line OA is [ X ]1 Y1 Z1]T(ii) a The coordinate position of the first feature point P2 of the straight line OB is [ X ]2 Y2 Z2]T(ii) a The coordinate position of the first feature point P3 of the straight line OC is [ X ]3 Y3 Z3]T;
The coordinate position of the second feature point P4 of the straight line OA in step 2 is [ X4 Y4 Z4]T(ii) a The coordinate position of the second feature point P5 of the straight line OB is [ X ]5 Y5 Z5]T(ii) a The coordinate position of the second feature point P6 of the straight line OC is [ X ]6 Y6 Z6]T(ii) a The pose of the robot after the 3D vision sensor is translated is a homogeneous coordinateWhere { T } represents the robot tool coordinate system and { B } represents the base coordinate system of the robot.
In the above embodiment, the algorithm for calculating the translation vector of the 3D vision sensor in step 3 is as follows:
let the translation vector of the 3D vision sensor be(x, y, z); converting the coordinate positions of the three first characteristic points into a coordinate system of three second characteristic points, namely a second measurement coordinate system { C } of the 3D vision sensor; the coordinate position of point P1 is converted to [ X ]1+x Y1+y Z1+z]TThe coordinate position of point P2 is converted to [ X ]2+x Y2+y Z2+z]TThe coordinate position of point P3 is converted to [ X ]3+x Y3+y Z3+z]T;
The point P1 and the point P4 form a vector by using the coordinate system of the second feature point (X4-X1-x,Y4-Y1-y,Z4-Z1-z); p2 and P5 form a vector of (X5-X2-x,Y5-Y2-y,Z5-Z2-z); p3 and P6 form a vector of (X6-X3-x,Y6-Y3-y,Z6-Z3-z);
An algorithm is constructed by utilizing the included angles among the straight line OA, the straight line OB and the straight line OC:
since the lines OA, OB and OC are perpendicular to each other two by two, the above formula can be simplified as follows:
that is:
unfolding to obtain:
make a difference every 2 equations:
writing in matrix form:
wherein, the meaning of the upper right corner + of the matrix: a. the+A generalized inverse matrix representing an A matrix;
the values of x, y and z can be solved, namely the translation vector of the 3D vision sensor is recorded as(a,b,c)。
In the above embodiment, the plane equation of the plane OAB obtained by the point-normal formula of the coordinate position of the intersection O of the three straight lines in step 3 is as follows:
(X6-X3-a)(x-X4)+(Y6-Y3-b)(y-Y4)+(Z6-Z3-c)(z-Z4)=0;
and obtaining an equation set for solving the coordinate value of the point O of the intersection of the three straight lines by the same method:
and (3) expanding the equation set to obtain:
obtaining by solution:
wherein, the meaning of the upper right corner + of the matrix: a. the+A generalized inverse matrix representing an A matrix;
the values of x, y and z can be obtained by solution, namely the coordinate of the point O of the intersection of the three straight lines is obtained by solution and is marked as [ ijk ]]T。
In the above embodiment, in step 3, a cartesian rectangular coordinate system { O } is established with an intersection O of the three straight lines as an origin of the coordinate system; the X axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OA, the Y axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OB, and the Z axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OC; using vectorsThe coordinate value of the coordinate system is calculated, and the pose of the Cartesian rectangular coordinate system { O } in the 3D vision sensor secondary measurement coordinate system { C } is calculatedThe method specifically comprises the following steps:
unfolding:
wherein the column vectorRepresenting the position of the origin of the Cartesian rectangular coordinate system { O } in the 3D vision sensor second measurement coordinate system { C }, i.e. the three obtained straight lines intersectCoordinate [ ijk ] of point O]T。
In the above embodiment, in step 4:
calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines by the robot to obtain a matrixPerform coordinate conversion, i.e.
Wherein,the matrix describes the pose of the second measurement coordinate system { C } of the 3D vision sensor in the robot tool coordinate system { T }, namely the hand-eye matrix to be solved;the matrix is pose data of the robot when controlling the 3D vision sensor to perform the second measurement,the matrix is calibrated by a robot in a robot base coordinate calibration system at the point O of the intersection point of the three straight lines,the matrix is the pose of a Cartesian rectangular coordinate system { O } in a second measurement coordinate system { C } of the 3D vision sensor
The hand-eye calibration method of the 3D vision measurement system is simple in steps, workload and working time of the hand-eye calibration of the robot vision system can be greatly reduced, requirements on professional levels of technicians are low, and the hand-eye calibration stability is high.
Example 2:
as shown in fig. 2 to 4, a hand-eye calibration system of a 3D vision measurement system includes:
the first coordinate acquisition module 1 is used for installing the 3D vision sensor at the tail end of the robot, setting three non-coplanar straight lines in a space, wherein the three straight lines are mutually perpendicular and intersect at the same point; measuring three first characteristic points on three straight lines by using a 3D vision sensor, wherein the three first characteristic points are positioned on the same laser plane, and acquiring coordinate positions of the three first characteristic points;
the first coordinate acquisition module 1 measures three second characteristic points on three straight lines through the robot translation 3D vision sensor, the three second characteristic points are located on the same laser plane, and coordinate positions of the three second characteristic points are acquired; recording the pose of the robot after the 3D vision sensor translates;
the calculation module 3 is used for calculating translation vectors of the 3D vision sensor by using the coordinate positions of the three first characteristic points and the coordinate positions of the three second characteristic points, and calculating the positions of the coordinate systems at the intersection points of the three straight lines in a second measurement coordinate system { C } of the 3D vision sensor by using the translation vectors;
and the coordinate conversion module 4 is used for calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines through the robot, performing coordinate conversion, and calculating the pose of the second measurement coordinate system { C } of the 3D vision sensor in the robot tool coordinate system { T }, namely a hand-eye matrix.
In the above embodiment, the three straight lines of the first obtained coordinate module are respectively a straight line OA, a straight line OB and a straight line OC, and the straight line OA, the straight line OB and the straight line OC intersect at the point O; the coordinate position of the first feature point P1 of the straight line OA is [ X ]1 Y1 Z1]T(ii) a The coordinate position of the first feature point P2 of the straight line OB is [ X ]2 Y2 Z2]T(ii) a Coordinates of first feature point P3 of straight line OCPosition is [ X ]3Y3 Z3]T;
The coordinate position of the second feature point P4 of the straight line OA of the second acquisition coordinate module 2 is [ X4 Y4 Z4]T(ii) a The coordinate position of the second feature point P5 of the straight line OB is [ X ]5 Y5 Z5]T(ii) a The coordinate position of the second feature point P6 of the straight line OC is [ X ]6Y6 Z6]T(ii) a The pose of the robot after the 3D vision sensor is translated is a homogeneous coordinateWhere { T } represents the robot tool coordinate system and { B } represents the base coordinate system of the robot.
In the above embodiment, the algorithm for calculating the translation vector of the 3D vision sensor by the calculating module 3 is as follows:
let the translation vector of the 3D vision sensor be(x, y, z); converting the coordinate positions of the three first characteristic points into a coordinate system of three second characteristic points, namely a second measurement coordinate system { C } of the 3D vision sensor; the coordinate position of point P1 is converted to [ X ]1+x Y1+y Z1+z]TThe coordinate position of point P2 is converted to [ X ]2+x Y2+y Z2+z]TThe coordinate position of point P3 is converted to [ X ]3+x Y3+y Z3+z]T;
The point P1 and the point P4 form a vector by using the coordinate system of the second feature point (X4-X1-x,Y4-Y1-y,Z4-Z1-z); p2 and P5 form a vector of (X5-X2-x,Y5-Y2-y,Z5-Z2-z); p3 and P6 form a vector of (X6-X3-x,Y6-Y3-y,Z6-Z3-z);
An algorithm is constructed by utilizing the included angles among the straight line OA, the straight line OB and the straight line OC:
since the lines OA, OB and OC are perpendicular to each other two by two, the above formula can be simplified as follows:
that is:
unfolding to obtain:
make a difference every 2 equations:
writing in matrix form:
wherein, the meaning of the upper right corner + of the matrix: a. the+A generalized inverse matrix representing an A matrix;
can be solved to obtain the values of x, y and z,i.e. the translation vector of the 3D vision sensor is noted as(a,b,c)。
In the above embodiment, the plane equation of the plane OAB obtained by the point-normal formula of the coordinate position of the intersection O of the three straight lines of the calculation module 3 is as follows:
(X6-X3-a)(x-X4)+(Y6-Y3-b)(y-Y4)+(Z6-Z3-c)(z-Z4)=0;
and obtaining an equation set for solving the coordinate value of the point O of the intersection of the three straight lines by the same method:
and (3) expanding the equation set to obtain:
obtaining by solution:
wherein, the meaning of the upper right corner + of the matrix: a. the+A generalized inverse matrix representing an A matrix;
the values of x, y and z can be obtained by solution, namely the coordinate of the point O of the intersection of the three straight lines is obtained by solution and is marked as [ ijk ]]T。
In the above embodiment, the calculating module 3 establishes a cartesian rectangular coordinate system { O } with an intersection O of three straight lines as an origin of the coordinate system; the X axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OA, the Y axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OB, and the Z axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OC; using vectorsThe coordinate value of the coordinate system is calculated, and the pose of the Cartesian rectangular coordinate system { O } in the 3D vision sensor secondary measurement coordinate system { C } is calculatedThe method specifically comprises the following steps:
unfolding:
wherein the column vectorThe position of the origin of the Cartesian rectangular coordinate system { O } in the 3D vision sensor secondary measurement coordinate system { C } is represented, namely the obtained coordinate [ ijk ] of the intersection point O point of the three straight lines]T。
In the above embodiment, the coordinate conversion module 4:
calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines by the robot to obtain a matrixPerform coordinate conversion, i.e.
Wherein,the matrix describes the pose of the second measurement coordinate system C of the 3D vision sensor in the robot tool coordinate system T,namely, the hand-eye matrix to be solved;the matrix is pose data of the robot when controlling the 3D vision sensor to perform the second measurement,the matrix is calibrated by a robot in a robot base coordinate calibration system at the point O of the intersection point of the three straight lines,the matrix is the pose of a Cartesian rectangular coordinate system { O } in a second measurement coordinate system { C } of the 3D vision sensor
The hand-eye calibration system of the 3D vision measurement system can greatly reduce the workload and the working time of the hand-eye calibration of the robot vision system, has low requirements on the professional level of technicians, and has high stability of the hand-eye calibration.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (7)
1. A hand-eye calibration method of a 3D vision measurement system is characterized by comprising the following steps:
step 1, mounting a 3D vision sensor at the tail end of a robot, setting three non-coplanar straight lines in a space, wherein the three straight lines are mutually vertical and intersect at the same point; measuring three first characteristic points on three straight lines by using a 3D vision sensor, wherein the three first characteristic points are positioned on the same laser plane, and acquiring coordinate positions of the three first characteristic points;
step 2, measuring three second characteristic points on three straight lines by the robot translation 3D vision sensor, wherein the three second characteristic points are positioned on the same laser plane, and acquiring coordinate positions of the three second characteristic points; recording the pose of the robot after the 3D vision sensor is translated;
calculating translation vectors of the 3D vision sensor by using the coordinate positions of the three first characteristic points and the coordinate positions of the three second characteristic points, and calculating the pose of the coordinate system at the intersection of the three straight lines in the second-time measurement coordinate system { C } of the 3D vision sensor by using the translation vectors;
and 4, calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines by the robot, performing coordinate conversion, and calculating the pose of a second measurement coordinate system { C } of the 3D vision sensor in a robot tool coordinate system { T }, namely a hand-eye matrix.
2. The hand-eye calibration method of the 3D vision measurement system according to claim 1, wherein the three straight lines in step 1 are a straight line OA, a straight line OB and a straight line OC, respectively, and the straight line OA, the straight line OB and the straight line OC intersect at a point O; the coordinate position of the first feature point P1 of the straight line OA is [ X ]1 Y1 Z1]T(ii) a The coordinate position of the first feature point P2 of the straight line OB is [ X ]2 Y2Z2]T(ii) a The coordinate position of the first feature point P3 of the straight line OC is [ X ]3 Y3 Z3]T;
The coordinate position of the second feature point P4 of the straight line OA in step 2 is [ X4 Y4 Z4]T(ii) a The coordinate position of the second feature point P5 of the straight line OB is [ X ]5 Y5 Z5]T(ii) a The coordinate position of the second feature point P6 of the straight line OC is [ X ]6 Y6 Z6]T(ii) a The pose of the robot after the 3D vision sensor is translated is a homogeneous coordinateWhere { T } represents the robot tool coordinate system and { B } represents the base coordinate system of the robot.
3. The hand-eye calibration method of 3D vision measurement system according to claim 2, wherein the algorithm for calculating the translation vector of the 3D vision sensor in step 3 is:
let the translation vector of the 3D vision sensor be(x, y, z); converting the coordinate positions of the three first characteristic points into a coordinate system of three second characteristic points, namely a second measurement coordinate system { C } of the 3D vision sensor; the coordinate position of point P1 is converted to [ X ]1+x Y1+y Z1+z]TThe coordinate position of point P2 is converted to [ X ]2+x Y2+y Z2+z]TThe coordinate position of point P3 is converted to [ X ]3+x Y3+y Z3+z]T;
The point P1 and the point P4 form a vector by using the coordinate system of the second feature point(X4-X1-x,Y4-Y1-y,Z4-Z1-z); p2 and P5 form a vector of(X5-X2-x,Y5-Y2-y,Z5-Z2-z); p3 and P6 form a vector of(X6-X3-x,Y6-Y3-y,Z6-Z3-z);
An algorithm is constructed by utilizing the included angles among the straight line OA, the straight line OB and the straight line OC:
since the lines OA, OB and OC are perpendicular to each other two by two, the above formula can be simplified as follows:
that is:
unfolding to obtain:
make a difference every 2 equations:
writing in matrix form:
wherein, the meaning of the upper right corner + of the matrix: a. the+A generalized inverse matrix representing an A matrix;
4. The hand-eye calibration method of 3D vision measuring system according to claim 3, wherein the plane equation of the plane OAB obtained by the point method of the coordinate position of the intersection O of the three straight lines in step 3 is as follows:
(X6-X3-a)(x-X4)+(Y6-Y3-b)(y-Y4)+(Z6-Z3-c)(z-Z4)=0;
and obtaining an equation set for solving the coordinate value of the point O of the intersection of the three straight lines by the same method:
and (3) expanding the equation set to obtain:
obtaining by solution:
wherein, the meaning of the upper right corner + of the matrix: a. the+A generalized inverse matrix representing an A matrix;
the values of x, y and z can be obtained by solution, namely the coordinate of the point O of the intersection of the three straight lines is obtained by solution and is marked as [ ijk ]]T。
5. The hand-eye calibration method of the 3D vision measurement system as claimed in claim 4, wherein in the step 3, a Cartesian rectangular coordinate system { O } is established with an intersection point O of three straight lines as an origin of the coordinate system; the X axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OA, the Y axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OB, and the Z axis of the Cartesian rectangular coordinate system (O) coincides with a straight line OC; using vectorsThe coordinate value of the coordinate system is calculated, and the pose of the Cartesian rectangular coordinate system { O } in the 3D vision sensor secondary measurement coordinate system { C } is calculated The method specifically comprises the following steps:
6. The hand-eye calibration method of 3D vision measurement system according to claim 4, wherein in step 4:
calibrating a robot base coordinate system at the point O of the intersection point of the three straight lines by the robot to obtain a matrixPerform coordinate conversion, i.e.
Wherein,the matrix describes the pose of the second measurement coordinate system { C } of the 3D vision sensor in the robot tool coordinate system { T }, namely the hand-eye matrix to be solved;the matrix is pose data of the robot when controlling the 3D vision sensor to perform the second measurement,the matrix is calibrated by a robot in a robot base coordinate calibration system at the point O of the intersection point of the three straight lines,the matrix is the pose of a Cartesian rectangular coordinate system { O } in a second measurement coordinate system { C } of the 3D vision sensor
7. A hand-eye calibration system for a 3D vision measurement system, comprising:
the first coordinate acquisition module (1) is used for installing the 3D vision sensor at the tail end of the robot, setting three non-coplanar straight lines in a space, and enabling the three straight lines to be mutually perpendicular and intersect at the same point; measuring three first characteristic points on three straight lines by using a 3D vision sensor, wherein the three first characteristic points are positioned on the same laser plane, and acquiring coordinate positions of the three first characteristic points;
the first coordinate acquisition module (1) measures three second characteristic points on three straight lines through the robot translation 3D vision sensor, the three second characteristic points are located on the same laser plane, and coordinate positions of the three second characteristic points are acquired; recording the pose of the robot after the 3D vision sensor translates;
the calculation module (3) is used for calculating translation vectors of the 3D vision sensor by using the coordinate positions of the three first characteristic points and the coordinate positions of the three second characteristic points, and calculating the positions of the coordinate systems at the intersection points of the three straight lines in a second measurement coordinate system { C } of the 3D vision sensor by using the translation vectors;
and the coordinate conversion module (4) calibrates a robot base coordinate system at the point O of the intersection point of the three straight lines through the robot, performs coordinate conversion, and calculates the pose of a second measurement coordinate system { C } of the 3D vision sensor in a robot tool coordinate system { T }, namely a hand-eye matrix.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010448062.4A CN111975756B (en) | 2020-05-25 | 2020-05-25 | Hand-eye calibration system and method of 3D vision measurement system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010448062.4A CN111975756B (en) | 2020-05-25 | 2020-05-25 | Hand-eye calibration system and method of 3D vision measurement system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111975756A true CN111975756A (en) | 2020-11-24 |
CN111975756B CN111975756B (en) | 2022-02-15 |
Family
ID=73442019
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010448062.4A Active CN111975756B (en) | 2020-05-25 | 2020-05-25 | Hand-eye calibration system and method of 3D vision measurement system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111975756B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113500584A (en) * | 2021-07-15 | 2021-10-15 | 西北工业大学 | Tail end error correction system and method of three-degree-of-freedom parallel robot |
CN113514017A (en) * | 2021-05-06 | 2021-10-19 | 南京理工大学 | Parallel driving mechanism moving platform pose measuring method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101377404A (en) * | 2008-07-11 | 2009-03-04 | 北京航空航天大学 | Method for disambiguating space round gesture recognition ambiguity based on angle restriction |
US20090157226A1 (en) * | 2004-11-19 | 2009-06-18 | Dynalog ,Inc. | Robot-cell calibration |
CN102460065A (en) * | 2009-06-25 | 2012-05-16 | 佳能株式会社 | Information processing apparatus, information processing method, and program |
CN104006825A (en) * | 2013-02-25 | 2014-08-27 | 康耐视公司 | System and method for calibration of machine vision cameras along at least three discrete planes |
CN106940894A (en) * | 2017-04-12 | 2017-07-11 | 无锡职业技术学院 | A kind of hand-eye system self-calibrating method based on active vision |
US9821466B2 (en) * | 2014-12-21 | 2017-11-21 | X Development Llc | Devices and methods for encoder calibration |
WO2019019136A1 (en) * | 2017-07-28 | 2019-01-31 | Qualcomm Incorporated | Systems and methods for utilizing semantic information for navigation of a robotic device |
CN209820416U (en) * | 2019-04-29 | 2019-12-20 | 泉州华数机器人有限公司 | Calibration device for monocular vision system |
CN110645956A (en) * | 2019-09-24 | 2020-01-03 | 南通大学 | Multi-channel visual ranging method imitating stereoscopic vision of insect compound eye |
CN110827359A (en) * | 2019-10-29 | 2020-02-21 | 武汉大学 | Checkerboard trihedron-based camera and laser external reference checking and correcting method and device |
-
2020
- 2020-05-25 CN CN202010448062.4A patent/CN111975756B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090157226A1 (en) * | 2004-11-19 | 2009-06-18 | Dynalog ,Inc. | Robot-cell calibration |
CN101377404A (en) * | 2008-07-11 | 2009-03-04 | 北京航空航天大学 | Method for disambiguating space round gesture recognition ambiguity based on angle restriction |
CN102460065A (en) * | 2009-06-25 | 2012-05-16 | 佳能株式会社 | Information processing apparatus, information processing method, and program |
CN104006825A (en) * | 2013-02-25 | 2014-08-27 | 康耐视公司 | System and method for calibration of machine vision cameras along at least three discrete planes |
US9821466B2 (en) * | 2014-12-21 | 2017-11-21 | X Development Llc | Devices and methods for encoder calibration |
CN106940894A (en) * | 2017-04-12 | 2017-07-11 | 无锡职业技术学院 | A kind of hand-eye system self-calibrating method based on active vision |
WO2019019136A1 (en) * | 2017-07-28 | 2019-01-31 | Qualcomm Incorporated | Systems and methods for utilizing semantic information for navigation of a robotic device |
CN209820416U (en) * | 2019-04-29 | 2019-12-20 | 泉州华数机器人有限公司 | Calibration device for monocular vision system |
CN110645956A (en) * | 2019-09-24 | 2020-01-03 | 南通大学 | Multi-channel visual ranging method imitating stereoscopic vision of insect compound eye |
CN110827359A (en) * | 2019-10-29 | 2020-02-21 | 武汉大学 | Checkerboard trihedron-based camera and laser external reference checking and correcting method and device |
Non-Patent Citations (2)
Title |
---|
PINGJIANG WANG等: "Calibration Method of Roof Weld Coating Robot System Based", 《INTERNATIONAL JOURNAL OF PRECISION ENGINEERING AND MANUFACTURING》 * |
郭新年等: "基于主动视觉的手眼矩阵和光平面标定方法", 《计算机工程与应用》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113514017A (en) * | 2021-05-06 | 2021-10-19 | 南京理工大学 | Parallel driving mechanism moving platform pose measuring method |
CN113500584A (en) * | 2021-07-15 | 2021-10-15 | 西北工业大学 | Tail end error correction system and method of three-degree-of-freedom parallel robot |
CN113500584B (en) * | 2021-07-15 | 2022-06-28 | 西北工业大学 | Tail end error correction system and method of three-degree-of-freedom parallel robot |
Also Published As
Publication number | Publication date |
---|---|
CN111975756B (en) | 2022-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102280663B1 (en) | Calibration method for robot using vision technology | |
CN110640745B (en) | Vision-based robot automatic calibration method, equipment and storage medium | |
JP4021413B2 (en) | Measuring device | |
EP3091405B1 (en) | Method, device and system for improving system accuracy of x-y motion platform | |
CN106777656B (en) | Industrial robot absolute accuracy calibration method based on PMPSD | |
CN110136204B (en) | Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera | |
CN110108208B (en) | Error compensation method of five-axis non-contact measuring machine | |
CN111531547B (en) | Robot calibration and detection method based on vision measurement | |
CN110757504B (en) | Positioning error compensation method of high-precision movable robot | |
CN113386136B (en) | Robot posture correction method and system based on standard spherical array target estimation | |
JP5618770B2 (en) | Robot calibration apparatus and calibration method | |
CN111975756B (en) | Hand-eye calibration system and method of 3D vision measurement system | |
CN114001653B (en) | Center point calibration method for robot tool | |
CN106737859B (en) | External parameter calibration method for sensor and robot based on invariant plane | |
CN112917513A (en) | TCP calibration method of three-dimensional dispensing needle head based on machine vision | |
CN112873213B (en) | Method for improving coordinate system calibration precision of six-joint robot tool | |
CN113681559B (en) | Line laser scanning robot hand-eye calibration method based on standard cylinder | |
CN113211431A (en) | Pose estimation method based on two-dimensional code correction robot system | |
CN112454366A (en) | Hand-eye calibration method | |
CN114347013A (en) | Method for assembling printed circuit board and FPC flexible cable and related equipment | |
CN115284292A (en) | Mechanical arm hand-eye calibration method and device based on laser camera | |
CN115179323A (en) | Machine end pose measuring device based on telecentric vision constraint and precision improving method | |
CN208350997U (en) | A kind of object movement monitoring system | |
CN114029982A (en) | Hand-eye calibration device and calibration method of camera outside robot arm | |
CN112975959B (en) | Machine vision-based radiator assembling and positioning method, system and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20201124 Assignee: QUANZHOU HUASHU ROBOT CO.,LTD. Assignor: QUANZHOU-HUST INTELLIGENT MANUFACTURING FUTURE Contract record no.: X2024350000062 Denomination of invention: A Hand Eye Calibration System and Method for 3D Vision Measurement System Granted publication date: 20220215 License type: Common License Record date: 20240429 |