CN112873213B - Method for improving coordinate system calibration precision of six-joint robot tool - Google Patents

Method for improving coordinate system calibration precision of six-joint robot tool Download PDF

Info

Publication number
CN112873213B
CN112873213B CN202110231651.1A CN202110231651A CN112873213B CN 112873213 B CN112873213 B CN 112873213B CN 202110231651 A CN202110231651 A CN 202110231651A CN 112873213 B CN112873213 B CN 112873213B
Authority
CN
China
Prior art keywords
coordinate
matrix
posture
tool
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110231651.1A
Other languages
Chinese (zh)
Other versions
CN112873213A (en
Inventor
张燚华
张东卫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Dafeng Cnc Technology Co ltd
Original Assignee
Nanjing Dafeng Cnc Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Dafeng Cnc Technology Co ltd filed Critical Nanjing Dafeng Cnc Technology Co ltd
Priority to CN202110231651.1A priority Critical patent/CN112873213B/en
Publication of CN112873213A publication Critical patent/CN112873213A/en
Application granted granted Critical
Publication of CN112873213B publication Critical patent/CN112873213B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a method for improving the calibration precision of a coordinate system of a six-joint robot tool, which comprises the steps of firstly, using the traditional calibration step of a four-point method to calibrate and align, recording a three-dimensional coordinate matrix and a posture matrix at four points by a robot in the alignment process, identifying a three-dimensional coordinate discrete data set of a terminal point of a welding gun in a camera coordinate system by using a 3D vision camera system in a welding seam tracking system at the four points, then converting the discrete data set into a three-dimensional coordinate discrete data set in the robot coordinate system, solving the optimal solution of a contradiction equation set by a calibrated tool coordinate system matrix T, and completing the calibration of the tool coordinate system. Under the premise of not increasing the cost, the method can fully and accurately identify the position of the calibration point under the robot coordinate system by the 3D vision system in the welding seam tracking system, so that the calibration precision of the tool coordinate system between the six-joint robot and the welding gun is greatly improved, and the welding seam tracking precision is further greatly improved.

Description

Method for improving coordinate system calibration precision of six-joint robot tool
Technical Field
The invention relates to the technical field of robot welding, in particular to a method for improving the coordinate system calibration precision of a six-joint robot tool.
Background
In a welding seam tracking application scene, the relation between the tail end point of the six-joint robot and the tail end point of the welding gun needs to be calibrated, namely a tool coordinate system, and the calibration precision of the tool coordinate system directly influences the welding seam tracking precision. Therefore, how to improve the calibration accuracy of the six-joint robot tool coordinate system is of great importance.
In document 1 (a robot tool coordinate system calibration method disclosed in "Shandong science" by Liu industry et al, article V01.25No.1Feb.2012DOI: 10.3976/j.issn.1002-4026.2012.01.015), calibration is performed by a four-point method, that is, a tip point of a welding gun of a robot is aligned with a tip of a calibration device, and positions of four different postures are recorded, so that calibration is performed. Document 2 (chinese patent No. 201811540756.X a method for improving the calibration accuracy of an industrial robot tool coordinate system) improves the method of document 1, performs accuracy evaluation on a calibration point in the calibration process, reselects the calibration point if the calibration point is not matched, and gives a total accuracy evaluation after the calibration point is selected, thereby improving the calibration accuracy.
The conventional six-joint robot calibration method is not high in precision, the method provided by the document 2 can improve the calibration precision to a certain extent, but the method of the document 2 is still insufficient in precision for the occasion with extremely high precision requirement, such as welding seam tracking, because an evaluation standard function of the method has no clear scientific theoretical guidance, has certain randomness, and cannot eliminate the artificial alignment error in the calibration process.
Disclosure of Invention
The invention aims to provide a method for improving the coordinate system calibration precision of a six-joint robot tool, so as to solve the problems in the background technology.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a method for improving the calibration precision of a coordinate system of a six-joint robot tool comprises the following steps:
s1, calibrating hands and eyes to obtain a coordinate transformation matrix equation A between the 3D vision system and the six-joint robot:
Figure GDA0003602296150000011
s2, calibrating a six-joint robot tool coordinate system according to the traditional four-point method calibration step;
s3, recording three-dimensional coordinate matrixes S1, S2, S3 and S4 and attitude matrixes R1, R2, R3 and R4 at four points by the six-joint robot; observing the three-dimensional coordinates at the initial position on a coordinate monitoring interface of the demonstrator and recording the three-dimensional coordinates as S1; teaching the robot by using a teaching machine, aligning the terminal point with the reference point, changing a posture, observing the three-dimensional coordinates at the position on a coordinate monitoring interface of the teaching machine, and recording the three-dimensional coordinates as S2; teaching the robot by using a demonstrator, aligning the terminal point with the reference point, changing a posture, observing the three-dimensional coordinate at the position on a coordinate monitoring interface of the demonstrator, and recording the three-dimensional coordinate as S3; teaching the robot by using a demonstrator, aligning the terminal point with the reference point, continuously changing a posture, observing the three-dimensional coordinate at the position on a coordinate monitoring interface of the demonstrator and recording the three-dimensional coordinate as S4;
wherein:
Figure GDA0003602296150000021
observing a pose matrix at an initial position on a coordinate monitoring interface of the demonstrator and recording the pose matrix as R1; changing a posture, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R2; changing a posture, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R3; after changing a posture continuously, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R4;
wherein:
Figure GDA0003602296150000022
s4, acquiring three-dimensional coordinate discrete data sets P1, P2, P3 and P4 at four points by a 3D vision system; the 3D vision system photographs and processes the tool tail end at the initial position to obtain a three-dimensional coordinate record of the tool tail end point under a vision coordinate system as P1, a posture is changed, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P2, a posture is changed again, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P3, and after changing a posture continuously, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P4;
wherein:
Figure GDA0003602296150000023
s5, calculating three-dimensional coordinate discrete sets Q1 ═ A ═ P1, Q2 ═ A ═ P2, Q3 ═ A ═ P3 and Q4 ═ A ═ P4 at four points in a six-joint robot coordinate system; the three-dimensional coordinates of the six-joint robot at the four points in steps S3 and S4 are Q1, Q2, Q3, Q4, and there are:
Figure GDA0003602296150000031
only the first three rows of the matrix are taken, namely:
Figure GDA0003602296150000032
the same can be obtained:
Figure GDA0003602296150000033
Figure GDA0003602296150000034
Figure GDA0003602296150000035
s6, solving the contradictory equations to obtain T according to the formula of the tool coordinate matrix T, where R1 × T + S1 is Q1, R2 × T + S2 is Q2, R3 × T + S3 is Q3, and R4 × T + S4 is Q4;
namely:
Figure GDA0003602296150000036
finishing
Figure GDA0003602296150000041
The same can be obtained
Figure GDA0003602296150000042
Figure GDA0003602296150000043
Figure GDA0003602296150000044
Combining the above formulas to form
Figure GDA0003602296150000045
Solving a tool coordinate matrix equation set, and making a matrix B as follows:
Figure GDA0003602296150000051
let matrix C be:
Figure GDA0003602296150000052
transpose B of matrix BTIs composed of
Figure GDA0003602296150000053
The tool coordinate matrix T' to be solved is
Figure GDA0003602296150000054
In the above formula (B)T*B)-1Representation matrix BTAn inverse matrix of the product of matrix B;
the solved equation set result T ' is substituted into the original equation set, so that the residual matrix E1 ═ R1 × T ' + S1-Q1, E2 ═ R2 × T ' + S2-Q2, E3 ═ R3 × T ' + S3-Q3, E4 ═ R4 ═ T ' + S4-Q4 can be obtained, that is:
Figure GDA0003602296150000061
Figure GDA0003602296150000062
Figure GDA0003602296150000063
Figure GDA0003602296150000064
let the residuals err1, err2, err3 and err4 be
Figure GDA0003602296150000065
Figure GDA0003602296150000066
Figure GDA0003602296150000067
Figure GDA0003602296150000068
The smaller the value of the residual error is, the higher the calibration precision is.
Compared with the prior art, the invention has the beneficial effects that: under the premise of not increasing the cost, the method can fully and accurately identify the position of the calibration point under the robot coordinate system by the 3D vision system in the welding seam tracking system, so that the calibration precision of the tool coordinate system between the six-joint robot and the welding gun is greatly improved, the welding seam tracking precision is further improved, and the error caused by artificial alignment in the calibration process is greatly reduced.
Drawings
FIG. 1 is a schematic flow chart of the present invention;
FIG. 2 is a schematic diagram of the comparison between the residual value obtained in the present invention and the residual value of the prior art;
Detailed Description
The technical solution of the present invention is further described in detail with reference to the accompanying drawings and examples.
As shown in fig. 1, a method for improving the calibration accuracy of a coordinate system of a six-joint robot tool includes the following steps:
s1, obtaining a coordinate transformation matrix equation A between the 3D vision system and the six-joint robot through hand-eye calibration:
Figure GDA0003602296150000071
hand-eye calibration is a mature technology, is not the focus of the patent, and only directly applies the result, and is not described in detail.
Next, the operation process is described by taking four coordinate points as an example:
the coordinate transformation matrix equation A from the 3D vision system to the six-joint robot obtained by calibrating the hands and the eyes is as follows:
Figure GDA0003602296150000072
and S2, calibrating the coordinate system of the six-joint robot tool according to the traditional four-point method calibration step. Preparing a tip calibration object, taking a pen point as a reference point and ensuring that the reference point is fixed and cannot move in the calibration process; using a demonstrator to teach the robot, enabling the tail end of a tool of the robot to be vertical to and opposite to a reference point, namely, moving the robot to a position right above a pen point for teaching; the coordinates of the initial position are measured, and then the different postures are moved to continue measuring the respective coordinates.
S3, recording three-dimensional coordinate matrixes S1, S2, S3 and S4 and attitude matrixes R1, R2, R3 and R4 at four points by the six-joint robot; observing the three-dimensional coordinates at the initial position on a coordinate monitoring interface of the demonstrator and recording the three-dimensional coordinates as S1; teaching the robot by using a teaching machine, aligning the terminal point with the reference point, changing a posture, observing the three-dimensional coordinates at the position on a coordinate monitoring interface of the teaching machine, and recording the three-dimensional coordinates as S2; teaching the robot by using a demonstrator, aligning the terminal point with the reference point, changing a posture, observing the three-dimensional coordinate at the position on a coordinate monitoring interface of the demonstrator, and recording the three-dimensional coordinate as S3; teaching the robot by using a demonstrator, aligning the terminal point with the reference point, continuously changing a posture, observing the three-dimensional coordinate at the position on a coordinate monitoring interface of the demonstrator and recording the three-dimensional coordinate as S4;
wherein:
Figure GDA0003602296150000073
observing a pose matrix at an initial position on a coordinate monitoring interface of the demonstrator and recording the pose matrix as R1; changing a posture, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R2; changing a posture, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R3; after changing a posture continuously, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R4;
wherein:
Figure GDA0003602296150000081
s4, acquiring three-dimensional coordinate discrete data sets P1, P2, P3 and P4 at four points by a 3D vision system; the 3D vision system photographs and processes the tool tail end at the initial position to obtain a three-dimensional coordinate record of the tool tail end point under a vision coordinate system as P1, a posture is changed, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P2, a posture is changed again, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P3, and after changing a posture continuously, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P4;
wherein:
Figure GDA0003602296150000082
the data obtained by combining the steps S3 and S4 by the four coordinate points in the step S1 are:
the related data of the calibration point 1 in the four-point calibration process are as follows:
Figure GDA0003602296150000083
the related data of the calibration point 2 in the four-point calibration process are as follows:
Figure GDA0003602296150000084
the related data of the calibration point 3 in the four-point calibration process are as follows:
Figure GDA0003602296150000085
the related data of the calibration point 4 in the four-point calibration process are as follows:
Figure GDA0003602296150000091
s5, calculating three-dimensional coordinate discrete sets Q1 ═ A ═ P1, Q2 ═ A ═ P2, Q3 ═ A ═ P3 and Q4 ═ A ═ P4 at four points in a six-joint robot coordinate system; the three-dimensional coordinates of the six-joint robot at the four points in steps S3 and S4 are Q1, Q2, Q3, Q4, and there are:
Figure GDA0003602296150000092
only the first three rows of the matrix are taken, namely:
Figure GDA0003602296150000093
the same can be obtained:
Figure GDA0003602296150000094
Figure GDA0003602296150000095
Figure GDA0003602296150000096
from the result of step S4, three-dimensional coordinates of the robot at four points are obtained, and from equations 1, 2, 3, and 4:
Figure GDA0003602296150000101
s6, solving the contradictory equations to obtain T according to the formula of the tool coordinate matrix T, where R1 × T + S1 is Q1, R2 × T + S2 is Q2, R3 × T + S3 is Q3, and R4 × T + S4 is Q4;
namely:
Figure GDA0003602296150000102
finishing
Figure GDA0003602296150000103
The same can be obtained
Figure GDA0003602296150000104
Figure GDA0003602296150000105
Figure GDA0003602296150000106
Combining the above formulas to form
Figure GDA0003602296150000111
Solving a tool coordinate matrix equation set, and making a matrix B as follows:
Figure GDA0003602296150000112
let matrix C be:
Figure GDA0003602296150000121
from the four results in step S5, matrix B and matrix C are obtained, and from equations 9 and 10, the following can be obtained:
Figure GDA0003602296150000122
transpose of matrix B, BTIs composed of
Figure GDA0003602296150000123
The tool coordinate matrix T' to be solved is
Figure GDA0003602296150000124
In the above formula (B)T*B)-1Representation matrix BTAn inverse matrix of the product of matrix B;
from equation 12, a matrix T can be found:
Figure GDA0003602296150000131
the solved equation set result T ' is substituted into the original equation set to obtain the residual matrix E1 ═ R1 ═ T ' + S1-Q1, E2 ═ R2 ═ T ' + S2-Q2, E3 ═ R3 ═ T ' + S3-Q3, and E4 ═ R4 ═ T ' + S4-Q4, that is:
Figure GDA0003602296150000132
Figure GDA0003602296150000133
Figure GDA0003602296150000134
Figure GDA0003602296150000135
let the residuals err1, err2, err3 and err4 be
Figure GDA0003602296150000136
Figure GDA0003602296150000137
Figure GDA0003602296150000138
Figure GDA0003602296150000139
The smaller the value of the residual error is, the higher the calibration precision is.
From equations 17, 18, 19, and 20, the calibration accuracy evaluation can be obtained:
err1=0.007 err2=0.015 err3=0.014 err4=0.004;
by using the method of the prior art (document 2), it is possible to obtain
err1'=0.366 err2'=0.258 err3'=0.389
Compared with the prior art, as shown in fig. 2, the abscissa represents the serial numbers of the equations in the equation set, and the ordinate represents the residual errors of the equations.
The invention relates to a method for accurately compensating alignment errors in a calibration process based on a 3D vision camera in a welding seam tracking system, which comprises the steps of firstly using a traditional four-point method calibration step to perform calibration alignment, recording three-dimensional coordinate matrixes S1, S2, S3 and S4 and attitude matrixes R1, R2, R3 and R4 at the four points by a robot in the alignment process, simultaneously identifying three-dimensional coordinate xyz discrete data sets P (P1, P2, P3 and P4) of a welding gun end point in a camera coordinate system by using a 3D vision camera system in the welding seam tracking system at the four points, and then converting P into three-dimensional coordinate discrete data sets Q (Q1, Q2, Q3 and Q4) in the robot coordinate system by using a coordinate conversion matrix A of the camera and the robot, wherein the calibrated tool coordinate system matrix T has the following relations of R822T + S56, R8253, R2 and Q867 + 867, R3T + S3Q 3, R4T + S4Q 4, so that a total of three unknowns (i.e. three elements of the matrix T) is obtained, and a total of 12 equations are combined to solve the optimal solution of the system of contradictory equations, so as to obtain the tool coordinate matrix T, i.e. complete the calibration of the tool coordinate system.
Under the premise of not increasing the cost, the method can fully and accurately identify the position of the calibration point under the robot coordinate system by the 3D vision system in the welding seam tracking system, so that the calibration precision of the tool coordinate system between the six-joint robot and the welding gun is greatly improved, the welding seam tracking precision is further improved, and the error caused by artificial alignment in the calibration process is greatly reduced.
The above-mentioned embodiments, objects, technical solutions and advantages of the present invention are further described in detail, it should be understood that the above-mentioned embodiments are only specific embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (1)

1. A method for improving the calibration precision of a coordinate system of a six-joint robot tool is characterized by comprising the following steps:
s1, obtaining a coordinate transformation matrix equation A between the 3D vision system and the six-joint robot through hand-eye calibration:
Figure FDA0003602296140000011
s2, calibrating a six-joint robot tool coordinate system according to the traditional four-point method calibration step;
s3, recording three-dimensional coordinate matrixes S1, S2, S3 and S4 and attitude matrixes R1, R2, R3 and R4 at four points by the six-joint robot; observing the three-dimensional coordinates at the initial position on a coordinate monitoring interface of the demonstrator and recording the three-dimensional coordinates as S1; teaching the robot by using a teaching machine, aligning the terminal point with the reference point, changing a posture, observing the three-dimensional coordinates at the position on a coordinate monitoring interface of the teaching machine, and recording the three-dimensional coordinates as S2; teaching the robot by using a demonstrator, aligning the terminal point with the reference point, changing a posture, observing the three-dimensional coordinate at the position on a coordinate monitoring interface of the demonstrator, and recording the three-dimensional coordinate as S3; teaching the robot by using a demonstrator, aligning the terminal point with the reference point, continuously changing a posture, observing the three-dimensional coordinate at the position on a coordinate monitoring interface of the demonstrator and recording the three-dimensional coordinate as S4;
wherein:
Figure FDA0003602296140000012
observing a pose matrix at an initial position on a coordinate monitoring interface of the demonstrator and recording the pose matrix as R1; changing a posture, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R2; changing a posture, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R3; after changing a posture continuously, observing a posture matrix at the position on a coordinate monitoring interface of the demonstrator and recording the posture matrix as R4;
wherein:
Figure FDA0003602296140000013
s4, acquiring three-dimensional coordinate discrete data sets P1, P2, P3 and P4 at four points by a 3D vision system; the 3D vision system photographs and processes the tool tail end at the initial position to obtain a three-dimensional coordinate record of the tool tail end point under a vision coordinate system as P1, a posture is changed, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P2, a posture is changed again, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P3, and after changing a posture continuously, the 3D vision system photographs and processes the tool tail end under the position to obtain a three-dimensional coordinate record of the tool tail end point under the vision coordinate system as P4;
wherein:
Figure FDA0003602296140000021
s5, calculating three-dimensional coordinate discrete sets Q1 ═ A ═ P1, Q2 ═ A ═ P2, Q3 ═ A ═ P3 and Q4 ═ A ═ P4 at four points in a six-joint robot coordinate system; the three-dimensional coordinates of the six-joint robot at the four points in steps S3 and S4 are Q1, Q2, Q3, Q4, and there are:
Figure FDA0003602296140000022
only the first three rows of the matrix are taken, namely:
Figure FDA0003602296140000023
the same can be obtained:
Figure FDA0003602296140000024
Figure FDA0003602296140000025
Figure FDA0003602296140000026
s6, solving the contradictory equations to obtain T according to the formula of the tool coordinate matrix T, where R1 × T + S1 is Q1, R2 × T + S2 is Q2, R3 × T + S3 is Q3, and R4 × T + S4 is Q4;
namely:
Figure FDA0003602296140000031
finishing
Figure FDA0003602296140000032
The same can be obtained
Figure FDA0003602296140000033
Figure FDA0003602296140000034
Figure FDA0003602296140000035
Combining the above formulas with
Figure FDA0003602296140000036
Solving a tool coordinate matrix equation set, and making a matrix B as follows:
Figure FDA0003602296140000041
let matrix C be:
Figure FDA0003602296140000042
transpose of matrix B, BTIs composed of
Figure FDA0003602296140000043
The tool coordinate matrix T' to be solved is
Figure FDA0003602296140000044
In the above formula (B)T*B)-1Representation matrix BTAn inverse matrix of the product of matrix B;
the solved equation set result T ' is substituted into the original equation set to obtain the residual matrix E1 ═ R1 ═ T ' + S1-Q1, E2 ═ R2 ═ T ' + S2-Q2, E3 ═ R3 ═ T ' + S3-Q3, and E4 ═ R4 ═ T ' + S4-Q4, that is:
Figure FDA0003602296140000051
Figure FDA0003602296140000052
Figure FDA0003602296140000053
Figure FDA0003602296140000054
let the residuals err1, err2, err3 and err4 be
Figure FDA0003602296140000055
Figure FDA0003602296140000056
Figure FDA0003602296140000057
Figure FDA0003602296140000058
The smaller the value of the residual error is, the higher the calibration precision is.
CN202110231651.1A 2021-03-02 2021-03-02 Method for improving coordinate system calibration precision of six-joint robot tool Active CN112873213B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110231651.1A CN112873213B (en) 2021-03-02 2021-03-02 Method for improving coordinate system calibration precision of six-joint robot tool

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110231651.1A CN112873213B (en) 2021-03-02 2021-03-02 Method for improving coordinate system calibration precision of six-joint robot tool

Publications (2)

Publication Number Publication Date
CN112873213A CN112873213A (en) 2021-06-01
CN112873213B true CN112873213B (en) 2022-06-10

Family

ID=76055252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110231651.1A Active CN112873213B (en) 2021-03-02 2021-03-02 Method for improving coordinate system calibration precision of six-joint robot tool

Country Status (1)

Country Link
CN (1) CN112873213B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113634958A (en) * 2021-09-27 2021-11-12 西安知象光电科技有限公司 Three-dimensional vision-based automatic welding system and method for large structural part
CN114918926B (en) * 2022-07-22 2022-10-25 杭州柳叶刀机器人有限公司 Mechanical arm visual registration method and device, control terminal and storage medium
CN117226856A (en) * 2023-11-16 2023-12-15 睿尔曼智能科技(北京)有限公司 Robot self-calibration method and system based on vision

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106113035B (en) * 2016-06-16 2018-04-24 华中科技大学 A kind of Six-DOF industrial robot end-of-arm tooling coordinate system caliberating device and method
CN106502208B (en) * 2016-09-23 2018-04-27 佛山华数机器人有限公司 A kind of industrial robot TCP scaling methods
CN109465831B (en) * 2018-12-17 2021-06-01 南京埃斯顿机器人工程有限公司 Method for improving calibration precision of tool coordinate system of industrial robot
CN109976259B (en) * 2019-03-19 2022-07-08 南京工程学院 VTK-based robot free-form surface workpiece polishing offline programming method
CN111775154B (en) * 2020-07-20 2021-09-03 广东拓斯达科技股份有限公司 Robot vision system

Also Published As

Publication number Publication date
CN112873213A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
CN112873213B (en) Method for improving coordinate system calibration precision of six-joint robot tool
CN111660295B (en) Industrial robot absolute precision calibration system and calibration method
CN110276806B (en) Online hand-eye calibration and grabbing pose calculation method for four-degree-of-freedom parallel robot stereoscopic vision hand-eye system
CN111531547B (en) Robot calibration and detection method based on vision measurement
CN110666798B (en) Robot vision calibration method based on perspective transformation model
CN108731591B (en) Robot tool coordinate system calibration method based on plane constraint
CN106066185B (en) A kind of line laser sensor automatic calibration device and method towards weld joint tracking
CN110136204B (en) Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera
CN113386136B (en) Robot posture correction method and system based on standard spherical array target estimation
CN109493389B (en) Camera calibration method and system based on deep learning
CN109465830B (en) Robot monocular stereoscopic vision calibration system and method
CN111941425A (en) Rapid workpiece positioning method of robot milling system based on laser tracker and binocular camera
CN109719722B (en) Method for accurately calibrating robot tail end and vision system
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN113237434B (en) Stepped calibrator-based eye-in-hand calibration method for laser profile sensor
CN112907679A (en) Robot repeated positioning precision measuring method based on vision
WO2024037174A1 (en) Robot calibration method based on pose constraint and force sensing
CN114643578A (en) Calibration device and method for improving robot vision guide precision
CN115139283B (en) Robot hand-eye calibration method based on random mark dot matrix
CN116026252A (en) Point cloud measurement method and system
CN112894814B (en) Mechanical arm DH parameter identification method based on least square method
CN114092563A (en) Photogrammetry beam method adjustment optimization method based on T-MAC
CN113916128A (en) Method for improving precision based on optical pen type vision measurement system
CN110458894B (en) Calibration method for camera and contact type measuring head of measuring machine
CN111975756A (en) Hand-eye calibration system and method of 3D vision measurement system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A Method for Improving the Calibration Accuracy of the Coordinate System of Six Joint Robot Tools

Effective date of registration: 20230927

Granted publication date: 20220610

Pledgee: Bank of Nanjing Co.,Ltd. Jiangning sub branch

Pledgor: Nanjing Dafeng CNC Technology Co.,Ltd.

Registration number: Y2023980059358