CN113386136A - Robot posture correction method and system based on standard spherical array target estimation - Google Patents

Robot posture correction method and system based on standard spherical array target estimation Download PDF

Info

Publication number
CN113386136A
CN113386136A CN202110736372.0A CN202110736372A CN113386136A CN 113386136 A CN113386136 A CN 113386136A CN 202110736372 A CN202110736372 A CN 202110736372A CN 113386136 A CN113386136 A CN 113386136A
Authority
CN
China
Prior art keywords
robot
coordinate system
pose
standard
transformation matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110736372.0A
Other languages
Chinese (zh)
Other versions
CN113386136B (en
Inventor
李文龙
田亚明
王刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN202110736372.0A priority Critical patent/CN113386136B/en
Publication of CN113386136A publication Critical patent/CN113386136A/en
Application granted granted Critical
Publication of CN113386136B publication Critical patent/CN113386136B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention belongs to the technical field of robots and discloses a robot posture correction method and system based on standard spherical array target estimation. The method comprises the following steps: s1, establishing a robot base coordinate system, a robot tail end coordinate system, a scanner measuring coordinate system and a standard spherical array local coordinate system; s2 the robot drives the scanner to scan the standard spherical array in multiple angles, reads and records the robot pose and the point cloud of the standard spherical array under the pose; calculating to obtain a transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system; s3, converting the transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system into an actual six-dimensional vector, wherein the actual six-dimensional vector is the actual pose of the robot, and calculating the error between the actual pose of the robot and the pose read in the step S2 to realize the pose correction of the robot. The invention breaks through the bottleneck of the existing method and has the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like.

Description

Robot posture correction method and system based on standard spherical array target estimation
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a robot posture correction method and system based on standard spherical array target estimation.
Background
The robot has the characteristics of flexible operation, high flexibility and the like, and the industrial robot replaces a human to finish operations such as stacking, welding, assembling and the like, so that the mainstream development trend of the application field of the industrial robot is formed, however, the lower absolute positioning precision of the industrial robot limits the precision of the operation of the industrial robot.
Therefore, many scholars develop researches on the theory aspect of the precision compensation method for the industrial robot according to the kinematic parameters of the robot, the method mostly involves complex mathematical formula derivation processes such as Jacobian matrix calculation, differential motion solution and the like, the efficiency is low for the condition that only the specific robot pose needs to be corrected, and the cost of a measuring device (a laser tracker) is greatly increased. Accordingly, there is a technical need in the art to develop a robot posture fast correction method based on standard spherical array target estimation, which is low in cost, strong in practicability and high in correction efficiency.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides a robot pose correction method and system based on standard spherical array target estimation, which can realize the rapid correction of the specific robot pose, break through the bottleneck of the existing method, and have the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like.
To achieve the above object, according to one aspect of the present invention, there is provided a robot pose correction method based on a standard sphere array target estimation, the method comprising the steps of:
s1, fixedly connecting the scanner at the tail end of the robot, placing the standard spherical array on the workbench, and establishing a robot base coordinate system { B }, a robot tail end coordinate system { E }, a scanner measurement coordinate system { S } and a standard spherical array local coordinate system { W };
s2, the robot drives the scanner to scan the standard spherical array in multiple angles, and the pose of the robot and the point cloud of the standard spherical array under the pose are read and recorded; calculating a hand-eye matrix based on the recorded pose and point cloud information; converting the recorded point cloud to a robot base coordinate system, and matching the point cloud with a standard spherical array design model; completing standard ball array target by using matched resultEstimating; calculating and obtaining a transformation matrix of a robot terminal coordinate system { E } relative to a robot base coordinate system { B } based on a dimension chain transfer model
Figure BDA0003141736380000021
S3 transforming the robot end coordinate system to the robot base coordinate system by vector-matrix transformation relation
Figure BDA0003141736380000022
Conversion to a true six-dimensional vector
Figure BDA0003141736380000023
The actual six-dimensional vector
Figure BDA0003141736380000024
That is, the actual pose of the robot obtained by the coordinate transformation calculation is calculated and the error between the actual pose of the robot and the pose of the robot read in step S2 is obtained, so that the pose of the robot is corrected.
2. The method for correcting robot pose based on standard sphere array target estimation of claim 1, wherein in step S2, the transformation matrix
Figure BDA0003141736380000025
Obtained in the following way:
s21, establishing a standard spherical array local coordinate system { W } based on the standard spherical array point cloud, and calculating to obtain a transformation matrix of the standard spherical array local coordinate system { W } relative to a scanner measurement coordinate system { S }
Figure BDA0003141736380000026
S22, the hand-eye calibration is carried out on the robot, so as to obtain the hand-eye relation matrix of the robot, namely the transformation matrix between the scanning measurement coordinate system { S } relative to the robot terminal coordinate system { E }
Figure BDA0003141736380000027
S23, based on the rigidity transformation matrix and point cloud matching, the transformation matrix of the robot base coordinate system { B } relative to the standard spherical matrix local coordinate system { W } is obtained
Figure BDA0003141736380000028
S24 construction of transformation matrix
Figure BDA0003141736380000029
And
Figure BDA00031417363800000210
and transformation matrix
Figure BDA00031417363800000211
The required transformation matrix is obtained by utilizing the relation formula to calculate
Figure BDA00031417363800000212
Further preferably, in step S21, the transformation matrix
Figure BDA00031417363800000213
The calculation is obtained according to the following mode:
(a) fitting the coordinates of the center of each standard sphere according to the data of the scanned standard spherical array point cloud under each posture;
(b) selecting the sphere center of one standard sphere as an origin to establish a local coordinate system of the standard sphere array, wherein the three-dimensional coordinates of the origin and the coordinate axis direction of the coordinate system form a transformation matrix
Figure BDA0003141736380000031
Further preferably, in step S22, the matrix is transformed
Figure BDA0003141736380000032
Obtained according to the following steps:
(a) establishing a robot pose and a robot end coordinate system relative to a robot base coordinateThe relation between the transformation matrixes of the systems is calculated by utilizing the read pose of the robot to obtain the transformation matrix of the coordinate system { E } of the tail end of the robot relative to the coordinate system { B } of the base robot
Figure BDA0003141736380000033
(b) Constructing transformation matrices
Figure BDA0003141736380000034
And
Figure BDA0003141736380000035
the relation between them, so as to calculate and obtain transformation matrix
Figure BDA0003141736380000036
Further preferably, in step (a), the transformation matrix
Figure BDA0003141736380000037
Calculated according to the following relation:
Figure BDA0003141736380000038
wherein the content of the first and second substances,
Figure BDA0003141736380000039
represents the transformation matrix of { E } relative to { B } at the ith measurement, R (z, Ez)i) Representing rotation Ez about the z-axisiRotation matrix of (R (y, Ey)i)、R(x,Exi) And so on),
Figure BDA00031417363800000316
is a three-dimensional vector representing the position coordinates of the origin of { E } under { B } at the i-th measurement.
Further preferably, in step S23, the transformation matrix
Figure BDA00031417363800000310
In the following mannerObtaining:
(a) each scanner measures the point cloud under the coordinate system through rigid body transformation matrix product operationSPiConverting the point cloud into a standard spherical array point cloudBP1,BP2,BP3,...,BPi,...,BPn};
(b) Point cloud of standard spherical array under robot base coordinate systemBP1,BP2,BP3,...,BPi,...,BPnTaking the three-dimensional entity model of the standard spherical array as a reference model, adopting an ADF algorithm for matching, and obtaining a transformation matrix of the robot base coordinate system relative to the local coordinate system of the standard spherical array as
Figure BDA00031417363800000311
(c) Transformation matrix for robot base coordinate system relative to standard spherical array local coordinate system
Figure BDA00031417363800000312
Conversion into six-dimensional vector sets
Figure BDA00031417363800000313
Solving the mean of the six-dimensional vector group
Figure BDA00031417363800000314
By vector-matrix transformation, from six-dimensional vectors
Figure BDA00031417363800000315
Solving transformation matrices
Figure BDA0003141736380000041
Further preferably, in the step (a), the scanning of each scanner is performed by measuring a point cloud under a coordinate systemSPiSwitching to the robot base coordinate system according to the following relationsThe method is carried out by the following steps:
Figure BDA0003141736380000042
wherein the content of the first and second substances,BPiis a standard spherical array point cloud under a robot base coordinate system acquired by the ith scanning,SPiis the standard spherical array point cloud under the coordinate system measured by the scanner obtained by the ith scanning,
Figure BDA0003141736380000043
is a transformation matrix of the terminal coordinate system of the scanning robot at the ith time relative to the base coordinate system of the robot.
Further preferably, in step S24, the transformation matrix
Figure BDA0003141736380000044
Matrix of
Figure BDA0003141736380000045
And
Figure BDA0003141736380000046
and transformation matrix
Figure BDA0003141736380000047
The relationship between them is according to the following relation:
Figure BDA0003141736380000048
where i is the number of scan measurements and n represents the total number of scan measurements.
Further preferably, the error is calculated according to the following relation:
Figure BDA0003141736380000049
wherein the content of the first and second substances,
Figure BDA00031417363800000410
is the robot pose error in the base coordinate system for the ith scan,
Figure BDA00031417363800000411
is the actual pose of the robot in the robot base coordinate system for the ith scanning,
Figure BDA00031417363800000412
the robot pose read in the robot controller for the ith scan is represented by i, i is the number of scans, and n is the total number of scans.
According to another aspect of the present invention, there is provided a system for correcting the robot posture correction method, the system comprising a robot, a scanner and a standard ball array, wherein the scanner is connected to the end of the robot, the standard ball array comprises a plurality of standard balls with different sizes and arranged in a nonlinear manner, and the standard ball array is arranged in the scanning range of the scanner.
Generally, compared with the prior art, the technical scheme of the invention has the following beneficial effects:
1. the method can correct the specific robot pose only by adding the robot pose to be corrected to the hand-eye calibration operation process, and is easy to implement; the method has the advantages that a plurality of specific robot poses can be quickly corrected only by matrix product forward/inverse operation and matrix-vector transformation, a new robot pose is determined based on a size chain transfer model, the robot pose is quickly corrected, the specific robot poses can be quickly corrected, the bottleneck of the existing method is broken through, and the method has the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like;
2. the invention realizes the correction of the pose of the robot by adopting the scanner and the standard calibration ball, greatly reduces the cost compared with the prior art which adopts the expensive laser tracker and the target ball, and simultaneously, the realization method of scanning the standard ball array by the scanner in multiple angles is fast, and the time consumption is shorter compared with the space pose of the planning robot grid type in the prior art;
3. the robot hand-eye calibration is completed based on the standard ball array comprising at least three standard balls with different diameters, and the rigid body transformation matrix is easy to solve; in addition, the point cloud matching is carried out by adopting an ADF algorithm, the standard spherical array target estimation is completed based on a plurality of matching transformation matrixes, the ADF matching algorithm integrates the distance function between the point and the distance function between the point and the plane, and the method is not easy to fall into local optimization relative to ICP point matching, and has high calculation efficiency and wide application range.
Drawings
Fig. 1 is a schematic flow chart of a robot pose rapid correction method based on standard spherical array target estimation according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a system for finishing rapid robot pose correction based on standard ball array target estimation according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a standard spherical array design model according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
The invention provides a robot pose rapid correction method based on standard spherical array target estimation, which is suitable for rapid correction of robot poses.
As shown in FIG. 2, the system for completing the rapid correction of the robot posture based on the standard ball array target estimation is shown in the figure, and the robot is sixThe degree of freedom industrial robot, the scanner is a grating binocular area array scanner, as shown in fig. 3, in this embodiment, the standard sphere array includes three standard spheres (standard matt ceramic spheres) with different sizes and non-linear arrangement. Wherein { B } denotes a robot base coordinate system, { E } denotes a robot end flange coordinate system, { S } denotes a scanner measurement coordinate system, { W } denotes a standard spherical array local coordinate system (or a workpiece coordinate system),
Figure BDA0003141736380000061
a transformation matrix representing the robot end flange coordinate system relative to the robot base coordinate system,
Figure BDA0003141736380000062
a transformation matrix representing the scanner measurement coordinate system relative to the robot end flange coordinate system,
Figure BDA0003141736380000063
a transformation matrix representing the local coordinate system of the standard spherical array relative to the measurement coordinate system of the scanner,
Figure BDA0003141736380000064
a transformation matrix representing the robot base coordinate system relative to a standard spherical array local coordinate system, wherein,
Figure BDA0003141736380000065
and
Figure BDA0003141736380000066
is a constant value, and the value is,
Figure BDA0003141736380000067
dependent on the number of scans, different numbers of scans correspond to different ones
Figure BDA0003141736380000068
Transform the matrix, hence, note
Figure BDA0003141736380000069
As shown in fig. 1, a method for quickly correcting the pose of a robot based on standard spherical array target estimation includes the following steps:
step one, the robot drives the scanner to scan and measure the standard spherical array in multiple angles, and the calibration of the hands and eyes of the robot is completed.
Specifically, firstly, the robot is controlled to drive the scanner to scan and measure the standard spherical array in multiple angles, and the obtained point cloud of the standard spherical array is recorded as-SP1,SP2,SP3,...,SPi,...,SPnN, n represents the number of scanning measurements, each scanner measuring a point cloud in a coordinate systemSPiFrom N pointsSp1Sp2Sp3,…,Spm,…,SpNComposition, where m is 1,2,3Spm=[xm ym zm]TThe coordinates of the m-th point are expressed, and simultaneously, the pose of the robot is sequentially recorded as
Figure BDA00031417363800000610
Figure BDA00031417363800000710
Is a six-dimensional vector and represents the position and posture of the ith robot,
Figure BDA0003141736380000071
indicating the position of the robot,. psii=[Exi Eyi Ezi]TRepresenting a pose of the robot; in the scanning process, at least three or more than three nonlinear related robot poses are included, and the robot poses are guaranteed to be non-singular.
Secondly, a standard spherical array local coordinate system is established based on the standard spherical array point cloud: fitting the centers of the standard spheres and the plane where the centers are located based on the measured point cloud, selecting the center of one of the spheres as an origin, establishing an x-axis direction by selecting the center of the other sphere, and using one of the planes where the centers are locatedThe vector is the z-axis, and the y-axis direction is obtained by the right-hand rule. Calculating transformation matrix of several standard spherical array local coordinate systems relative to the measuring coordinate system of scanner by using the original point of local coordinate system as position vector and using the directions of x, y and z axes as attitude vector
Figure BDA0003141736380000072
At the same time, through vector-matrix conversion, from six-dimensional vectors
Figure BDA0003141736380000073
Solving a transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system
Figure BDA0003141736380000074
(the transformation matrix is the pose ζ of the robot by direct reading)iObtained) by
Figure BDA0003141736380000075
Wherein, R (·, E ·)i) Representing rotation about an axis EiThe rotation matrix of the angle can be obtained, and transformation matrixes of a plurality of robot end coordinate systems relative to a robot base coordinate system can be obtained
Figure BDA0003141736380000076
Finally, based on the AX ═ XB model, where,
Figure BDA0003141736380000077
j-1, 2,3, a
Figure BDA0003141736380000078
And finishing the calibration of the hands and eyes of the robot.
And step two, acquiring a plurality of groups of standard spherical array point clouds under the robot base coordinate system based on the rigid body transformation matrix among the coordinate systems.
Specifically, each scanner measurement seat is calculated by rigid body transformation matrix productPoint cloud under the landmark systemSPiTo be transferred under the robot base coordinate system, i.e.
Figure BDA0003141736380000079
Wherein the content of the first and second substances,BPithe standard ball array point clouds under the robot base coordinate system obtained by the ith scanning are expressed, and the standard ball array point clouds under n groups of robot base coordinate systems can be obtained by the formula (2)BP1,BP2,BP3,...,BPi,...,BPn}。
And step three, obtaining a transformation matrix set of the robot base coordinate systems relative to the standard spherical array local coordinate system through point cloud matching.
Specifically, a standard spherical array point cloud under a robot base coordinate system is used for making a final imageBP1,BP2,BP3,...,BPi,...,BPnTaking a three-dimensional entity model of a standard spherical array as a reference model, matching by adopting an ADF algorithm, sequentially acquiring a transformation matrix set of a plurality of robot base coordinate systems relative to a local coordinate system of the standard spherical array, and recording the transformation matrix set as a test model
Figure BDA0003141736380000081
And step four, determining a transformation matrix of the robot base coordinate system relative to the standard spherical array local coordinate system based on the matching result, and finishing target estimation.
Specifically, first, a transformation matrix set of the robot base coordinate system with respect to the standard spherical array local coordinate system is set
Figure BDA0003141736380000082
Conversion into six-dimensional vector sets
Figure BDA0003141736380000083
Namely:
Figure BDA0003141736380000084
secondly, solving the six-dimensional vector group
Figure BDA0003141736380000085
Mean value of
Figure BDA0003141736380000086
Namely, it is
Figure BDA0003141736380000087
Finally, by vector-matrix transformation, from six-dimensional vectors
Figure BDA0003141736380000088
Solving transformation matrices
Figure BDA0003141736380000089
Namely, the robot base coordinate system is relative to the transformation matrix of the standard spherical array local coordinate system, and the standard spherical array target estimation is completed.
And step five, determining a transformation matrix of the terminal flange coordinate systems of the robots relative to the robot base coordinate system based on the size chain transfer model.
Specifically, based on the relative pose relationship among a robot base coordinate system, a standard spherical array local coordinate system, a scanner measurement coordinate system and a robot end flange coordinate system, a transformation matrix of the robot end flange coordinate system relative to the robot base coordinate system is sequentially solved
Figure BDA00031417363800000810
Namely, it is
Figure BDA0003141736380000091
The attitude matrix in the transformation matrix is an orthogonal matrix.
And sixthly, obtaining a new robot pose through matrix-vector transformation, and finishing quick correction of the robot pose.
In particular, the matrix is transformed
Figure BDA0003141736380000092
Conversion to six-dimensional vectors
Figure BDA0003141736380000093
Thus, a plurality of new robot poses can be obtained
Figure BDA0003141736380000094
Calculating pose deviation of poses of n robots
Figure BDA0003141736380000095
To perform robot pose correction, i.e.
Figure BDA0003141736380000096
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A robot posture correction method based on standard spherical array target estimation is characterized by comprising the following steps:
s1, fixedly connecting the scanner at the tail end of the robot, placing the standard spherical array on the workbench, and establishing a robot base coordinate system { B }, a robot tail end coordinate system { E }, a scanner measurement coordinate system { S } and a standard spherical array local coordinate system { W };
s2, the robot drives the scanner to scan the standard spherical array in multiple angles, and the pose of the robot and the point cloud of the standard spherical array under the pose are read and recorded; calculating and obtaining a transformation matrix of a robot terminal coordinate system { E } relative to a robot base coordinate system { B } based on a dimension chain transfer model
Figure FDA0003141736370000011
S3 transforming the robot end coordinate system to the robot base coordinate system by vector-matrix transformation relation
Figure FDA0003141736370000012
Conversion to a true six-dimensional vector
Figure FDA0003141736370000013
The actual six-dimensional vector
Figure FDA0003141736370000014
That is, the actual pose of the robot obtained by the coordinate transformation calculation is calculated and the error between the actual pose of the robot and the pose of the robot read in step S2 is obtained, so that the pose of the robot is corrected.
2. The method for correcting robot pose based on standard sphere array target estimation of claim 1, wherein in step S2, the transformation matrix
Figure FDA0003141736370000015
Obtained in the following way:
s21, establishing a standard spherical array local coordinate system { W } based on the standard spherical array point cloud, and calculating to obtain a transformation matrix of the standard spherical array local coordinate system { W } relative to a scanner measurement coordinate system { S }
Figure FDA0003141736370000016
S22, the hand-eye calibration is carried out on the robot, so as to obtain the hand-eye relation matrix of the robot, namely the transformation matrix between the scanning measurement coordinate system { S } relative to the robot terminal coordinate system { E }
Figure FDA0003141736370000017
S23, based on the rigidity transformation matrix and point cloud matching, the transformation matrix of the robot base coordinate system { B } relative to the standard spherical matrix local coordinate system { W } is obtained
Figure FDA0003141736370000018
S24 construction of transformation matrix
Figure FDA0003141736370000019
And
Figure FDA00031417363700000110
and transformation matrix
Figure FDA00031417363700000111
The required transformation matrix is obtained by utilizing the relation formula to calculate
Figure FDA00031417363700000112
3. The method for correcting robot pose based on standard sphere array target estimation of claim 2, wherein in step S21, the transformation matrix
Figure FDA0003141736370000021
The calculation is obtained according to the following mode:
(a) fitting the coordinates of the center of each standard sphere according to the data of the scanned standard spherical array point cloud under each posture;
(b) selecting the sphere center of one standard sphere as an origin to establish a local coordinate system of the standard sphere array, wherein the three-dimensional coordinates of the origin and the coordinate axis direction of the coordinate system form a transformation matrix
Figure FDA0003141736370000022
4. The standard ball-array target estimation machine according to claim 2The method for correcting the posture of the robot is characterized in that in step S22, the matrix is transformed
Figure FDA0003141736370000023
Obtained according to the following steps:
(a) constructing a relational expression between the robot pose and a transformation matrix of the robot end coordinate system relative to the robot base coordinate system, and calculating by using the read robot pose to obtain the transformation matrix of the robot end coordinate system { E } relative to the robot base coordinate system { B }
Figure FDA0003141736370000024
(b) Constructing transformation matrices
Figure FDA0003141736370000025
And
Figure FDA0003141736370000026
the relation between them, so as to calculate and obtain transformation matrix
Figure FDA0003141736370000027
5. The method for correcting robot pose based on standard sphere array target estimation of claim 4, wherein in step (a), the transformation matrix
Figure FDA0003141736370000028
Calculated according to the following relation:
Figure FDA0003141736370000029
wherein the content of the first and second substances,
Figure FDA00031417363700000210
representing the transformation of { E } relative to { B } at the ith measurementMatrix, R (z, Ez)i) Representing rotation Ez about the z-axisiRotation matrix of (R (y, Ey)i)、R(x,Exi) And so on),
Figure FDA00031417363700000211
is a three-dimensional vector representing the position coordinates of the origin of { E } under { B } at the i-th measurement.
6. The method for correcting robot pose based on standard sphere array target estimation of claim 2, wherein in step S23, the transformation matrix
Figure FDA00031417363700000212
Obtained in the following way:
(a) each scanner measures the point cloud under the coordinate system through rigid body transformation matrix product operationSPiConverting the point cloud into a standard spherical array point cloudBP1,BP2,BP3,...,BPi,...,BPn};
(b) Point cloud of standard spherical array under robot base coordinate systemBP1,BP2,BP3,...,BPi,...,BPnTaking the three-dimensional entity model of the standard spherical array as a reference model, adopting an ADF algorithm for matching, and obtaining a transformation matrix of the robot base coordinate system relative to the local coordinate system of the standard spherical array as
Figure FDA0003141736370000031
(c) Transformation matrix for robot base coordinate system relative to standard spherical array local coordinate system
Figure FDA0003141736370000032
Conversion into six-dimensional vector sets
Figure FDA0003141736370000033
Solving the mean of the six-dimensional vector group
Figure FDA0003141736370000034
By vector-matrix transformation, from six-dimensional vectors
Figure FDA0003141736370000035
Solving transformation matrices
Figure FDA0003141736370000036
7. The method according to claim 6, wherein in the step (a), each scanner measures a point cloud in a coordinate systemSPiAnd (3) converting to a robot base coordinate system according to the following relation:
Figure FDA0003141736370000037
wherein the content of the first and second substances,BPiis a standard spherical array point cloud under a robot base coordinate system acquired by the ith scanning,SPiis the standard spherical array point cloud under the coordinate system measured by the scanner obtained by the ith scanning,
Figure FDA0003141736370000038
is a transformation matrix of the terminal coordinate system of the scanning robot at the ith time relative to the base coordinate system of the robot.
8. The method for correcting robot pose based on standard sphere array target estimation of claim 2, wherein in step S24, the transformation matrix
Figure FDA0003141736370000039
Matrix of
Figure FDA00031417363700000310
And
Figure FDA00031417363700000311
and transformation matrix
Figure FDA00031417363700000312
The relationship between them is according to the following relation:
Figure FDA00031417363700000313
where i is the number of scan measurements and n represents the total number of scan measurements.
9. A method for robot pose correction based on standard sphere array target estimation as claimed in claim 1, wherein the error is calculated according to the following relation:
Figure FDA00031417363700000314
wherein the content of the first and second substances,
Figure FDA00031417363700000315
is the robot pose error in the base coordinate system for the ith scan,
Figure FDA00031417363700000316
is the actual pose of the robot in the robot base coordinate system for the ith scanning,
Figure FDA0003141736370000041
the robot pose read in the robot controller for the ith scan is represented by i, i is the number of scans, and n is the total number of scans.
10. A system for correcting posture of a robot using the method of any one of claims 1 to 9, comprising a robot, a scanner attached to an end of the robot, and a standard ball array comprising a plurality of standard balls of different sizes and arranged non-linearly, the standard ball array being disposed within a scanning range of the scanner.
CN202110736372.0A 2021-06-30 2021-06-30 Robot posture correction method and system based on standard spherical array target estimation Active CN113386136B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110736372.0A CN113386136B (en) 2021-06-30 2021-06-30 Robot posture correction method and system based on standard spherical array target estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110736372.0A CN113386136B (en) 2021-06-30 2021-06-30 Robot posture correction method and system based on standard spherical array target estimation

Publications (2)

Publication Number Publication Date
CN113386136A true CN113386136A (en) 2021-09-14
CN113386136B CN113386136B (en) 2022-05-20

Family

ID=77624581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110736372.0A Active CN113386136B (en) 2021-06-30 2021-06-30 Robot posture correction method and system based on standard spherical array target estimation

Country Status (1)

Country Link
CN (1) CN113386136B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113770577A (en) * 2021-09-18 2021-12-10 宁波博视达焊接机器人有限公司 Method for realizing generation of track of workpiece mounted on robot
CN113843792A (en) * 2021-09-23 2021-12-28 四川锋准机器人科技有限公司 Hand-eye calibration method of surgical robot
CN114310888A (en) * 2021-12-28 2022-04-12 广东省科学院智能制造研究所 Cooperative robot variable-rigidity motor skill learning and regulating method and system
CN114347027A (en) * 2022-01-08 2022-04-15 天晟智享(常州)机器人科技有限公司 Pose calibration method of 3D camera relative to mechanical arm
CN114485468A (en) * 2022-01-28 2022-05-13 天津大学 Multi-axis linkage composite measurement system and micro-part full-profile automatic measurement method
CN114589692A (en) * 2022-02-25 2022-06-07 埃夫特智能装备股份有限公司 Robot zero calibration method and calibration equipment thereof
CN114770517A (en) * 2022-05-19 2022-07-22 梅卡曼德(北京)机器人科技有限公司 Method for calibrating robot through point cloud acquisition device and calibration system
CN115249267A (en) * 2022-09-22 2022-10-28 海克斯康制造智能技术(青岛)有限公司 Automatic detection method and device based on turntable and robot position and attitude calculation
CN114310888B (en) * 2021-12-28 2024-05-31 广东省科学院智能制造研究所 Method and system for learning and regulating variable rigidity motor skills of cooperative robot

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1805830A (en) * 2003-06-11 2006-07-19 Abb公司 A method for fine tuning of a robot program
CN107398901A (en) * 2017-07-28 2017-11-28 哈尔滨工业大学 The visual servo control method of robot for space maintainable technology on-orbit
CN107953336A (en) * 2017-12-27 2018-04-24 北京理工大学 Measured piece is loaded the modification method and system of deviation in manipulator Ultrasonic NDT
CN108724181A (en) * 2017-04-19 2018-11-02 丰田自动车株式会社 Calibration system
EP3402632A1 (en) * 2016-01-11 2018-11-21 KUKA Deutschland GmbH Determining an orientation of a robot relative to the direction of gravity
CN108994827A (en) * 2018-05-04 2018-12-14 武汉理工大学 A kind of robot measurement-system of processing scanner coordinate system automatic calibration method
CN109373898A (en) * 2018-11-27 2019-02-22 华中科技大学 A kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud
CN110202582A (en) * 2019-07-03 2019-09-06 桂林电子科技大学 A kind of robot calibration method based on three coordinates platforms
CN110480638A (en) * 2019-08-20 2019-11-22 南京博约智能科技有限公司 A kind of self-compensating palletizing method of articulated robot position and attitude error and its palletizing system
CN111551111A (en) * 2020-05-13 2020-08-18 华中科技大学 Part feature robot rapid visual positioning method based on standard ball array
US20210039259A1 (en) * 2018-02-26 2021-02-11 Renishaw Plc Coordinate positioning machine
CN112659112A (en) * 2020-12-03 2021-04-16 合肥富煌君达高科信息技术有限公司 Robot eye calibration method based on line laser scanner

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1805830A (en) * 2003-06-11 2006-07-19 Abb公司 A method for fine tuning of a robot program
EP3402632A1 (en) * 2016-01-11 2018-11-21 KUKA Deutschland GmbH Determining an orientation of a robot relative to the direction of gravity
CN108724181A (en) * 2017-04-19 2018-11-02 丰田自动车株式会社 Calibration system
CN107398901A (en) * 2017-07-28 2017-11-28 哈尔滨工业大学 The visual servo control method of robot for space maintainable technology on-orbit
CN107953336A (en) * 2017-12-27 2018-04-24 北京理工大学 Measured piece is loaded the modification method and system of deviation in manipulator Ultrasonic NDT
US20210039259A1 (en) * 2018-02-26 2021-02-11 Renishaw Plc Coordinate positioning machine
CN108994827A (en) * 2018-05-04 2018-12-14 武汉理工大学 A kind of robot measurement-system of processing scanner coordinate system automatic calibration method
CN109373898A (en) * 2018-11-27 2019-02-22 华中科技大学 A kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud
CN110202582A (en) * 2019-07-03 2019-09-06 桂林电子科技大学 A kind of robot calibration method based on three coordinates platforms
CN110480638A (en) * 2019-08-20 2019-11-22 南京博约智能科技有限公司 A kind of self-compensating palletizing method of articulated robot position and attitude error and its palletizing system
CN111551111A (en) * 2020-05-13 2020-08-18 华中科技大学 Part feature robot rapid visual positioning method based on standard ball array
CN112659112A (en) * 2020-12-03 2021-04-16 合肥富煌君达高科信息技术有限公司 Robot eye calibration method based on line laser scanner

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李文龙 等: "核主泵复杂零件机器人在位自动光学检测***开发", 《机械工程学报》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113770577A (en) * 2021-09-18 2021-12-10 宁波博视达焊接机器人有限公司 Method for realizing generation of track of workpiece mounted on robot
CN113770577B (en) * 2021-09-18 2022-09-20 宁波博视达焊接机器人有限公司 Method for realizing generation of track of workpiece mounted on robot
CN113843792B (en) * 2021-09-23 2024-02-06 四川锋准机器人科技有限公司 Hand-eye calibration method of surgical robot
CN113843792A (en) * 2021-09-23 2021-12-28 四川锋准机器人科技有限公司 Hand-eye calibration method of surgical robot
CN114310888A (en) * 2021-12-28 2022-04-12 广东省科学院智能制造研究所 Cooperative robot variable-rigidity motor skill learning and regulating method and system
CN114310888B (en) * 2021-12-28 2024-05-31 广东省科学院智能制造研究所 Method and system for learning and regulating variable rigidity motor skills of cooperative robot
CN114347027A (en) * 2022-01-08 2022-04-15 天晟智享(常州)机器人科技有限公司 Pose calibration method of 3D camera relative to mechanical arm
CN114485468A (en) * 2022-01-28 2022-05-13 天津大学 Multi-axis linkage composite measurement system and micro-part full-profile automatic measurement method
CN114485468B (en) * 2022-01-28 2023-09-26 天津大学 Multi-axis linkage composite measurement system and micro-part full-contour automatic measurement method
CN114589692A (en) * 2022-02-25 2022-06-07 埃夫特智能装备股份有限公司 Robot zero calibration method and calibration equipment thereof
CN114589692B (en) * 2022-02-25 2024-03-26 埃夫特智能装备股份有限公司 Zero calibration method and calibration equipment for robot
CN114770517A (en) * 2022-05-19 2022-07-22 梅卡曼德(北京)机器人科技有限公司 Method for calibrating robot through point cloud acquisition device and calibration system
CN114770517B (en) * 2022-05-19 2023-08-15 梅卡曼德(北京)机器人科技有限公司 Method for calibrating robot through point cloud acquisition device and calibration system
CN115249267A (en) * 2022-09-22 2022-10-28 海克斯康制造智能技术(青岛)有限公司 Automatic detection method and device based on turntable and robot position and attitude calculation

Also Published As

Publication number Publication date
CN113386136B (en) 2022-05-20

Similar Documents

Publication Publication Date Title
CN113386136B (en) Robot posture correction method and system based on standard spherical array target estimation
CN107738254B (en) Conversion calibration method and system for mechanical arm coordinate system
CN109822574B (en) Industrial robot end six-dimensional force sensor calibration method
Wang et al. A point and distance constraint based 6R robot calibration method through machine vision
CN112833786B (en) Cabin attitude and pose measuring and aligning system, control method and application
CN109877840B (en) Double-mechanical-arm calibration method based on camera optical axis constraint
CN111660295A (en) Industrial robot absolute precision calibration system and calibration method
Zhuang et al. Robot calibration with planar constraints
CN109323650B (en) Unified method for measuring coordinate system by visual image sensor and light spot distance measuring sensor in measuring system
CN110276806A (en) Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system
CN106777656B (en) Industrial robot absolute accuracy calibration method based on PMPSD
CN112873199B (en) Robot absolute positioning precision calibration method based on kinematics and spatial interpolation
CN113160334B (en) Dual-robot system calibration method based on hand-eye camera
CN111168719B (en) Robot calibration method and system based on positioning tool
CN107817682A (en) A kind of space manipulator on-orbit calibration method and system based on trick camera
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN112109072B (en) Accurate 6D pose measurement and grabbing method for large sparse feature tray
CN112454366A (en) Hand-eye calibration method
CN115546289A (en) Robot-based three-dimensional shape measurement method for complex structural part
Dehghani et al. Vision-based calibration of a Hexa parallel robot
TW202302301A (en) Automated calibration system and method for the relation between a profile scanner coordinate frame and a robot arm coordinate frame
CN109059761B (en) EIV model-based handheld target measuring head calibration method
CN113878586B (en) Robot kinematics calibration device, method and system
CN116309879A (en) Robot-assisted multi-view three-dimensional scanning measurement method
CN115179323A (en) Machine end pose measuring device based on telecentric vision constraint and precision improving method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant