CN113386136A - Robot posture correction method and system based on standard spherical array target estimation - Google Patents
Robot posture correction method and system based on standard spherical array target estimation Download PDFInfo
- Publication number
- CN113386136A CN113386136A CN202110736372.0A CN202110736372A CN113386136A CN 113386136 A CN113386136 A CN 113386136A CN 202110736372 A CN202110736372 A CN 202110736372A CN 113386136 A CN113386136 A CN 113386136A
- Authority
- CN
- China
- Prior art keywords
- robot
- coordinate system
- pose
- standard
- transformation matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The invention belongs to the technical field of robots and discloses a robot posture correction method and system based on standard spherical array target estimation. The method comprises the following steps: s1, establishing a robot base coordinate system, a robot tail end coordinate system, a scanner measuring coordinate system and a standard spherical array local coordinate system; s2 the robot drives the scanner to scan the standard spherical array in multiple angles, reads and records the robot pose and the point cloud of the standard spherical array under the pose; calculating to obtain a transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system; s3, converting the transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system into an actual six-dimensional vector, wherein the actual six-dimensional vector is the actual pose of the robot, and calculating the error between the actual pose of the robot and the pose read in the step S2 to realize the pose correction of the robot. The invention breaks through the bottleneck of the existing method and has the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like.
Description
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a robot posture correction method and system based on standard spherical array target estimation.
Background
The robot has the characteristics of flexible operation, high flexibility and the like, and the industrial robot replaces a human to finish operations such as stacking, welding, assembling and the like, so that the mainstream development trend of the application field of the industrial robot is formed, however, the lower absolute positioning precision of the industrial robot limits the precision of the operation of the industrial robot.
Therefore, many scholars develop researches on the theory aspect of the precision compensation method for the industrial robot according to the kinematic parameters of the robot, the method mostly involves complex mathematical formula derivation processes such as Jacobian matrix calculation, differential motion solution and the like, the efficiency is low for the condition that only the specific robot pose needs to be corrected, and the cost of a measuring device (a laser tracker) is greatly increased. Accordingly, there is a technical need in the art to develop a robot posture fast correction method based on standard spherical array target estimation, which is low in cost, strong in practicability and high in correction efficiency.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides a robot pose correction method and system based on standard spherical array target estimation, which can realize the rapid correction of the specific robot pose, break through the bottleneck of the existing method, and have the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like.
To achieve the above object, according to one aspect of the present invention, there is provided a robot pose correction method based on a standard sphere array target estimation, the method comprising the steps of:
s1, fixedly connecting the scanner at the tail end of the robot, placing the standard spherical array on the workbench, and establishing a robot base coordinate system { B }, a robot tail end coordinate system { E }, a scanner measurement coordinate system { S } and a standard spherical array local coordinate system { W };
s2, the robot drives the scanner to scan the standard spherical array in multiple angles, and the pose of the robot and the point cloud of the standard spherical array under the pose are read and recorded; calculating a hand-eye matrix based on the recorded pose and point cloud information; converting the recorded point cloud to a robot base coordinate system, and matching the point cloud with a standard spherical array design model; completing standard ball array target by using matched resultEstimating; calculating and obtaining a transformation matrix of a robot terminal coordinate system { E } relative to a robot base coordinate system { B } based on a dimension chain transfer model
S3 transforming the robot end coordinate system to the robot base coordinate system by vector-matrix transformation relationConversion to a true six-dimensional vectorThe actual six-dimensional vectorThat is, the actual pose of the robot obtained by the coordinate transformation calculation is calculated and the error between the actual pose of the robot and the pose of the robot read in step S2 is obtained, so that the pose of the robot is corrected.
2. The method for correcting robot pose based on standard sphere array target estimation of claim 1, wherein in step S2, the transformation matrixObtained in the following way:
s21, establishing a standard spherical array local coordinate system { W } based on the standard spherical array point cloud, and calculating to obtain a transformation matrix of the standard spherical array local coordinate system { W } relative to a scanner measurement coordinate system { S }
S22, the hand-eye calibration is carried out on the robot, so as to obtain the hand-eye relation matrix of the robot, namely the transformation matrix between the scanning measurement coordinate system { S } relative to the robot terminal coordinate system { E }
S23, based on the rigidity transformation matrix and point cloud matching, the transformation matrix of the robot base coordinate system { B } relative to the standard spherical matrix local coordinate system { W } is obtained
S24 construction of transformation matrixAndand transformation matrixThe required transformation matrix is obtained by utilizing the relation formula to calculate
Further preferably, in step S21, the transformation matrixThe calculation is obtained according to the following mode:
(a) fitting the coordinates of the center of each standard sphere according to the data of the scanned standard spherical array point cloud under each posture;
(b) selecting the sphere center of one standard sphere as an origin to establish a local coordinate system of the standard sphere array, wherein the three-dimensional coordinates of the origin and the coordinate axis direction of the coordinate system form a transformation matrix
Further preferably, in step S22, the matrix is transformedObtained according to the following steps:
(a) establishing a robot pose and a robot end coordinate system relative to a robot base coordinateThe relation between the transformation matrixes of the systems is calculated by utilizing the read pose of the robot to obtain the transformation matrix of the coordinate system { E } of the tail end of the robot relative to the coordinate system { B } of the base robot
(b) Constructing transformation matricesAndthe relation between them, so as to calculate and obtain transformation matrix
Further preferably, in step (a), the transformation matrixCalculated according to the following relation:
wherein the content of the first and second substances,represents the transformation matrix of { E } relative to { B } at the ith measurement, R (z, Ez)i) Representing rotation Ez about the z-axisiRotation matrix of (R (y, Ey)i)、R(x,Exi) And so on),is a three-dimensional vector representing the position coordinates of the origin of { E } under { B } at the i-th measurement.
(a) each scanner measures the point cloud under the coordinate system through rigid body transformation matrix product operationSPiConverting the point cloud into a standard spherical array point cloudBP1,BP2,BP3,...,BPi,...,BPn};
(b) Point cloud of standard spherical array under robot base coordinate systemBP1,BP2,BP3,...,BPi,...,BPnTaking the three-dimensional entity model of the standard spherical array as a reference model, adopting an ADF algorithm for matching, and obtaining a transformation matrix of the robot base coordinate system relative to the local coordinate system of the standard spherical array as
(c) Transformation matrix for robot base coordinate system relative to standard spherical array local coordinate systemConversion into six-dimensional vector setsSolving the mean of the six-dimensional vector groupBy vector-matrix transformation, from six-dimensional vectorsSolving transformation matrices
Further preferably, in the step (a), the scanning of each scanner is performed by measuring a point cloud under a coordinate systemSPiSwitching to the robot base coordinate system according to the following relationsThe method is carried out by the following steps:
wherein the content of the first and second substances,BPiis a standard spherical array point cloud under a robot base coordinate system acquired by the ith scanning,SPiis the standard spherical array point cloud under the coordinate system measured by the scanner obtained by the ith scanning,is a transformation matrix of the terminal coordinate system of the scanning robot at the ith time relative to the base coordinate system of the robot.
Further preferably, in step S24, the transformation matrixMatrix ofAndand transformation matrixThe relationship between them is according to the following relation:
where i is the number of scan measurements and n represents the total number of scan measurements.
Further preferably, the error is calculated according to the following relation:
wherein the content of the first and second substances,is the robot pose error in the base coordinate system for the ith scan,is the actual pose of the robot in the robot base coordinate system for the ith scanning,the robot pose read in the robot controller for the ith scan is represented by i, i is the number of scans, and n is the total number of scans.
According to another aspect of the present invention, there is provided a system for correcting the robot posture correction method, the system comprising a robot, a scanner and a standard ball array, wherein the scanner is connected to the end of the robot, the standard ball array comprises a plurality of standard balls with different sizes and arranged in a nonlinear manner, and the standard ball array is arranged in the scanning range of the scanner.
Generally, compared with the prior art, the technical scheme of the invention has the following beneficial effects:
1. the method can correct the specific robot pose only by adding the robot pose to be corrected to the hand-eye calibration operation process, and is easy to implement; the method has the advantages that a plurality of specific robot poses can be quickly corrected only by matrix product forward/inverse operation and matrix-vector transformation, a new robot pose is determined based on a size chain transfer model, the robot pose is quickly corrected, the specific robot poses can be quickly corrected, the bottleneck of the existing method is broken through, and the method has the advantages of low cost, strong practicability, high correction efficiency, wide application range and the like;
2. the invention realizes the correction of the pose of the robot by adopting the scanner and the standard calibration ball, greatly reduces the cost compared with the prior art which adopts the expensive laser tracker and the target ball, and simultaneously, the realization method of scanning the standard ball array by the scanner in multiple angles is fast, and the time consumption is shorter compared with the space pose of the planning robot grid type in the prior art;
3. the robot hand-eye calibration is completed based on the standard ball array comprising at least three standard balls with different diameters, and the rigid body transformation matrix is easy to solve; in addition, the point cloud matching is carried out by adopting an ADF algorithm, the standard spherical array target estimation is completed based on a plurality of matching transformation matrixes, the ADF matching algorithm integrates the distance function between the point and the distance function between the point and the plane, and the method is not easy to fall into local optimization relative to ICP point matching, and has high calculation efficiency and wide application range.
Drawings
Fig. 1 is a schematic flow chart of a robot pose rapid correction method based on standard spherical array target estimation according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a system for finishing rapid robot pose correction based on standard ball array target estimation according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a standard spherical array design model according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
The invention provides a robot pose rapid correction method based on standard spherical array target estimation, which is suitable for rapid correction of robot poses.
As shown in FIG. 2, the system for completing the rapid correction of the robot posture based on the standard ball array target estimation is shown in the figure, and the robot is sixThe degree of freedom industrial robot, the scanner is a grating binocular area array scanner, as shown in fig. 3, in this embodiment, the standard sphere array includes three standard spheres (standard matt ceramic spheres) with different sizes and non-linear arrangement. Wherein { B } denotes a robot base coordinate system, { E } denotes a robot end flange coordinate system, { S } denotes a scanner measurement coordinate system, { W } denotes a standard spherical array local coordinate system (or a workpiece coordinate system),a transformation matrix representing the robot end flange coordinate system relative to the robot base coordinate system,a transformation matrix representing the scanner measurement coordinate system relative to the robot end flange coordinate system,a transformation matrix representing the local coordinate system of the standard spherical array relative to the measurement coordinate system of the scanner,a transformation matrix representing the robot base coordinate system relative to a standard spherical array local coordinate system, wherein,andis a constant value, and the value is,dependent on the number of scans, different numbers of scans correspond to different onesTransform the matrix, hence, note
As shown in fig. 1, a method for quickly correcting the pose of a robot based on standard spherical array target estimation includes the following steps:
step one, the robot drives the scanner to scan and measure the standard spherical array in multiple angles, and the calibration of the hands and eyes of the robot is completed.
Specifically, firstly, the robot is controlled to drive the scanner to scan and measure the standard spherical array in multiple angles, and the obtained point cloud of the standard spherical array is recorded as-SP1,SP2,SP3,...,SPi,...,SPnN, n represents the number of scanning measurements, each scanner measuring a point cloud in a coordinate systemSPiFrom N pointsSp1,Sp2,Sp3,…,Spm,…,SpNComposition, where m is 1,2,3Spm=[xm ym zm]TThe coordinates of the m-th point are expressed, and simultaneously, the pose of the robot is sequentially recorded as Is a six-dimensional vector and represents the position and posture of the ith robot,indicating the position of the robot,. psii=[Exi Eyi Ezi]TRepresenting a pose of the robot; in the scanning process, at least three or more than three nonlinear related robot poses are included, and the robot poses are guaranteed to be non-singular.
Secondly, a standard spherical array local coordinate system is established based on the standard spherical array point cloud: fitting the centers of the standard spheres and the plane where the centers are located based on the measured point cloud, selecting the center of one of the spheres as an origin, establishing an x-axis direction by selecting the center of the other sphere, and using one of the planes where the centers are locatedThe vector is the z-axis, and the y-axis direction is obtained by the right-hand rule. Calculating transformation matrix of several standard spherical array local coordinate systems relative to the measuring coordinate system of scanner by using the original point of local coordinate system as position vector and using the directions of x, y and z axes as attitude vectorAt the same time, through vector-matrix conversion, from six-dimensional vectorsSolving a transformation matrix of the robot terminal coordinate system relative to the robot base coordinate system(the transformation matrix is the pose ζ of the robot by direct reading)iObtained) by
Wherein, R (·, E ·)i) Representing rotation about an axis EiThe rotation matrix of the angle can be obtained, and transformation matrixes of a plurality of robot end coordinate systems relative to a robot base coordinate system can be obtained
Finally, based on the AX ═ XB model, where,j-1, 2,3, aAnd finishing the calibration of the hands and eyes of the robot.
And step two, acquiring a plurality of groups of standard spherical array point clouds under the robot base coordinate system based on the rigid body transformation matrix among the coordinate systems.
Specifically, each scanner measurement seat is calculated by rigid body transformation matrix productPoint cloud under the landmark systemSPiTo be transferred under the robot base coordinate system, i.e.
Wherein the content of the first and second substances,BPithe standard ball array point clouds under the robot base coordinate system obtained by the ith scanning are expressed, and the standard ball array point clouds under n groups of robot base coordinate systems can be obtained by the formula (2)BP1,BP2,BP3,...,BPi,...,BPn}。
And step three, obtaining a transformation matrix set of the robot base coordinate systems relative to the standard spherical array local coordinate system through point cloud matching.
Specifically, a standard spherical array point cloud under a robot base coordinate system is used for making a final imageBP1,BP2,BP3,...,BPi,...,BPnTaking a three-dimensional entity model of a standard spherical array as a reference model, matching by adopting an ADF algorithm, sequentially acquiring a transformation matrix set of a plurality of robot base coordinate systems relative to a local coordinate system of the standard spherical array, and recording the transformation matrix set as a test model
And step four, determining a transformation matrix of the robot base coordinate system relative to the standard spherical array local coordinate system based on the matching result, and finishing target estimation.
Specifically, first, a transformation matrix set of the robot base coordinate system with respect to the standard spherical array local coordinate system is setConversion into six-dimensional vector setsNamely:
Finally, by vector-matrix transformation, from six-dimensional vectorsSolving transformation matricesNamely, the robot base coordinate system is relative to the transformation matrix of the standard spherical array local coordinate system, and the standard spherical array target estimation is completed.
And step five, determining a transformation matrix of the terminal flange coordinate systems of the robots relative to the robot base coordinate system based on the size chain transfer model.
Specifically, based on the relative pose relationship among a robot base coordinate system, a standard spherical array local coordinate system, a scanner measurement coordinate system and a robot end flange coordinate system, a transformation matrix of the robot end flange coordinate system relative to the robot base coordinate system is sequentially solvedNamely, it is
The attitude matrix in the transformation matrix is an orthogonal matrix.
And sixthly, obtaining a new robot pose through matrix-vector transformation, and finishing quick correction of the robot pose.
In particular, the matrix is transformedConversion to six-dimensional vectorsThus, a plurality of new robot poses can be obtainedCalculating pose deviation of poses of n robotsTo perform robot pose correction, i.e.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (10)
1. A robot posture correction method based on standard spherical array target estimation is characterized by comprising the following steps:
s1, fixedly connecting the scanner at the tail end of the robot, placing the standard spherical array on the workbench, and establishing a robot base coordinate system { B }, a robot tail end coordinate system { E }, a scanner measurement coordinate system { S } and a standard spherical array local coordinate system { W };
s2, the robot drives the scanner to scan the standard spherical array in multiple angles, and the pose of the robot and the point cloud of the standard spherical array under the pose are read and recorded; calculating and obtaining a transformation matrix of a robot terminal coordinate system { E } relative to a robot base coordinate system { B } based on a dimension chain transfer model
S3 transforming the robot end coordinate system to the robot base coordinate system by vector-matrix transformation relationConversion to a true six-dimensional vectorThe actual six-dimensional vectorThat is, the actual pose of the robot obtained by the coordinate transformation calculation is calculated and the error between the actual pose of the robot and the pose of the robot read in step S2 is obtained, so that the pose of the robot is corrected.
2. The method for correcting robot pose based on standard sphere array target estimation of claim 1, wherein in step S2, the transformation matrixObtained in the following way:
s21, establishing a standard spherical array local coordinate system { W } based on the standard spherical array point cloud, and calculating to obtain a transformation matrix of the standard spherical array local coordinate system { W } relative to a scanner measurement coordinate system { S }
S22, the hand-eye calibration is carried out on the robot, so as to obtain the hand-eye relation matrix of the robot, namely the transformation matrix between the scanning measurement coordinate system { S } relative to the robot terminal coordinate system { E }
S23, based on the rigidity transformation matrix and point cloud matching, the transformation matrix of the robot base coordinate system { B } relative to the standard spherical matrix local coordinate system { W } is obtained
3. The method for correcting robot pose based on standard sphere array target estimation of claim 2, wherein in step S21, the transformation matrixThe calculation is obtained according to the following mode:
(a) fitting the coordinates of the center of each standard sphere according to the data of the scanned standard spherical array point cloud under each posture;
4. The standard ball-array target estimation machine according to claim 2The method for correcting the posture of the robot is characterized in that in step S22, the matrix is transformedObtained according to the following steps:
(a) constructing a relational expression between the robot pose and a transformation matrix of the robot end coordinate system relative to the robot base coordinate system, and calculating by using the read robot pose to obtain the transformation matrix of the robot end coordinate system { E } relative to the robot base coordinate system { B }
5. The method for correcting robot pose based on standard sphere array target estimation of claim 4, wherein in step (a), the transformation matrixCalculated according to the following relation:
wherein the content of the first and second substances,representing the transformation of { E } relative to { B } at the ith measurementMatrix, R (z, Ez)i) Representing rotation Ez about the z-axisiRotation matrix of (R (y, Ey)i)、R(x,Exi) And so on),is a three-dimensional vector representing the position coordinates of the origin of { E } under { B } at the i-th measurement.
6. The method for correcting robot pose based on standard sphere array target estimation of claim 2, wherein in step S23, the transformation matrixObtained in the following way:
(a) each scanner measures the point cloud under the coordinate system through rigid body transformation matrix product operationSPiConverting the point cloud into a standard spherical array point cloudBP1,BP2,BP3,...,BPi,...,BPn};
(b) Point cloud of standard spherical array under robot base coordinate systemBP1,BP2,BP3,...,BPi,...,BPnTaking the three-dimensional entity model of the standard spherical array as a reference model, adopting an ADF algorithm for matching, and obtaining a transformation matrix of the robot base coordinate system relative to the local coordinate system of the standard spherical array as
(c) Transformation matrix for robot base coordinate system relative to standard spherical array local coordinate systemConversion into six-dimensional vector setsSolving the mean of the six-dimensional vector groupBy vector-matrix transformation, from six-dimensional vectorsSolving transformation matrices
7. The method according to claim 6, wherein in the step (a), each scanner measures a point cloud in a coordinate systemSPiAnd (3) converting to a robot base coordinate system according to the following relation:
wherein the content of the first and second substances,BPiis a standard spherical array point cloud under a robot base coordinate system acquired by the ith scanning,SPiis the standard spherical array point cloud under the coordinate system measured by the scanner obtained by the ith scanning,is a transformation matrix of the terminal coordinate system of the scanning robot at the ith time relative to the base coordinate system of the robot.
8. The method for correcting robot pose based on standard sphere array target estimation of claim 2, wherein in step S24, the transformation matrixMatrix ofAndand transformation matrixThe relationship between them is according to the following relation:
where i is the number of scan measurements and n represents the total number of scan measurements.
9. A method for robot pose correction based on standard sphere array target estimation as claimed in claim 1, wherein the error is calculated according to the following relation:
wherein the content of the first and second substances,is the robot pose error in the base coordinate system for the ith scan,is the actual pose of the robot in the robot base coordinate system for the ith scanning,the robot pose read in the robot controller for the ith scan is represented by i, i is the number of scans, and n is the total number of scans.
10. A system for correcting posture of a robot using the method of any one of claims 1 to 9, comprising a robot, a scanner attached to an end of the robot, and a standard ball array comprising a plurality of standard balls of different sizes and arranged non-linearly, the standard ball array being disposed within a scanning range of the scanner.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110736372.0A CN113386136B (en) | 2021-06-30 | 2021-06-30 | Robot posture correction method and system based on standard spherical array target estimation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110736372.0A CN113386136B (en) | 2021-06-30 | 2021-06-30 | Robot posture correction method and system based on standard spherical array target estimation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113386136A true CN113386136A (en) | 2021-09-14 |
CN113386136B CN113386136B (en) | 2022-05-20 |
Family
ID=77624581
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110736372.0A Active CN113386136B (en) | 2021-06-30 | 2021-06-30 | Robot posture correction method and system based on standard spherical array target estimation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113386136B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113770577A (en) * | 2021-09-18 | 2021-12-10 | 宁波博视达焊接机器人有限公司 | Method for realizing generation of track of workpiece mounted on robot |
CN113843792A (en) * | 2021-09-23 | 2021-12-28 | 四川锋准机器人科技有限公司 | Hand-eye calibration method of surgical robot |
CN114310888A (en) * | 2021-12-28 | 2022-04-12 | 广东省科学院智能制造研究所 | Cooperative robot variable-rigidity motor skill learning and regulating method and system |
CN114347027A (en) * | 2022-01-08 | 2022-04-15 | 天晟智享(常州)机器人科技有限公司 | Pose calibration method of 3D camera relative to mechanical arm |
CN114485468A (en) * | 2022-01-28 | 2022-05-13 | 天津大学 | Multi-axis linkage composite measurement system and micro-part full-profile automatic measurement method |
CN114589692A (en) * | 2022-02-25 | 2022-06-07 | 埃夫特智能装备股份有限公司 | Robot zero calibration method and calibration equipment thereof |
CN114770517A (en) * | 2022-05-19 | 2022-07-22 | 梅卡曼德(北京)机器人科技有限公司 | Method for calibrating robot through point cloud acquisition device and calibration system |
CN115249267A (en) * | 2022-09-22 | 2022-10-28 | 海克斯康制造智能技术(青岛)有限公司 | Automatic detection method and device based on turntable and robot position and attitude calculation |
CN114310888B (en) * | 2021-12-28 | 2024-05-31 | 广东省科学院智能制造研究所 | Method and system for learning and regulating variable rigidity motor skills of cooperative robot |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1805830A (en) * | 2003-06-11 | 2006-07-19 | Abb公司 | A method for fine tuning of a robot program |
CN107398901A (en) * | 2017-07-28 | 2017-11-28 | 哈尔滨工业大学 | The visual servo control method of robot for space maintainable technology on-orbit |
CN107953336A (en) * | 2017-12-27 | 2018-04-24 | 北京理工大学 | Measured piece is loaded the modification method and system of deviation in manipulator Ultrasonic NDT |
CN108724181A (en) * | 2017-04-19 | 2018-11-02 | 丰田自动车株式会社 | Calibration system |
EP3402632A1 (en) * | 2016-01-11 | 2018-11-21 | KUKA Deutschland GmbH | Determining an orientation of a robot relative to the direction of gravity |
CN108994827A (en) * | 2018-05-04 | 2018-12-14 | 武汉理工大学 | A kind of robot measurement-system of processing scanner coordinate system automatic calibration method |
CN109373898A (en) * | 2018-11-27 | 2019-02-22 | 华中科技大学 | A kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud |
CN110202582A (en) * | 2019-07-03 | 2019-09-06 | 桂林电子科技大学 | A kind of robot calibration method based on three coordinates platforms |
CN110480638A (en) * | 2019-08-20 | 2019-11-22 | 南京博约智能科技有限公司 | A kind of self-compensating palletizing method of articulated robot position and attitude error and its palletizing system |
CN111551111A (en) * | 2020-05-13 | 2020-08-18 | 华中科技大学 | Part feature robot rapid visual positioning method based on standard ball array |
US20210039259A1 (en) * | 2018-02-26 | 2021-02-11 | Renishaw Plc | Coordinate positioning machine |
CN112659112A (en) * | 2020-12-03 | 2021-04-16 | 合肥富煌君达高科信息技术有限公司 | Robot eye calibration method based on line laser scanner |
-
2021
- 2021-06-30 CN CN202110736372.0A patent/CN113386136B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1805830A (en) * | 2003-06-11 | 2006-07-19 | Abb公司 | A method for fine tuning of a robot program |
EP3402632A1 (en) * | 2016-01-11 | 2018-11-21 | KUKA Deutschland GmbH | Determining an orientation of a robot relative to the direction of gravity |
CN108724181A (en) * | 2017-04-19 | 2018-11-02 | 丰田自动车株式会社 | Calibration system |
CN107398901A (en) * | 2017-07-28 | 2017-11-28 | 哈尔滨工业大学 | The visual servo control method of robot for space maintainable technology on-orbit |
CN107953336A (en) * | 2017-12-27 | 2018-04-24 | 北京理工大学 | Measured piece is loaded the modification method and system of deviation in manipulator Ultrasonic NDT |
US20210039259A1 (en) * | 2018-02-26 | 2021-02-11 | Renishaw Plc | Coordinate positioning machine |
CN108994827A (en) * | 2018-05-04 | 2018-12-14 | 武汉理工大学 | A kind of robot measurement-system of processing scanner coordinate system automatic calibration method |
CN109373898A (en) * | 2018-11-27 | 2019-02-22 | 华中科技大学 | A kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud |
CN110202582A (en) * | 2019-07-03 | 2019-09-06 | 桂林电子科技大学 | A kind of robot calibration method based on three coordinates platforms |
CN110480638A (en) * | 2019-08-20 | 2019-11-22 | 南京博约智能科技有限公司 | A kind of self-compensating palletizing method of articulated robot position and attitude error and its palletizing system |
CN111551111A (en) * | 2020-05-13 | 2020-08-18 | 华中科技大学 | Part feature robot rapid visual positioning method based on standard ball array |
CN112659112A (en) * | 2020-12-03 | 2021-04-16 | 合肥富煌君达高科信息技术有限公司 | Robot eye calibration method based on line laser scanner |
Non-Patent Citations (1)
Title |
---|
李文龙 等: "核主泵复杂零件机器人在位自动光学检测***开发", 《机械工程学报》 * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113770577A (en) * | 2021-09-18 | 2021-12-10 | 宁波博视达焊接机器人有限公司 | Method for realizing generation of track of workpiece mounted on robot |
CN113770577B (en) * | 2021-09-18 | 2022-09-20 | 宁波博视达焊接机器人有限公司 | Method for realizing generation of track of workpiece mounted on robot |
CN113843792B (en) * | 2021-09-23 | 2024-02-06 | 四川锋准机器人科技有限公司 | Hand-eye calibration method of surgical robot |
CN113843792A (en) * | 2021-09-23 | 2021-12-28 | 四川锋准机器人科技有限公司 | Hand-eye calibration method of surgical robot |
CN114310888A (en) * | 2021-12-28 | 2022-04-12 | 广东省科学院智能制造研究所 | Cooperative robot variable-rigidity motor skill learning and regulating method and system |
CN114310888B (en) * | 2021-12-28 | 2024-05-31 | 广东省科学院智能制造研究所 | Method and system for learning and regulating variable rigidity motor skills of cooperative robot |
CN114347027A (en) * | 2022-01-08 | 2022-04-15 | 天晟智享(常州)机器人科技有限公司 | Pose calibration method of 3D camera relative to mechanical arm |
CN114485468A (en) * | 2022-01-28 | 2022-05-13 | 天津大学 | Multi-axis linkage composite measurement system and micro-part full-profile automatic measurement method |
CN114485468B (en) * | 2022-01-28 | 2023-09-26 | 天津大学 | Multi-axis linkage composite measurement system and micro-part full-contour automatic measurement method |
CN114589692A (en) * | 2022-02-25 | 2022-06-07 | 埃夫特智能装备股份有限公司 | Robot zero calibration method and calibration equipment thereof |
CN114589692B (en) * | 2022-02-25 | 2024-03-26 | 埃夫特智能装备股份有限公司 | Zero calibration method and calibration equipment for robot |
CN114770517A (en) * | 2022-05-19 | 2022-07-22 | 梅卡曼德(北京)机器人科技有限公司 | Method for calibrating robot through point cloud acquisition device and calibration system |
CN114770517B (en) * | 2022-05-19 | 2023-08-15 | 梅卡曼德(北京)机器人科技有限公司 | Method for calibrating robot through point cloud acquisition device and calibration system |
CN115249267A (en) * | 2022-09-22 | 2022-10-28 | 海克斯康制造智能技术(青岛)有限公司 | Automatic detection method and device based on turntable and robot position and attitude calculation |
Also Published As
Publication number | Publication date |
---|---|
CN113386136B (en) | 2022-05-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113386136B (en) | Robot posture correction method and system based on standard spherical array target estimation | |
CN107738254B (en) | Conversion calibration method and system for mechanical arm coordinate system | |
CN109822574B (en) | Industrial robot end six-dimensional force sensor calibration method | |
Wang et al. | A point and distance constraint based 6R robot calibration method through machine vision | |
CN112833786B (en) | Cabin attitude and pose measuring and aligning system, control method and application | |
CN109877840B (en) | Double-mechanical-arm calibration method based on camera optical axis constraint | |
CN111660295A (en) | Industrial robot absolute precision calibration system and calibration method | |
Zhuang et al. | Robot calibration with planar constraints | |
CN109323650B (en) | Unified method for measuring coordinate system by visual image sensor and light spot distance measuring sensor in measuring system | |
CN110276806A (en) | Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system | |
CN106777656B (en) | Industrial robot absolute accuracy calibration method based on PMPSD | |
CN112873199B (en) | Robot absolute positioning precision calibration method based on kinematics and spatial interpolation | |
CN113160334B (en) | Dual-robot system calibration method based on hand-eye camera | |
CN111168719B (en) | Robot calibration method and system based on positioning tool | |
CN107817682A (en) | A kind of space manipulator on-orbit calibration method and system based on trick camera | |
CN115284292A (en) | Mechanical arm hand-eye calibration method and device based on laser camera | |
CN112109072B (en) | Accurate 6D pose measurement and grabbing method for large sparse feature tray | |
CN112454366A (en) | Hand-eye calibration method | |
CN115546289A (en) | Robot-based three-dimensional shape measurement method for complex structural part | |
Dehghani et al. | Vision-based calibration of a Hexa parallel robot | |
TW202302301A (en) | Automated calibration system and method for the relation between a profile scanner coordinate frame and a robot arm coordinate frame | |
CN109059761B (en) | EIV model-based handheld target measuring head calibration method | |
CN113878586B (en) | Robot kinematics calibration device, method and system | |
CN116309879A (en) | Robot-assisted multi-view three-dimensional scanning measurement method | |
CN115179323A (en) | Machine end pose measuring device based on telecentric vision constraint and precision improving method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |