CN110919658A - Robot calibration method based on vision and multi-coordinate system closed-loop conversion - Google Patents
Robot calibration method based on vision and multi-coordinate system closed-loop conversion Download PDFInfo
- Publication number
- CN110919658A CN110919658A CN201911279737.0A CN201911279737A CN110919658A CN 110919658 A CN110919658 A CN 110919658A CN 201911279737 A CN201911279737 A CN 201911279737A CN 110919658 A CN110919658 A CN 110919658A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- robot
- camera
- calibration
- tail end
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
The invention discloses a robot calibration method based on vision and multi-coordinate system closed-loop conversion. The robot calibration method based on vision and multi-coordinate system closed-loop conversion can solve the problem that the traditional robot calibration method needs to construct AX (X-XB) equation with huge calculation amount, and reduces calculation burden; a. b, the accuracy of the conversion relation among the coordinate systems is improved by combining the three coordinate system conversion closed loops; and applying a nonlinear least square method to the obtained robot tail end position error to optimize and improve the positioning accuracy of the robot.
Description
Technical Field
The invention relates to the technical field of robots and image recognition, in particular to a robot calibration method based on vision and multi-coordinate system closed-loop conversion.
Background
With the rapid development of technologies such as big data, artificial intelligence, image recognition and the like in the information age, the robot technology is also greatly improved. The industrial robot has the characteristics of simple structure, high flexibility, large working space and the like, and is widely applied to the fields of automobiles, logistics, electronics, medical treatment, aerospace and the like. The robot positioning accuracy is divided into repeated positioning accuracy and absolute positioning accuracy, and nowadays, the repeated positioning accuracy of the robot is high, but the absolute positioning accuracy is low. Therefore, improving the absolute positioning accuracy of the robot is also one of the important fields of robot technology research.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the uncalibrated robot has low absolute positioning precision.
In order to solve the technical problem, the technical scheme of the invention is to provide a robot calibration method based on vision and multi-coordinate system closed-loop conversion, which is characterized by comprising the following steps:
step 1: establishing an Eye-to-hand-Eye basic model with a camera fixed outside a robot body, and fixing a chess board as a target at the tail end of the robot; the robot drives the chessboard to move within a range that the camera can shoot proper images, and chessboard images shot by the camera and positioned at different poses are used for calibrating the camera;
step 2: performing hand targeting, comprising the steps of:
step 201: fixing a calibration pen at the tail end of a robot as a target, setting 9 marking points in space based on a coordinate system of a joint at the tail end of the robot, driving the calibration pen to perform point fixing at each marking point in different postures by the robot until the point fixing of the 9 marking points is completed, and fixing a point D on the coordinate system of the joint at the tail end of the robot1~D9The corresponding points projected onto the image coordinate system are respectively d1~d9Representing D by means of Euclidean transformation1~D9To d1~d9Thereby obtaining a conversion relation between the image coordinate system and the robot end joint coordinate system;
step 202: deriving a conversion relation T between the camera coordinate system and the robot end joint coordinate system based on the conversion relation between the image coordinate system and the robot end joint coordinate system obtained in step 201cmCompleting the calibration of the hand and the eye;
step 203: based on the transformation relation TcmAnd (3) deducing a conversion relation T between a target coordinate system and a camera coordinate system by combining the camera parameters obtained after camera calibration in the step (1)bc;
Step 204: from Tmb=TcmTbcAnd deducing a conversion relation T between the robot end joint coordinate system and the target coordinate systemmbCompleting the calibration of the hand target;
and step 3: the position of the tail end of the robot under the robot base coordinate system is pushed out by combining the pre-calibration data;
and 4, step 4: comparing the obtained tail end position of the robot with the expected tail end position of the robot to obtain an error between the tail end position and the expected tail end position of the robot;
and 5: and (4) establishing an error constraint equation according to the error of the robot tail end position obtained in the step (4), then optimizing by using a nonlinear least square method, finding a local minimum value of an error function through continuous iterative computation, and considering that the local minimum value can enable an objective function to obtain an optimal solution to finish parameter calibration.
Preferably, the step 1 specifically comprises the following steps:
step 101: the Z-axis direction of the coordinate system of the calibration plate is consistent with the Z-axis direction of the coordinate system of the flange plate, the poses of the chess board at the tail end of the robot are changed within the range that the cameras can shoot proper images, and the fixed cameras can collect a plurality of chessboard images with different poses;
step 102: extracting corner points in the chessboard image, and utilizing a cornerSubPix () function in OpenCV to accurately position the corner points to sub-pixel level precision for camera calibration; get cameraWherein the intrinsic parameter matrix of the camera isfx、fyDenotes the focal length, u0And u0Representing the intersection of the camera optical axis and the image plane; the extrinsic parameter matrix of the camera isR represents a rotation matrix from the world coordinate system to the camera coordinate system, R ═ RxRyRz,Rx、Ry、RzDenotes the rotation of the camera coordinate system around the x, y, z axes of the world coordinate system, T denotes the translation matrix from the world coordinate system to the camera coordinate system, T ═ Txtytz],Tx、Ty、TzRepresenting a translation of the camera coordinate system along the x, y, z axes of the world coordinate system, then
The transformation formula from the camera coordinate system to the world coordinate system is as follows:
in the formula (1), (X)w,Yw,Zw) Representing points in a world coordinate system; (X)c,Yc,Zc) Representing a corresponding point in a camera coordinate system;
the transformation relation between the image coordinate system and the world coordinate system is as follows:
the point (X, Y, Z) in the world coordinate system corresponding to the point (X, Y) in the image coordinate system is calculated by the equation (2).
Preferably, the step 3 comprises the steps of:
step 301: pre-calibrating by adopting space position fixed points, sequentially moving a calibration pen to 9 set marker points under a robot tail end joint coordinate system, wherein each marker point is at the momentThe robot at the mark points does not need to change the posture, a camera fixed outside the robot shoots the robot, and the coordinates P 'of the tail end of the robot at each mark point are recorded while shooting'(X,Y,Z);
Step 302: preprocessing calibration pen images acquired by a camera to different marking points; then further processed to obtain pre-calibrated Tcm′(1~9),Tcm′(1~9)Representing a transformation matrix, T, between 9 sets of pre-calibrated camera coordinate systems and the robot end joint coordinate systemcm′(1~9)Combining the recorded robot end coordinates to obtain the conversion relation T between the 9 sets of camera coordinate systems and the world coordinate system of the robotcw(1~9)Taking the average value of 9 groups of data as the final conversion relation T between the camera coordinate system and the world coordinate system of the robotcw;
Step 303: converting relation T obtained by pre-calibrationcwTarget coordinate system and camera coordinate systembcT derived from hand targetmbIn combination, the robot tip position can be derived and compared to the expected robot tip position to obtain an error.
Compared with the prior art, the invention has the following advantages:
the robot calibration method based on vision and multi-coordinate system closed-loop conversion can solve the problem that the traditional robot calibration method needs to construct AX (X-XB) equation with huge calculation amount, and reduces calculation burden; a. b, the accuracy of the conversion relation among the coordinate systems is improved by combining the three coordinate system conversion closed loops; and applying a nonlinear least square method to the obtained robot tail end position error to optimize and improve the positioning accuracy of the robot.
Drawings
FIG. 1 is a schematic structural diagram 1 of a robot calibration method based on closed-loop conversion of vision and multiple coordinate systems;
FIG. 2 is a schematic structural diagram 2 of a robot calibration method based on closed-loop conversion of vision and multiple coordinate systems;
FIG. 3 is a diagram of the transformation relationship of coordinate systems of a robot calibration method based on closed-loop transformation of vision and multi-coordinate systems;
fig. 4 is a flowchart of a robot calibration method based on closed-loop conversion of vision and multiple coordinate systems.
Icon:
1-a robot body; 2-calibrating the plate; 3-a camera; 4-calibrating the pen; world coordinate system of { W } -robot; { M } -robot end joint coordinate system; { B } -calibration plate or calibration pen coordinate system; { C } -camera coordinate system.
Detailed Description
The invention will be further illustrated with reference to the following specific examples. It should be understood that these examples are for illustrative purposes only and are not intended to limit the scope of the present invention. Further, it should be understood that various changes or modifications of the present invention may be made by those skilled in the art after reading the teaching of the present invention, and such equivalents may fall within the scope of the present invention as defined in the appended claims.
The invention combines a camera and a robot, and provides a robot calibration method based on vision and multi-coordinate system closed-loop conversion. The method can solve the problem that the traditional robot calibration method needs to construct AX (X-XB) equation with huge calculation amount, and reduces calculation burden; a. b, the accuracy of the conversion relation among the coordinate systems is improved by combining the three coordinate system conversion closed loops; and applying a nonlinear least square method to the obtained robot tail end position error to optimize and improve the positioning accuracy of the robot.
The invention provides a robot calibration method based on vision and multi-coordinate system closed-loop conversion, which specifically comprises the following steps:
step 1: an Eye-to-hand-Eye basic model with a camera fixed outside the robot body is established, a chess board is fixed at the tail end of the robot as a target, and as shown in figure 1, the Z axis of a calibration board coordinate system is consistent with the Z axis of a flange plate coordinate system. The robot drives the chessboard to move within the range that the camera can shoot proper images, and the camera is calibrated by using calibration board images shot by the camera and with different poses.
In this step, the calibration of the external reference and the internal reference of the camera specifically comprises the following steps:
(1) the position and posture of the calibration plate on the tail end position of the robot are continuously changed, so that a plurality of chessboard images are collected by the fixed camera.
(2) And extracting corner points in the chessboard image, and utilizing a cornerSubPix () function in OpenCV to accurately position the corner points to sub-pixel level precision, and calibrating the camera to obtain an internal parameter matrix and an external parameter matrix of the camera.
The transformation formula from the camera coordinate system to the world coordinate system is as follows:
in the formula (1), (X)w,Yw,Zw) Representing points in a world coordinate system; (X)c,Yc,Zc) Representing a corresponding point in a camera coordinate system;is an external parameter matrix of the camera; r represents a rotation matrix from the world coordinate system to the camera coordinate system, R ═ RxRyRz,Rx、Ry、RzRepresenting a rotation of the camera coordinate system about the x, y, z axis of the world coordinate system; t represents a translation matrix from the world coordinate system to the camera coordinate system, T ═ Txtytz],Tx、Ty、TzRepresenting a translation of the camera coordinate system in the x, y, z axis of the world coordinate system.
The transformation relation between the image coordinate system and the world coordinate system is as follows:
in the formula (2), S represents a proportionality coefficient;is an internal reference matrix of the camera; f. ofx、fyDenotes the focal length, which is generally equal; u. of0,v0It is indicated that the intersection of the camera optical axis and the image plane is usually located at the center of the image, so its value is usually half the resolution;
the point (X, Y, Z) in the world coordinate system corresponding to the point (X, Y) in the image coordinate system can be calculated by the equation (2).
Further transformation from the world coordinate system to the pixel coordinate system can be obtained:
in the formulas (4) and (5), f represents the focal length of the camera; (u, v) represents a point in the pixel coordinate system.
Step 2: the calibration plate is changed into a calibration pen, the calibration pen is fixed at the tail end of the robot as a target, and 9 marking points are set in the space based on the coordinate system of the tail end joint of the robot as shown in fig. 2. The robot drives the calibration pen to perform fixed point at the marking points in different postures until the fixed point of the 9 marking points is completed, and the conversion relation between the image coordinate system and the robot tail end joint coordinate system is obtained. Further, the transformation relation T between the camera coordinate system and the robot end joint coordinate system can be deducedcmThus completing the calibration of the hands and eyes. As shown in the closed loop c of coordinate system transformation in fig. 3, the transformation relation T between the target coordinate system and the camera coordinate system can be derived by combining the camera calibrationbc(ii) a Further deducing the conversion relation T between the robot end joint coordinate system and the target coordinate systemmbAnd completing the calibration of the hand target.
In this step, the hand targeting specifically comprises the following steps:
the calibration plate is changed into a calibration pen, and the calibration pen is used as a target, so that the coordinate system of the calibration pen and the coordinate system of the previous calibration plate are kept consistent as shown in fig. 2. A cube is set in space on a coordinate system of the tail end joint of the robot, and the cube is in a shooting range suitable for a camera, and 8 vertexes and 1 central point of the cube are usedAs a mark point D1~D9(ii) a The robot drives the calibration pen to perform fixed point at each marking point in 4 different postures until the fixed point of 9 marking points is completed; can know the point D on the robot end joint coordinate system1~D9The corresponding points projected onto the image coordinate system are respectively d1~d9The vector can be transformed by the euclidean transformation, i.e. the rotation vector and the translation vector:
Di=R·di-t(i∈(1,9)) (6)
formula (6) represents D1~D9To d1~d9In equation (6), t represents a translation vector. Therefore, the conversion relation between the image coordinate system and the robot end joint coordinate system can be obtained. Camera parameters obtained by combining the previous camera calibration and formula (7):
obtaining the conversion relation T between the camera coordinate system and the robot end joint coordinate systemcmCompleting the calibration of the hand and the eye; as shown in the closed loop c for coordinate system transformation in fig. 3, the transformation relation T between the robot end joint coordinate system and the target can be obtainedmbConversion relation T between camera coordinate system and robot end joint coordinate systemcmConversion relation T between x target coordinate system and camera coordinate systembc(Tmb=TcmTbc) And calculating the conversion relation T between the robot end joint coordinate system and the target coordinate systemmbNamely hand target calibration. .
And step 3: and then, combining the pre-calibration data to push out the position of the tail end of the robot under the robot base coordinate system.
In this step, the pre-calibration specifically includes the following steps:
and pre-calibrating by adopting a space position fixed point method. By means of off-line programming, the robot is controlled to move the calibration pen to the set 9 marking points in sequence, the robot does not need to change the posture of each marking point, a camera fixed outside the robot shoots the marking points, and all the marking points are recorded while shootingCoordinate P 'of robot end at each mark point'(X,Y,Z)And rotation angles Δ 'of six joints'(1~6)。
The calibration pen image collected by the camera is preprocessed and then further processed to obtain the pre-calibrated Tcm'; t is obtained according to the coordinate system transformation closed loop b in FIG. 36=TcmTcwCombining the recorded robot end coordinates to obtain the conversion relation T between the camera coordinate system and the world coordinate system of the robotcw。
T can be known from the closed loop a of coordinate system transformation in FIG. 36=TmbTbcTcwWill precalibrated TcwThe previously obtained conversion relation T between the target coordinate system and the camera coordinate systembcT derived from hand targetmbThe combination can deduce the position P of the tail end of the robot1(X,Y,Z)Compares it with the expected robot end position P0(X,Y,Z)And comparing to obtain an error.
And 4, step 4: and comparing the obtained tail end position of the robot with the expected tail end position of the robot to obtain the error between the tail end position and the expected tail end position of the robot.
And 5: an error constraint equation is established according to the position error of the tail end of the robot, then a nonlinear least square method is used for optimization, the local minimum value of an error function is found through continuous iterative calculation, the local minimum value is considered to enable an optimal solution (minimum value) to be obtained by the objective function, and parameter calibration is completed.
In this step, the optimization by applying the nonlinear least square method after obtaining the error specifically includes the following steps:
the purpose of the parameter optimization by means of the least square method is to find a suitable set of geometric parameter values under which the positioning error of the robot's tip is minimized.
The objective function optimized by the nonlinear least square method can be constructed as follows:
formula (8)) In, P0(X,Y,Z)iIndicating the ith desired robot end position; p0(X,Y,Z)jIndicating a jth desired robot end position; p1(X,Y,Z)iRepresenting the deduced ith end position of the robot; p1(X,Y,Z)jRepresenting the derived jth end position of the robot.
This is a typical problem for estimating the parameters of a nonlinear static model using the least sum of squared errors as a criterion. And solving the minimum value of the objective function through an iterative optimization algorithm. The algorithm adopts a Levenberg-Marquardt algorithm, and the calculation form is as follows:
HLM=-(JTJ+μI)-1JTe (9)
in the formula (9), HLMFor the algorithm step, J is the Jacobian matrix of the error function, e is the error function, and μ is a positive number.
Coordinate the position P of the end of the robot0(X,Y,Z)And P1(X,Y,Z)Substituting the calculated objective function and then calculating the iteration step length HLMThe parameter values are corrected and the above steps are repeated until the maximum number of iterations is reached or the objective function is reduced to the required level.
By adopting the technical scheme, the vision is applied to robot calibration to obtain the conversion relation among the coordinates, and the positioning precision of the robot is better improved.
Claims (3)
1. A robot calibration method based on vision and multi-coordinate system closed-loop conversion is characterized by comprising the following steps:
step 1: establishing an Eye-to-hand-Eye basic model with a camera fixed outside a robot body, and fixing a chess board as a target at the tail end of the robot; the robot drives the chessboard to move within a range that the camera can shoot a proper image, and the chessboard image shot by the camera is used for camera calibration;
step 2: performing hand targeting, comprising the steps of:
step 201: fixing a calibration pen as a target at the tail end of the robot, and setting 9 calibration pens in the space based on the coordinate system of the tail end joint of the robotMarking points, wherein the robot drives a calibration pen to perform point fixing at each marking point in different postures until completing the point fixing of 9 marking points, and a point D on a robot tail end joint coordinate system1~D9The corresponding points projected onto the image coordinate system are respectively d1~d9Representing D by means of Euclidean transformation1~D9To d1~d9Thereby obtaining a conversion relation between the image coordinate system and the robot end joint coordinate system;
step 202: deriving a conversion relation T between the camera coordinate system and the robot end joint coordinate system based on the conversion relation between the image coordinate system and the robot end joint coordinate system obtained in step 201cmCompleting the calibration of the hand and the eye;
step 203: based on the transformation relation TcmAnd (3) deducing a conversion relation T between a target coordinate system and a camera coordinate system by combining the camera parameters obtained after camera calibration in the step (1)bc;
Step 204: from Tmb=TcmTbcAnd deducing a conversion relation T between the robot end joint coordinate system and the target coordinate systemmbCompleting the calibration of the hand target;
and step 3: the position of the tail end of the robot under the robot base coordinate system is pushed out by combining the pre-calibration data;
and 4, step 4: comparing the obtained tail end position of the robot with the expected tail end position of the robot to obtain an error between the tail end position and the expected tail end position of the robot;
and 5: and (4) establishing an error constraint equation according to the error of the robot tail end position obtained in the step (4), then optimizing by using a nonlinear least square method, finding a local minimum value of an error function through continuous iterative computation, and considering that the local minimum value can enable an objective function to obtain an optimal solution to finish parameter calibration.
2. The robot calibration method based on the closed-loop transformation of the vision and the multi-coordinate system as claimed in claim 1, wherein the step 1 specifically comprises the following steps:
step 101: the Z-axis direction of the coordinate system of the calibration plate is consistent with the Z-axis direction of the coordinate system of the flange plate, the poses of the chess board at the tail end of the robot are changed within the range that the cameras can shoot proper images, and the fixed cameras can collect a plurality of chessboard images with different poses;
step 102: extracting corner points in the chessboard image, and utilizing a cornerSubPix () function in OpenCV to accurately position the corner points to sub-pixel level precision for camera calibration; obtaining an internal parameter matrix and an external parameter matrix of the camera, wherein the internal parameter matrix of the camera isfx、fyDenotes the focal length, u0v0Representing the intersection of the camera optical axis and the image plane; the extrinsic parameter matrix of the camera isR represents a rotation matrix from the world coordinate system to the camera coordinate system, R ═ RxRyRz,Rx、Ry、RzRepresenting the rotation of the camera coordinate system about the x, y, z axes of the world coordinate system, T representing the translation matrix from the world coordinate system to the camera coordinate system, T ═ Txtytz],Tx、Ty、TzRepresenting a translation of the camera coordinate system along the x, y, z axes of the world coordinate system, then
The transformation formula from the camera coordinate system to the world coordinate system is as follows:
in the formula (1), (X)w,Yw,Zw) Representing points in a world coordinate system; (X)c,Yc,Zc) Representing a corresponding point in a camera coordinate system;
the transformation relation between the image coordinate system and the world coordinate system is as follows:
the point (X, Y, Z) in the world coordinate system corresponding to the point (X, Y) in the image coordinate system is calculated by the equation (2).
3. The robot calibration method based on the closed-loop transformation of the vision and the multi-coordinate system as claimed in claim 1, wherein the step 3 comprises the following steps:
step 301: pre-calibrating by adopting space position fixed points, sequentially moving a calibration pen to 9 set mark points under a robot tail end joint coordinate system, taking pictures of the robot by a camera fixed outside the robot without changing the posture of the robot at each mark point, and recording the coordinates P 'of the robot tail end at each mark point while taking pictures'(X,Y,Z);
Step 302: preprocessing calibration pen images acquired by a camera at different marking points; then further processed to obtain pre-calibrated T'cm(1~9),T′cm(1~9)Representing a conversion matrix, T ', between 9 sets of pre-calibrated camera coordinate systems and a robot end joint coordinate system'cm(1~9)Combining the recorded robot end coordinates to obtain the conversion relation T between the 9 sets of camera coordinate systems and the world coordinate system of the robotcw(1~9)Taking the average value of 9 groups of data as the final conversion relation T between the camera coordinate system and the world coordinate system of the robotcw;
Step 303: converting relation T obtained by pre-calibrationcwTarget coordinate system and camera coordinate systembcT derived from hand targetmbIn combination, the robot tip position can be derived and compared to the expected robot tip position to obtain an error.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911279737.0A CN110919658B (en) | 2019-12-13 | 2019-12-13 | Robot calibration method based on vision and multi-coordinate system closed-loop conversion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911279737.0A CN110919658B (en) | 2019-12-13 | 2019-12-13 | Robot calibration method based on vision and multi-coordinate system closed-loop conversion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110919658A true CN110919658A (en) | 2020-03-27 |
CN110919658B CN110919658B (en) | 2023-03-31 |
Family
ID=69860355
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911279737.0A Active CN110919658B (en) | 2019-12-13 | 2019-12-13 | Robot calibration method based on vision and multi-coordinate system closed-loop conversion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110919658B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111515950A (en) * | 2020-04-28 | 2020-08-11 | 腾讯科技(深圳)有限公司 | Method, device and equipment for determining transformation relation of robot coordinate system and storage medium |
CN111768364A (en) * | 2020-05-15 | 2020-10-13 | 成都飞机工业(集团)有限责任公司 | Aircraft surface quality detection system calibration method |
CN112223285A (en) * | 2020-09-30 | 2021-01-15 | 南京航空航天大学 | Robot hand-eye calibration method based on combined measurement |
CN113119083A (en) * | 2021-03-19 | 2021-07-16 | 深圳市优必选科技股份有限公司 | Robot calibration method and device, robot and storage medium |
CN113237434A (en) * | 2021-04-25 | 2021-08-10 | 湖南大学 | Stepped calibrator-based eye-in-hand calibration method for laser profile sensor |
CN114260899A (en) * | 2021-12-29 | 2022-04-01 | 广州极飞科技股份有限公司 | Hand-eye calibration method and device, electronic equipment and computer readable storage medium |
CN116277035A (en) * | 2023-05-15 | 2023-06-23 | 北京壹点灵动科技有限公司 | Robot control method and device, processor and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5523663A (en) * | 1992-05-15 | 1996-06-04 | Tsubakimoto Chain Co. | Method for controlling a manipulator relative to a moving workpiece |
CN109859275A (en) * | 2019-01-17 | 2019-06-07 | 南京邮电大学 | A kind of monocular vision hand and eye calibrating method of the rehabilitation mechanical arm based on S-R-S structure |
US20190204084A1 (en) * | 2017-09-29 | 2019-07-04 | Goertek Inc. | Binocular vision localization method, device and system |
CN110136208A (en) * | 2019-05-20 | 2019-08-16 | 北京无远弗届科技有限公司 | A kind of the joint automatic calibration method and device of Visual Servoing System |
CN110355464A (en) * | 2019-07-05 | 2019-10-22 | 上海交通大学 | Visual Matching Method, system and the medium of laser processing |
-
2019
- 2019-12-13 CN CN201911279737.0A patent/CN110919658B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5523663A (en) * | 1992-05-15 | 1996-06-04 | Tsubakimoto Chain Co. | Method for controlling a manipulator relative to a moving workpiece |
US20190204084A1 (en) * | 2017-09-29 | 2019-07-04 | Goertek Inc. | Binocular vision localization method, device and system |
CN109859275A (en) * | 2019-01-17 | 2019-06-07 | 南京邮电大学 | A kind of monocular vision hand and eye calibrating method of the rehabilitation mechanical arm based on S-R-S structure |
CN110136208A (en) * | 2019-05-20 | 2019-08-16 | 北京无远弗届科技有限公司 | A kind of the joint automatic calibration method and device of Visual Servoing System |
CN110355464A (en) * | 2019-07-05 | 2019-10-22 | 上海交通大学 | Visual Matching Method, system and the medium of laser processing |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111515950A (en) * | 2020-04-28 | 2020-08-11 | 腾讯科技(深圳)有限公司 | Method, device and equipment for determining transformation relation of robot coordinate system and storage medium |
CN111768364A (en) * | 2020-05-15 | 2020-10-13 | 成都飞机工业(集团)有限责任公司 | Aircraft surface quality detection system calibration method |
CN111768364B (en) * | 2020-05-15 | 2022-09-20 | 成都飞机工业(集团)有限责任公司 | Aircraft surface quality detection system calibration method |
CN112223285A (en) * | 2020-09-30 | 2021-01-15 | 南京航空航天大学 | Robot hand-eye calibration method based on combined measurement |
CN112223285B (en) * | 2020-09-30 | 2022-02-01 | 南京航空航天大学 | Robot hand-eye calibration method based on combined measurement |
CN113119083A (en) * | 2021-03-19 | 2021-07-16 | 深圳市优必选科技股份有限公司 | Robot calibration method and device, robot and storage medium |
CN113237434A (en) * | 2021-04-25 | 2021-08-10 | 湖南大学 | Stepped calibrator-based eye-in-hand calibration method for laser profile sensor |
CN113237434B (en) * | 2021-04-25 | 2022-04-01 | 湖南大学 | Stepped calibrator-based eye-in-hand calibration method for laser profile sensor |
CN114260899A (en) * | 2021-12-29 | 2022-04-01 | 广州极飞科技股份有限公司 | Hand-eye calibration method and device, electronic equipment and computer readable storage medium |
CN116277035A (en) * | 2023-05-15 | 2023-06-23 | 北京壹点灵动科技有限公司 | Robot control method and device, processor and electronic equipment |
CN116277035B (en) * | 2023-05-15 | 2023-09-12 | 北京壹点灵动科技有限公司 | Robot control method and device, processor and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN110919658B (en) | 2023-03-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110919658B (en) | Robot calibration method based on vision and multi-coordinate system closed-loop conversion | |
CN111775146B (en) | Visual alignment method under industrial mechanical arm multi-station operation | |
CN108972559B (en) | Hand-eye calibration method based on infrared stereoscopic vision positioning system and mechanical arm | |
CN106457562B (en) | Method and robot system for calibration machine people | |
CN111801198B (en) | Hand-eye calibration method, system and computer storage medium | |
CN109877840B (en) | Double-mechanical-arm calibration method based on camera optical axis constraint | |
CN110666798B (en) | Robot vision calibration method based on perspective transformation model | |
CN111012506B (en) | Robot-assisted puncture surgery end tool center calibration method based on stereoscopic vision | |
CN113386136B (en) | Robot posture correction method and system based on standard spherical array target estimation | |
CN110136204B (en) | Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera | |
CN114343847B (en) | Hand-eye calibration method of surgical robot based on optical positioning system | |
CN107192376A (en) | Unmanned plane multiple image target positioning correction method based on interframe continuity | |
CN114519738A (en) | Hand-eye calibration error correction method based on ICP algorithm | |
CN111915685B (en) | Zoom camera calibration method | |
CN112109072B (en) | Accurate 6D pose measurement and grabbing method for large sparse feature tray | |
CN108089441A (en) | Clap machine secondary mirror six degree of freedom accurate adjusting mechanism calibration algorithm and storage medium in space | |
CN115546289A (en) | Robot-based three-dimensional shape measurement method for complex structural part | |
CN112700505B (en) | Binocular three-dimensional tracking-based hand and eye calibration method and device and storage medium | |
CN114161411A (en) | Vision-based multi-legged robot kinematic parameter calibration method | |
CN111899303B (en) | Novel feature matching and relative positioning method considering space inverse projection constraint | |
CN113681559A (en) | Line laser scanning robot hand-eye calibration method based on standard cylinder | |
CN111145267B (en) | 360-degree panoramic view multi-camera calibration method based on IMU assistance | |
CN115813556A (en) | Surgical robot calibration method and device, surgical robot and storage medium | |
Nissler et al. | Robot-to-camera calibration: a generic approach using 6D detections | |
Yang et al. | A closed-loop controller for a continuum surgical manipulator based on a specially designed wrist marker and stereo tracking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |