CN112116664A - Hand-eye calibration track generation method and device, electronic equipment and storage medium - Google Patents

Hand-eye calibration track generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112116664A
CN112116664A CN202010923062.5A CN202010923062A CN112116664A CN 112116664 A CN112116664 A CN 112116664A CN 202010923062 A CN202010923062 A CN 202010923062A CN 112116664 A CN112116664 A CN 112116664A
Authority
CN
China
Prior art keywords
robot
target
calibration
pose
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010923062.5A
Other languages
Chinese (zh)
Other versions
CN112116664B (en
Inventor
许金鹏
温志庆
周德成
李伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ji Hua Laboratory
Original Assignee
Ji Hua Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ji Hua Laboratory filed Critical Ji Hua Laboratory
Priority to CN202010923062.5A priority Critical patent/CN112116664B/en
Publication of CN112116664A publication Critical patent/CN112116664A/en
Application granted granted Critical
Publication of CN112116664B publication Critical patent/CN112116664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a hand-eye calibration track generation method, a hand-eye calibration track generation device, electronic equipment and a storage medium, wherein a plurality of target rotation angle values are obtained; acquiring an initial pose of the robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the plurality of target poses; at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the requirements on the professional level and experience of workers are low.

Description

Hand-eye calibration track generation method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of industrial robots, in particular to a hand-eye calibration track generation method and device, electronic equipment and a storage medium.
Background
At present, in order to realize the precision and the intellectualization of an industrial robot, a robot vision system is required to be relied on.
The first step in the application of robot vision, which is also particularly important, is the calibration of the robot hand and eye. A more commonly used robot vision system is generally an Eye-in-Hand system, and a 3D camera is generally mounted on a flange plate at the tail end of a robot, so that the 3D camera moves along with the movement of the robot, and the accuracy of calibrating the Hand Eye of the vision system determines that the machining and manufacturing precision of the robot is mainly determined by the accuracy of calibrating the Hand Eye. Therefore, the robot hand-eye calibration is particularly important.
For an Eye-in-Hand system, generally, when Hand-Eye calibration is performed, a motion track and a photographing position point of a robot need to be manually set, and then when the robot moves according to the set motion track, a camera is used for collecting a picture of a calibration plate at the preset photographing position point and recording the pose of the robot. The robot motion track and the photographing position point during calibration are designed manually, so that the efficiency is low, the requirements on professional level and experience of workers are high, and the personnel cost of a user is high.
Disclosure of Invention
In view of the foregoing disadvantages of the prior art, an object of the embodiments of the present application is to provide a method and an apparatus for generating a hand-eye calibration trajectory, an electronic device, and a storage medium, which can improve work efficiency and have low requirements on professional level and experience of a worker.
In a first aspect, an embodiment of the present application provides a method for generating a hand-eye calibration trajectory, which is applied to a robot, and includes the steps of:
A1. obtaining a plurality of target rotation angle values;
A2. acquiring an initial pose of the robot;
A3. generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value;
A4. moving according to the plurality of target poses;
A5. at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera;
A6. and recording the adjusted pose data of the robot.
In the method for generating a hand-eye calibration trajectory, step a2 includes: acquiring an initial posture and an initial position of the robot;
step a3 includes:
A301. according to each target rotation angle value, calculating normalized four-elements when the robot rotates around the X, Y, Z axis by a corresponding angle respectively to obtain a plurality of normalized four-elements;
A302. respectively point-multiplying the initial postures by the multiple normalized four-element to perform normalization processing to obtain multiple target postures;
A303. and combining the initial position with the plurality of target poses respectively to obtain a plurality of target poses.
In the method for generating a hand-eye calibration trajectory, step a5 includes:
A501. acquiring a picture shot by a camera;
A502. judging whether the calibration plate is complete in the picture;
A503. if the calibration plate is not complete in the picture, judging the direction in which the robot needs to move according to the condition that the calibration plate is missing in the picture;
A504. and moving the robot to the direction needing to move until the calibration board is completely exposed in the visual field of the camera.
Further, step a503 includes:
acquiring the missing characteristic point information of the calibration plate in the picture;
and judging the direction in which the robot needs to move according to the positions of the missing characteristic points on the calibration plate and the relative position relationship between the calibration plate and the robot.
Further, step a504 includes:
enabling the robot to move step by step in the direction needing to move according to a preset step length;
acquiring a picture shot by a camera after each movement and judging whether a calibration board is complete in the picture;
and if the calibration plate is complete in the picture, stopping moving.
In a second aspect, an embodiment of the present application provides a hand-eye calibration trajectory generation device, which is applied to a robot, and includes:
the first acquisition module is used for acquiring a plurality of target rotation angle values;
the second acquisition module is used for acquiring the initial pose of the robot;
the generating module is used for generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value;
the first execution module is used for moving according to the plurality of target poses;
the position adjusting module is used for adjusting the position of each target pose according to the position of the calibration plate, so that the calibration plate is completely exposed in the visual field of the camera;
and the recording module is used for recording the adjusted pose data of the robot.
In the hand-eye calibration track generation device, the second acquisition module acquires an initial posture and an initial position of the robot when acquiring the initial pose of the robot;
the generation module, in generating a plurality of target poses from the plurality of target rotation angle values and an initial pose,
according to each target rotation angle value, calculating normalized four-elements when the robot rotates around the X, Y, Z axis by a corresponding angle respectively to obtain a plurality of normalized four-elements;
respectively point-multiplying the initial postures by the multiple normalized four-element to perform normalization processing to obtain multiple target postures;
and combining the initial position with the plurality of target poses respectively to obtain a plurality of target poses.
In the hand-eye calibration trajectory generation device, when the position adjustment module adjusts the position of the robot,
acquiring a picture shot by a camera;
judging whether the calibration plate is complete in the picture;
if the calibration plate is not complete in the picture, judging the direction in which the robot needs to move according to the condition that the calibration plate is missing in the picture;
and moving the robot to the direction needing to move until the calibration board is completely exposed in the visual field of the camera.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor and a memory, where the memory stores a computer program, and the processor is configured to execute the steps of the hand-eye calibration trajectory generation method by calling the computer program stored in the memory.
In a fourth aspect, an embodiment of the present application provides a storage medium, on which a computer program is stored, where the computer program is executed by a processor to execute the steps of the hand-eye calibration trajectory generation method described in the above item.
Has the advantages that:
according to the method, the device, the electronic equipment and the storage medium for generating the hand-eye calibration track, a plurality of target rotation angle values are obtained; acquiring an initial pose of the robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the plurality of target poses; at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the requirements on the professional level and experience of workers are low.
Drawings
Fig. 1 is a flowchart of a hand-eye calibration trajectory generation method according to an embodiment of the present application.
Fig. 2 is a block diagram of a hand-eye calibration trajectory generation device according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, a method for generating a hand-eye calibration trajectory provided in an embodiment of the present application is applied to a robot, and includes the steps of:
A1. obtaining a plurality of target rotation angle values;
A2. acquiring an initial pose of the robot;
A3. generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value;
A4. moving according to a plurality of target poses;
A5. at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera;
A6. and recording the adjusted pose data of the robot.
The method is directed to an Eye-in-Hand system, i.e. the camera is arranged at the end of the robot. The principle of the method is as follows: the method comprises the steps of giving a plurality of target rotation angle values, if the robot rotates around a corresponding axial direction according to the given target rotation angle values in an initial pose (in the initial pose, a camera can shoot a complete calibration board), causing pose change after each rotation, enabling the position of the calibration board in a picture to change, enabling the calibration board to be completely exposed in the field of view of the camera again by adjusting the position of the robot when the calibration board is not complete in the picture, enabling the pose of the robot to change again after adjustment, and taking the adjusted pose as a node of a calibration track. Therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the requirements on the professional level and experience of workers are low.
In step a1, a plurality of preset target rotation angle values (e.g., 25 °, 30 °, 35 °, 40 °, 45 °, 50 °, -25 °, -30 °, -35 °, -40 °, -45 °, -50 °) may be directly obtained; the angle range and the target number can also be obtained, and then the corresponding number of angle values are selected as the target rotation angle values in the angle range according to the target number. For the latter, the angle values may be selected at equal intervals within the angle range according to the target number, or may be selected randomly within the angle range.
Wherein, since the pose of the robot is composed of a pose and a position, the pose is generally expressed by three pose angles (euler angles), and the position is generally expressed by coordinates (x, y, z) of three axes, step a2 includes: and acquiring an initial posture and an initial position of the robot.
Further, step a3 includes:
A301. according to each target rotation angle value, calculating normalized four-elements when the robot rotates around the X, Y, Z axis by a corresponding angle respectively to obtain a plurality of normalized four-elements;
A302. respectively point-multiplying the initial postures by a plurality of normalized four-element to perform normalization processing to obtain a plurality of target postures;
A303. and combining the initial position with the plurality of target poses respectively to obtain a plurality of target poses.
According to the Z-Y-X Euler angle function in the robot kinematics, the rotation transformation matrix can be expressed as:
Figure DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 956097DEST_PATH_IMAGE002
alpha, beta and gamma are three attitude angles respectively;
the above equation can be obtained by calculation:
Figure DEST_PATH_IMAGE003
the above formula simplifies each parameter into a coefficient matrix of m according to quaternion
Figure 873237DEST_PATH_IMAGE004
Is calculated by the following formula
Figure DEST_PATH_IMAGE005
Figure 43187DEST_PATH_IMAGE006
Figure DEST_PATH_IMAGE007
Figure 64495DEST_PATH_IMAGE008
Figure DEST_PATH_IMAGE009
The normalized four-element can then be calculated by the following formula:
Figure 878868DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 560385DEST_PATH_IMAGE012
in order to normalize the four elements of the image,
Figure DEST_PATH_IMAGE013
is a quaternion
Figure 827418DEST_PATH_IMAGE014
Die length of (2).
In step a302, the following formula can be used for calculation:
Figure DEST_PATH_IMAGE015
Figure 917996DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE017
in order to be the target posture,
Figure 711508DEST_PATH_IMAGE018
is the attitude obtained by the dot-product,
Figure DEST_PATH_IMAGE019
is composed of
Figure 704872DEST_PATH_IMAGE018
The length of the die (c) is,
Figure 85300DEST_PATH_IMAGE020
is the initial pose.
In step a303, the target pose calculated by the target pose is used as the pose of the target pose, and the initial position acquired in step a2 is used as the position of the target pose, thereby obtaining the target pose.
For example, twelve target rotation angle values of 25 °, 30 °, 35 °, 40 °, 45 °, 50 °, -25 °, -30 °, -35 °, -40 °, -45 °, -50 ° are acquired in step a1, and then in step a301, normalized four elements (α =25 °, β =0 °, γ =0 °) when the robot rotates 25 ° around the X axis, normalized four elements (α =0 °, β =25 °, γ =0 °) when the robot rotates 25 ° around the Y axis, normalized four elements (α =0 °, β =0 °, γ =25 °) when the robot rotates 25 ° around the Z axis, normalized four elements (α =0 °, β =0 °, γ =25 °) when the robot rotates 30 ° around the X axis, and normalized four elements (α =0 °, β =0 °, γ =0 °) when the robot rotates 30 ° around the Y axis, normalized four elements (α =0 °,) when the robot rotates 30 ° around the Y axis, are calculated, β =30 °, γ =0 °), normalized four elements when the robot is rotated by 30 ° around the Z axis (α =0 °, β =0 °, γ =30 °), and so on, and finally 36 normalized four elements are obtained. Therefore, through the step a6, the pose data of 36 robots are finally obtained, and at most, nodes of 36 calibration tracks can be obtained.
In step a4, when the robot moves according to the plurality of target poses, the robot sequentially moves to each target pose.
In step A3, the target pose data may be numbered according to the order in which the target pose data are calculated, so that in step a4, the robot may be moved to the target poses in sequence according to the order of the numbering. Before step a4, a preferred order is also acquired from the plurality of object pose data generated, so that the total path of the robot moving between the object poses in the preferred order is shortest, and in step a4, the robot is moved to the object poses in turn in the preferred order.
Specifically, step a5 includes:
A501. acquiring a picture shot by a camera;
A502. judging whether the calibration plate is complete in the picture;
A503. if the calibration plate is not complete in the picture, judging the direction in which the robot needs to move according to the condition that the calibration plate is missing in the picture;
A504. the robot is moved in the direction of movement required until the calibration plate is completely exposed to the field of view of the camera.
In step a502, whether the calibration board is complete in the picture can be determined by identifying whether the number of the feature points is correct, but is not limited thereto. For example, the calibration board is a rectangular board, four corner points are used as feature points, the number of the feature points is 4, and if the number of the corner points in the picture is not 4, the incomplete calibration board in the picture is represented.
Wherein, step a503 comprises:
acquiring the missing characteristic point information of the calibration plate in the picture;
and judging the direction in which the robot needs to move according to the positions of the missing characteristic points on the calibration plate and the relative position relationship between the calibration plate and the robot.
The feature points are points on the calibration board, such as corner points of the calibration board or some dot patterns specially arranged on the upper surface of the calibration board, and generally, different features (such as color and/or shape, but not limited thereto) are provided among different feature points so as to determine the identity of the feature points appearing in the picture; taking a rectangular calibration board as an example, the calibration board has four corner points, and the four corner points are coated with different colors, when in a picture, the calibration board lacks a corner point at the lower left corner (the missing corner point can be judged to be the corner point at the lower left corner through the colors of the three corner points appearing in the picture), at this time, the robot is required to move towards the direction of the lower left corner of the calibration board, the direction from the center of the calibration board to the missing corner point can be used as a target direction, because the conversion relation between the coordinate system of the calibration board and the coordinate system of the robot base can be obtained by calibration in advance (for example, the calibration is carried out by adopting the existing calibration method of the coordinate system of the robot workpiece coordinate system), the target direction is converted into the direction in the coordinate system of the robot base, and the direction in which the robot needs to move is.
Therefore, the direction from the center of the calibration plate to the missing feature point (hereinafter referred to as a first target direction) can be obtained first, and then the first target direction is converted into the direction in the robot base coordinate system according to the conversion relation between the calibration plate coordinate system and the robot base coordinate system, and the direction is taken as the direction in which the robot needs to move; if the number of the missing feature points is multiple, the gravity center point positions of all the missing feature points can be calculated, then the direction from the center of the calibration plate to the gravity center point (hereinafter referred to as a second target direction) is obtained, the second target direction is converted into the direction in the robot base coordinate system according to the conversion relation between the calibration plate coordinate system and the robot base coordinate system, and the direction is used as the direction in which the robot needs to move.
If the part of the calibration plate displayed in the picture does not contain the center point of the calibration plate, the center of gravity point of the displayed part can be obtained firstly (the method for obtaining the center of gravity point of a certain area in the picture is the prior art), the center of gravity point is used for replacing the center point of the calibration plate in the method to move the robot until the center of the calibration plate appears in the picture, and then the center point of the calibration plate is used for executing the method to move the robot.
In step a504, the fact that the calibration plate is completely exposed in the field of view of the camera means that the calibration plate can completely appear in the picture taken by the camera, specifically, step a504 includes:
the robot is moved step by step in a preset step (the step can be called as a first step) in the direction needing to move;
acquiring a picture shot by a camera after each movement and judging whether a calibration board is complete in the picture;
and if the calibration plate is complete in the picture, stopping moving.
The method comprises the steps of obtaining a picture shot by a camera every time the camera moves according to a first step length to judge whether the calibration plate is complete in the picture, and changing the direction of the robot to move as the condition of the characteristic points lacked in the picture obtained every time is changed, so that the moving direction of the robot can be automatically changed along with the change of the condition of the calibration plate displayed in the picture along with the movement of the robot, and the calibration plate can be reliably ensured to be completely exposed in the field of view of the camera.
If the acquired target rotation angle value is too large, a calibration plate may not be included in a picture shot at some target poses (because if the picture does not include a feature point, it is difficult to judge which position of the calibration plate a part displayed in the picture belongs to, and thus which position of the calibration plate the missing part belongs to cannot be known, and the target direction cannot be acquired), therefore, whether the shot picture includes the calibration plate is generally determined by whether the picture includes the feature point of the calibration plate, that is, if the shot picture includes at least one feature point, it is determined that the picture includes the calibration plate); at this time, in step A5, the robot may be moved along a predetermined path until the calibration plate appears in the field of view of the camera (based on at least one feature point appearing in the picture taken by the camera), and then the position of the position robot is adjusted according to steps a501-a 504.
For example, the preset path is to move forward by a preset distance, return to the initial position and then move backward by the preset distance, return to the initial position and then move left by the preset distance, return to the initial position and then move right by the preset distance, and the movement is stopped once the calibration plate is found to be in the visual field of the camera in the moving process; or the preset path is a spiral path with gradually increasing radius, and the movement is stopped once the calibration board is found to be in the visual field of the camera during the movement along the spiral path. When moving along the preset path, the calibration board can move step by step in a preset step length (the step length can be called as a second step length), and after moving one step, a picture taken by the camera is acquired and whether the calibration board appears in the picture is judged. And if the calibration plate cannot be made to appear in the visual field of the camera even if the calibration plate moves along the preset path, moving to the next target pose and continuing the subsequent processing (namely abandoning the target pose).
It should be noted that, in order to ensure that the calibration plate does not deviate from the field of view of the camera too much when the robot moves to each target pose, generally, before step a1, the robot may be moved to the optimal initial pose, so that the calibration plate is located right below the camera, and the optical axis of the camera faces downward and faces the center of the positioning plate.
According to the above, the hand-eye calibration track generation method obtains a plurality of target rotation angle values; acquiring an initial pose of the robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the plurality of target poses; at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the requirements on the professional level and experience of workers are low.
Referring to fig. 2, an embodiment of the present application provides a hand-eye calibration trajectory generation device, which is applied to a robot and includes a first obtaining module 1, a second obtaining module 2, a generating module 3, a first executing module 4, a position adjusting module 5, and a recording module 6;
the first obtaining module 1 is configured to obtain a plurality of target rotation angle values;
the second acquisition module 2 is used for acquiring an initial pose of the robot;
the generating module 3 is configured to generate a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value;
the first execution module 4 is used for moving according to a plurality of target poses;
the position adjusting module 5 is used for adjusting the position of each target pose according to the position of the calibration plate, so that the calibration plate is completely exposed in the visual field of the camera;
and the recording module 6 is used for recording the adjusted pose data of the robot.
In some embodiments, the first acquisition module 1 directly acquires a plurality of preset target rotation angle values (e.g., 25 °, 30 °, 35 °, 40 °, 45 °, 50 °, -25 °, -30 °, -35 °, -40 °, -45 °, -50 °).
In other embodiments, the first obtaining module 1 obtains the angle range and the target number, and then selects the corresponding number of angle values within the angle range as the target rotation angle value according to the target number. The angle values can be selected at equal intervals in the angle range according to the target number, and the angle values can also be selected randomly in the angle range.
The second acquisition module 2 acquires the initial pose and the initial position of the robot when acquiring the initial pose of the robot;
so that when the generation module 3 generates a plurality of object poses based on the plurality of object rotation angle values and the initial pose,
according to each target rotation angle value, calculating normalized four-elements when the robot rotates around the X, Y, Z axis by a corresponding angle respectively to obtain a plurality of normalized four-elements;
respectively point-multiplying the initial postures by a plurality of normalized four-element to perform normalization processing to obtain a plurality of target postures;
and combining the initial position with the plurality of target poses respectively to obtain a plurality of target poses.
When the first execution module 4 moves according to the plurality of target poses, the robot is sequentially moved to each target pose.
In some embodiments, when the generating module 3 generates a plurality of target pose data, the target pose data are numbered according to the sequence of the calculated target pose data, so that the first executing module 4 sequentially moves the robot to the target poses according to the sequence of the numbering.
In other embodiments, the hand-eye calibration trajectory generation apparatus further includes a third obtaining module; the third acquisition module is used for acquiring a preferred sequence according to the generated data of the plurality of target poses, so that the total path of the robot moving among the target poses according to the preferred sequence is shortest; the first execution module 4 thus moves the robot to each target pose in turn in accordance with the preferred sequence.
Wherein, when the position adjusting module 5 adjusts the position of the robot,
acquiring a picture shot by a camera;
judging whether the calibration plate is complete in the picture;
if the calibration plate is not complete in the picture, judging the direction in which the robot needs to move according to the condition that the calibration plate is missing in the picture;
the robot is moved in the direction of movement required until the calibration plate is completely exposed to the field of view of the camera.
The number of the feature points can be identified to determine whether the calibration board is complete in the picture, but the invention is not limited thereto. For example, the calibration board is a rectangular board, four corner points are used as feature points, the number of the feature points is 4, and if the number of the corner points in the picture is not 4, the incomplete calibration board in the picture is represented.
Wherein, when the position adjusting module 5 judges the moving direction of the robot according to the missing condition of the calibration plate in the picture,
acquiring the missing characteristic point information of the calibration plate in the picture;
and judging the direction in which the robot needs to move according to the positions of the missing characteristic points on the calibration plate and the relative position relationship between the calibration plate and the robot.
The characteristic points are some points on the calibration plate, such as angular points of the calibration plate or some point-like patterns specially arranged on the upper surface of the calibration plate; taking a rectangular calibration board as an example, the calibration board has four corner points, when in a picture, the calibration board lacks a corner point at the lower left corner, at this time, the robot is required to move towards the lower left corner of the calibration board, the direction from the center of the calibration board to the missing corner point can be used as a target direction, because the conversion relation between the coordinate system of the calibration board and the base coordinate system of the robot can be obtained by calibration in advance (for example, the calibration is carried out by adopting the existing calibration method of the workpiece coordinate system of the robot), the target direction is converted into the direction in the base coordinate system of the robot, and the direction in which the robot needs to move is obtained.
Therefore, the direction from the center of the calibration plate to the missing feature point (hereinafter referred to as a first target direction) can be obtained first, and then the first target direction is converted into the direction in the robot base coordinate system according to the conversion relation between the calibration plate coordinate system and the robot base coordinate system, and the direction is taken as the direction in which the robot needs to move; if the number of the missing feature points is multiple, the gravity center point positions of all the missing feature points can be calculated, then the direction from the center of the calibration plate to the gravity center point (hereinafter referred to as a second target direction) is obtained, the second target direction is converted into the direction in the robot base coordinate system according to the conversion relation between the calibration plate coordinate system and the robot base coordinate system, and the direction is used as the direction in which the robot needs to move.
If the part of the calibration plate displayed in the picture does not contain the center point of the calibration plate, the center of gravity point of the displayed part can be obtained firstly (the method for obtaining the center of gravity point of a certain area in the picture is the prior art), the center of gravity point is used for replacing the center point of the calibration plate in the method to move the robot until the center of the calibration plate appears in the picture, and then the center point of the calibration plate is used for executing the method to move the robot.
Wherein, when the position adjusting module 5 makes the robot move towards the direction needing to move,
the robot is moved step by step in a preset step (the step can be called as a first step) in the direction needing to move;
acquiring a picture shot by a camera after each movement and judging whether a calibration board is complete in the picture;
and if the calibration plate is complete in the picture, stopping moving.
The method comprises the steps of obtaining a picture shot by a camera every time the camera moves according to a first step length to judge whether the calibration plate is complete in the picture, and changing the direction of the robot to move as the condition of the characteristic points lacked in the picture obtained every time is changed, so that the moving direction of the robot can be automatically changed along with the change of the condition of the calibration plate displayed in the picture along with the movement of the robot, and the calibration plate can be reliably ensured to be completely exposed in the field of view of the camera.
If the acquired target rotation angle value is too large, some images shot at the target poses may not contain the calibration plate; at this time, the position adjusting module 5 may move the robot according to a preset path until the calibration board appears in the field of view of the camera, and then perform the above steps: acquiring a picture shot by a camera; judging whether the calibration plate is complete in the picture; if the calibration plate is not complete in the picture, judging the direction in which the robot needs to move according to the condition that the calibration plate is missing in the picture; the robot is moved in the direction of movement required until the calibration plate is completely exposed to the field of view of the camera.
For example, the preset path is to move forward by a preset distance, return to the initial position and then move backward by the preset distance, return to the initial position and then move left by the preset distance, return to the initial position and then move right by the preset distance, and the movement is stopped once the calibration plate is found to be in the visual field of the camera in the moving process; or the preset path is a spiral path with gradually increasing radius, and the movement is stopped once the calibration board is found to be in the visual field of the camera during the movement along the spiral path. When moving along the preset path, the calibration board can move step by step in a preset step length (the step length can be called as a second step length), and after moving one step, a picture taken by the camera is acquired and whether the calibration board appears in the picture is judged.
In the above way, the hand-eye calibration trajectory generation device obtains a plurality of target rotation angle values; acquiring an initial pose of the robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the plurality of target poses; at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the requirements on the professional level and experience of workers are low.
Referring to fig. 3, an electronic device 100 according to an embodiment of the present application further includes a processor 101 and a memory 102, where the memory 102 stores a computer program, and the processor 101 is configured to execute the steps of the above-mentioned hand-eye calibration trajectory generation method by calling the computer program stored in the memory 102.
The processor 101 is electrically connected to the memory 102. The processor 101 is a control center of the electronic device 100, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or calling a computer program stored in the memory 102 and calling data stored in the memory 102, thereby performing overall monitoring of the electronic device.
The memory 102 may be used to store computer programs and data. The memory 102 stores computer programs containing instructions executable in the processor. The computer program may constitute various functional modules. The processor 101 executes various functional applications and data processing by calling a computer program stored in the memory 102.
In this embodiment, the processor 101 in the electronic device 100 loads instructions corresponding to one or more processes of the computer program into the memory 102, and the processor 101 runs the computer program stored in the memory 102 according to the following steps, so as to implement various functions: obtaining a plurality of target rotation angle values; acquiring an initial pose of the robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the plurality of target poses; at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera; and recording the adjusted pose data of the robot.
According to the above, the electronic device obtains a plurality of target rotation angle values; acquiring an initial pose of the robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the plurality of target poses; at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the requirements on the professional level and experience of workers are low.
An embodiment of the present application further provides a storage medium, on which a computer program is stored, where the computer program runs the steps of the above-mentioned hand-eye calibration trajectory generation method when being executed by a processor, so as to implement the following functions: obtaining a plurality of target rotation angle values; acquiring an initial pose of the robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the plurality of target poses; at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera; and recording the adjusted pose data of the robot.
The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In summary, although the present invention has been described with reference to the preferred embodiments, the above-described preferred embodiments are not intended to limit the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present invention, which are substantially the same as the present invention.

Claims (10)

1. A hand-eye calibration track generation method is applied to a robot and is characterized by comprising the following steps:
A1. obtaining a plurality of target rotation angle values;
A2. acquiring an initial pose of the robot;
A3. generating a plurality of target poses according to the plurality of target rotation angle values and the initial pose; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value;
A4. moving according to the plurality of target poses;
A5. at each target pose, performing position adjustment according to the position of the calibration plate to enable the calibration plate to be completely exposed in the visual field of the camera;
A6. and recording the adjusted pose data of the robot.
2. The hand-eye calibration trajectory generation method according to claim 1, wherein step a2 includes: acquiring an initial posture and an initial position of the robot;
step a3 includes:
A301. according to each target rotation angle value, calculating normalized four-elements when the robot rotates around the X, Y, Z axis by a corresponding angle respectively to obtain a plurality of normalized four-elements;
A302. respectively point-multiplying the initial postures by the multiple normalized four-element to perform normalization processing to obtain multiple target postures;
A303. and combining the initial position with the plurality of target poses respectively to obtain a plurality of target poses.
3. The hand-eye calibration trajectory generation method according to claim 1, wherein step a5 includes:
A501. acquiring a picture shot by a camera;
A502. judging whether the calibration plate is complete in the picture;
A503. if the calibration plate is not complete in the picture, judging the direction in which the robot needs to move according to the condition that the calibration plate is missing in the picture;
A504. and moving the robot to the direction needing to move until the calibration board is completely exposed in the visual field of the camera.
4. The hand-eye calibration trajectory generation method according to claim 3, wherein the step A503 comprises:
acquiring the missing characteristic point information of the calibration plate in the picture;
and judging the direction in which the robot needs to move according to the positions of the missing characteristic points on the calibration plate and the relative position relationship between the calibration plate and the robot.
5. The method for generating a hand-eye calibration trajectory according to claim 3, wherein step A504 includes:
enabling the robot to move step by step in the direction needing to move according to a preset step length;
acquiring a picture shot by a camera after each movement and judging whether a calibration board is complete in the picture;
and if the calibration plate is complete in the picture, stopping moving.
6. A hand-eye calibration track generation device applied to a robot is characterized by comprising:
the first acquisition module is used for acquiring a plurality of target rotation angle values;
the second acquisition module is used for acquiring the initial pose of the robot;
the generating module is used for generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is obtained after the robot rotates from the initial pose according to the target rotation angle value;
the first execution module is used for moving according to the plurality of target poses;
the position adjusting module is used for adjusting the position of each target pose according to the position of the calibration plate, so that the calibration plate is completely exposed in the visual field of the camera;
and the recording module is used for recording the adjusted pose data of the robot.
7. The hand-eye calibration trajectory generation device according to claim 6, wherein the second acquisition module acquires an initial pose and an initial position of the robot when acquiring the initial pose of the robot;
the generation module, in generating a plurality of target poses from the plurality of target rotation angle values and an initial pose,
according to each target rotation angle value, calculating normalized four-elements when the robot rotates around the X, Y, Z axis by a corresponding angle respectively to obtain a plurality of normalized four-elements;
respectively point-multiplying the initial postures by the multiple normalized four-element to perform normalization processing to obtain multiple target postures;
and combining the initial position with the plurality of target poses respectively to obtain a plurality of target poses.
8. The device for generating a calibration trajectory for a hand and an eye according to claim 6, wherein the position adjustment module, when adjusting the position of the robot,
acquiring a picture shot by a camera;
judging whether the calibration plate is complete in the picture;
if the calibration plate is not complete in the picture, judging the direction in which the robot needs to move according to the condition that the calibration plate is missing in the picture;
and moving the robot to the direction needing to move until the calibration board is completely exposed in the visual field of the camera.
9. An electronic device, comprising a processor and a memory, wherein the memory stores a computer program, and the processor is configured to execute the steps of the hand-eye calibration trajectory generation method according to any one of claims 1 to 5 by calling the computer program stored in the memory.
10. A storage medium having a computer program stored thereon, wherein the computer program, when being executed by a processor, executes the steps of the hand-eye calibration trajectory generation method according to any one of claims 1 to 5.
CN202010923062.5A 2020-09-04 2020-09-04 Method and device for generating hand-eye calibration track, electronic equipment and storage medium Active CN112116664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010923062.5A CN112116664B (en) 2020-09-04 2020-09-04 Method and device for generating hand-eye calibration track, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010923062.5A CN112116664B (en) 2020-09-04 2020-09-04 Method and device for generating hand-eye calibration track, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112116664A true CN112116664A (en) 2020-12-22
CN112116664B CN112116664B (en) 2024-05-28

Family

ID=73802260

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010923062.5A Active CN112116664B (en) 2020-09-04 2020-09-04 Method and device for generating hand-eye calibration track, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112116664B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439555A (en) * 2022-08-29 2022-12-06 佛山职业技术学院 Multi-phase machine external parameter calibration method without public view field

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160059419A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Robot apparatus and method for controlling robot apparatus
CN106885585A (en) * 2016-12-30 2017-06-23 国家测绘地理信息局卫星测绘应用中心 A kind of satellite borne photography measuring system integration calibration method based on bundle adjustment
CN107063228A (en) * 2016-12-21 2017-08-18 上海交通大学 Targeted attitude calculation method based on binocular vision
CN107478223A (en) * 2016-06-08 2017-12-15 南京理工大学 A kind of human body attitude calculation method based on quaternary number and Kalman filtering
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN108549322A (en) * 2018-04-11 2018-09-18 广州启帆工业机器人有限公司 Pose synchronization method and device for arc track motion of robot
CN108592950A (en) * 2018-05-17 2018-09-28 北京航空航天大学 A kind of monocular camera and Inertial Measurement Unit are with respect to established angle scaling method
JP2018202608A (en) * 2018-09-28 2018-12-27 キヤノン株式会社 Robot device, control method of robot device, program, and recording medium
US20190015988A1 (en) * 2017-07-11 2019-01-17 Seiko Epson Corporation Robot control device, robot, robot system, and calibration method of camera for robot
CN110202573A (en) * 2019-06-04 2019-09-06 上海知津信息科技有限公司 Full-automatic hand and eye calibrating, working face scaling method and device
CN110497386A (en) * 2019-08-26 2019-11-26 中科新松有限公司 A kind of cooperation Robot Hand-eye relationship automatic calibration device and method
CN111152223A (en) * 2020-01-09 2020-05-15 埃夫特智能装备股份有限公司 Full-automatic robot hand-eye calibration method
CN111347426A (en) * 2020-03-26 2020-06-30 季华实验室 Mechanical arm accurate placement track planning method based on 3D vision
CN111415417A (en) * 2020-04-14 2020-07-14 大连理工江苏研究院有限公司 Mobile robot topology experience map construction method integrating sparse point cloud
CN111515944A (en) * 2020-03-30 2020-08-11 季华实验室 Automatic calibration method for non-fixed path robot

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160059419A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Robot apparatus and method for controlling robot apparatus
CN107478223A (en) * 2016-06-08 2017-12-15 南京理工大学 A kind of human body attitude calculation method based on quaternary number and Kalman filtering
CN107063228A (en) * 2016-12-21 2017-08-18 上海交通大学 Targeted attitude calculation method based on binocular vision
CN106885585A (en) * 2016-12-30 2017-06-23 国家测绘地理信息局卫星测绘应用中心 A kind of satellite borne photography measuring system integration calibration method based on bundle adjustment
US20190015988A1 (en) * 2017-07-11 2019-01-17 Seiko Epson Corporation Robot control device, robot, robot system, and calibration method of camera for robot
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN108549322A (en) * 2018-04-11 2018-09-18 广州启帆工业机器人有限公司 Pose synchronization method and device for arc track motion of robot
CN108592950A (en) * 2018-05-17 2018-09-28 北京航空航天大学 A kind of monocular camera and Inertial Measurement Unit are with respect to established angle scaling method
JP2018202608A (en) * 2018-09-28 2018-12-27 キヤノン株式会社 Robot device, control method of robot device, program, and recording medium
CN110202573A (en) * 2019-06-04 2019-09-06 上海知津信息科技有限公司 Full-automatic hand and eye calibrating, working face scaling method and device
CN110497386A (en) * 2019-08-26 2019-11-26 中科新松有限公司 A kind of cooperation Robot Hand-eye relationship automatic calibration device and method
CN111152223A (en) * 2020-01-09 2020-05-15 埃夫特智能装备股份有限公司 Full-automatic robot hand-eye calibration method
CN111347426A (en) * 2020-03-26 2020-06-30 季华实验室 Mechanical arm accurate placement track planning method based on 3D vision
CN111515944A (en) * 2020-03-30 2020-08-11 季华实验室 Automatic calibration method for non-fixed path robot
CN111415417A (en) * 2020-04-14 2020-07-14 大连理工江苏研究院有限公司 Mobile robot topology experience map construction method integrating sparse point cloud

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
3D视觉工坊: "手眼标定之基本原理(https://blog.csdn.net/Yong_Qi2015/article/details/83960141)", CSDN, pages 1 *
JOCHEN SCHMIDT: "Data Selection for Hand-eye Calibration: A Vector Quantization Approach", THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, vol. 27, no. 9, 30 September 2008 (2008-09-30), pages 1027 - 1053 *
JOCHEN SCHMIDT: "Robust Hand–Eye Calibration of an Endoscopic Surgery Robot Using Dual Quaternions", DAGM 2003:PATTERN RECOGNITION, 12 September 2003 (2003-09-12), pages 548 - 556 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439555A (en) * 2022-08-29 2022-12-06 佛山职业技术学院 Multi-phase machine external parameter calibration method without public view field

Also Published As

Publication number Publication date
CN112116664B (en) 2024-05-28

Similar Documents

Publication Publication Date Title
CN113246135B (en) Robot hand-eye calibration method and device, electronic equipment and storage medium
CN104842352B (en) Robot system using visual feedback
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
US20090118864A1 (en) Method and system for finding a tool center point for a robot using an external camera
US11014233B2 (en) Teaching point correcting method, program, recording medium, robot apparatus, imaging point creating method, and imaging point creating apparatus
CN112091971A (en) Robot eye calibration method and device, electronic equipment and system
CN106514651A (en) Measurement system and calibration method
US20080027580A1 (en) Robot programming method and apparatus with both vision and force
CN106003021A (en) Robot, robot control device, and robotic system
CN114355953B (en) High-precision control method and system of multi-axis servo system based on machine vision
CN106003020A (en) Robot, robot control device, and robotic system
CN112123341B (en) Robot double-arm coordinated motion control method and device and electronic equipment
CN113664838B (en) Robot positioning placement control method and device, electronic equipment and storage medium
CN113664835B (en) Automatic hand-eye calibration method and system for robot
CN114147728B (en) Universal robot eye on-hand calibration method and system
CN112907682B (en) Hand-eye calibration method and device for five-axis motion platform and related equipment
CN113021358A (en) Method and device for calibrating origin of coordinate system of mechanical arm tool and electronic equipment
CN112809668A (en) Method, system and terminal for automatic hand-eye calibration of mechanical arm
CN112116664B (en) Method and device for generating hand-eye calibration track, electronic equipment and storage medium
CN114505864B (en) Hand-eye calibration method, device, equipment and storage medium
US11577400B2 (en) Method and apparatus for managing robot system
CN112743548B (en) Method, system and terminal for unifying hand-eye calibration of two mechanical arms
CN114516051B (en) Front intersection method and system for three or more degrees of freedom robot vision measurement
KR102333281B1 (en) Robot arm for teaching location where tools to move and operation method thereof
CN112184819A (en) Robot guiding method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant