CN112720474A - Pose correction method and device for robot, terminal device and storage medium - Google Patents

Pose correction method and device for robot, terminal device and storage medium Download PDF

Info

Publication number
CN112720474A
CN112720474A CN202011523735.4A CN202011523735A CN112720474A CN 112720474 A CN112720474 A CN 112720474A CN 202011523735 A CN202011523735 A CN 202011523735A CN 112720474 A CN112720474 A CN 112720474A
Authority
CN
China
Prior art keywords
robot
preset
coordinate
acquiring
included angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011523735.4A
Other languages
Chinese (zh)
Inventor
刘培超
蔡同彪
郎需林
解俊杰
刘主福
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yuejiang Technology Co Ltd
Original Assignee
Shenzhen Yuejiang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yuejiang Technology Co Ltd filed Critical Shenzhen Yuejiang Technology Co Ltd
Priority to CN202011523735.4A priority Critical patent/CN112720474A/en
Publication of CN112720474A publication Critical patent/CN112720474A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The application is applicable to the technical field of robots and provides a pose correction method, a pose correction device, terminal equipment and a storage medium for a robot, wherein the method comprises the following steps: when the robot is detected to reach the target position, controlling the execution tail end of the mechanical arm of the robot to move above the target object; when the execution tail end moves to the position above the target object and a preset marker is detected in the visual range of the robot, acquiring pixel coordinates of a marker point with preset characteristics; wherein the preset marker comprises a marker point of the preset feature; acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points with preset characteristics; acquiring a first coordinate of the mark point under a mechanical arm coordinate system according to the pixel coordinate of the mark point and a conversion relation between a pixel coordinate system and the mechanical arm coordinate system; and correcting the pose of the robot according to the first included angle and the first coordinate. The embodiment of the application can simply and effectively correct the pose of the robot.

Description

Pose correction method and device for robot, terminal device and storage medium
Technical Field
The application belongs to the technical field of robots, and particularly relates to a pose correction method and device for a robot, a terminal device and a storage medium.
Background
With the rapid development of artificial intelligence, various intelligent products are adapted, and an intelligent car is a type of mobile robot, can move from a starting position to a target position, and can grab a target object from the starting position to the target position in an autonomous navigation mode or a runway mode.
For example, in an application scenario, when the robot stops at a target position to grab an object from the storage box, the pose of the robot on the runway needs to be adjusted, and then the robot can smoothly grab the object. The existing pose correction method is generally realized through a sensor (gyroscope), and the sensor may have accumulated errors in the long-time walking process of the robot, so that the accuracy of the pose correction of the robot is low.
Disclosure of Invention
The embodiment of the application provides a robot pose correction method and device, terminal equipment and a storage medium, and aims to solve the problem that the accuracy of the existing robot pose correction is not high.
In a first aspect, an embodiment of the present application provides a pose correction method for a robot, including:
when the robot is detected to reach a target position, controlling an execution tail end of a mechanical arm on the robot to move above a target object;
when the execution tail end moves above the target object and a preset marker is detected in the visual range of the robot, acquiring pixel coordinates of a marker point with a preset characteristic; wherein the preset markers comprise marker points of the preset features;
acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points of the preset characteristics;
acquiring a first coordinate of the mark point under a mechanical arm coordinate system according to the pixel coordinate of the mark point and a conversion relation between a pixel coordinate system and the mechanical arm coordinate system;
and correcting the pose of the robot according to the first included angle and the first coordinate.
In one embodiment, the obtaining the pixel coordinates of the marker point of the preset feature when the execution terminal moves above the target object and the preset marker is detected in the visual range of the robot includes:
when the execution tail end moves above a target object and at least two preset markers are detected in the visual range of the robot, respectively acquiring pixel coordinates of marker points of preset features in the at least two markers;
the pixel coordinate according to the mark point of the preset characteristic obtains a first included angle between the robot and a preset path, and the method comprises the following steps:
and acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points with preset characteristics in the at least two marks.
In one embodiment, the calculation formula for obtaining the first included angle between the robot and the preset path according to the pixel coordinates of the mark points of the preset features in the at least two markers is as follows:
Figure BDA0002849778640000021
wherein, the
Figure BDA0002849778640000022
Represents the first angle, (x)1,y1) (x) pixel coordinates of a marker point representing a first predetermined feature of the at least two markers2,y2) Pixel coordinates of a marker point representing a second predetermined feature of the at least two markers.
In one embodiment, the correcting the pose of the robot according to the first included angle and the first coordinate includes:
adjusting the angle of the robot according to a first difference value between the first included angle and a second included angle of a preset standard position until the first difference value between the first included angle and the standard included angle is smaller than or equal to a first threshold value;
and adjusting the horizontal and vertical offsets of the robot according to the first coordinate and a second coordinate of the mark point of the preset feature in the preset standard position in a mechanical arm coordinate system until the spacing distance between the first coordinate and the second coordinate is less than or equal to a second threshold value.
In one embodiment, before controlling the executing end of the robot arm on the robot to move above the target object when the robot is detected to reach the target position, the method further comprises:
and calibrating the robot to obtain the conversion relation.
In one embodiment, the transformation relationship comprises a transformation matrix;
the calibrating the robot to obtain the conversion relation comprises:
acquiring pixel coordinates of preset calibration points at N different calibration positions;
acquiring third coordinates of preset calibration points at N different calibration positions under a mechanical arm coordinate system;
calculating to obtain the conversion matrix according to the pixel coordinates of the N preset calibration points and the third coordinates of the N preset calibration points under the mechanical arm coordinate system; wherein N is more than or equal to 3.
In one embodiment, before controlling the executing end of the robot arm on the robot to move above the target object when the robot is detected to reach the target position, the method further comprises:
when the robot is under a preset standard position, acquiring a second included angle between the robot and a preset path;
and when the robot is under a preset standard position, acquiring a second coordinate of the mark point of the preset characteristic under a mechanical arm coordinate system.
In a second aspect, an embodiment of the present application provides a pose correction apparatus for a robot, including:
the moving module is used for controlling the execution tail end of a mechanical arm on the robot to move above a target object when the robot is detected to reach the target position;
the first acquisition module is used for acquiring the pixel coordinates of a mark point with preset characteristics when the execution tail end moves above the target object and a preset mark is detected in the visual range of the robot; wherein the preset markers comprise marker points of the preset features;
the second acquisition module is used for acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points of the preset characteristics;
the third acquisition module is used for acquiring a first coordinate of the mark point under a mechanical arm coordinate system according to the pixel coordinate of the mark point and the conversion relation between the pixel coordinate system and the mechanical arm coordinate system;
and the correction module is used for correcting the pose of the robot according to the first included angle and the first coordinate.
In a third aspect, the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the robot pose correction method when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the steps of the pose correction method for a robot.
In a fifth aspect, the present application provides a computer program product, which when run on an electronic device, causes the electronic device to execute the steps of the robot pose correction method described above.
Compared with the prior art, the first aspect of the embodiment of the application has the following beneficial effects: when the embodiment of the application detects that the robot reaches the target position, the robot may not accurately reach the target position at the moment, and deviation occurs, the execution tail end of the mechanical arm can be controlled to move above the target object, detecting the preset marker in the visual range when the execution terminal moves to the position above the target object, when the preset marker is detected, the pixel coordinates of the marker points of the preset characteristics in the preset marker can be obtained, the first included angle between the robot and the preset path is directly obtained according to the pixel coordinates of the marker points, namely, the offset angle of the robot can be obtained, and the first coordinate of the mark point with the preset characteristics under the mechanical arm coordinate system is obtained according to the pixel coordinate of the mark point and the conversion relation between the pixel coordinate system and the mechanical arm coordinate system, the pose of the robot is corrected according to the first included angle and the first coordinate, and the pose of the robot can be corrected simply and effectively.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flowchart of a pose correction method for a robot according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a correct pose in a preset standard position in an application scenario provided by an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a situation in which a pose in an application scene is deviated according to an embodiment of the present application;
fig. 4 is another schematic diagram of a situation in which a pose in an application scenario is deviated according to an embodiment of the present application;
fig. 5 is a schematic flowchart of step S105 according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a pose correction apparatus of a robot according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of a robot according to another embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The pose correction method of the robot can be applied to robots or other intelligent devices with mobile functions and the like, the robot can be specifically a mobile robot, and the mobile robot can be specifically an intelligent trolley. The embodiment of the application does not set any limit to the specific type of the robot.
In order to explain the technical means described in the present application, the following examples are given below.
Referring to fig. 1, a pose correction method for a robot according to an embodiment of the present application includes:
and S101, when the robot is detected to reach a target position, controlling the execution tail end of the mechanical arm on the robot to move above a target object.
Specifically, the detection that the robot reaches the target position may be a determination that the robot reaches the target position when the robot reaches the target according to a preset planned path, or a determination that the robot reaches the target position when the robot reaches the preset target position based on a specific auxiliary device (e.g., a magnetic stripe-attached runway, a track, etc.). And when the robot is detected to reach the target position, controlling the execution tail end of the mechanical arm on the robot to move above a preset target object. When it is detected that the robot reaches the target position, the actual target position reached by the robot may deviate from the preset standard position. At this time, an attempt is made to control the execution end of the robot arm on the robot to move above the target object. The target object can be a preset storage box or other objects, and helps the robot to adjust the position of the robot.
In one embodiment, before controlling the executing end of the robot arm on the robot to move above the target object when the robot is detected to reach the target position, the method further comprises: when the robot is under a preset standard position, acquiring a second included angle between the robot and a preset path; and when the robot is under a preset standard position, acquiring a second coordinate of the mark point of the preset characteristic under a mechanical arm coordinate system.
Specifically, the robot is in a preset standard position in advance by means of manual calibration, and the preset standard position is the position which the robot really needs to reach. The robot obtains a second included angle between the robot and the preset path under the preset standard position, the preset path may also be a preset reference path, for example, the robot walks along the runway, and the preset path may be a path using the runway as a reference. It will be understood that the second angle is the angle between a standard robot and the predetermined path. And when the robot is under the preset standard position, the second coordinate of the mark point of the preset characteristic under the mechanical arm coordinate system is also obtained.
In one embodiment, the marker point of the preset feature may be a marker point on a marker that can be detected by a human in a visual range of a preset standard position, the marker may be an Apriltag code, and the marker point may be a central point in the marker or a point with a recognizable feature. For example, in a specific application scenario, if the robot is a smart cart, the smart cart moves to a designated position (i.e., the target position) along a preset track, and grabs a target object in a preset storage box, where the target object may be a corresponding cargo, thereby performing a cargo picking function, pre-stores a relative position between the target object (i.e., the storage box) and the target position, and grabs the target object according to the relative position, and controls an execution end of a robotic arm on the robot to move above the target object according to the pre-stored relative position.
And S102, when the execution tail end moves to the position above the target object and a preset marker is detected in the visual range of the robot, acquiring the pixel coordinates of a marker point with a preset characteristic.
Specifically, the preset marker includes a marker point of the preset feature. A preset marker may be previously set at a certain position near the target object, and the preset marker includes a marker point of a preset feature. When the execution terminal moves above the object and the preset marker is detected in the visual range of the robot, the pixel coordinates of the marker point of the preset sign are obtained, and the pixel coordinates can be understood as coordinates in the pixel coordinate system of the robot.
The mark point of the preset feature may be an identified point having a specific feature, for example, the mark point of the preset feature may be a central point of the preset marker, and when the preset marker is detected, the central point of the preset marker is located, and the pixel coordinate of the central point is obtained. The predetermined characteristic mark point may also be a point having a certain texture, color or other specifically recognizable characteristic.
Step S103, acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points with the preset characteristics.
Specifically, a first included angle between the robot and the preset path may be obtained according to pixel coordinates of at least two mark points of the preset feature, for example, the first included angle may be an included angle between a connection line of the two mark points of the preset feature and the preset path (runway).
In one embodiment, the obtaining the pixel coordinates of the marker point of the preset feature when the execution terminal moves above the target object and the preset marker is detected in the visual range of the robot includes: when the execution tail end moves to the position above the target object and at least two preset markers are detected in the visual range of the robot, the pixel coordinates of the marker points of preset features in the at least two markers are respectively acquired.
For example, in an application scenario, two markers (a target that can be detected, for example, the target may be Apriltag code) may be attached to a certain position of a position that needs to be corrected (i.e., the preset standard position), the two markers can be distinguished when being detected (for example, different identifiers are set for each marker in advance to distinguish the two markers), and pixel coordinates of a marker point (for example, a central point) of a preset feature in the markers in a pixel coordinate system can be detected. The pixel coordinates of the mark points of the preset features in each preset mark can be respectively obtained when the execution tail end moves to the position above the target object (namely, the object placing box in the drawing) and at least two preset marks are detected in the visual range of the robot. Fig. 2 is a schematic diagram of a correct pose of the intelligent trolley in a preset standard position, fig. 3 is a schematic diagram of a first included angle between the intelligent trolley and a preset path when the pose of the intelligent trolley deviates, and fig. 4 is a schematic diagram of a marker shot by a camera on the intelligent trolley when the pose of the intelligent trolley deviates in fig. 3.
The pixel coordinate according to the mark point of the preset characteristic obtains a first included angle between the robot and a preset path, and the method comprises the following steps: and acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points with preset characteristics in the at least two marks.
In one embodiment, the calculation formula for obtaining the first included angle between the robot and the preset path according to the pixel coordinates of the mark points of the preset features in the at least two markers is as follows:
Figure BDA0002849778640000091
wherein, the
Figure BDA0002849778640000092
The first included angle is represented by the first angle,(x1,y1) (x) pixel coordinates of a marker point representing a first predetermined feature of the at least two markers2,y2) Pixel coordinates of a marker point representing a second predetermined feature of the at least two markers. (x)1,y1) (x) pixel coordinates of a marker point representing a predetermined feature in any one of all the predetermined markers, (x)2,y2) Means that all the pre-set markers are divided by (x)1,y1) Pixel coordinates of a marker point corresponding to a preset feature in any one of the preset markers other than the markers.
And step S104, acquiring a first coordinate of the mark point in a mechanical arm coordinate system according to the pixel coordinate of the mark point and the conversion relation between the pixel coordinate system and the mechanical arm coordinate system.
Specifically, a first coordinate of a preset feature mark point in the preset marker under a mechanical arm coordinate is obtained according to a detected pixel coordinate of the preset feature mark point in the preset marker and a conversion relation between a pixel coordinate system and the mechanical arm coordinate system.
And S105, correcting the pose of the robot according to the first included angle and the first coordinate.
Specifically, the angular offset of the robot can be known according to the relationship between the first included angle and the second included angle of the preset standard position, so that the angle of the robot is adjusted, the offset in the front-back direction and the left-right direction of the robot can be known according to the first coordinate of the preset feature mark point in the preset marker in the mechanical arm coordinate and the second coordinate of the preset feature mark point corresponding to the preset standard position in the mechanical arm coordinate, and the offset in the front-back direction and the left-right direction of the robot is adjusted.
In one embodiment, as shown in fig. 5, correcting the pose of the robot according to the first included angle and the first coordinate includes steps S1051 to S1052:
step S1051, adjusting the angle of the robot according to a first difference value between the first included angle and a second included angle of a preset standard position until the first difference value between the first included angle and the standard included angle is smaller than or equal to a first threshold value.
Specifically, in order to further ensure the accuracy of the correction, an angle threshold (i.e., the first threshold) is preset, the angle of the robot is adjusted until a first difference between the first included angle and the standard included angle is less than or equal to the first threshold, and the angle adjustment is stopped. The adjusting of the angle of the robot may be adjusting at a preset adjustment angle.
Step S1052, adjusting the horizontal and vertical offsets of the robot according to the first coordinate and a second coordinate of the mark point of the preset feature in the preset standard position in the mechanical arm coordinate system until a distance between the first coordinate and the second coordinate is less than or equal to a second threshold.
Specifically, the second threshold includes a second threshold in the horizontal direction and a second threshold in the vertical direction, after the angle of the robot is adjusted, the robot may have a horizontal or vertical offset, and the horizontal and vertical offsets of the robot are adjusted according to the first coordinate and a second coordinate of the mark point of the preset feature in the preset standard position in the robot arm coordinate system until the distance between the first coordinate and the second coordinate in the horizontal direction is less than or equal to the second threshold in the horizontal direction, and the distance in the vertical direction is less than or equal to the second threshold in the vertical direction.
When a first difference between the first included angle and the standard included angle is smaller than or equal to a first threshold, a distance between the first coordinate and the second coordinate in the horizontal direction is smaller than or equal to a second threshold in the horizontal direction, and a distance between the first coordinate and the second coordinate in the vertical direction is smaller than or equal to a second threshold in the vertical direction, stopping correction, otherwise, returning to execute the steps of "controlling the execution tail end of the mechanical arm on the robot to move above the target object" in the step S101 and the subsequent steps.
Assume a first coordinate of
Figure BDA0002849778640000101
Second coordinate is [ x ]b,yb]The calculation formula of the horizontal and vertical offset of the robot is as follows:
Figure BDA0002849778640000102
wherein the content of the first and second substances,
Figure BDA0002849778640000111
indicating the amount of offset in the left-right direction, and the signs indicate left and right.
Figure BDA0002849778640000112
Indicating the amount of offset in the front-rear direction, and the signs indicate front and rear. Or
Figure BDA0002849778640000113
Indicating the amount of movement in the left-right direction, and the signs indicate left and right.
Figure BDA0002849778640000114
The offset in the front-back direction is represented, and the signs represent the front and the back, and are specifically determined according to a coordinate system defined by the mechanical arm in practical application, and are not limited here.
In one embodiment, step S101 is preceded by: and calibrating the robot to obtain the conversion relation.
Specifically, the robot realizes an important step of hand-eye coordination during hand-eye calibration, so that the hand-eye calibration of the robot is performed in advance, the hand-eye calibration is performed in order to calibrate the relative pose relationship between a mechanical arm and a camera of the robot, the camera is arranged on the mechanical arm in advance, and the camera and the mechanical arm have a relative pose relationship. When the execution tail end of the mechanical arm is preset with N different positions, a relation matrix between a pixel coordinate system and a mechanical arm coordinate system is calculated through N point calibration, wherein N > is 3, and N can be generally set as 9.
In one embodiment, the transformation relationship comprises a transformation matrix; calibrating the robot to obtain the conversion relation comprises: acquiring pixel coordinates of preset calibration points at N different calibration positions; acquiring third coordinates of preset calibration points at N different calibration positions under a mechanical arm coordinate system; calculating to obtain the conversion matrix according to the pixel coordinates of the N preset calibration points and the third coordinates of the N preset calibration points under the mechanical arm coordinate system; wherein N is more than or equal to 3.
Specifically, pixel coordinates of the preset calibration points at the N different calibration positions and third coordinates of the preset calibration points corresponding to the N different calibration positions in the mechanical arm coordinate system are obtained, and the conversion matrix can be obtained by resolving the N points. For example, pixel coordinates of N calibration points at different positions can be obtained, the tail end of the robot arm is sequentially aligned with the marker points of the calibration object in a manual or automatic mode, coordinates under N robot arm coordinate systems are obtained, and the coordinate transformation matrix can be obtained by resolving the N points.
When the embodiment of the application detects that the robot reaches the target position, the robot may not accurately reach the target position at the moment, and deviation occurs, the execution tail end of the mechanical arm can be controlled to move above the target object, detecting the preset marker in the visual range when the execution terminal moves to the position above the target object, when the preset marker is detected, the pixel coordinates of the marker points of the preset characteristics in the preset marker can be obtained, the first included angle between the robot and the preset path is directly obtained according to the pixel coordinates of the marker points, namely, the offset angle of the robot can be obtained, and the first coordinate of the mark point with the preset characteristics under the mechanical arm coordinate system is obtained according to the pixel coordinate of the mark point and the conversion relation between the pixel coordinate system and the mechanical arm coordinate system, the pose of the robot is corrected according to the first included angle and the first coordinate, and the pose of the robot can be corrected simply and effectively.
Fig. 6 shows a block diagram of a pose correction apparatus of a robot according to an embodiment of the present application, which corresponds to the pose correction method of a robot according to the above embodiment, and only shows portions related to the embodiment of the present application for convenience of description. Referring to fig. 6, the apparatus includes:
a moving module 601, configured to control an execution end of a mechanical arm on the robot to move above a target object when it is detected that the robot reaches a target position;
a first obtaining module 602, configured to obtain pixel coordinates of a mark point of a preset feature when the execution end moves above the target object and a preset mark is detected in a visual range of the robot; wherein the preset markers comprise marker points of the preset features;
a second obtaining module 603, configured to obtain a first included angle between the robot and a preset path according to the pixel coordinates of the mark point of the preset feature;
a third obtaining module 604, configured to obtain a first coordinate of the mark point in a robot arm coordinate system according to the pixel coordinate of the mark point and a conversion relationship between a pixel coordinate system and the robot arm coordinate system;
and the correcting module 605 is configured to correct the pose of the robot according to the first included angle and the first coordinate.
In one embodiment, the first obtaining module comprises:
the robot comprises a first acquisition unit, a second acquisition unit and a control unit, wherein the first acquisition unit is used for respectively acquiring the pixel coordinates of a mark point of a preset feature in at least two marks when the execution tail end moves to the position above a target object and the at least two preset marks are detected in the visual range of the robot;
the second obtaining unit is used for obtaining a first included angle between the robot and a preset path according to the pixel coordinates of the mark points of the preset features, and comprises:
and the third acquisition unit is used for acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points with preset characteristics in the at least two markers.
In one embodiment, the calculation formula for obtaining the first included angle between the robot and the preset path according to the pixel coordinates of the mark points of the preset features in the at least two markers is as follows:
Figure BDA0002849778640000131
wherein, the
Figure BDA0002849778640000132
Represents the first angle, (x)1,y1) (x) pixel coordinates of a marker point representing a first predetermined feature of the at least two markers2,y2) Pixel coordinates of a marker point representing a second predetermined feature of the at least two markers.
In one embodiment, the correction module comprises:
the first adjusting unit is used for adjusting the angle of the robot according to a first difference value between the first included angle and a second included angle of a preset standard position until the first difference value between the first included angle and the standard included angle is smaller than or equal to a first threshold value;
and the second adjusting unit is used for adjusting the horizontal and vertical offsets of the robot according to the first coordinate and a second coordinate of the mark point of the preset feature in the preset standard position in a mechanical arm coordinate system until the spacing distance between the first coordinate and the second coordinate is smaller than or equal to a second threshold value.
In one embodiment, the pose correction apparatus further includes:
and the calibration module is used for calibrating the robot to obtain the conversion relation.
In one embodiment, the calibration module further comprises:
the fourth acquisition unit is used for acquiring pixel coordinates of preset calibration points at N different calibration positions;
a fifth obtaining unit, configured to obtain third coordinates of the preset calibration points at the N different calibration positions in the mechanical arm coordinate system;
the calculation unit is used for calculating and obtaining the conversion matrix according to the pixel coordinates of the N preset calibration points and the third coordinates of the N preset calibration points under the mechanical arm coordinate system; wherein N is more than or equal to 3.
In one embodiment, the pose correction apparatus further includes:
the sixth acquiring unit is used for acquiring a second included angle between the robot and the preset path when the robot is under the preset standard position;
and the seventh acquiring unit is used for acquiring a second coordinate of the mark point of the preset characteristic in the mechanical arm coordinate system when the robot is under the preset standard position.
When the embodiment of the application detects that the robot reaches the target position, the robot may not accurately reach the target position at the moment, and deviation occurs, the execution tail end of the mechanical arm can be controlled to move above the target object, detecting the preset marker in the visual range when the execution terminal moves to the position above the target object, when the preset marker is detected, the pixel coordinates of the marker points of the preset characteristics in the preset marker can be obtained, the first included angle between the robot and the preset path is directly obtained according to the pixel coordinates of the marker points, namely, the offset angle of the robot can be obtained, and the first coordinate of the mark point with the preset characteristics under the mechanical arm coordinate system is obtained according to the pixel coordinate of the mark point and the conversion relation between the pixel coordinate system and the mechanical arm coordinate system, the pose of the robot is corrected according to the first included angle and the first coordinate, and the pose of the robot can be corrected simply and effectively.
As shown in fig. 7, an embodiment of the present invention also provides a robot 700 including: a processor 701, a memory 702 and a computer program 703, such as a pose correction program for a robot, stored in said memory 702 and executable on said processor 701. The processor 701 implements the steps in the above-described embodiments of the robot pose correction method when executing the computer program 703. The processor 701, when executing the computer program 703, implements the functions of the modules in the above-described device embodiments, such as the functions of the modules 601 to 605 shown in fig. 6.
Illustratively, the computer program 703 may be partitioned into one or more modules that are stored in the memory 702 and executed by the processor 701 to implement the present invention. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 703 in the robot 700. For example, the computer program 703 may be divided into a moving module, a first obtaining module, a second obtaining module, a third obtaining module and a correcting module, and specific functions of the modules are described in the foregoing embodiments, and are not described herein again.
The robot 700 may be a robot or other intelligent device with mobile function. The smart device may include, but is not limited to, a processor 701, a memory 702. Those skilled in the art will appreciate that fig. 7 is merely an example of a robot 700 and is not intended to be limiting of robot 700 and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., the terminal device may also include input output devices, network access devices, buses, etc.
The Processor 701 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 702 may be an internal storage unit of the robot 700, such as a hard disk or a memory of the robot 700. The memory 702 may also be an external storage device of the robot 700, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the robot 700. Further, the memory 702 may also include both an internal storage unit and an external storage device of the robot 700. The memory 702 is used for storing the computer program and other programs and data required by the terminal device. The memory 702 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated module, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A robot pose correction method is characterized by comprising:
when the robot is detected to reach a target position, controlling an execution tail end of a mechanical arm on the robot to move above a target object;
when the execution tail end moves above the target object and a preset marker is detected in the visual range of the robot, acquiring pixel coordinates of a marker point with a preset characteristic; wherein the preset markers comprise marker points of the preset features;
acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points of the preset characteristics;
acquiring a first coordinate of the mark point under a mechanical arm coordinate system according to the pixel coordinate of the mark point and a conversion relation between a pixel coordinate system and the mechanical arm coordinate system;
and correcting the pose of the robot according to the first included angle and the first coordinate.
2. The pose correction method according to claim 1, wherein the acquiring pixel coordinates of a marker point of a preset feature when the execution tip moves above a target object and a preset marker is detected within a visual range of the robot, comprises:
when the execution tail end moves above a target object and at least two preset markers are detected in the visual range of the robot, respectively acquiring pixel coordinates of marker points of preset features in the at least two markers;
the pixel coordinate according to the mark point of the preset characteristic obtains a first included angle between the robot and a preset path, and the method comprises the following steps:
and acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points with preset characteristics in the at least two marks.
3. The pose correction method according to claim 2, wherein a calculation formula for obtaining a first angle between the robot and a preset path according to pixel coordinates of a marker point of a preset feature in the at least two markers is:
Figure FDA0002849778630000021
wherein, the
Figure FDA0002849778630000022
Represents the first angle, (x)1,y1) (x) pixel coordinates of a marker point representing a first predetermined feature of the at least two markers2,y2) Pixel coordinates of a marker point representing a second predetermined feature of the at least two markers.
4. The pose correction method according to claim 1, wherein the correcting the pose of the robot according to the first angle and the first coordinate includes:
adjusting the angle of the robot according to a first difference value between the first included angle and a second included angle of a preset standard position until the first difference value between the first included angle and the standard included angle is smaller than or equal to a first threshold value;
and adjusting the horizontal and vertical offsets of the robot according to the first coordinate and a second coordinate of the mark point of the preset feature in the preset standard position in a mechanical arm coordinate system until the spacing distance between the first coordinate and the second coordinate is less than or equal to a second threshold value.
5. The pose correction method according to claim 1, wherein before controlling the execution tip of the robot arm on the robot to move above the target object upon detecting that the robot reaches the target position, further comprising:
and calibrating the robot to obtain the conversion relation.
6. The pose correction method according to claim 5, wherein the conversion relationship includes a conversion matrix;
the calibrating the robot to obtain the conversion relation comprises:
acquiring pixel coordinates of preset calibration points at N different calibration positions;
acquiring third coordinates of preset calibration points at N different calibration positions under a mechanical arm coordinate system;
calculating to obtain the conversion matrix according to the pixel coordinates of the N preset calibration points and the third coordinates of the N preset calibration points under the mechanical arm coordinate system; wherein N is more than or equal to 3.
7. The pose correction method according to claim 1, wherein before controlling the execution tip of the robot arm on the robot to move above the target object upon detecting that the robot reaches the target position, further comprising:
when the robot is under a preset standard position, acquiring a second included angle between the robot and a preset path;
and when the robot is under a preset standard position, acquiring a second coordinate of the mark point of the preset characteristic under a mechanical arm coordinate system.
8. A pose correction apparatus of a robot, characterized by comprising:
the moving module is used for controlling the execution tail end of a mechanical arm on the robot to move above a target object when the robot is detected to reach the target position;
the first acquisition module is used for acquiring the pixel coordinates of a mark point with preset characteristics when the execution tail end moves above the target object and a preset mark is detected in the visual range of the robot; wherein the preset markers comprise marker points of the preset features;
the second acquisition module is used for acquiring a first included angle between the robot and a preset path according to the pixel coordinates of the mark points of the preset characteristics;
the third acquisition module is used for acquiring a first coordinate of the mark point under a mechanical arm coordinate system according to the pixel coordinate of the mark point and the conversion relation between the pixel coordinate system and the mechanical arm coordinate system;
and the correction module is used for correcting the pose of the robot according to the first included angle and the first coordinate.
9. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202011523735.4A 2020-12-21 2020-12-21 Pose correction method and device for robot, terminal device and storage medium Pending CN112720474A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011523735.4A CN112720474A (en) 2020-12-21 2020-12-21 Pose correction method and device for robot, terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011523735.4A CN112720474A (en) 2020-12-21 2020-12-21 Pose correction method and device for robot, terminal device and storage medium

Publications (1)

Publication Number Publication Date
CN112720474A true CN112720474A (en) 2021-04-30

Family

ID=75605261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011523735.4A Pending CN112720474A (en) 2020-12-21 2020-12-21 Pose correction method and device for robot, terminal device and storage medium

Country Status (1)

Country Link
CN (1) CN112720474A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113172636A (en) * 2021-06-29 2021-07-27 深圳市越疆科技有限公司 Automatic hand-eye calibration method and device and storage medium
CN114147725A (en) * 2021-12-21 2022-03-08 乐聚(深圳)机器人技术有限公司 Zero point adjustment method, device, equipment and storage medium for robot
CN114543669A (en) * 2022-01-27 2022-05-27 珠海亿智电子科技有限公司 Mechanical arm calibration method, device, equipment and storage medium
CN115556109A (en) * 2022-10-24 2023-01-03 深圳市通用测试***有限公司 Method and device for positioning mechanical arm in test system
CN117123520A (en) * 2023-02-06 2023-11-28 荣耀终端有限公司 Method for realizing glue wiping of target workpiece and glue wiping equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160079223A (en) * 2014-12-26 2016-07-06 전자부품연구원 Robot control visualization apparatus
CN109623821A (en) * 2018-12-26 2019-04-16 深圳市越疆科技有限公司 The visual guide method of mechanical hand crawl article
CN109754421A (en) * 2018-12-31 2019-05-14 深圳市越疆科技有限公司 A kind of vision calibration method, device and robot controller
CN109807885A (en) * 2018-12-29 2019-05-28 深圳市越疆科技有限公司 A kind of vision calibration method of manipulator, device and intelligent terminal
CN109848951A (en) * 2019-03-12 2019-06-07 易思维(天津)科技有限公司 Automatic processing equipment and method for large workpiece
CN110842928A (en) * 2019-12-04 2020-02-28 中科新松有限公司 Visual guiding and positioning device and method for compound robot
CN111145257A (en) * 2019-12-27 2020-05-12 深圳市越疆科技有限公司 Article grabbing method and system and article grabbing robot
CN111775154A (en) * 2020-07-20 2020-10-16 广东拓斯达科技股份有限公司 Robot vision system
CN111775146A (en) * 2020-06-08 2020-10-16 南京航空航天大学 Visual alignment method under industrial mechanical arm multi-station operation
CN111923053A (en) * 2020-04-21 2020-11-13 广州里工实业有限公司 Industrial robot object grabbing teaching system and method based on depth vision

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160079223A (en) * 2014-12-26 2016-07-06 전자부품연구원 Robot control visualization apparatus
CN109623821A (en) * 2018-12-26 2019-04-16 深圳市越疆科技有限公司 The visual guide method of mechanical hand crawl article
CN109807885A (en) * 2018-12-29 2019-05-28 深圳市越疆科技有限公司 A kind of vision calibration method of manipulator, device and intelligent terminal
CN109754421A (en) * 2018-12-31 2019-05-14 深圳市越疆科技有限公司 A kind of vision calibration method, device and robot controller
CN109848951A (en) * 2019-03-12 2019-06-07 易思维(天津)科技有限公司 Automatic processing equipment and method for large workpiece
CN110842928A (en) * 2019-12-04 2020-02-28 中科新松有限公司 Visual guiding and positioning device and method for compound robot
CN111145257A (en) * 2019-12-27 2020-05-12 深圳市越疆科技有限公司 Article grabbing method and system and article grabbing robot
CN111923053A (en) * 2020-04-21 2020-11-13 广州里工实业有限公司 Industrial robot object grabbing teaching system and method based on depth vision
CN111775146A (en) * 2020-06-08 2020-10-16 南京航空航天大学 Visual alignment method under industrial mechanical arm multi-station operation
CN111775154A (en) * 2020-07-20 2020-10-16 广东拓斯达科技股份有限公司 Robot vision system

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113172636A (en) * 2021-06-29 2021-07-27 深圳市越疆科技有限公司 Automatic hand-eye calibration method and device and storage medium
CN114147725A (en) * 2021-12-21 2022-03-08 乐聚(深圳)机器人技术有限公司 Zero point adjustment method, device, equipment and storage medium for robot
CN114147725B (en) * 2021-12-21 2024-04-02 乐聚(深圳)机器人技术有限公司 Zero point adjustment method, device and equipment for robot and storage medium
CN114543669A (en) * 2022-01-27 2022-05-27 珠海亿智电子科技有限公司 Mechanical arm calibration method, device, equipment and storage medium
CN114543669B (en) * 2022-01-27 2023-08-01 珠海亿智电子科技有限公司 Mechanical arm calibration method, device, equipment and storage medium
CN115556109A (en) * 2022-10-24 2023-01-03 深圳市通用测试***有限公司 Method and device for positioning mechanical arm in test system
CN115556109B (en) * 2022-10-24 2024-06-11 深圳市通用测试***有限公司 Positioning method and device for mechanical arm in test system
CN117123520A (en) * 2023-02-06 2023-11-28 荣耀终端有限公司 Method for realizing glue wiping of target workpiece and glue wiping equipment

Similar Documents

Publication Publication Date Title
CN112720474A (en) Pose correction method and device for robot, terminal device and storage medium
CN110850872A (en) Robot inspection method and device, computer readable storage medium and robot
CN109807885B (en) Visual calibration method and device for manipulator and intelligent terminal
US20150092058A1 (en) System, Vehicle and Method for Online Calibration of a Camera on a Vehicle
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
CN111637877B (en) Robot positioning method and device, electronic equipment and nonvolatile storage medium
CN111127497B (en) Robot and stair climbing control method and device thereof
CN111537967B (en) Radar deflection angle correction method and device and radar terminal
CN112967347B (en) Pose calibration method, pose calibration device, robot and computer readable storage medium
CN111113422A (en) Robot positioning method and device, computer readable storage medium and robot
CN114943952A (en) Method, system, device and medium for obstacle fusion under multi-camera overlapped view field
CN115546313A (en) Vehicle-mounted camera self-calibration method and device, electronic equipment and storage medium
JP7137464B2 (en) Camera calibration device, camera calibration method, and program
CN112945586B (en) Chassis deflection calibration method and device and unmanned automobile
CN113911110A (en) Parking track correction method and system, electronic device and storage medium
CN103685936A (en) WIDE field of view camera image calibration and de-warping
CN111157012B (en) Robot navigation method and device, readable storage medium and robot
CN110083158B (en) Method and equipment for determining local planning path
US11403770B2 (en) Road surface area detection device
CN115311332A (en) Automatic guided vehicle butt joint method and device
CN113635299B (en) Mechanical arm correction method, terminal device and storage medium
CN111343565A (en) Positioning method and terminal equipment
CN115482296A (en) Camera external parameter calibration method, system and non-volatile computer readable storage medium
CN109615658B (en) Method and device for taking articles by robot, computer equipment and storage medium
JP2019219204A (en) Travel trajectory estimation method and travel trajectory estimation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210430

RJ01 Rejection of invention patent application after publication