CN115922728B - Robot pointing motion control method, apparatus, electronic device, and storage medium - Google Patents

Robot pointing motion control method, apparatus, electronic device, and storage medium Download PDF

Info

Publication number
CN115922728B
CN115922728B CN202310009464.8A CN202310009464A CN115922728B CN 115922728 B CN115922728 B CN 115922728B CN 202310009464 A CN202310009464 A CN 202310009464A CN 115922728 B CN115922728 B CN 115922728B
Authority
CN
China
Prior art keywords
mechanical arm
robot
target object
arm
tail end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310009464.8A
Other languages
Chinese (zh)
Other versions
CN115922728A (en
Inventor
朱世强
黄秋兰
王凡
谢安桓
梁定坤
顾建军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202310009464.8A priority Critical patent/CN115922728B/en
Publication of CN115922728A publication Critical patent/CN115922728A/en
Application granted granted Critical
Publication of CN115922728B publication Critical patent/CN115922728B/en
Priority to PCT/CN2023/118887 priority patent/WO2024037658A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application provides a robot pointing action control method, a robot pointing action control device, electronic equipment and a storage medium. The method comprises the following steps: under the condition that a target object exceeds the reachable range of the tail end of a mechanical arm of a robot, selecting a mechanical arm which is closer to the target object in double mechanical arms of the robot according to the target object coordinates of the target object in a robot coordinate system; according to the mapping of the target object coordinates, the shoulder joint coordinates of the selected mechanical arm in a robot coordinate system and a preset rule of pointing actions, the pose of pointing to the target object in the reachable range of the mechanical arm end of the selected mechanical arm is obtained; determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint according to the obtained pose of the tail end of the mechanical arm; and controlling the selected mechanical arm to point to the target object according to the track.

Description

Robot pointing motion control method, apparatus, electronic device, and storage medium
Technical Field
The present invention relates to the field of anthropomorphic motion technology for a robotic arm, and in particular, to a method and apparatus for controlling a pointing motion of a robot, an electronic device, and a storage medium.
Background
Along with the development of science and technology, the application of robots has entered into the aspects of people's life, such as many markets, restaurants, exhibition halls, etc. have been introduced into service robots, but in the current robot and human interaction technology, the voice and vision technology is relatively mature, and the aspect of limb motion control is relatively deficient, so that the robot is relatively stiff in interaction with human. If the body language is increased on the basis, the robot is more anthropomorphic when interacting with a person.
In the related art, when a robot arm of a robot performs an action, the robot does not perform any processing on a target object when the target object is out of the reachable range of the robot arm. Thus, the application scene of the mechanical arm is limited.
Disclosure of Invention
The application of the robot pointing action control method, the device, the electronic equipment and the storage medium is realized, the application of a target object scene is realized, and the applicability of the robot is improved.
The application provides a robot pointing action control method, which comprises the following steps:
under the condition that a target object exceeds the reachable range of the tail end of a mechanical arm of a robot, selecting a mechanical arm which is closer to the target object in double mechanical arms of the robot according to the target object coordinates of the target object in a robot coordinate system;
According to the mapping of the target object coordinates, the shoulder joint coordinates of the selected mechanical arm in a robot coordinate system and a preset rule of pointing actions, the pose of pointing to the target object in the reachable range of the mechanical arm end of the selected mechanical arm is obtained;
determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint according to the obtained pose of the tail end of the mechanical arm;
and controlling the selected mechanical arm to point to the target object according to the track.
Further, the obtaining the pose of pointing to the target object within the reachable range of the arm end of the selected mechanical arm according to the mapping of the target object coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system, and the predetermined rule of pointing actions includes:
mapping the target object coordinates to coordinates within the reach of the tail end of the mechanical arm of the selected mechanical arm;
determining the appointed direction of the tail end of the mechanical arm of the selected mechanical arm according to the preset rule of the pointing action;
and generating the gesture of the tail end of the mechanical arm of the selected mechanical arm according to the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the appointed direction.
Further, the mapping the target object coordinates to coordinates within the reach of the arm end of the selected arm includes: according to the included angle between the vector of the shoulder joint of the selected mechanical arm pointing to the target object in the robot coordinate system and the X axis of the robot coordinate system and the arm length of the selected mechanical arm, the X coordinate and the Y coordinate of the tail end of the mechanical arm of the selected mechanical arm are obtained; obtaining the Z coordinate of the tail end of the mechanical arm of the selected mechanical arm according to the relation between the Z-direction coordinate of the target object in the robot coordinate system, the coordinate of the shoulder joint of the selected mechanical arm and the arm length;
and/or the number of the groups of groups,
generating the gesture of the arm end of the selected mechanical arm according to the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the specified direction, including:
determining a unit vector of a vector of the target object pointing to the tail end of the mechanical arm as 3Z-direction components of a gesture matrix of the tail end of the mechanical arm of the selected mechanical arm, wherein the gesture matrix of the tail end of the mechanical arm of the selected mechanical arm is a matrix of 3 rows and 3 columns;
determining that 3Y-direction components of the gesture matrix for forming a Y-direction vector are positive when the tail end of the mechanical arm of the selected mechanical arm points to a target object and 3 components of the Z-direction of the gesture matrix are positive and the selected mechanical arm is a right mechanical arm;
Determining that 3Y-direction components of the gesture matrix for forming a Y-direction vector are negative when the tail end of the mechanical arm of the selected mechanical arm points to a target object and 3 components of the Z-direction of the gesture matrix are positive and the selected mechanical arm is a left mechanical arm;
according to a right hand rule, the 3Z-direction components and the 3Y-direction components of a gesture matrix of the tail end of the mechanical arm of the selected mechanical arm are determined, and the 3X-direction components of the gesture matrix for forming an X-direction vector are determined.
Further, after the controlling the selected mechanical arm to point to the target object according to the track, the method further includes:
and testing the pose of the tail end of the obtained mechanical arm by using a measuring device, and determining the error of the pointing action of the tail end of the mechanical arm of the selected mechanical arm to the target object.
Further, the measuring device is used for testing the pose of the obtained tail end of the mechanical arm, and determining the error of the pointing action of the tail end of the mechanical arm of the selected mechanical arm to the target object comprises the following steps:
determining a vector of the target object pointing to the arm end of the selected arm;
and determining an included angle between the vector and a Z-direction component of a gesture matrix of the tail end of the mechanical arm of the selected mechanical arm as the error.
Further, determining, according to the obtained pose of the arm end, a trajectory from the current pose of the arm end of the selected arm along each joint to the target pose of each joint, including:
according to the obtained pose of the tail end of the mechanical arm, solving inverse kinematics to obtain the target pose of each joint of the mechanical arm;
and obtaining the track from the current pose to the target pose of each joint at the tail end of the mechanical arm of the selected mechanical arm through track interpolation.
Further, according to the obtained pose of the tail end of the mechanical arm, solving inverse kinematics to obtain a target pose of each joint of the mechanical arm, including:
according to the pose of the tail end of the mechanical arm, solving inverse kinematics to obtain a target configuration of the selected mechanical arm, wherein the target configuration comprises target angles of all joints;
and performing track interpolation according to the current configuration of the robot and the target configuration, and controlling the mechanical arm to move from the current configuration of the robot to the target configuration of the robot pointing to the target object, wherein the current configuration of the robot comprises the current angles of the joints.
The application provides a directional action control device of robot includes:
The mechanical arm selection module is used for selecting a mechanical arm which is closer to the target object in the double mechanical arms of the robot according to the target object coordinates of the target object in a robot coordinate system under the condition that the target object exceeds the reachable range of the tail end of the mechanical arm of the robot;
the pose determining module at the tail end of the mechanical arm is used for obtaining the pose of the selected mechanical arm, which points to the target object, within the reachable range of the tail end of the mechanical arm according to the mapping of the target object coordinates, the shoulder joint coordinates of the selected mechanical arm in the robot coordinate system and the preset rules of the pointing action;
the track determining module is used for determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint according to the obtained pose of the tail end of the mechanical arm;
and the action control module is used for controlling the selected mechanical arm to point to the target object according to the track.
The application provides an electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of the preceding claims.
The present application provides a computer readable storage medium having stored thereon computer instructions which when executed by a processor implement a method as described in any of the above.
In some embodiments, according to the method for controlling the pointing motion of the robot, under the condition that the target object exceeds the reachable range of the tail end of the mechanical arm of the robot, according to the mapping of the coordinates of the target object, the shoulder joint coordinates of the selected mechanical arm in the robot coordinate system and the preset rule of the pointing motion obtain the pose of the pointing target object within the reachable range of the tail end of the mechanical arm of the selected mechanical arm, so as to complete the pointing motion of the selected mechanical arm pointing to the target object. Therefore, the application of pointing to the target object scene can be realized, and the applicability of the robot is improved.
Drawings
Fig. 1 is a schematic flow chart of a robot pointing motion control method according to an embodiment of the present application;
FIG. 2 is a schematic diagram showing a specific flow of the robot pointing motion control method shown in FIG. 1;
FIG. 3 is a simplified schematic diagram of a robot coordinate system and a target object of the robot pointing motion control method shown in FIG. 1;
FIG. 4a is a schematic view showing a current configuration of a robot of the robot pointing motion control method shown in FIG. 1;
FIG. 4b is a schematic view of a target configuration of the robot pointing motion control method of FIG. 1;
fig. 5 is a schematic flow chart of measuring motion accuracy of the robot pointing motion control method according to the embodiment of the present application;
fig. 6 is a schematic block diagram of a robot pointing motion control device according to an embodiment of the present application;
fig. 7 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The embodiments described in the following exemplary embodiments are not intended to represent all embodiments consistent with one or more embodiments of the present specification. Rather, they are merely examples of apparatus and methods consistent with aspects of one or more embodiments of the present description as detailed in the accompanying claims.
It should be noted that: in other embodiments, the steps of the corresponding method are not necessarily performed in the order shown and described in this specification. In some other embodiments, the method may include more or fewer steps than described in this specification. Furthermore, individual steps described in this specification, in other embodiments, may be described as being split into multiple steps; while various steps described in this specification may be combined into a single step in other embodiments.
In order to solve the technical problem that the application scene of the mechanical arm is limited, the embodiment of the application provides a robot pointing action control method, wherein under the condition that a target object exceeds the reachable range of the tail end of the mechanical arm of a robot, the mechanical arm which is closer to the target object in the double mechanical arms of the robot is selected according to the target object coordinates of the target object in a robot coordinate system; according to the mapping of the coordinates of the target object, the coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the preset rule of the pointing action, the pose of the selected mechanical arm pointing to the target object within the reachable range of the tail end of the mechanical arm is obtained; determining the track from the current pose of the selected mechanical arm along each joint to the target pose of each joint according to the obtained pose of the tail end of the mechanical arm; and controlling the selected mechanical arm to point to the target object according to the track.
In the embodiment of the application, under the condition that the target object exceeds the reachable range of the tail end of the mechanical arm of the robot, according to the mapping of the coordinates of the target object, the position and the posture of the shoulder joint of the selected mechanical arm pointing to the target object within the reachable range of the tail end of the mechanical arm of the selected mechanical arm are obtained according to the predetermined rules of the shoulder joint coordinates and pointing actions of the shoulder joint of the selected mechanical arm in the robot coordinate system, so that the pointing action of the selected mechanical arm pointing to the target object is completed. Therefore, the application of pointing to the target object scene can be realized, and the applicability of the robot is improved.
The robot pointing action control method is applied to an application scene of a robot pointing target object. The robot pointing target object application scenario may include, but is not limited to, a scenario that includes a robot showroom explanation or announce presentation. Presentation content may include, but is not limited to, video, lectures, instructional content. The target object may include, but is not limited to, specific content including presentation content.
The following describes the specific implementation process of the robot pointing motion control method in detail.
Fig. 1 is a flow chart of a robot pointing motion control method according to an embodiment of the present application.
As shown in fig. 1, the robot pointing motion control method includes the following steps 110 to 140:
step 110, selecting a mechanical arm closer to the target object in the double mechanical arms of the robot according to the target object coordinates of the target object in the robot coordinate system under the condition that the target object exceeds the reach range of the tail end of the mechanical arm of the robot. As such, the target object being beyond the reach of the robot's arm end indicates that the target object is outside the reach of the robot's arm end. And the target object coordinates of the target object in the robot coordinate system enable the target object and the double mechanical arms of the robot to be in the same robot coordinate system, and the selected mechanical arms can be conveniently and rapidly obtained.
The step 110 can implement the pointing action by using the mechanical arm closer to the target object, thereby improving the pointing convenience, meeting the user requirements and being more beneficial to operation implementation.
There are a variety of implementations of step 110 described above. In one implementation manner of the step 110, in a first step, according to the target object coordinates of the target object in the robot coordinate system, it is determined that the target object is located in two regions corresponding to the two mechanical arms of the robot. And a second step, selecting the mechanical arm corresponding to the region where the target object is located, and selecting the mechanical arm of which the target object is closer to the double mechanical arms of the robot.
In another implementation of step 110, in a first step, a vector between the target object and the respective shoulder joints of the robotic arm is determined, and a mapping distance on a horizontal coordinate plane of the robot coordinate system is determined. And a second step of determining the corresponding mechanical arm with the shorter mapping distance as the mechanical arm with the target object closer to the double mechanical arms of the robot.
In another embodiment of step 110, the left shoulder joint coordinate in the robot coordinate system may be obtained according to the dimensions of each joint of the robot, including, for example, the height h of the shoulder joint and the width w of the shoulder r p ls = (0, w, h) and right shoulder joint coordinates r p rs = (0, -w, h), respectively calculating the euclidean distance between the left shoulder joint, the right shoulder joint and the target object in the robot coordinate system, and selecting the mechanical arm with small distance as the selected mechanical arm to execute the pointing action.
And 120, according to the mapping of the coordinates of the target object, the coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the preset rule of the pointing action, obtaining the pose of the pointing target object within the reachable range of the mechanical arm end of the selected mechanical arm.
Wherein, the predetermined rule of the pointing action is used as the constraint condition of the pointing action, which can provide reliable guarantee for the execution of the subsequent specified action.
The step 120 may obtain the location point pointing to the target object within the reach of the arm end of the selected arm through mapping transformation of the coordinates of the target object. This allows for subsequent control of the pointing of the selected manipulator towards the target object.
The pose of the arm end of the selected arm pointing to the target object within the reach of the arm end of the selected arm includes the position and the pose of the arm end of the selected arm of the robot under the robot coordinate system.
There are a variety of embodiments of determining the position of the arm end of the selected arm in the robot coordinate system at step 120 above. In some embodiments, a position point, where the distance from the shoulder of the selected manipulator is smaller than the arm length of the selected manipulator, may be determined in a direction in which the manipulator end of the selected manipulator points to the target object, as the target object coordinate map to a position within the reach of the manipulator end of the selected manipulator.
In other embodiments, the arm length of the selected arm may be multiplied by a reduction coefficient in a direction in which the arm end of the selected arm points to the target object, to obtain a reduced position point within the reach, and the reduced position point is mapped as the target object coordinate to a position within the reach of the arm end of the selected arm. The reduction coefficient is larger than 0 and smaller than 1, so that the tail end of the mechanical arm is guaranteed to be positioned in an reachable range. For details, see below.
In still other embodiments, the arm length of the selected manipulator may be reduced by a predetermined length in a direction in which the manipulator end of the selected manipulator points to the target object, resulting in a reduced position point within the reach as the target object coordinate map to a position within the reach of the manipulator end of the selected manipulator. Wherein the predetermined length is greater than 0 and less than the arm length. In order to ensure that the selected mechanical arm is stretched as much as possible, the pointing range of the tail end of the mechanical arm is increased, and the tail end of the mechanical arm is in a reachable range, and the preset length is as small as possible than the arm length. For details, see below.
Of course, other positions of the robot arm end of the selected robot arm under the robot coordinate system may be determined, which are all within the scope of the embodiments of the present application, and are not exemplified herein.
And 130, determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint according to the obtained pose of the tail end of the mechanical arm. The trajectory of step 130 described above may be used as a basis for control of the pointing of the selected robotic arm to the target object.
And 140, controlling the selected mechanical arm to point to the target object according to the track so as to finish the pointing action of the tail end of the mechanical arm of the selected mechanical arm to point to the target object.
Fig. 2 is a schematic flow chart of a method for controlling the pointing motion of the robot shown in fig. 1.
As shown in fig. 2, before the step 110, the method may further include, but is not limited to, the following steps 101 to 104 to determine that the target object is beyond the reach of the arm end of the robot:
step 101, establishing a robot world coordinate system, a robot coordinate system and a local coordinate system of each joint of the double mechanical arms.
The robot coordinate system refers to a base coordinate system of the robot.
Step 102, obtaining the angle that the target object is in the front of the robot according to the included angle between the vector of the coordinates of the target object in the robot coordinate system, which points to the origin of the robot coordinate system, and the X axis of the robot coordinate system.
Specifically, the step 102 may be implemented by, but not limited to, the following 3 steps:
(1) Calculating the coordinates of the robot in the world coordinate system of the robot w p robot Pointing to target object coordinates w p target Vector of (3)
Figure BDA0004035918380000081
Included angle α with the X-axis of the robot world coordinate system:
Figure BDA0004035918380000091
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0004035918380000092
for vector->
Figure BDA0004035918380000093
Y-direction component>
Figure BDA0004035918380000094
For vector->
Figure BDA0004035918380000095
Is included in the X-direction component of (a).
(2) Calculating the relation between the included angle alpha and the orientation angle theta of the front face of the robot under the world coordinate system, so as to obtain the angle beta of the chassis of the robot needing to rotate:
Figure BDA00040359183800000912
Figure BDA0004035918380000096
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0004035918380000097
for the difference between the included angle alpha and the orientation angle theta of the front face of the robot in the world coordinate system, the value can exceed [ -pi, pi]And obtaining beta as the minimum rotation angle of the chassis through the operation.
(3) After the robot rotation is obtained, the orientation angle gamma in the world coordinate system is as follows:
Figure BDA0004035918380000098
step 103, according to the coordinates of the robot in the world coordinate system w p robot And the orientation angle gamma, the target object coordinates of the target object in the world coordinate system w p target The coordinates converted into the robot coordinate system are as follows:
Figure BDA0004035918380000099
Figure BDA00040359183800000910
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA00040359183800000911
is the inverse matrix of the gesture matrix of the robot in the world coordinate system, -1 is the inverse matrix, R robot Is a gesture matrix of the robot in a world coordinate system, r p target The target is the coordinate of the target object under the robot coordinate system, the target is the target object, and the robot is the robot.
Step 104, determining that the target object is beyond the reach of the tail end of the mechanical arm of the robot. In the embodiment of step 104, if the target object is out of the maximum reach of the end of the arm of the robot, it is determined that the target object is beyond the reach of the end of the arm of the robot. In another embodiment of the step 104, if the distance between the target object and the shoulder joint of the robot is greater than the arm length of the mechanical arm of the robot, it is determined that the target object is beyond the reach of the end of the mechanical arm of the robot.
Fig. 3 is a simplified schematic diagram of a robot coordinate system and a target object of the robot pointing motion control method shown in fig. 1.
As shown in fig. 3, the XY plane coordinate system is a coordinate system in which the robot base coordinate system is projected on the XY plane. The XY plane coordinate system includes an XY plane coordinate system origin O corresponding to the head position 21 of the robot, a left shoulder joint 22 of the left robot arm, a right shoulder joint 23 of the right robot arm, a target object 24, a position point 25 within the reach of the arm end of the selected robot arm, and the like. How to determine the three-dimensional coordinates of the position point 25 in the robot coordinate system, i.e., the X-coordinate, the Y-coordinate, and the Z-coordinate, is described in detail below.
With continued reference to fig. 3, the above-mentioned step 120 may include, but is not limited to, the following steps 121 to 123. Step 121, mapping the coordinates of the target object to coordinates within the reach of the arm end of the selected arm.
In some embodiments of the step 121, the 1 st step obtains the X-coordinate and the Y-coordinate of the arm end of the selected arm according to the angle between the vector of the shoulder joint of the selected arm pointing to the target object in the robot coordinate system and the X-axis of the robot coordinate system, and the arm length of the selected arm.
Step 121 above is exemplary, based on the shoulder joint of the selected manipulator in the robot coordinate system r p s Pointing to a target object r p targer Vector of (3)
Figure BDA0004035918380000101
An included angle eta between the robot and the X axis of the robot coordinate system and the arm length L of the selected mechanical arm, and the tail end of the mechanical arm is obtained r p e X, Y coordinates of (c):
Figure BDA0004035918380000102
wherein the setting of K will be such that the distance between the location point 25 and the shoulder joint L of the selected robot arm is smaller than the arm length L of the selected robot arm.
And 2, obtaining the Z coordinate of the tail end of the mechanical arm of the selected mechanical arm according to the relation between the Z-direction coordinate of the target object in the robot coordinate system, the coordinate of the shoulder joint of the selected mechanical arm and the arm length.
Figure BDA0004035918380000103
Where δ is a number greater than zero and less than L, which may be a traversal value within a range.
The above equation (1) is used to describe that in the Z direction, the target object is located below the shoulder joint, the arm end of the selected arm is lower than the shoulder joint, and the arm passes through the shoulder joint r p s (z) subtracting δ, the robotic arm end simulation conforming to the robot is directed to a target object below the shoulder joint.
The above formula (2) is used to describe that in the Z direction, the position of the target object is higher than the shoulder joint and lower than the total height of the shoulder joint and the mechanical arm, and the position of the target object is higher than the shoulder joint and within the reach range of the mechanical arm, and the target object passes through r p target (z) subtracting delta so that the end of the mechanical arm of the robot can point to the target object at a position which is lower than the target object, and thus the method is more suitable for the target object to which the end of the mechanical arm of the robot is simulated to point.
The above formula (3) is used to explain that the target object is located higher than the total height of the shoulder joint and the arm in the Z direction, by the total height of the shoulder joint and the arm r p s (z) +L minus δ, this robot arm pointing to a point below the target object. In this way, the mechanical arm end of the selected mechanical arm can be within the reachable range, and the mechanical arm end of the selected mechanical arm is as maximum as possible The target object is pointed within the reach.
Step 122, determining the designated direction of the arm end of the selected arm according to the predetermined rule of the pointing action.
And step 123, generating the gesture of the tail end of the mechanical arm of the selected mechanical arm according to the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the appointed direction.
In some embodiments of the step 123, step 1, the unit vector of the target object directed to the arm end is determined as 3Z-direction components for forming the Z-direction vector of the pose matrix of the arm end of the selected arm, and the pose matrix of the arm end of the selected arm is a 3-row 3-column matrix. Illustratively, the pose matrix R of the arm tip of the selected arm is represented as follows:
Figure BDA0004035918380000111
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0004035918380000112
for the X-direction vector, ax, ay, az are respectively 3X-direction components, which 3X-direction components belong to the first column +.>
Figure BDA0004035918380000113
For the Y-direction vector, ox, oy, oz are 3Y-direction components, respectively, which 3Y-direction components belong to the second column +.>
Figure BDA0004035918380000114
For the Z-direction vector, nx, ny and nz are 3Z-direction components, respectively, which 3Z-direction components belong to the third column +.>
Figure BDA0004035918380000115
The above step 1 can be performed by the following procedure r p target Pointing to the arm end of the selected arm r p e Vector of (3)
Figure BDA0004035918380000121
Is the third column of the pose matrix of the arm end of the selected arm +.>
Figure BDA0004035918380000122
Namely:
Figure BDA0004035918380000123
Figure BDA0004035918380000124
step 2, determining that 3Y-direction components of the gesture matrix for forming a Y-direction vector are positive when the tail end of the mechanical arm of the selected mechanical arm points to the target object and 3 components of the gesture matrix in the Z direction are positive and the selected mechanical arm is the right mechanical arm;
and 3, determining that 3Y-direction components of the gesture matrix for forming a Y-direction vector are negative when the tail end of the mechanical arm of the selected mechanical arm points to the target object and 3 components of the gesture matrix in the Z direction are positive and the selected mechanical arm is the left mechanical arm.
Fig. 4a is a schematic view showing a current configuration of the robot pointing motion control method shown in fig. 1. Fig. 4b is a schematic view showing a target configuration of the robot pointing motion control method shown in fig. 1.
As shown in fig. 4a and 4b, the base center position O of the base coordinate system of the robot r . The dual robot includes a left robot (not shown) and a right robot 30. Taking the right arm 30 as an example of the selected arm, the right arm 30 may include, but is not limited to, a right shoulder joint 31, a right large arm 32, a right small arm 33, and a right palm 34. The left mechanical arm is the same and will not be described in detail here.
As further shown in connection with fig. 4a and 4b, the above-mentioned steps 2 and 3 may be implemented by the following steps:
according to the rule that the palm is inclined upwards when the mechanical arm points to the target object, the matrix at the tail end of the mechanical arm can be known when the mechanical arm points to the target object
Figure BDA0004035918380000125
The Z-direction component of (2) is positive, and the tail end gesture of the mechanical arm is +.>
Figure BDA0004035918380000126
Is positive and is used to indicate palm up. While the Y-direction component is negative when the left arm is pointing at the target object, here the right arm as shown in fig. 4a and 4b is taken as an example:
Figure BDA0004035918380000131
the value of oz affects the upward amplitude of the palm, which can be selected according to the pointing effect, and this embodiment selects oz=0.4, and can also float around the value according to the actual solving result, so that it can be obtained:
Figure BDA0004035918380000132
setting:
Figure BDA0004035918380000133
thereby, can obtain:
Figure BDA0004035918380000134
the left mechanical arm is the same and will not be described in detail here.
And 4, determining 3Z-direction components and 3Y-direction components of a gesture matrix at the tail end of the mechanical arm of the selected mechanical arm according to a right hand rule, and determining 3X-direction components of the gesture matrix for forming an X-direction vector. Thus, compared with the realization process of solving the nonlinear equation, the method is more convenient, and does not depend on the existing library, such as the library of solving the nonlinear equation of matlab, so that the calculation speed is improved.
The 4 th step can be realized by the following steps: according to the right rule, the second column of the arm end gesture matrix can be used for
Figure BDA0004035918380000135
And third column->
Figure BDA0004035918380000136
The first column is calculated:
Figure BDA0004035918380000137
the above step 130 may further include, but is not limited to, the following 2 steps. And 1, solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain the target pose of each joint of the mechanical arm. And 2, obtaining the track from the current pose to the target pose of each joint at the tail end of the mechanical arm of the selected mechanical arm through track interpolation.
Wherein, the step 1 may further include, but is not limited to:
according to the pose of the tail end of the mechanical arm, solving inverse kinematics to obtain a target configuration of the selected mechanical arm, wherein the target configuration comprises target angles of all joints;
and performing track interpolation according to the current configuration and the target configuration of the robot, and controlling the mechanical arm to move from the current configuration of the robot to the target configuration of the robot pointing to the target object, wherein the current configuration of the robot comprises the current angles of all joints.
Fig. 5 is a schematic flow chart of measuring motion accuracy of the robot pointing motion control method according to the embodiment of the present application.
As shown in connection with fig. 1-5, following step 140 described above, the method may, but is not limited to, further include: and 150, testing the pose of the tail end of the obtained mechanical arm by using a measuring device, and determining the error of the pointing action of the tail end of the mechanical arm of the selected mechanical arm to the target object. Therefore, after the mechanical arm is controlled to point to the target, the deviation of the mechanical arm to the target is determined, so that the precision of the pointing action is measured, and the subsequent adjustment and control optimization are convenient.
In some embodiments, testing the pose of the obtained arm tip using a measurement device, determining an error in the pointing motion of the arm tip of the selected arm toward the target object, comprising: first, a vector is determined for the target object to point to the arm end of the selected arm. Specifically, measuring the pose of the tail end of the mechanical arm by using measuring equipment; and calculating the vector of the target object pointing to the tail end of the mechanical arm according to the position of the target object and the measured position of the tail end of the mechanical arm. And secondly, determining an included angle between the vector and a Z-direction component of a gesture matrix at the tail end of the mechanical arm of the selected mechanical arm as the error.
Further, the above step 150 may further include, but is not limited to, the following 1) to 3):
1) The pose of the tail end of the mechanical arm is tested by using a laser tracker, the coordinate system used for measurement coincides with the robot coordinate system, and the test result is the actual pose of the tail end of the mechanical arm in the robot coordinate system r p e ′, r R e ′);
2) From the test results in step 1) above, the coordinates of the target object in the robot coordinate system r p target Calculating vector of target object pointing to tail end of mechanical arm
Figure BDA0004035918380000141
3) Gesture matrix r R e Vector represented by the third column of
Figure BDA0004035918380000151
Vector +. >
Figure BDA0004035918380000152
Is the angle of the pointing action, namely the error delta:
Figure BDA0004035918380000153
fig. 6 is a schematic block diagram of a robot pointing motion control device according to an embodiment of the present application.
As shown in fig. 6, the robot pointing motion control device includes the following modules:
the mechanical arm selection module 41 is configured to select a mechanical arm closer to the target object in the dual mechanical arms of the robot according to the target object coordinates of the target object in the robot coordinate system when the target object exceeds the reach of the mechanical arm end of the robot;
the pose determining module 42 of the arm end is configured to obtain a pose of the selected arm pointing to the target object within an reachable range of the arm end according to mapping of coordinates of the target object, shoulder coordinates of a shoulder joint of the selected arm in a robot coordinate system, and a predetermined rule of pointing actions;
the track determining module 43 is configured to determine, according to the obtained pose of the end of the mechanical arm, a track from the current pose of each joint of the selected mechanical arm to the target pose of each joint;
the motion control module 44 is configured to control the selected manipulator to point to the target object according to the trajectory.
In some embodiments, the pose determining module of the mechanical arm end includes:
The coordinate mapping sub-module is used for mapping the target object coordinate to the coordinate within the reachable range of the tail end of the mechanical arm of the selected mechanical arm;
the specified direction determining submodule is used for determining the specified direction of the tail end of the mechanical arm of the selected mechanical arm according to the preset rule of the pointing action;
and the gesture generating sub-module is used for generating the gesture of the tail end of the mechanical arm of the selected mechanical arm according to the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the appointed direction.
In some embodiments, the coordinate mapping submodule is specifically configured to: according to the included angle between the vector of the shoulder joint of the selected mechanical arm pointing to the target object in the robot coordinate system and the X axis of the robot coordinate system and the arm length of the selected mechanical arm, the X coordinate and the Y coordinate of the tail end of the mechanical arm of the selected mechanical arm are obtained; obtaining the Z coordinate of the tail end of the mechanical arm of the selected mechanical arm according to the relation between the Z-direction coordinate of the target object in the robot coordinate system, the coordinate of the shoulder joint of the selected mechanical arm and the arm length;
and/or the number of the groups of groups,
the gesture generation sub-module at the tail end of the mechanical arm is specifically used for:
Determining a unit vector of a vector of the target object pointing to the tail end of the mechanical arm as 3Z-direction components of a gesture matrix of the tail end of the mechanical arm of the selected mechanical arm, wherein the gesture matrix of the tail end of the mechanical arm of the selected mechanical arm is a matrix of 3 rows and 3 columns;
determining that 3Y-direction components of the gesture matrix for forming a Y-direction vector are positive when the tail end of the mechanical arm of the selected mechanical arm points to a target object and 3 components of the Z-direction of the gesture matrix are positive and the selected mechanical arm is a right mechanical arm;
determining that 3Y-direction components of the gesture matrix for forming a Y-direction vector are negative when the tail end of the mechanical arm of the selected mechanical arm points to a target object and 3 components of the Z-direction of the gesture matrix are positive and the selected mechanical arm is a left mechanical arm;
according to a right hand rule, the 3Z-direction components and the 3Y-direction components of a gesture matrix of the tail end of the mechanical arm of the selected mechanical arm are determined, and the 3X-direction components of the gesture matrix for forming an X-direction vector are determined.
In some embodiments, after the controlling the pointing motion of the selected manipulator to point to the target object according to the trajectory, the apparatus further includes: and the error determination module of the pointing action is used for determining the error of the pointing action of the tail end of the mechanical arm pointing to the target object by using the measuring equipment to test the pose of the tail end of the mechanical arm.
In some embodiments, the error determination module of the pointing action is specifically configured to: determining a vector of the target object pointing to the arm end of the selected arm; and determining the included angle between the vector and the Z-direction component of the gesture matrix at the tail end of the mechanical arm of the selected mechanical arm as the error.
In some embodiments, the trajectory determination module includes:
the target gesture determining submodule is used for solving inverse kinematics according to the obtained gesture of the tail end of the mechanical arm to obtain the target gesture of each joint of the mechanical arm;
and the track determination submodule is used for obtaining the track from the current pose to the target pose of each joint at the tail end of the mechanical arm of the selected mechanical arm through track interpolation.
In some embodiments, the pose determining module of the tail end of the mechanical arm is specifically configured to: according to the pose of the tail end of the mechanical arm, solving inverse kinematics to obtain a target configuration of the selected mechanical arm, wherein the target configuration comprises target angles of all joints; and performing track interpolation according to the current configuration of the robot and the target configuration, and controlling the mechanical arm to move from the current configuration of the robot to the target configuration of the robot pointing to the target object, wherein the current configuration of the robot comprises the current angles of the joints.
The implementation process of the functions and roles of each module in the above device is specifically shown in the implementation process of the corresponding steps in the above method, and will not be described herein again.
Fig. 7 is a block diagram of an electronic device 50 according to an embodiment of the present application.
As shown in fig. 7, the electronic device 50 includes one or more processors 51 for implementing the robot pointing motion control method as described above.
In some embodiments, electronic device 50 may include memory 59, and memory 59 may store programs that may be invoked by processor 51, and may include non-volatile storage media. In some embodiments, electronic device 50 may include memory 58 and interface 57. In some embodiments, electronic device 50 may also include other hardware depending on the actual application.
The memory 59 of the embodiment of the present application has stored thereon a program for implementing the robot pointing action control method described above when executed by the processor 51.
The present application may take the form of a computer program product that is implemented on one or more memories 59 (including but not limited to disk memory, CD-ROM, optical storage, etc.) having program code embodied therein. Memory 59 includes both permanent and non-permanent, removable and non-removable media, and may be implemented in any method or technology for storage of information. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of memory 59 include, but are not limited to: phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by the computing device.
The functional service robot of the embodiment of the application is used for pointing to the display information containing the target object, and comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the method when calling the computer program in the memory.
The foregoing description of the preferred embodiments is provided for the purpose of illustration only, and is not intended to limit the scope of the disclosure, since any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the disclosure are intended to be included within the scope of the disclosure.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the phrase "comprising one … …" does not exclude the presence of additional identical elements in a process, method, article, or apparatus that comprises the depicted element.

Claims (10)

1. A robot pointing motion control method, comprising:
under the condition that a target object exceeds the reachable range of the tail end of a mechanical arm of a robot, selecting a mechanical arm which is closer to the target object in double mechanical arms of the robot according to the target object coordinates of the target object in a robot coordinate system;
according to the mapping of the target object coordinates, the shoulder joint coordinates of the selected mechanical arm in a robot coordinate system and a preset rule of pointing actions, the pose of pointing to the target object in the reachable range of the mechanical arm end of the selected mechanical arm is obtained;
determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint according to the obtained pose of the tail end of the mechanical arm;
and controlling the selected mechanical arm to point to the target object according to the track.
2. The method for controlling pointing motion of a robot according to claim 1, wherein the obtaining the pose of the selected robot arm pointing to the target object within the reach of the arm end according to the mapping of the target object coordinates, the shoulder joint coordinates of the selected robot arm in the robot coordinate system, and the predetermined rule of the pointing motion comprises:
Mapping the target object coordinates to coordinates within the reach of the tail end of the mechanical arm of the selected mechanical arm;
determining the appointed direction of the tail end of the mechanical arm of the selected mechanical arm according to the preset rule of the pointing action;
and generating the gesture of the tail end of the mechanical arm of the selected mechanical arm according to the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the appointed direction.
3. The robot pointing motion control method of claim 2, wherein mapping the target object coordinates to coordinates within reach of a robot arm end of the selected robot arm comprises: according to the included angle between the vector of the shoulder joint of the selected mechanical arm pointing to the target object in the robot coordinate system and the X axis of the robot coordinate system and the arm length of the selected mechanical arm, the X coordinate and the Y coordinate of the tail end of the mechanical arm of the selected mechanical arm are obtained; obtaining the Z coordinate of the tail end of the mechanical arm of the selected mechanical arm according to the relation between the Z-direction coordinate of the target object in the robot coordinate system, the coordinate of the shoulder joint of the selected mechanical arm and the arm length;
and/or the number of the groups of groups,
generating the gesture of the arm end of the selected mechanical arm according to the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the specified direction, including:
Determining a unit vector of a vector of the target object pointing to the tail end of the mechanical arm as 3Z-direction components of a gesture matrix of the tail end of the mechanical arm of the selected mechanical arm, wherein the gesture matrix of the tail end of the mechanical arm of the selected mechanical arm is a matrix of 3 rows and 3 columns;
determining that 3Y-direction components of the gesture matrix for forming a Y-direction vector are positive when the tail end of the mechanical arm of the selected mechanical arm points to a target object and 3 components of the Z-direction of the gesture matrix are positive and the selected mechanical arm is a right mechanical arm;
determining that 3Y-direction components of the gesture matrix for forming a Y-direction vector are negative when the tail end of the mechanical arm of the selected mechanical arm points to a target object and 3 components of the Z-direction of the gesture matrix are positive and the selected mechanical arm is a left mechanical arm;
according to a right hand rule, the 3Z-direction components and the 3Y-direction components of a gesture matrix of the tail end of the mechanical arm of the selected mechanical arm are determined, and the 3X-direction components of the gesture matrix for forming an X-direction vector are determined.
4. The robot pointing action control method of claim 1, wherein after said controlling the selected manipulator to point to the target object according to the trajectory, the method further comprises:
And testing the pose of the tail end of the obtained mechanical arm by using a measuring device, and determining the error of the pointing action of the tail end of the mechanical arm of the selected mechanical arm to the target object.
5. The robot pointing motion control method of claim 4, wherein the measuring the pose of the obtained arm tip using the measuring device, determining an error in the pointing motion of the arm tip of the selected arm toward the target object, comprises:
determining a vector of the target object pointing to the arm end of the selected arm;
and determining an included angle between the vector and a Z-direction component of a gesture matrix of the tail end of the mechanical arm of the selected mechanical arm as the error.
6. The robot pointing motion control method of claim 1, wherein determining a trajectory of the arm end of the selected arm along the current pose of each joint to the target pose of each joint according to the obtained pose of the arm end comprises:
according to the obtained pose of the tail end of the mechanical arm, solving inverse kinematics to obtain the target pose of each joint of the mechanical arm;
and obtaining the track from the current pose to the target pose of each joint at the tail end of the mechanical arm of the selected mechanical arm through track interpolation.
7. The method for controlling pointing motion of a robot according to claim 6, wherein the step of solving inverse kinematics based on the obtained pose of the distal end of the arm to obtain a target pose of each joint of the arm comprises:
according to the pose of the tail end of the mechanical arm, solving inverse kinematics to obtain a target configuration of the selected mechanical arm, wherein the target configuration comprises target angles of all joints;
and performing track interpolation according to the current configuration of the robot and the target configuration, and controlling the mechanical arm to move from the current configuration of the robot to the target configuration of the robot pointing to the target object, wherein the current configuration of the robot comprises the current angles of the joints.
8. A robot pointing motion control device, comprising:
the mechanical arm selection module is used for selecting a mechanical arm which is closer to the target object in the double mechanical arms of the robot according to the target object coordinates of the target object in a robot coordinate system under the condition that the target object exceeds the reachable range of the tail end of the mechanical arm of the robot;
the pose determining module at the tail end of the mechanical arm is used for obtaining the pose of the selected mechanical arm, which points to the target object, within the reachable range of the tail end of the mechanical arm according to the mapping of the target object coordinates, the shoulder joint coordinates of the selected mechanical arm in the robot coordinate system and the preset rules of the pointing action;
The track determining module is used for determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint according to the obtained pose of the tail end of the mechanical arm;
and the action control module is used for controlling the selected mechanical arm to point to the target object according to the track.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
10. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the method of any of claims 1-7.
CN202310009464.8A 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium Active CN115922728B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310009464.8A CN115922728B (en) 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium
PCT/CN2023/118887 WO2024037658A1 (en) 2023-01-03 2023-09-14 Method and apparatus for controlling pointing action of robot, and electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310009464.8A CN115922728B (en) 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN115922728A CN115922728A (en) 2023-04-07
CN115922728B true CN115922728B (en) 2023-06-30

Family

ID=86654337

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310009464.8A Active CN115922728B (en) 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium

Country Status (2)

Country Link
CN (1) CN115922728B (en)
WO (1) WO2024037658A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115922728B (en) * 2023-01-03 2023-06-30 之江实验室 Robot pointing motion control method, apparatus, electronic device, and storage medium
CN116141341B (en) * 2023-04-21 2023-08-08 之江实验室 Method for realizing pointing action of five-degree-of-freedom mechanical arm meeting Cartesian space constraint

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150142796A (en) * 2014-06-11 2015-12-23 현대자동차주식회사 Method and system for controlling elbow of robot
CN105415372A (en) * 2015-12-09 2016-03-23 常州汉迪机器人科技有限公司 Multi-joint robot track planning method under constraint of safety space
WO2018053430A1 (en) * 2016-09-16 2018-03-22 Carbon Robotics, Inc. System and calibration, registration, and training methods
CN109048890A (en) * 2018-07-13 2018-12-21 哈尔滨工业大学(深圳) Coordination method for controlling trajectory, system, equipment and storage medium based on robot
KR20190048589A (en) * 2017-10-31 2019-05-09 충남대학교산학협력단 Apparatus and method for dual-arm robot teaching based on virtual reality
WO2019119724A1 (en) * 2017-12-21 2019-06-27 东南大学 Force sense information and posture information based limb motion intention understanding and upper limb rehabilitation training robot control method
CN112775931A (en) * 2019-11-05 2021-05-11 深圳市优必选科技股份有限公司 Mechanical arm control method and device, computer readable storage medium and robot
WO2021089550A1 (en) * 2019-11-06 2021-05-14 Koninklijke Philips N.V. Robotic positioning of a device
CN112828885A (en) * 2020-12-30 2021-05-25 诺创智能医疗科技(杭州)有限公司 Hybrid master-slave mapping method, mechanical arm system and computer equipment
CN113814988A (en) * 2021-11-24 2021-12-21 之江实验室 7-degree-of-freedom SRS type mechanical arm inverse solution analysis method and device and electronic equipment
CN114310915A (en) * 2022-02-16 2022-04-12 哈尔滨工业大学 Space manipulator butt joint end tool trajectory planning method based on visual feedback
CN114952868A (en) * 2022-07-26 2022-08-30 之江实验室 7-degree-of-freedom SRS (sounding reference Signal) type mechanical arm control method and device and piano playing robot

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011224737A (en) * 2010-04-21 2011-11-10 Toyota Motor Corp Guide robot, guide method, and program for controlling guide
KR101789756B1 (en) * 2010-12-29 2017-11-20 삼성전자주식회사 Robot and method for controlling the same
CN107972026B (en) * 2016-10-25 2021-05-04 河北亿超机械制造股份有限公司 Robot, mechanical arm and control method and device thereof
JP6720950B2 (en) * 2017-11-13 2020-07-08 株式会社安川電機 Laser processing method, controller and robot system
JP2020175466A (en) * 2019-04-17 2020-10-29 アズビル株式会社 Teaching device and teaching method
CN113601504A (en) * 2021-08-04 2021-11-05 之江实验室 Robot limb action control method and device, electronic device and storage medium
CN115922728B (en) * 2023-01-03 2023-06-30 之江实验室 Robot pointing motion control method, apparatus, electronic device, and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150142796A (en) * 2014-06-11 2015-12-23 현대자동차주식회사 Method and system for controlling elbow of robot
CN105415372A (en) * 2015-12-09 2016-03-23 常州汉迪机器人科技有限公司 Multi-joint robot track planning method under constraint of safety space
WO2018053430A1 (en) * 2016-09-16 2018-03-22 Carbon Robotics, Inc. System and calibration, registration, and training methods
KR20190048589A (en) * 2017-10-31 2019-05-09 충남대학교산학협력단 Apparatus and method for dual-arm robot teaching based on virtual reality
WO2019119724A1 (en) * 2017-12-21 2019-06-27 东南大学 Force sense information and posture information based limb motion intention understanding and upper limb rehabilitation training robot control method
CN109048890A (en) * 2018-07-13 2018-12-21 哈尔滨工业大学(深圳) Coordination method for controlling trajectory, system, equipment and storage medium based on robot
CN112775931A (en) * 2019-11-05 2021-05-11 深圳市优必选科技股份有限公司 Mechanical arm control method and device, computer readable storage medium and robot
WO2021089550A1 (en) * 2019-11-06 2021-05-14 Koninklijke Philips N.V. Robotic positioning of a device
CN112828885A (en) * 2020-12-30 2021-05-25 诺创智能医疗科技(杭州)有限公司 Hybrid master-slave mapping method, mechanical arm system and computer equipment
CN113814988A (en) * 2021-11-24 2021-12-21 之江实验室 7-degree-of-freedom SRS type mechanical arm inverse solution analysis method and device and electronic equipment
CN114310915A (en) * 2022-02-16 2022-04-12 哈尔滨工业大学 Space manipulator butt joint end tool trajectory planning method based on visual feedback
CN114952868A (en) * 2022-07-26 2022-08-30 之江实验室 7-degree-of-freedom SRS (sounding reference Signal) type mechanical arm control method and device and piano playing robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
取药机械手的设计及其运动分析与仿真;李泊,李成群;机床与液压;第47卷(第17期);第72-75页 *
基于DSP的焊装机器人控制算法研究与仿真设计;邓顺;周康渠;;重庆工商大学学报(自然科学版)(第01期);第87-93页 *

Also Published As

Publication number Publication date
CN115922728A (en) 2023-04-07
WO2024037658A1 (en) 2024-02-22

Similar Documents

Publication Publication Date Title
CN115922728B (en) Robot pointing motion control method, apparatus, electronic device, and storage medium
CN110780285B (en) Pose calibration method, system and medium for laser radar and combined inertial navigation
CN104842352B (en) Robot system using visual feedback
US20110093119A1 (en) Teaching and playback method based on control of redundancy resolution for robot and computer-readable medium controlling the same
CN112318506A (en) Automatic calibration method, device, equipment, mechanical arm and medium for mechanical arm
CN110842901A (en) Robot hand-eye calibration method and device based on novel three-dimensional calibration block
CN116277035B (en) Robot control method and device, processor and electronic equipment
CN114310901B (en) Coordinate system calibration method, device, system and medium for robot
CN112603542B (en) Hand-eye calibration method and device, electronic equipment and storage medium
CN111216136A (en) Multi-degree-of-freedom mechanical arm control system, method, storage medium and computer
CN115824041A (en) Laser calibration method, device, equipment and medium
JPH07121214A (en) Measuring sensor device for robot, and calibration method and measuring method using the same
Zarubin et al. Caging complex objects with geodesic balls
CN112476435B (en) Calibration method and calibration device for gravity acceleration direction and storage medium
JP2023517395A (en) ADJUSTMENT METHOD, APPARATUS AND READABLE STORAGE MEDIUM OF TOOL HEAD
Astad et al. Vive for robotics: Rapid robot cell calibration
CN115164823B (en) Method and device for acquiring gyroscope information of camera
CN113450903B (en) Human body action mapping method and device, computer equipment and storage medium
Wei et al. Multisensory visual servoing by a neural network
CN111971529A (en) Method and apparatus for managing robot system
JP2019197333A (en) Path correction method and control device of multiple spindle processing machine
CN115916475B (en) Calibration method, device and system for tool center point of robot
Henriksson et al. Maximizing the use of computational resources in multi-camera feedback control
CN113705378A (en) Sample data generation method and device and electronic equipment
CN112907669A (en) Camera pose measuring method and device based on coplanar feature points

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant