CN115213883A - Robot control method, device, medium, and apparatus - Google Patents

Robot control method, device, medium, and apparatus Download PDF

Info

Publication number
CN115213883A
CN115213883A CN202110729602.0A CN202110729602A CN115213883A CN 115213883 A CN115213883 A CN 115213883A CN 202110729602 A CN202110729602 A CN 202110729602A CN 115213883 A CN115213883 A CN 115213883A
Authority
CN
China
Prior art keywords
robot
collision
information
target action
action file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110729602.0A
Other languages
Chinese (zh)
Inventor
李岩刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Shanghai Robotics Co Ltd
Original Assignee
Cloudminds Shanghai Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Shanghai Robotics Co Ltd filed Critical Cloudminds Shanghai Robotics Co Ltd
Priority to CN202110729602.0A priority Critical patent/CN115213883A/en
Publication of CN115213883A publication Critical patent/CN115213883A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The present disclosure relates to a robot control method, apparatus, medium, and device, including: acquiring spatial information of the robot, wherein the spatial information comprises position information and contour information of objects around the robot; performing motion planning on the robot according to the current pose and spatial information of the robot to obtain a target action file, wherein the target action file comprises collision data, and the collision data comprises one or more groups of collision information of each execution part and objects around the robot when the robot executes the target action file; and controlling the robot to reach the target pose according to the target action file. Therefore, the collision data can be referred to for motion control of the robot, so that damage of the robot or damage of surrounding objects caused by hard collision is avoided, the function of the robot body structure environment object is protected, new motion track calculation is not needed, the cost of the robot and the pressure of data processing are reduced, and property loss is reduced.

Description

Robot control method, device, medium, and apparatus
Technical Field
The present disclosure relates to the field of robot control, and in particular, to a robot control method, apparatus, medium, and device.
Background
At present, when the robot field is performing operation, in order to avoid the problem that body parts of the robot are damaged due to possible collision between the body parts or collision with the external environment, the method generally adopted in the industry at present is to determine the accurate positions of various objects in the environment where the robot is located by using sensors such as a depth camera with higher precision and the like, and then recalculate and change the motion track of the robot when judging that the body parts of the robot collide with the external environment; or measuring changes in joint torque in the actuators in the joints of the respective body members by means of a torque sensor or the like, and stopping or changing the operation of the actuators in the respective joints when a change that can be determined that the body member has collided is measured. In any way, environment sensing is carried out by depending on a deployed high-precision sensor, under the condition that the precision of the sensor is insufficient, collision avoidance cannot be guaranteed, meanwhile, newly collected sensor data increase data needing to be processed in time by the robot, and the pressure of the robot on timely data processing is further increased; and the torque sensor can only respond after the collision happens, and joints or other body parts and the like of the robot in the collision can be damaged, so that certain property loss is caused.
Disclosure of Invention
The invention aims to provide a robot control method, a device, a medium and equipment, which can control a robot by referring to collision data so as to avoid the damage of the robot or the damage of surrounding objects caused by hard collision, thereby achieving the function of protecting the body structure of the robot and the objects in the operating environment of the robot.
In order to achieve the above object, the present disclosure provides a robot control method, the method including:
acquiring space information of a robot, wherein the space information comprises position information and contour information of objects around the robot;
performing motion planning on the robot according to the current pose of the robot and the spatial information to obtain a target action file, wherein the target action file comprises collision data, the collision data comprises one or more groups of collision information of each executing component and objects around the robot when the robot executes the target action file, and each group of collision information comprises joint information and time information corresponding to the executing component which has the collision;
and controlling the robot to reach the target pose according to the target action file.
Optionally, the controlling the robot to reach the target pose according to the target action file comprises:
and controlling the robot joint corresponding to the execution component to be in a flexible working mode in a corresponding time period according to the joint information and the time information in the collision data in the target action file.
Optionally, in the flexible working mode, the actuator in the robot joint drives the robot joint to move according to the torque information converted from the received current information.
Optionally, the motion planning of the robot by the current pose of the robot and the spatial information to obtain a target action file includes:
performing motion planning on the robot according to the current pose of the robot, the space information and the URDF robot description format file of the robot to obtain a target action file;
and when the robot executes the target action file, one or more groups of collision information generated between execution components of the robot are also included in the collision data.
Optionally, the performing motion planning on the robot according to the current pose of the robot and the spatial information to obtain a target action file includes:
and performing motion planning on the robot according to the current pose of the robot, the space information and the target task to obtain the target action file.
Optionally, the position information and the contour information of the object around the robot include contour information of a surrounding obstacle, and the performing motion planning on the robot according to the current pose of the robot and the spatial information to obtain a target motion file further includes:
amplifying the contour information of the surrounding obstacles to preset multiples;
and the collision data is one or more groups of collision information of the robot and the surrounding obstacles after the contour information is amplified when the robot executes the target action file.
Optionally, the method further comprises:
detecting collision events actually occurring among execution parts of the robot in the process that the robot executes the target action file;
and controlling the robot joint corresponding to the collision event to be in the flexible working mode under the condition that the collision event is detected to occur.
Optionally, the method further comprises:
when the collision event is detected, judging whether the robot joint corresponding to the collision event and the time of the collision event are contained in the collision data in the target action file;
and recording the collision event as an abnormal collision when the robot joint corresponding to the collision event and the collision data of which the time of the collision event is not contained in the target action file, and reporting collision information related to the abnormal collision to a cloud.
The present disclosure also provides a robot control apparatus, the apparatus including:
the acquisition module is used for acquiring spatial information of the robot, wherein the spatial information comprises position information and contour information of objects around the robot;
the planning module is used for planning the motion of the robot according to the current pose of the robot and the spatial information to obtain a target action file, wherein the target action file comprises collision data, the collision data comprises one or more groups of collision information of each executing component and objects around the robot when the robot executes the target action file, and each group of collision information comprises joint information and time information corresponding to the executing component which has the collision;
and the control module is used for controlling the robot to reach the target pose according to the target action file.
The present disclosure also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method described above.
The present disclosure also provides an electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method described above.
According to the technical scheme, after the positions of the surrounding objects in the space where the robot is located are identified, the outlines of the objects are combined, when the motion of the robot is planned, collision which may occur between the robot and the surrounding objects is determined in advance, so that when the motion of the robot is controlled according to a target motion file obtained through planning, the robot can be controlled by referring to collision data, the damage of the robot or the damage of the surrounding objects caused by hard collision is avoided, and the functions of protecting the body structure of the robot and the objects in the robot operating environment are achieved.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
fig. 1 is a schematic structural diagram illustrating a robotic system according to an exemplary embodiment of the present disclosure.
Fig. 2 is a flowchart illustrating a robot control method according to an exemplary embodiment of the present disclosure.
FIG. 3 is a flowchart illustrating a robot control method according to yet another exemplary embodiment of the present disclosure.
Fig. 4 illustrates a flowchart of a robot control method according to yet another exemplary embodiment of the present disclosure.
Fig. 5 is a flowchart illustrating a robot control method according to still another exemplary embodiment of the present disclosure.
Fig. 6 is a flowchart illustrating a robot control method according to still another exemplary embodiment of the present disclosure.
Fig. 7 is a block diagram illustrating a configuration of a robot control apparatus according to an exemplary embodiment of the present disclosure.
Fig. 8 is a block diagram illustrating a configuration of a robot control apparatus according to still another exemplary embodiment of the present disclosure.
FIG. 9 is a block diagram illustrating an electronic device in accordance with an example embodiment.
FIG. 10 is a block diagram illustrating an electronic device in accordance with an example embodiment.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
Fig. 1 is a schematic structural view of a robot system according to an exemplary embodiment of the present disclosure. As shown in fig. 1, a Robot system deployed in the present disclosure includes a Robot Control Unit RCU1 (Robot Control Unit), which is connected to a cloud Control module 2 in an upward direction in a wireless manner such as WiFi/4G/5G, and is connected to a central Robot Control Unit CCU3 (Center Control Unit) in a downward direction in a wired manner such as ethernet/usb, and is connected to an authorized device 4 such as PAD or PC in a horizontal manner via WiFi/bluetooth for functional cooperation, and is used as a bridge for connecting cloud intelligence and a Robot, so as to complete the cloud intelligence to energize the Robot. The robot body consists of the robot central Control Unit CCU3, an Actuator Control Unit ECU4 (Electronic Control Unit), and a plurality of actuators SCA5 (Smart composite Actuator) controlled by the Actuator Control Unit ECU 4; the robot central control unit CCU3 completes calculation (perception and cognition) tasks in the robot body, and the actuator control unit ECU4 is matched with a plurality of actuators SCA5 controlled by the actuator control unit ECU to complete action tasks calculated by the robot central control unit CCU 3; and the actuator SCA5 is a final unit for completing various actions performed by the robot.
When a robot, especially a humanoid robot, executes an action task, collision between body parts of the robot or collision between the body of the robot and an external environment can occur according to action requirements of the execution task; for example, during the process of the robot performing a preset dance movement, the arm returns to the two sides of the body and naturally sags, and the arm may touch the body of the robot; or the robot performs door opening action, namely, the finger door handles or the door panels of the robot hand collide with each other. In order to avoid the problem that joints of the robot are damaged due to collision when the robot moves, a robot control method as shown in fig. 2 is provided.
Fig. 2 is a flowchart illustrating a robot control method according to an exemplary embodiment of the present disclosure. As shown in fig. 2, the method includes steps 201 to 203.
In step 201, spatial information of the robot is obtained, where the spatial information includes position information and contour information of objects around the robot. The method for acquiring the spatial information may be various, and for example, the method may be implemented by using a robot vision technology in combination with an image acquisition device, such as a depth camera, disposed on a robot. The determination of the object around the robot is performed according to the position of the target grabbed object or according to a target pose determined by the target grabbed object, for example, and an object that obstructs or limits the robot during grabbing the target grabbed object is determined as the object around the robot, or an object that obstructs or idles the robot during moving to the target pose may be determined as the object around the robot, and the like.
In step 202, a motion plan is performed on the robot according to the current pose of the robot and the spatial information to obtain a target action file, where the target action file includes collision data, the collision data includes one or more sets of collision information of each execution unit and objects around the robot when the robot executes the target action file, and each set of collision information includes joint information and time information corresponding to the execution unit that has the collision.
After the object is grasped by the target or other execution instructions are determined, a target pose may be determined, so that the robot may be subjected to motion planning to obtain a target motion file according to the current pose of the robot and the control information obtained in step 201, so that the robot may move according to the target motion file to reach the target pose to implement grasping or executing the related instructions for the object. The target motion file may include a plurality of sets of motion data including a plurality of joint motion sequences, and the central control unit CCU3 may control each joint of the robot to move accordingly according to the motion data in the target motion file to achieve the target pose.
In the process of the motion planning, collision information which may occur in the process of executing the target action file by the robot is also judged in advance. The collision information may be determined by performing data calculation directly or by executing the target motion file in the cloud through a digital twin corresponding to the robot, and a specific analysis manner is not limited in this disclosure as long as collision data with objects around the robot, which may occur when the robot executes the target motion file, can be determined and obtained according to the current pose of the robot and the spatial information.
The data structure of the collision information included in the collision data may be as shown in table 1 below.
TABLE 1
Figure BDA0003139600300000071
Wherein the time information of the collision occurrence is calculated with respect to the execution time of the target motion file. For example, if the time required for the robot to execute the target motion file is 5 seconds, the time when the robot starts executing the target motion file may be considered as 00 seconds, and then continuously increments to 05.
The number of the joint information corresponding to each set of collision information included in the collision data may be multiple or 1, and when the joint information of a collision changes, it may be regarded as the end of the collision event corresponding to the set of collision information. For example, when the time from the first analysis to the occurrence of the collision is 00.
In step 203, the robot is controlled to reach the target pose according to the target action file.
After obtaining the target motion file including the collision data, the collision event that may occur may be avoided in advance according to the collision data, for example, the robot is controlled to stop moving in advance, or the control is processed accordingly to reduce damage to the body parts of the robot, such as joints, for example, the control mode of the relevant joints of the robot is changed in advance, and so on. The specific control operations employed in the present disclosure will be given below.
According to the technical scheme, after the positions of the surrounding objects in the space where the robot is located are identified, the outlines of the objects are combined, when the motion of the robot is planned, collision which may occur between the robot and the surrounding objects is determined in advance, so that when the motion of the robot is controlled according to a target motion file obtained through planning, the robot can be controlled by referring to collision data, the damage of the robot or the damage of the surrounding objects caused by hard collision is avoided, and the functions of protecting the body structure of the robot and the objects in the robot operating environment are achieved.
Fig. 3 is a flowchart illustrating a robot control method according to an exemplary embodiment of the present disclosure. As shown in fig. 3, the method further comprises step 301.
In step 301, controlling the robot joint corresponding to the executing component to be in a flexible working mode in a corresponding time period according to the joint information and the time information in the collision data.
During the execution of the target motion file by the robot, the actuator SCA5 of the robot generally operates in three possible operating modes: position mode, velocity mode, and current mode. In the position mode, the actuator SCA5 receives position information sent by the actuator control unit ECU4, then controls the actuator to move to a corresponding position and keeps the position; in the speed mode, the actuator SCA5 receives speed information transmitted by the actuator control unit ECU4, accelerates or decelerates to the transmitted speed, and keeps the speed; in the current mode, the actuator SCA5 receives current information sent by the actuator control unit ECU4, and then executes the execution according to torque information converted from the current information, and holds the torque. In the working modes of the speed mode and the position mode, if collision between the body parts of the robot occurs, the actuator SCA5 of the relevant joint cannot move to the corresponding position or accelerates or decelerates to the corresponding speed, and the actuator SCA5 continuously executes control operation of moving to the corresponding position or accelerating or decelerating to the corresponding speed, which easily causes damage to the body parts or the relevant joints of the robot.
The flexible operation mode may be, for example, the current mode as described above, in which the actuator SCA5 of the robot is not forcibly controlled to perform a fixed position or a fixed speed, as compared to the position mode and the speed mode described above. In the current mode, an actuator in the robot joint drives the robot joint to move according to torque information converted from the received current information. That is, the actuator control unit ECU4 may also control the actuator SCA5 to move by issuing a current information, where the current information is used to represent the torque information that the actuator SCA5 needs to execute, and after the actuator SCA5 receives the current information, the actuator SCA5 increases or decreases the current to the current information and maintains the current to continuously output the torque corresponding to the current information. Because there is a certain conversion relationship between the current and the torque, if the externally applied torque is larger than the torque output by the actuator SCA5, the actuator SCA5 will not move in the expected direction at this time. Thus, in this current mode, damage to body parts of the robot, such as joints, when a collision occurs during movement can be avoided as much as possible.
In one possible embodiment, the flexible operating mode can also be another operating mode, for example, in such a way that the actuator SCA5 can be made to perform a more flexible control by, for example, widening the target position range or target speed range. For example, when the actuator SCA5 is controlled to move to the position after the actuator control unit ECU4 issues a position information, positions within a certain range near the position information may be all regarded as executable positions, and if the actuator SCA5 cannot reach the position information within a preset time, the actuator SCA5 may be considered to have finished executing the position information as long as the actuator SCA5 can move to the certain range near the position information. For another example, after the actuator control unit ECU4 issues a speed information, when the actuator SCA5 is controlled to accelerate to the speed, a time threshold may be set, and if the actuator SCA5 cannot reach the speed continuously within the time threshold, it may be considered that the actuator SCA5 has completed executing the speed information, and the actuator SCA5 is not continuously controlled to continue to move in the case of collision.
In addition, the actuator SCA5 may be different when the movement is performed according to different information, for example, the actuator SCA5 may be a first actuator when the movement of the actuator SCA5 is controlled according to the position information in the position mode or the flexible operation mode; when the movement of the actuator SCA5 is controlled according to the speed information in the speed mode or the flexible working mode, the actuator SCA5 may be a second actuator; when controlling the movement of the actuator SCA5 in the flexible operation mode based on the current information, the actuator SCA5 may be a third actuator.
A specific control example of controlling the corresponding robot joint to be in the flexible working mode in the corresponding time period according to the joint information and the time information in the collision data may be as follows: if the collision data is the collision data shown in table 1, controlling the joint 0 and the joint 1 to be in the flexible working mode between the starting time A1 and the ending time B1 during the robot executes the target motion file; controlling joint 0, joint 2 and joint 4 in the flexible operating mode between the start time A2 and the end time B2; between the start time A3 and the end time B3, the joint 1 and the joint 3 are controlled to be in the flexible operating mode. The movement of the actuator SCA5 can be controlled by the position mode or the velocity mode, as selected by the actual situation, at the time included in the non-collision data or in the colliding joint included in the non-collision data.
In a possible embodiment, the motion planning of the robot according to the current pose of the robot and the spatial information to obtain the target action file includes: and performing motion planning on the Robot according to the target pose, the current pose of the Robot, the spatial information and a URDF Robot Description Format file (Unified Robot Description Format) of the Robot to obtain a target action file. The URDF robot description format file is a file describing the structure of each body part of the robot according to a format for describing the structure of the robot based on the XML specification. That is, in the process of determining the target action file, the determination may be performed according to the description file of the robot itself. And the collision data also comprises one or more groups of collision information generated among the execution parts of the robot when the robot executes the target action file. In the case that the target action file is determined according to the URDF file of the robot itself, collision information that may occur between the execution components of the robot itself during the execution of the target action file by the robot may also be determined according to the URDF file. Therefore, the collision data included in the target motion file may include not only one or more sets of collision information generated between each execution unit and the object around the robot, but also one or more sets of collision information generated between each execution unit of the robot.
In a possible embodiment, the motion planning of the robot according to the current pose of the robot and the spatial information to obtain the target action file includes: and performing motion planning on the robot according to the current pose of the robot, the space information and the target task to obtain the target action file. That is, before the exercise planning, the current target task may be determined, so that the target action file is determined according to the target task. The target task may be any task that the robot can perform, such as opening a door, or dancing, or moving to a designated location, etc.
Fig. 4 is a flowchart illustrating a robot control method according to still another exemplary embodiment of the present disclosure. Wherein the position information and contour information of the objects around the robot include contour information of surrounding obstacles. As shown in fig. 4, the method further includes step 401 and step 402.
In step 401, the contour information of the surrounding obstacle is enlarged to a preset multiple. And the collision data is one or more groups of collision information of the robot and the surrounding obstacles after the contour information is amplified when the robot executes the target action file.
The peripheral obstacles are the objects around the robot which collide with any execution part of the robot in the process of executing the target motion file by the robot. Since the collision information included in the collision data is the collision information of the contour-enlarged peripheral obstacle and the execution component of the robot, when the collision start time recorded in any collision information is reached, the robot does not actually collide with the peripheral obstacle, and therefore, the robot can be further protected from a serious collision event by enlarging the peripheral obstacle and re-determining the obtained collision information to control the robot to execute the target action file. In such a case, even if the control accuracy is not sufficient and there is an error in the actual moving direction and distance of the robot, it is possible to avoid as much as possible the case where the damage of the robot is serious after the collision or the damage of the object around the robot is serious.
Fig. 5 is a flowchart illustrating a robot control method according to an exemplary embodiment of the present disclosure. As shown in fig. 5, the method further comprises step 501 and step 502.
In step 501, collision events actually occurring between execution components of the robot are detected during the robot executing the target motion file.
In step 502, in case that the collision event is detected to occur, controlling the robot joint corresponding to the collision event to be in the flexible working mode.
The manner in which this actually occurring collision event is detected may be by deploying sensors that detect the collision, or alternatively, by deploying image capture devices on the robot, such as depth cameras. The sensor can determine whether or not a collision has actually occurred at the joint of the robot, for example, from the detected moment. The depth camera can judge whether the robot collides by judging whether the robot contacts with objects around the robot or self parts through the acquired images, or judges whether the robot actually collides by judging whether the robot overlaps with the outline of the surrounding obstacle after the outline is enlarged.
That is, in the process of controlling the robot to execute the target motion file, not only the relevant joints can be controlled to be in the flexible working mode according to the collision data obtained through analysis in advance, but also the actually occurring collision can be detected, and the relevant robot joints are also controlled to be in the flexible working mode under the condition of the actually occurring collision. Therefore, the problem of collision damage caused by the fact that the robot actually collides but the relevant joints do not adjust the working mode under the condition that the collision data is not accurate enough can be avoided. Or, the problem that the robot generates errors in action execution in the process of executing the target motion file, collision damage is caused when an unexpected collision event is found, and new motion track calculation is not needed can be avoided.
Fig. 6 is a flowchart illustrating a robot control method according to an exemplary embodiment of the present disclosure. As shown in fig. 6, the method further includes step 601 and step 602.
In step 601, when the collision event is detected, it is determined whether the robot joint corresponding to the collision event and the time when the collision event occurs are included in the collision data.
In step 602, when the robot joint corresponding to the collision event and the time when the collision event occurs are not included in the collision data, the collision event is recorded as an abnormal collision, and collision information related to the abnormal collision is reported to a cloud.
In the event that the occurrence of the collision event is detected, the collision event may or may not be an event recorded in the collision data. When the event is not recorded in the collision data, a normal collision event may not be recorded due to insufficient accuracy of the collision data, or an unexpected abnormal collision event may occur due to an error in execution of the action. Regardless of the cause of the collision event, the collision event is determined as an abnormal collision as long as the collision event is not included in the collision data.
For example, when the robot detects that all of the joint 1, the joint 2, and the joint 3 have collided at 32 th sec in executing the target motion file, and the time is within the time information of the collision occurrence corresponding to a certain set of collision information recorded in the collision data, but the joint information of the collision among the set of collision information recorded in the collision data includes only the joint 1 and the joint 3, the time at which the joint 2 detects the collision at 02 th sec.
In a possible implementation manner, in a case that the robot joint recorded in the collision data is not detected to have the collision event at a corresponding time, the relevant collision information in the collision data is recorded as abnormal information, and the abnormal information is reported to a cloud. That is, except for determining whether a collision event is included in the collision data when the collision event is detected, all collision information recorded in the collision data is compared with all detected collision events one by one, and if the collision information recorded in the collision data is not actually detected in the process of executing the target action file by the robot, the execution of the action may not be in place due to an error in the execution of the action, or the collision data obtained by analysis may not be accurate enough, and no matter which reason causes the collision information, the collision information which is not detected is used as abnormal information and reported to the cloud for abnormal processing.
In one possible embodiment, the robot control method further includes: recording a first abnormal action under the condition that the robot joint does not reach the expected position, and reporting the first abnormal action to a cloud; and/or recording a second abnormal action under the condition of failure of controlling the robot joint to enter the flexible working mode or exit the flexible working mode, and reporting the second abnormal action to a cloud. That is, when the robot is controlled to execute the target action file, various abnormal actions can be recorded and reported. The cloud control module 2 can send corresponding control to the robot control unit RCU1 to perform certain intervention control on the robot through the reported records of abnormal collision, abnormal information, abnormal actions and the like.
In a possible embodiment, the report of the abnormal collision, the abnormal information, and the abnormal action may also be reported to the central control unit CCU3, and the central control unit CCU3 may also perform certain intervention control on the execution of the target action file by the robot according to the reported abnormal data, for example, the actuator control unit ECU4 may control the corresponding actuator SCA5 to stop moving or change the moving direction thereof, so as to further ensure the safety when the robot executes the target action file.
Fig. 7 is a block diagram illustrating a configuration of a robot control apparatus according to an exemplary embodiment of the present disclosure, the apparatus including, as shown in fig. 7: the system comprises an acquisition module 10, a processing module and a control module, wherein the acquisition module is used for acquiring spatial information of a robot, and the spatial information comprises position information and outline information of objects around the robot; a planning module 20, configured to perform motion planning on the robot according to the current pose of the robot and the spatial information to obtain a target action file, where the target action file includes collision data, the collision data includes one or more sets of collision information that occurs between each execution component and an object around the robot when the robot executes the target action file, and each set of collision information includes joint information and time information corresponding to the execution component that has the collision; and the control module 30 is used for controlling the robot to reach the target pose according to the target action file.
According to the technical scheme, after the positions of the surrounding objects in the space where the robot is located are identified, the outlines of the objects are combined, when the motion of the robot is planned, collision which may occur between the robot and the surrounding objects is determined in advance, so that when the motion of the robot is controlled according to a target motion file obtained through planning, the robot can be controlled by referring to collision data, the damage of the robot or the damage of the surrounding objects caused by hard collision is avoided, and the functions of protecting the body structure of the robot and the objects in the robot operating environment are achieved.
Fig. 8 is a block diagram illustrating a robot control apparatus according to still another exemplary embodiment of the present disclosure. As shown in fig. 8, the control module 30 includes: the first control submodule 301 is configured to control a robot joint corresponding to the execution component to be in a flexible working mode in a corresponding time period according to the joint information and the time information in the collision data.
In one possible embodiment, in the flexible working mode, the actuator in the robot joint drives the robot joint to move according to the torque information converted from the received current information.
In one possible embodiment, the planning module 20 is further configured to: performing motion planning on the robot according to the current pose of the robot, the space information and the URDF robot description format file of the robot to obtain a target action file; and when the robot executes the target action file, one or more groups of collision information generated between execution components of the robot are also included in the collision data.
In one possible embodiment, the planning module 20 is further configured to: the motion planning of the robot according to the current pose of the robot and the spatial information to obtain a target action file comprises: and performing motion planning on the robot according to the current pose of the robot, the space information and the target task to obtain the target action file.
In a possible embodiment, the position information and the contour information of the object around the robot include contour information of a surrounding obstacle, as shown in fig. 8, the apparatus further includes: an amplification module 40 configured to: amplifying the contour information of the surrounding obstacles to preset multiples; and the collision data is one or more groups of collision information of the robot and the surrounding obstacles after the contour information is amplified when the robot executes the target action file.
In a possible embodiment, as shown in fig. 8, the apparatus further comprises: a detection module 50, configured to detect a collision event actually occurring between execution components of the robot in a process of executing the target motion file by the robot; the control module 30 is further configured to: and controlling the robot joint corresponding to the collision event to be in the flexible working mode under the condition that the collision event is detected to occur.
In a possible embodiment, as shown in fig. 8, the device further comprises: a determining module 60, configured to determine, when the collision event is detected, whether the robot joint corresponding to the collision event and the time when the collision event occurs are included in the collision data; an exception reporting module 70, configured to record that the collision event is an exception collision when the robot joint corresponding to the collision event and the time of the collision event do not include the collision data, and report collision information related to the exception collision to a cloud.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 9 is a block diagram illustrating an electronic device 900 in accordance with an example embodiment. As shown in fig. 9, the electronic device 900 may include: a processor 901 and a memory 902. The electronic device 900 may also include one or more of a multimedia component 903, an input/output (I/O) interface 904, and a communications component 905.
The processor 901 is configured to control the overall operation of the electronic device 900, so as to complete all or part of the steps in the robot control method. The memory 902 is used to store various types of data to support operation of the electronic device 900, such as instructions for any application or method operating on the electronic device 900 and application-related data, such as contact data, messages sent or received, pictures, audio, video, and the like. The Memory 902 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically Erasable Programmable Read-Only Memory (EEPROM), erasable Programmable Read-Only Memory (EPROM), programmable Read-Only Memory (PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia component 903 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 902 or transmitted through the communication component 905. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 904 provides an interface between the processor 901 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 905 is used for wired or wireless communication between the electronic device 900 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, near Field Communication (NFC for short), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 905 may thus include: wi-Fi modules, bluetooth modules, NFC modules, and the like.
In an exemplary embodiment, the electronic Device 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the robot control method described above.
In another exemplary embodiment, there is also provided a computer-readable storage medium including program instructions, which when executed by a processor, implement the steps of the robot control method described above. For example, the computer readable storage medium may be the above-mentioned memory 902 comprising program instructions executable by the processor 901 of the electronic device 900 to perform the above-mentioned robot control method.
Fig. 10 is a block diagram of an electronic device 1000 shown in accordance with an example embodiment. For example, the electronic device 1000 may be provided as a server. Referring to fig. 10, the electronic device 1000 includes a processor 1022, which may be one or more in number, and a memory 1032 for storing computer programs executable by the processor 1022. The computer programs stored in memory 1032 may include one or more modules that each correspond to a set of instructions. Further, the processor 1022 may be configured to execute the computer program to perform the robot control method described above.
Additionally, the electronic device 1000 may also include a power component 1026 and a communication component 1050, the power component 1026 may be configured to perform power management for the electronic device 1000, and the communication component 1050 may be configured to enable communication for the electronic device 1000, e.g., wired or wireless communication. In addition, the electronic device 1000 may also include input/output (I/O) interfaces 1058. The electronic device 1000 may operate based on an operating system stored in the memory 1032, such as a Windows Server TM ,Mac OS X TM ,Unix TM ,Linux TM And so on.
In another exemplary embodiment, a computer readable storage medium comprising program instructions which, when executed by a processor, implement the steps of the robot control method described above is also provided. For example, the non-transitory computer readable storage medium may be the memory 1032 described above that includes program instructions executable by the processor 1022 of the electronic device 1000 to perform the robot control method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the robot control method described above when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that the various features described in the foregoing embodiments may be combined in any suitable manner without contradiction. In order to avoid unnecessary repetition, various possible combinations will not be separately described in this disclosure.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (11)

1. A robot control method, characterized in that the method comprises:
acquiring spatial information of a robot, wherein the spatial information comprises position information and contour information of objects around the robot;
performing motion planning on the robot according to the current pose of the robot and the spatial information to obtain a target action file, wherein the target action file comprises collision data, the collision data comprises one or more groups of collision information of each executing component and objects around the robot when the robot executes the target action file, and each group of collision information comprises joint information and time information corresponding to the executing component which has the collision;
and controlling the robot to reach the target pose according to the target action file.
2. The method of claim 1, wherein the controlling the robot to a goal pose according to a goal action file comprises:
and controlling the robot joint corresponding to the executing component to be in a flexible working mode in a corresponding time period according to the joint information and the time information in the collision data in the target action file.
3. The method of claim 2, wherein in the flexible operating mode, an actuator in the robot joint drives the robot joint to move according to torque information converted from the received current information.
4. The method of claim 1, wherein the motion planning of the robot to obtain a target action file according to the current pose of the robot and the spatial information comprises:
performing motion planning on the robot according to the current pose of the robot, the space information and the URDF robot description format file of the robot to obtain a target action file;
and the collision data also comprises one or more groups of collision information generated among the execution parts of the robot when the robot executes the target action file.
5. The method of claim 1, wherein the motion planning of the robot to obtain a target action file according to the current pose of the robot and the spatial information comprises:
and performing motion planning on the robot according to the current pose of the robot, the space information and the target task to obtain the target action file.
6. The method of claim 1, wherein the position information and contour information of the objects around the robot includes contour information of surrounding obstacles, and wherein the motion planning of the robot according to the current pose of the robot and the spatial information to obtain the target action file further comprises:
amplifying the contour information of the surrounding obstacles to preset multiples;
and the collision data is one or more groups of collision information of the robot and the surrounding obstacles after the contour information is amplified when the robot executes the target action file.
7. The method of claim 1, further comprising:
detecting collision events actually occurring in each execution component of the robot in the process of executing the target action file by the robot;
and under the condition that the collision event is detected to occur, controlling the robot joint corresponding to the collision event to be in the flexible working mode.
8. The method of claim 7, further comprising:
when the collision event is detected, judging whether the robot joint corresponding to the collision event and the time of the collision event are contained in the collision data in the target action file;
and recording the collision event as an abnormal collision when the robot joint corresponding to the collision event and the collision data of which the time of the collision event is not contained in the target action file, and reporting collision information related to the abnormal collision to a cloud.
9. A robot control apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring spatial information of the robot, and the spatial information comprises position information and contour information of objects around the robot;
the planning module is used for performing motion planning on the robot according to the current pose of the robot and the space information to obtain a target action file, wherein the target action file comprises collision data, the collision data comprises one or more groups of collision information of each execution part and objects around the robot when the robot executes the target action file, and each group of collision information comprises joint information and time information corresponding to the execution part which has the collision;
and the control module is used for controlling the robot to reach the target pose according to the target action file.
10. A non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
11. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 7.
CN202110729602.0A 2021-06-29 2021-06-29 Robot control method, device, medium, and apparatus Pending CN115213883A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110729602.0A CN115213883A (en) 2021-06-29 2021-06-29 Robot control method, device, medium, and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110729602.0A CN115213883A (en) 2021-06-29 2021-06-29 Robot control method, device, medium, and apparatus

Publications (1)

Publication Number Publication Date
CN115213883A true CN115213883A (en) 2022-10-21

Family

ID=83606845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110729602.0A Pending CN115213883A (en) 2021-06-29 2021-06-29 Robot control method, device, medium, and apparatus

Country Status (1)

Country Link
CN (1) CN115213883A (en)

Similar Documents

Publication Publication Date Title
JP5283622B2 (en) Monitoring method and apparatus using camera for preventing collision of machine
US9469031B2 (en) Motion limiting device and motion limiting method
US10379513B2 (en) Monitoring system, monitoring device, and monitoring method
JP2020173744A (en) Image processing method using machine learning and electronic control device using it
US20080231221A1 (en) Arm-equipped mobile robot and method for controlling the same
US8634959B2 (en) Apparatus and method detecting a robot slip
CN106471546A (en) Control robot in the presence of mobile object
KR101947825B1 (en) Robot and method for operating a robot
KR102418451B1 (en) Robot control system
US9604362B2 (en) Method and apparatus for failure handling of a robot
US20220219323A1 (en) Method and system for operating a robot
CN114800535B (en) Robot control method, mechanical arm control method, robot and control terminal
CN112008722A (en) Control method and control device for construction robot and robot
JP2023547612A (en) Safety systems and methods used in robot operation
CN110856933B (en) Control device, robot system, and control method
KR100877715B1 (en) Reactive Layer Software Architecture Containing Sensing, Actuation and Real-Time Actions for Intelligent Robots
CN117562674A (en) Surgical robot and method performed by the same
JPH07271415A (en) Cooperative robot control method
WO2021085429A1 (en) Remotely controlled device, remote control system, and remote control device
JP2786874B2 (en) Movable position control device
CN115213883A (en) Robot control method, device, medium, and apparatus
CN114845841A (en) Control method, control device, robot system, program, and recording medium
US20220105633A1 (en) Integrity and safety checking for robots
CN115213882A (en) Robot control method, device, medium, and apparatus
CN114074323B (en) Safety system for ensuring speed and momentum boundary limitation of robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination