CN109822569B - Robot control method, system and storage medium - Google Patents

Robot control method, system and storage medium Download PDF

Info

Publication number
CN109822569B
CN109822569B CN201910094952.7A CN201910094952A CN109822569B CN 109822569 B CN109822569 B CN 109822569B CN 201910094952 A CN201910094952 A CN 201910094952A CN 109822569 B CN109822569 B CN 109822569B
Authority
CN
China
Prior art keywords
point
end effector
operable control
sequence
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910094952.7A
Other languages
Chinese (zh)
Other versions
CN109822569A (en
Inventor
胡军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MGA Technology Shenzhen Co Ltd
Original Assignee
MGA Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MGA Technology Shenzhen Co Ltd filed Critical MGA Technology Shenzhen Co Ltd
Priority to CN201910094952.7A priority Critical patent/CN109822569B/en
Publication of CN109822569A publication Critical patent/CN109822569A/en
Application granted granted Critical
Publication of CN109822569B publication Critical patent/CN109822569B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The embodiment of the invention provides a robot control method, a robot control system and a storage medium. Wherein the robot comprises an end effector, the method comprising: obtaining information about at least one first position and at least one second position of the end effector; displaying a user interface, wherein the user interface includes a first operable control and a second operable control; setting the first position as a grasp point for an object in response to user manipulation of the first operable control; setting the second position to be a release point for the object in response to user manipulation of the second operable control; determining a motion trajectory of the end effector based on the grasping point and the release point. According to the technical scheme, the user interface is provided, so that the operation of planning the carrying path of the end effector by the user is visual, the carrying path of the robot can be simply and flexibly planned, and the user experience is greatly improved.

Description

Robot control method, system and storage medium
Technical Field
The present invention relates to the field of robotics, and more particularly, to a robot control method, a robot control system, and a storage medium.
Background
Currently, the user control of the movement path and the execution task of the robot is mostly achieved by pre-measuring and fixed programming according to the operation purpose and the relatively unified task. The control mode can only complete the control of the robot within a single or certain fixed mode range according to the preset movement route and/or track. In other words, the control mode can only be used by a single machine. Once the carrying path and/or the execution task of the robot need to be changed, reprogramming or even design is often needed, the cost and the human resource are wasted repeatedly, and the user experience is poor.
Disclosure of Invention
The present invention has been made in view of the above problems. The invention provides a robot control method, a robot control system and a storage medium.
According to an aspect of an embodiment of the present invention, there is provided a robot control method, wherein the robot includes an end effector, the method including:
obtaining information about at least one first position and at least one second position of the end effector;
displaying a user interface, wherein the user interface includes a first operable control and a second operable control;
setting the first position as a grasp point for an object in response to user manipulation of the first operable control;
setting the second position to be a release point for the object in response to user manipulation of the second operable control;
determining a sequence of motion trajectories of the end effector based on the grasping points and the release points; and controlling the end effector to carry the object according to the motion trail sequence.
Illustratively, the user interface further comprises a third operable control,
the method further comprises the following steps:
obtaining information about at least one third position of the end effector;
determining the third position as a transition point in response to user operation of the third operable control;
the determining the sequence of motion trajectories of the end effector is further based on the transition point.
Illustratively, the user interface further comprises a fourth operable control,
the determining a sequence of motion trajectories of the end effector based on the grasping point and the release point comprises:
and responding to the operation of the user on the fourth operable control, and determining the motion trail sequence according to the setting sequence of the grabbing point and the releasing point.
Illustratively, the number of the grabbing points is multiple, the number of the releasing points is 1, the user interface further comprises a fifth operable control,
the determining a sequence of motion trajectories of the end effector based on the grasping point and the release point comprises:
in response to user operation of the fifth operable control, determining a sequence of grasp points in the grasp point set order, and setting the release point after each grasp point in the sequence of grasp points to form the sequence of motion trajectories.
Illustratively, the user interface further comprises a sixth operable control for modifying the currently selected grab point to a release point and a seventh operable control for modifying the currently selected release point to a grab point.
Illustratively, the user interface further comprises a text editing area for displaying or editing information of the grab point and release point.
Illustratively, the displaying the user interface includes:
when the end effector moves to any one of the grabbing point or the releasing point in the process of controlling the end effector to carry the object, highlighting a text editing area corresponding to the moved position point in the user interface.
Illustratively, the end effector is a jaw;
the method further comprises the following steps: acquiring information of opening and closing parameters of the clamping jaw at the first position and the second position;
the text editing area is also used for displaying or editing the information of the opening and closing parameters.
Exemplarily, the user interface further includes an eighth operable control for serializing the opening and closing parameters of the clamping jaws of all position points in the motion trajectory sequence into the opening and closing parameters of the current position point.
Illustratively, the information of the at least one first location and the at least one second location includes: the coordinate value of the first position and the coordinate value of the second position of the end effector under a Cartesian rectangular coordinate system;
the user interface further comprises a ninth operable control for serializing the Z-axis coordinate values of all the position points in the motion trail sequence in a cartesian rectangular coordinate system into the Z-axis coordinate value of the current position point.
According to another aspect of the embodiments of the present invention, there is also provided a robot control system including a display and a processor:
the display is to display a user interface, wherein the user interface includes a first operable control and a second operable control;
the processor is configured to obtain information about at least one first position and at least one second position of an end effector of the robot; setting the first position as a grasp point for an object in response to user manipulation of the first operable control; setting the second position to be a release point for the object in response to user manipulation of the second operable control; determining a sequence of motion trajectories of the end effector based on the grasping points and the release points; and controlling the end effector to carry the object according to the motion trail sequence.
According to still another aspect of an embodiment of the present invention, there is also provided a storage medium having stored thereon program instructions for executing the robot control method described above when executed.
According to the robot control method, the robot control system and the robot control storage medium, the user interface is provided, so that the operation of a user in planning the carrying path of the end effector is visualized, the simple and flexible operation of robot carrying path planning is realized, and the user experience is greatly improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail embodiments of the present invention with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 shows a schematic flow diagram of a robot control method according to an embodiment of the invention;
FIG. 2 shows a schematic diagram of a user interface according to one embodiment of the invention.
FIG. 3 shows a schematic diagram of a user interface according to another embodiment of the invention;
fig. 4a and 4b show schematic views of a user interface according to a further embodiment of the invention, respectively.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described herein without inventive step, shall fall within the scope of protection of the invention.
According to an embodiment of the present invention, there is provided a robot control method. A robot is a machine device that automatically performs work. A robot may include a robot body, an end effector (or referred to as a tool). The body may include a plurality of joints such as a base, a large arm, a small arm, a wrist, and the like. The end effector is, for example, a jaw that can be opened and closed, but also other operating tools. The end effector is controlled by the robot control system to move according to the corresponding path and complete the designated action at the corresponding position. Specifically, for example, the end effector is controlled by a robot control system to move in three-dimensional space and perform related actions such as grasping, releasing or other actions at specified positions. The handling path of the end effector can be planned in advance, so that the end effector can be automatically and repeatedly executed according to the planned path.
Fig. 1 shows a schematic flow diagram of a robot control method 100 according to an embodiment of the invention. As shown in fig. 1, the robot control method 100 includes the steps of:
step S110, information about at least one first position and at least one second position of an end effector of the robot is acquired.
When the end effector is manipulated, in order to accurately control the movement track of the end effector in the movement process of the end effector and execute related actions at a designated position, a coordinate system of the robot can be established to determine the position information of the end effector. This allows the trajectory of the end effector to be set or controlled and to perform the associated action at the desired location.
Alternatively, the robot coordinate system may be a robot body coordinate system with a center point of a base of the robot as a coordinate system origin. Because the base of the robot remains stationary during the operations performed by the various joints of the robot. Therefore, the robot control is executed by using the robot body coordinate system, so that various coordinate system transformations can be avoided, and the calculation is simplified.
It is to be understood that the above-described acquisition of the position information of the end effector of the robot may be performed by using various suitable sensors such as an encoder and an angle sensor.
It is understood that an end effector is a tool that occupies space, rather than a point. For the convenience of calculation, the position information of one point in the coordinate system is used as the position information of the end effector. Alternatively, position information of a certain part point of the end effector or a certain point in its occupied space is taken as the position information of the end effector. Specifically, for example, the end effector is a tool having a shape resembling a cone, and the position of the end point of the tip of the end effector may be taken as the position of the end effector. For another example, the end effector is a jaw that can be opened and closed, and the position of the center point of a geometric planar pattern composed of end points of several teeth of the jaw may be used as the position of the end effector.
The most basic action of an end effector in performing the task of handling objects is to grab an object from one location and then move to another location to release the object. Therefore, in planning the performance of a task by an end effector, it is necessary to obtain information about at least two positions of the end effector, including a first position and a second position. Wherein the first position is used as a position for the end effector to grasp the object. The second position is used as a position for the end effector to release the object.
Optionally, position information of the end effector is acquired in real time during the movement of the end effector. Alternatively, position information of the end effector may also be acquired in response to a user's manipulation during the movement of the end effector.
Step S120, displaying a user interface, where the user interface includes a first operable control and a second operable control.
In order to more conveniently and intuitively control the end effector of the robot to execute tasks, a user interface for human-computer interaction is provided. And acquiring the operation of the user through the user interface to complete the corresponding operation function.
Based on the user interface, the user can control the robot more intuitively and visually. The user interface includes a first operable control and a second operable control.
FIG. 2 shows a schematic diagram of a user interface according to one embodiment of the invention. The user interface provides a plurality of operable controls, such as a first operable control (cPos) and a second operable control (dPos). The first operable control and the second operable control are used for controlling the end effector to execute relevant operations. In this user interface, the operable controls are buttons for performing different functions. It will be appreciated in the art that other operable controls may also be employed, such as a text edit box or the like.
Step S130, in response to the user operating the first operable control, setting the first position as a grasping point of the object.
After the first position of the end effector is obtained, the user may operate a first operable control in the user interface. The first position is set to a grasping point of the object in response to user manipulation of the first operable control. For example, a user may operate a first operable control in a user interface when the end effector is moved to a first position. Setting the first position as a grasping point of the object in response to the above-described operation by the user.
It will be appreciated that the grasping point is the point at which the end effector performs a grasping action on the object. The end effector may perform various specific actions at different locations, such as a grasping action, a releasing action, or other actions. Whether moving or performing a particular action, is associated with a particular location. For example, moving from one position to another, the path of movement includes a number of points at which the end effector does not perform any action. Similarly, if a corresponding action is performed at a certain location point, that point is the point at which the action is performed, specifically, for example, a grab point, a release point, or other action points. By setting the first position as the object's gripping point, the action of gripping the object is performed at this first position each time the robot is moved to this first position during the subsequent work.
Alternatively, the grasping point may be an initial position of the end effector or a position reached after the end effector has moved a distance.
Step S140, setting the second position as a release point of the object in response to the user operating the second operable control.
Similar to step S130, after the second position of the end effector is obtained, the user may operate a second operable control in the user interface. The second position is set to a release point of the object in response to user manipulation of the second operable control. By setting the second position as the release point of the object, the action of releasing the object is performed at this second position each time the robot is moved to this second position during the subsequent work.
And step S150, determining a motion track sequence of the end effector based on the grabbing point and the releasing point.
When planning a transfer task of an end effector of a robot, it is necessary to plan an execution action and an execution position point of each subtask. For example, subtask A may grasp the object at position P1, and subtask B may release the object at position P2. The subtasks A and B, as described above, are executed in the order of executing the subtask A first and then executing the subtask B (from the perspective of the execution location point, i.e., P1-P2). In performing the transfer task, the end effector would first move to position P1 to grasp the object and then move from position P1 to position P2 to release the just grasped object, depending on the execution sequence.
Based on the grasping point and the release point of the end effector determined in steps S130 and S140, respectively, a motion trajectory sequence of the end effector may be determined based on the grasping point and the release point.
And step S160, controlling the end effector to carry the object according to the motion track sequence. It can be understood that the object can be transported by controlling the end effector to perform operations according to the motion trajectory sequence obtained in step S150.
In one example, step S160 specifically includes the following steps:
step S161: determining position point information of a current subtask in the object transporting task;
step S162: controlling the end effector to move to the position point according to the position point information, and executing the action of the current subtask at the position point;
step S163: judging whether the next subtask exists, if not, stopping executing the operation, and ending; if the next subtask exists, continuing to execute the next step;
step S164: the next subtask is determined to be the current subtask, and it proceeds to step S161.
After the steps are executed, the end effector completes corresponding actions according to the motion track sequence. The end effector may grasp the object at a grasping point and then move to a release point to release the object. Thereby realizing the control of the end effector to carry the object.
It will be understood by those of ordinary skill in the art that although the various steps of the robot control method 100 are described in a particular order, the order is illustrative only and not limiting of the present invention. For example, step S130 and step S140 may be performed in reverse order.
The robot control method provides a user interface, and enables a user to visually plan the operation of the carrying path of the end effector. From this, the user can control the robot through simple operation ground and carry the object in a flexible way, need not to measure in advance and preprogrammed, has promoted user experience greatly.
Illustratively, as shown in FIG. 2, the user interface may further include a third operable control (Pos). In addition to acquiring information on at least one first position and at least one second position of the end effector at step S110, the robot control method further includes: obtaining information about at least one third position of the end effector; determining a third position as the transition point in response to user operation of the third operable control; the aforementioned step S150 determines the motion trajectory sequence of the end effector based on the transition point in addition to the grasping point and the release point.
The third position is used as a transition point in the path of motion of the robot end effector. The transition point is the point through which the path of motion of the end effector passes, and at the transition point, the end effector simply passes without performing any additional action. Like the release point and the grasp point, the transition point may also be added to the sequence of motion trajectories of the end effector.
Similar to step S130 of the method 100 described above, after the third position of the end effector is obtained, the user may operate a third operable control in the user interface. The third position is set to the transition point in response to user manipulation of the third manipulable control.
It can be understood that determining the motion trajectory sequence based on the grabbing points, the transition points and the releasing points together enables the planning of the motion trajectory to be more detailed, for example, some roadblocks can be successfully avoided, so that the robot can carry objects more smoothly.
FIG. 3 shows a schematic diagram of a user interface according to another embodiment of the invention. As shown in fig. 3, the user interface includes a text editing area (shown as a data area) that includes data relating to controlling the end effector to perform a certain handling task. In other words, the data area may be used to display or edit information of the grab point and release point. It will be appreciated that information of the third position of the end effector, i.e. the transition point, may also be obtained in the robot control method as described above. In this case, the text editing region may be used to display or edit information of the transition point.
The first line of the data area is a header, and the content of the header is respectively the name of a position point, the X-axis coordinate value of the position point, the Y-axis coordinate value of the position point, the Z-axis coordinate value of the position point and the opening and closing parameters of the end effector at the position point from left to right.
The grasping points among the position points are denoted by C, where represents the number. For example, C0 is the first grab point and C1 is the second grab point. The release points in the position points are denoted as D, where x also represents the number. For example, D0 is the first release point and D1 is the second release point. It is understood that the numbering is merely to distinguish the position points and does not represent the sequential relationship between the position points.
As shown in fig. 3, the information of each position point may include coordinate values of the end effector in a cartesian rectangular coordinate system. In other words, the information for the first position as the grasping point and for the second position as the releasing point includes: and the coordinate value of the first position and the coordinate value of the second position of the end effector under the Cartesian rectangular coordinate system. Similarly, the information for the third position as the transition point may include coordinate values (not shown in fig. 3) of the third position of the end effector in a cartesian orthogonal coordinate system.
The cartesian rectangular coordinate system may be the robot body coordinate system described above, which may be defined according to the installation position of the robot. For example, the robot is mounted on the ground, and may define cartesian orthogonal coordinates with both the X-axis and the Y-axis in the horizontal plane. For another example, the robot is mounted on a wall surface perpendicular to the ground, and both the X-axis and the Y-axis can be defined in a vertical plane parallel to the wall surface. The coordinate value based on the Cartesian rectangular coordinate system can uniquely determine the position information of each position, and the end effector can conveniently and accurately position the position point for executing the task.
It will be appreciated that the text editing area may support editing functions to edit such location information in addition to displaying such location information as described above. The editing may include modifying data, adding data, deleting data, and the like.
Alternatively, in the text editing area, a currently set position point (such as a grab point or a release point) may be arranged by default after an already set position point, so that the setting order of all the position points can be determined.
The setting order of the above-mentioned position points can also be adjusted. For example, in response to a user operation, a new position point is inserted before the current position point in the text editing area and the attribute of the new position point, such as a grab point, a release point, or a transition point, is set. For another example, at least one position point in the text editing region is deleted in response to an operation by the user. For another example, the position of any one or more position points in the text editing area is modified in response to the user operation. For example, to be positioned further forward or rearward.
Based on the display and editing operation of the text editing area, the user can simply and conveniently plan the carrying task of the end effector, and the working efficiency is improved.
As shown in FIG. 3, the user interface may include a fourth operable control. The fourth operable control is for determining a sequence of motion trajectories of the end effector. The motion trail sequence comprises a grabbing point and a releasing point. The step S150 of determining the motion trajectory sequence of the end effector based on the grasping point and the releasing point includes: and in response to the operation of the fourth operable control by the user, determining the motion trail sequence according to the setting sequence of the grabbing point and the releasing point. It will be appreciated that the order of placement may be from front to back for the location points in the data area of the user interface described in FIG. 3.
Specifically, data shown in the data area in fig. 3 is taken as an example for explanation. As shown in the data area of fig. 3, the object-moving task involves four location points, D0, C1, C2, and C3. The X, Y, Z axis coordinate values of the four location points in the coordinate system are also shown in FIG. 3, respectively. The order of the position points in the data area from front to back is the arrangement order of the four position points. In response to a user's operation of the fourth operable control (i.e., clicking on the button), a sequence of motion trajectories is determined in the order of the setting of the above-described grab point and release point. Thus, the sequence of motion trajectories of the end effector may be D0-C1-C2-C3-D0.
The end effector proceeds from release point D0 when performing a task of transporting an object. The end effector then passes through the grasp points C1, C2, and C3 in sequence and performs a grasping action at each grasp point, and finally the end effector again returns to the release point D0 where it performs an action to release the object. In one example, the end effector is a suction cup. The end effector sucks up the object to be grasped at the grasping points C1, C2, and C3, respectively, and finally releases them together at the release point D0.
It will be appreciated that transition points may also be included in the data area, and that the sequence of motion trajectories correspondingly includes transition points. The transition points are denoted by N and represent numbers. For example, N0 is the first transition point and N1 is the second transition point. It is to be appreciated that the end effector does not perform any additional actions at the transition point.
In some applications, the user may set the grasping point, release point, and transition point (if present) according to a desired motion profile of the end effector. Due to the existence of the fourth operable control, the subsequent determination of the motion trail sequence is more convenient, and the user experience is improved.
Illustratively, in some applications, there are multiple grasping points and 1 release point. In these applications, it may be desirable to control the end effector to grasp objects at multiple grasp points, respectively, but to finally release the objects at the same release point. In this case, the user may operate a fifth operable control in the user interface shown in fig. 3, which is also used to determine the sequence of motion trajectories of the end effector.
The step S150 of determining the motion trajectory sequence of the end effector based on the grasping point and the releasing point includes: in response to user operation of the fifth operable control, a sequence of grasp points is determined in a grasp point set order, and a release point is set after each grasp point in the sequence of grasp points to form a sequence of motion trajectories.
Still taking the user interface shown in FIG. 3 as an example, in response to a user operation of the fifth operable control (i.e., clicking on a button), a sequence of grab points C1-C2-C3 is first determined in order of the grab point settings, and then a release point is set after each grab point in the sequence of grab points C1-C2-C3 to form a sequence of motion trajectories. Therefore, the motion trail sequence of the end effector is D0-C1-D0-C2-D0-C3-D0.
The end effector proceeds from release point D0 when performing a task of transporting an object. Then, the end effector moves to the grasping point C1 to perform an action of grasping the object, and then returns to the release point D0 to perform an action of releasing the object; the end effector then moves to the grasp point C2 to perform the action of grasping the object, and the subsequent operations are similar until all the actions of grasping the object are performed. Finally, the end effector returns to release point D0 where it performs an action to release the object.
Likewise, the handling task of the end effector described above may also involve a transition point.
In some application scenarios, it is necessary to control the end effector to grasp objects at multiple grasp points, respectively, but finally release the objects at the same release point. The fifth operable control simplifies the operation of repeatedly setting the release point, simultaneously facilitates the subsequent determination of the motion trail sequence, and improves the user experience.
Illustratively, as shown in fig. 3, the user interface further comprises a sixth operable control for modifying the currently selected grab point to a release point and a seventh operable control for modifying the currently selected release point to a grab point. Based on the operation of the sixth operable control or the seventh operable control, the switching of the release point or the grab point can be conveniently performed, and the operation of a user is facilitated.
Illustratively, in controlling the end effector to carry the object, when the end effector moves to any one of the grab point, the release point, or the transition point (if any), a text editing area corresponding to the moved position point is highlighted in a text editing area in the user interface. Fig. 4a and 4b show a user interface according to a further embodiment of the invention, respectively. Fig. 4a shows the user interface displayed when the end effector is moved to transition point N0, and fig. 4b shows the user interface displayed when the end effector is moved to transition point N1. When the end effector executes the task of carrying objects, the text editing area is synchronously updated, the position information of the end effector movement is displayed in real time, and a user can clearly know the movement state of the end effector in real time.
Exemplarily, in case the end effector is a jaw, the robot control method further comprises: and acquiring information of opening and closing parameters of the clamping jaw at the first position and the second position. The clamping jaw grabs the object at the first position, and the opening and closing parameters of the clamping jaw when the clamping jaw grabs the object can be obtained. Similarly, when the clamping jaw releases the object at the second position, the opening and closing parameters of the released object can be acquired.
Alternatively, if the jaws are opened and closed by rotation of the gripping teeth, the opening and closing parameter may be information on the opening and closing angle between the gripping teeth. The opening and closing parameter may be information of the opening and closing distance between the clamp teeth if the clamping jaw is opened and closed based on the linear distance between the clamp teeth.
The text editing area in the user interface can also be used for displaying or editing information of opening and closing parameters of the position points. The 5 th column in the user interface shown in fig. 3 is the opening and closing parameter information of the position point.
For robots in which the end effector is a jaw, the opening and closing parameters of the jaw are of great significance for successful object handling tasks. Providing information on the opening and closing parameters of the jaws at the gripping and release points can help the user to more optimally control the robot to perform the task of carrying objects.
Illustratively, the user interface may further include an eighth operable control, as shown in FIG. 2. The eighth operable control is used for serializing the opening and closing parameters of the clamping jaws at all position points in the motion track sequence into the opening and closing parameters of the current position point.
In one example, the current location point is first determined in response to a user action, such as clicking a left mouse button at a target location point in the user interface. Then, in response to the user clicking on the eighth operable control shown in fig. 2, the opening and closing parameters of all position points of the clamping jaw in the motion trail sequence are unified into the opening and closing parameters of the determined current position point.
The setting of the opening and closing parameters of the clamping jaws can be completed in batches by using the eighth operable control, so that the operation of a user is facilitated.
Similarly, the user interface may further include a ninth operable control, as shown in FIG. 2. The ninth operational control is used for serializing the Z-axis coordinate values of all the position points in the motion trail sequence in a Cartesian rectangular coordinate system into the Z-axis coordinate value of the current position point. Therefore, the Z-axis coordinate values of all the position points in the motion track sequence are unified into the determined Z-axis coordinate value of the current position point.
The Z-axis coordinate values of all position points in the motion track sequence are uniformly set to be one coordinate value, so that the end effector moves in a plane equivalently, and the difficulty of controlling the robot is reduced. In addition, the setting of the Z-axis coordinate value of the position point can be completed in batch by utilizing the ninth operable control, so that the operation of a user is facilitated.
According to another embodiment of the invention, a robot control system is also provided. The robot control system includes a display and a processor. The display is to display a user interface, wherein the user interface includes a first operable control and a second operable control; the processor is configured to obtain information about at least one first position and at least one second position of an end effector of the robot; setting the first position as a grasp point for an object in response to user manipulation of the first operable control; setting the second position to be a release point for the object in response to user manipulation of the second operable control; determining a sequence of motion trajectories of the end effector based on the grasping points and the release points; and controlling the end effector to carry the object according to the motion trail sequence.
Furthermore, according to still another aspect of the present invention, there is also provided a storage medium having stored thereon program instructions which, when executed by a computer or a processor, cause the computer or the processor to execute the respective steps of the above-described robot control method of the embodiment of the present invention. The storage medium may include, for example, a storage component of a tablet computer, a hard disk of a personal computer, Read Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM), portable compact disc read only memory (CD-ROM), USB memory, or any combination of the above storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
A person skilled in the art can understand specific implementation schemes of the robot control system and the storage medium by reading the above description related to the robot control method, and details are not described herein for brevity.
According to the robot control method, the robot control system and the storage medium provided by the embodiment of the invention, the operation of a user in planning the carrying path of the end effector is visualized by providing the user interface, the carrying path of the robot can be simply and flexibly planned, and the user experience is greatly improved.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the foregoing illustrative embodiments are merely exemplary and are not intended to limit the scope of the invention thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present invention should not be construed to reflect the intent: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some of the modules used in the robot control system according to embodiments of the present invention. The present invention may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiment of the present invention or the description thereof, and the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (11)

1. A method of robot control, wherein the robot includes an end effector, the method comprising:
obtaining information about at least one first position and at least one second position of the end effector;
displaying a user interface, wherein the user interface includes a first operable control, a second operable control, and a fourth operable control;
setting the first position as a grasp point for an object in response to user manipulation of the first operable control;
setting the second position to be a release point for the object in response to user manipulation of the second operable control;
determining a sequence of motion trajectories of the end effector based on the grasping point and the release point, comprising: in response to the operation of the user on the fourth operable control, determining the motion trail sequence according to the setting sequence of the grabbing point and the releasing point; and
and controlling the end effector to carry the object according to the motion track sequence.
2. The method of claim 1, wherein the user interface further comprises a third operable control,
the method further comprises the following steps:
obtaining information about at least one third position of the end effector;
determining the third position as a transition point in response to user operation of the third operable control;
the determining the sequence of motion trajectories of the end effector is further based on the transition point.
3. The method according to claim 1 or 2, wherein the number of the grasping points is plural, the number of the releasing points is 1, the user interface further includes a fifth operable control,
the determining a sequence of motion trajectories of the end effector based on the grasping point and the release point comprises:
in response to user operation of the fifth operable control, determining a sequence of grasp points in the grasp point set order, and setting the release point after each grasp point in the sequence of grasp points to form the sequence of motion trajectories.
4. The method of claim 1 or 2, wherein the user interface further comprises a sixth operable control for modifying the currently selected grab point to a release point and a seventh operable control for modifying the currently selected release point to a grab point.
5. The method of claim 1 or 2, wherein the user interface further comprises a text editing area for displaying or editing information of the grab and release points.
6. The method of claim 5, wherein the displaying the user interface comprises:
when the end effector moves to any one of the grabbing point or the releasing point in the process of controlling the end effector to carry the object, highlighting a text editing area corresponding to the moved position point in the user interface.
7. The method of claim 5, wherein the end effector is a jaw;
the method further comprises the following steps: acquiring information of opening and closing parameters of the clamping jaw at the first position and the second position;
the text editing area is also used for displaying or editing the information of the opening and closing parameters.
8. The method of claim 7, wherein the user interface further comprises an eighth operable control for serializing the opening and closing parameters of the jaws for all position points in the sequence of motion trajectories to the opening and closing parameters for a current position point.
9. The method of claim 1 or 2, wherein the information of the at least one first location and the at least one second location comprises: the coordinate value of the first position and the coordinate value of the second position of the end effector under a Cartesian rectangular coordinate system;
the user interface further comprises a ninth operable control for serializing the Z-axis coordinate values of all the position points in the motion trail sequence in a cartesian rectangular coordinate system into the Z-axis coordinate value of the current position point.
10. A robotic control system comprising a display and a processor:
the display is to display a user interface, wherein the user interface includes a first operable control, a second operable control, and a fourth operable control;
the processor is configured to obtain information about at least one first position and at least one second position of an end effector of the robot; setting the first position as a grasp point for an object in response to user manipulation of the first operable control; setting the second position to be a release point for the object in response to user manipulation of the second operable control; determining a sequence of motion trajectories of the end effector based on the grasping points and the release points; and controlling the end effector to carry the object according to the motion trail sequence; wherein the determining a sequence of motion trajectories of the end effector based on the grasping point and the release point comprises: and responding to the operation of the user on the fourth operable control, and determining the motion trail sequence according to the setting sequence of the grabbing point and the releasing point.
11. A storage medium on which program instructions are stored, the program instructions being operable when executed to perform a robot control method according to any one of claims 1 to 9.
CN201910094952.7A 2019-01-30 2019-01-30 Robot control method, system and storage medium Active CN109822569B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910094952.7A CN109822569B (en) 2019-01-30 2019-01-30 Robot control method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910094952.7A CN109822569B (en) 2019-01-30 2019-01-30 Robot control method, system and storage medium

Publications (2)

Publication Number Publication Date
CN109822569A CN109822569A (en) 2019-05-31
CN109822569B true CN109822569B (en) 2021-04-09

Family

ID=66863161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910094952.7A Active CN109822569B (en) 2019-01-30 2019-01-30 Robot control method, system and storage medium

Country Status (1)

Country Link
CN (1) CN109822569B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110238299A (en) * 2019-06-21 2019-09-17 珠海格力智能装备有限公司 Control method and device for blanking of punch press, storage medium and processor
CN115903688A (en) * 2022-11-04 2023-04-04 北京镁伽机器人科技有限公司 Control method and device for automation system and automation process

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4174342B2 (en) * 2003-02-19 2008-10-29 ファナック株式会社 Work transfer device
CN108349085B (en) * 2015-10-30 2021-12-28 株式会社安川电机 Robot teaching device, computer program, and robot teaching method
CN108621151A (en) * 2017-03-20 2018-10-09 罗德科技股份有限公司 Icon formula programmable control method for Machinery Control System
CN107186715B (en) * 2017-05-25 2020-07-31 深圳市越疆科技有限公司 Method and device for controlling movement of mechanical arm, storage medium and computer
CN107972035A (en) * 2018-01-02 2018-05-01 北京翰辰自动化***有限公司 A kind of industrial robot programmed set of instructions and its graphic processing method
CN108268255A (en) * 2018-02-11 2018-07-10 遨博(北京)智能科技有限公司 For programming the method and apparatus of robot

Also Published As

Publication number Publication date
CN109822569A (en) 2019-05-31

Similar Documents

Publication Publication Date Title
JP5784670B2 (en) Method, apparatus, and system for automated motion for medical robots
US9878446B2 (en) Determination of object-related gripping regions using a robot
US10166673B2 (en) Portable apparatus for controlling robot and method thereof
EP3272473B1 (en) Teaching device and method for generating control information
CN109822568B (en) Robot control method, system and storage medium
CN109689310A (en) To the method for industrial robot programming
CN109648568B (en) Robot control method, system and storage medium
CN109822569B (en) Robot control method, system and storage medium
WO2019064916A1 (en) Robot simulator
JP7151713B2 (en) robot simulator
CN109605378B (en) Method, device and system for processing motion parameters and storage medium
WO2020066949A1 (en) Robot path determination device, robot path determination method, and program
JP2022500260A (en) Controls for robotic devices, robotic devices, methods, computer programs and machine-readable storage media
WO2024092922A1 (en) Robot teaching method and apparatus, electronic device, and storage medium
CN110666804B (en) Motion planning method and system for cooperation of double robots
CN109822566B (en) Robot control method, system and storage medium
CN113733107B (en) Robot drag teaching method, robot and computer storage medium
CN109551486B (en) Method, device and system for processing motion parameters and storage medium
JPH06134684A (en) Teaching method of robot track
CN109822565A (en) Robot control method, system and storage medium
CN109822567B (en) Method, device and system for processing motion parameters and storage medium
JPWO2019064915A1 (en) Robot teaching device
Sun et al. Direct virtual-hand interface in robot assembly programming
JP7099470B2 (en) Robot teaching device
JP2020175473A (en) Operation planning device and operation planning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20191216

Address after: No.1705, building 8, Qianhai preeminent Financial Center (phase I), unit 2, guiwan District, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Mga Technology (Shenzhen) Co., Ltd

Address before: 102208 1, unit 1, 1 hospital, lung Yuan middle street, Changping District, Beijing 1109

Applicant before: Beijing magnesium Robot Technology Co., Ltd.

CB02 Change of applicant information
CB02 Change of applicant information

Address after: 518052 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen City, Guangdong Province

Applicant after: Shenzhen mga Technology Co.,Ltd.

Address before: 1705, building 8, Qianhai excellence Financial Center (phase I), unit 2, guiwan area, Nanshan street, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong 518000

Applicant before: Mga Technology (Shenzhen) Co.,Ltd.

GR01 Patent grant
GR01 Patent grant