CN116141331A - Robot end effector working space boundary generation method based on linear programming - Google Patents

Robot end effector working space boundary generation method based on linear programming Download PDF

Info

Publication number
CN116141331A
CN116141331A CN202310209843.1A CN202310209843A CN116141331A CN 116141331 A CN116141331 A CN 116141331A CN 202310209843 A CN202310209843 A CN 202310209843A CN 116141331 A CN116141331 A CN 116141331A
Authority
CN
China
Prior art keywords
robot
joint
constraint
end effector
linear
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310209843.1A
Other languages
Chinese (zh)
Inventor
邱蜀伟
黄坤
黄金
诸明翰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Shibite Robot Co Ltd
Original Assignee
Hunan Shibite Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Shibite Robot Co Ltd filed Critical Hunan Shibite Robot Co Ltd
Publication of CN116141331A publication Critical patent/CN116141331A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a robot end effector working space boundary generation method based on linear programming, which comprises the following steps: s1, constructing a kinematic model of the robot motion by parameterizing the joint motion according to the joint characteristics of the robot; s2, respectively constructing linear equations of all constraints required by the working space boundary of the end effector of the robot; s3, determining a linear constraint equation set which needs to be met for generating a working space boundary of the end effector of the robot; s4, obtaining all robot joint motion parameters meeting linear equations in the constraint equation set; s5, finding out boundary joint motion parameters which simultaneously meet part of constraint from all the robot joint motion parameters which meet the linear equation of the linear constraint equation set; s6, identifying and determining working space boundary points of the end effector of the robot. When laying out robots and industrial production lines, the work of overlapping the industrial production lines and the robot working space is simplified, and the layout work of the robots is greatly simplified.

Description

Robot end effector working space boundary generation method based on linear programming
Technical Field
The invention relates to the technical field of automatic control, in particular to a working space boundary generation method of a robot.
Background
With the continuous development of automatic control technology, robots are widely applied to various industries, and in various scenes and tasks of applying robots, different requirements are placed on positions and postures of end effectors of the robots to be reached. When laying out robots and industrial production lines, the traditional method is to estimate the moving range of the end effector of the robot on the premise of meeting task requirements according to the design drawing of the robot, but the estimation method has the possibility that the end effector of the robot cannot fully meet the task requirements, and because whether one robot can meet the requirements of a certain task cannot be determined, after the robots are deployed, a craftsman is required to perform full-flow test in a manual walking point mode.
In the prior art, a method for calculating and solving the working space of the end effector of the robot in the Cartesian space by adopting the method for sampling in the joint space of the robot and then carrying out forward kinematics is only suitable for robots with fewer joints, a great amount of calculation is needed for robots with multiple joints, and a great amount of redundancy calculation exists.
Therefore, there is a need in the art for a method of obtaining end effector workspace for joint characteristics of robots to determine whether the robot can reach a certain job site and perform a certain task when laying out the robot and industrial production line.
Disclosure of Invention
In order to solve at least one of the above technical problems, the present invention provides a method for generating a working space boundary of a robot end effector based on linear programming.
The aim of the invention is achieved by the following technical scheme:
the invention provides a robot end effector working space boundary generation method based on linear programming, which comprises the following steps:
s1, according to joint characteristics of a robot, representing joint movement by parameterization, and constructing a robot kinematics model;
s2, respectively constructing a linear equation of robot assembly constraint, matrix rank lack constraint, constraint on the pose of the robot end effector and constraint generated by introducing intermediate variables required by generating the working space boundary of the robot end effector;
s3, combining joint motion parameters with constructed linear equations of robot assembly constraint, matrix rank lack constraint, constraint on the pose of the end effector of the robot and constraint generated by introducing intermediate variables, and determining a linear constraint equation set which needs to be met for generating the working space boundary of the end effector of the robot;
s4, obtaining all robot joint motion parameters meeting linear equations in the constraint equation set according to the linear constraint equation set;
s5, boundary joint motion parameters which simultaneously meet the robot assembly constraint, the matrix rank lack constraint and the constraint on the pose of the robot end effector are found out from all the robot joint motion parameters which meet the linear equation of the linear constraint equation set;
s6, identifying and determining working space boundary points of the robot end effector from boundary joint motion parameters in combination with robot assembly constraints.
As a further improvement, in the step S1, the joint motion is modeled by parameterizing the joint motion according to the joint characteristics of the robot, and specifically includes the following steps:
s11, taking a joint point on each joint of the robot, and abstracting all joints of the robot into an abstract model formed by points and line segments;
s12, representing the motion generated by the current joint by parameterization through the rotation gesture or position change of the next joint point connected with the current joint;
s13, repeating the step S12 for each joint of the robot until the motion of each joint is represented by parameterization, so as to construct a kinematic model of the motion of the robot.
As a further improvement, in the step S12, the motion generated by the current joint is represented by parameterization by the rotation gesture or position change of the next joint point connected with the current joint, and specifically includes the following steps:
s121, when the current joint is a rotary joint, the current joint motion parameter is represented by the rotation gesture change of the next joint point by adopting a quaternion parameter;
and S122, when the current joint is a telescopic joint, the current joint motion parameter is represented by the position change of the next joint point by adopting a three-dimensional vector parameter.
As a further improvement, in the step S2, the robot assembly constraint for creating the working space boundary is specifically: judging each joint of the robot, and when the current joint is a rotary joint, keeping the rotation direction consistent with the rotation axis of the current joint all the time when the rotation posture of the next joint point is changed under the condition that the assembly constraint condition of the robot needs to be met by the current joint; when the current joint is a telescopic joint, the robot assembly constraint condition to be met by the current joint is that when the position of the next joint point is changed, two vectors inconsistent with the telescopic positive direction of the current joint are always unchanged.
As a further improvement, in the step S2, the linear equation of the constructed matrix rank deficiency constraint is:
Figure BDA0004112240520000031
wherein ,
Figure BDA0004112240520000032
representation->
Figure BDA0004112240520000033
Relative to->
Figure BDA0004112240520000034
Transposed matrix of partial derivatives of>
Figure BDA0004112240520000035
Linear equation representing the assembly constraints of the robot, < +.>
Figure BDA0004112240520000036
Indicating the articulation parameters of the end effector of the robot, < +.>
Figure BDA0004112240520000037
Representing other articulation parameters than the robotic end effector +.>
Figure BDA0004112240520000038
Representing a random vector.
As a further improvement, in the step S3, a system of linear constraint equations that needs to be satisfied by the robot end effector workspace boundary is generated as follows:
Figure BDA0004112240520000039
wherein ,
Figure BDA00041122405200000310
constraint linear equation representing pose of robot end effector, +.>
Figure BDA00041122405200000311
Representing the constraint linear equation resulting from the introduction of intermediate variables,,, and>
Figure BDA00041122405200000312
representing the intermediate variable introduced by the secondary variable consisting of the articulation parameters, <>
Figure BDA00041122405200000313
Representing intermediate variables introduced by bilinear variables made up of articulation parameters.
As a further improvement, in the step S4, all robot joint motion parameters satisfying the linear equation in the constraint equation set are obtained according to the linear constraint equation set, and the method includes the following steps:
s41, finding out the maximum value and the minimum value which can be taken by the joint motion parameters on the premise of meeting the linear constraint equation set;
s42, setting a threshold value, judging whether the maximum value in the maximum value ranges of all the articulation parameters is higher than the set threshold value, if so, dividing the value range of the articulation parameters into two parts from the middle point, respectively storing the obtained value ranges of the two groups of parameters into a queue of the articulation parameters, and keeping the value ranges of other articulation parameters unchanged;
s43, repeating the steps S41 and S42 for the value ranges of the next group of parameters in the articulation parameter queue until the value ranges of all the articulation parameters are lower than the set threshold.
As a further improvement, in the step S5, boundary joint motion parameters which simultaneously satisfy the robot assembly constraint, the matrix rank deficiency constraint and the constraint on the pose of the robot end effector are found out by the newton iteration method from all the robot joint motion parameters which satisfy the linear equation of the linear constraint equation set.
As a further improvement, in the step S6, the working space boundary point of the robot end effector is identified and determined from the boundary joint motion parameters in combination with the robot assembly constraint, comprising the steps of:
s61, searching each group of boundary articulation parameters, and respectively finding out a Gao Weiji graph formed by a robot assembly constraint linear equation relative to an articulation normal line of the robot end effector articulation parameters at the current boundary articulation parameter position, wherein the specific formula is as follows:
Figure BDA0004112240520000041
wherein ,
Figure BDA0004112240520000042
gao Weiji, representing what is constituted by a robot assembly constraint linear equation, relative to the articulation normal of the robot end effector articulation parameter at the current boundary articulation parameter position, < >>
Figure BDA0004112240520000043
Is that
Figure BDA0004112240520000044
Relative to->
Figure BDA0004112240520000045
Partial derivative of>
Figure BDA0004112240520000046
Representing a random vector representing current boundary joint motion parameters;
s62, setting a judging function in combination with the joint motion normal, and judging the joint motion parameters of each robot end effector through the judging function to identify and determine the working space boundary points of the end effectors.
The invention provides a robot end effector working space boundary generation method based on linear programming, which is characterized by comprising the following steps: s1, constructing a kinematic model of the robot motion by parameterizing the joint motion according to the joint characteristics of the robot; s2, respectively constructing a linear equation of robot assembly constraint, matrix rank lack constraint, constraint on the pose of the robot end effector and constraint generated by introducing intermediate variables required by generating the working space boundary of the robot end effector; s3, combining joint motion parameters with constructed linear equations of robot assembly constraint, matrix rank lack constraint, constraint on the pose of the end effector of the robot and constraint generated by introducing intermediate variables, and determining a linear constraint equation set which needs to be met for generating the working space boundary of the end effector of the robot; s4, obtaining all robot joint motion parameters meeting linear equations in the constraint equation set according to the linear constraint equation set; s5, boundary joint motion parameters which simultaneously meet the robot assembly constraint, the matrix rank lack constraint and the constraint on the pose of the robot end effector are found out from all the robot joint motion parameters which meet the linear equation of the linear constraint equation set; s6, identifying and determining working space boundary points of the robot end effector from boundary joint motion parameters in combination with robot assembly constraints. In the application process, based on the requirements of the structural size, joint limit and task targets of the robot, boundary points of working spaces, which are obtained by the end effector of the robot on the premise of meeting the task requirements, are connected to obtain the working space boundary of the end effector of the robot, and all constraint conditions on the end effector of the robot are introduced. When the robot and the industrial production line are laid out, the layout work is simplified into the work of overlapping the industrial production line and the robot working space, so that the layout work of an automation scene can be greatly simplified, and the accuracy of the layout can be greatly improved.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of an abstract model according to one embodiment of the invention;
fig. 3 is a schematic view of the working space boundaries of the robotic end effector of the present invention.
Detailed Description
Referring to fig. 1, an embodiment of the present invention provides a method for generating a working space boundary of a robot end effector based on linear programming, which specifically includes the following steps:
s1, according to joint characteristics of the robot, joint movement is represented by parameterization, and a robot kinematics model is constructed. As an example, the robot of the present embodiment is optionally but not limited to an articulated robot as an example for explanation, but not limited to this. Specifically, the robot may optionally but not exclusively include a plurality of joint axes, and the specific positions, numbers, types, etc. of the joint axes may be arbitrarily set by those skilled in the art according to the application range, field, working performance, etc. of the robot. Specifically, the joint shaft may optionally, but not exclusively, include a rotation shaft or/and a telescopic shaft according to the functions that the robot can perform, and the two joint shafts are connected through a joint arm (i.e., a joint link), and the joint characteristics of the robot include rotation or telescopic. More specifically, the end effector is an actuating member mounted to the end of a joint, and is optionally, but not limited to, a multi-fingered gripper, a paint gun, a welding tool, or the like. More specifically, each joint of the robot is optionally, but not limited to, mathematically represented by a parameter to achieve mathematical modeling; more specifically, the mathematical modeling method using linear programming is optional but not limited to, because linear programming (Linear programming, LP for short) is a basic mathematical theory and method for researching extremum problem of linear objective function under linear constraint condition, but not limited to. More specifically, step S1, optionally but not limited to, includes the following:
s11, as shown in FIG. 2, a joint point is taken on each joint of the robot, the joint point taken in the embodiment is a joint axis of the robot, all joints of the robot are abstracted into an abstract model formed by points and line segments, each joint comprises a joint point and a joint arm, the joint point, namely the joint axis, is represented by a point in the abstract model, and the joint arm is represented by a line in the abstract model;
s12, representing the motion generated by the rotation gesture or position change of the next joint point connected with the current joint on the current joint by parameters; specifically, the method comprises the following steps:
s121, when the front joint is a rotary joint, the current joint motion parameter is represented by the rotation posture change of the next joint point by using the quaternion parameter because the position of the next joint point connected with the current joint is determined by the rotation posture, as in FIG. 2, the joint 1 joint motion parameter is represented by the joint point P 1 The change in the rotational attitude of the joint 2 is represented by a quaternion parameter, the articulation parameter of the joint is represented by the articulation point P 2 The change in rotational attitude of (c) is represented by a quaternion parameter, and so on for other joints.
S122, when the current joint is a telescopic joint, the rotation posture of the next joint point connected to the current joint is not changed, so that the current joint motion parameter is represented by the position change of the next joint point by using a three-dimensional vector parameter.
S13, repeating the step S12 for each joint of the robot until the motion of each joint is represented by parameterization, and constructing a kinematic model of the robot.
S2, respectively constructing a linear equation of robot assembly constraint, matrix rank lack constraint, constraint on the pose of the robot end effector and constraint generated by introducing intermediate variables required by generating the working space boundary of the robot end effector.
Constructing a robot assembly constraint that generates a workspace boundary: since the robot has all joints connecting different components, and the connection of each joint of the robot can only allow a limited degree of freedom of movement between adjacent components, the parameters representing the movement of the joints need to meet the assembly constraints introduced by the joints to limit the variation of the parameters, in particular: judging each joint of the robot, wherein when the current joint is a rotary joint, the robot assembly constraint condition to be met by the current joint is that the direction in the rotary posture is always consistent with the rotary shaft of the current joint when the rotary posture of the next joint point is changed, and the rotary posture is the same as the rotary shaft of the current joint pointIs defined by the coordinate system of the next node. The rotational attitudes of the joint points P1 and P2 as in fig. 2 are (q 10 ,q 11 ,q 12 ,q 13) and (q20 ,q 21 ,q 22 ,q 23 ) Their corresponding rotation matrices are as follows:
Figure BDA0004112240520000071
Figure BDA0004112240520000072
since the joint 2 can only surround P 1 So that the constraint introduced by joint 2 is P 2 The Y-axis direction in the rotational attitude of (2) is always equal to P 1 Is kept consistent, i.e
2(q 11 q 12 -q 10 q 13 )=2(q 21 q 22 -q 20 q 23 )
Figure BDA0004112240520000073
2(q 10 q 11 +q 12 q 13 )=2(q 20 q 21 +q 22 q 23 )
When the current joint is a telescopic joint, the robot assembly constraint condition to be met by the current joint is that when the position (X, Y, Z) of the next joint point is changed, two vectors inconsistent with the telescopic positive direction of the current joint are always unchanged.
After deriving the assembly constraints introduced by all joints of the robot
Figure BDA0004112240520000081
A linear equation representing the assembly constraints of the robot, wherein +.>
Figure BDA0004112240520000082
Representing the articulation parameters of the robotic end effector, such as articulation point P in FIG. 2 6 Quaternion parameters of->
Figure BDA0004112240520000083
Representing other articulation parameters besides a robotic end effector, e.g. P in FIG. 2 1 ...P 5 Quaternion parameters of (c). In all cases satisfy->
Figure BDA0004112240520000084
The joint motion parameters of the system comprise joint motion parameters corresponding to the boundary points of the working space and joint motion parameters corresponding to the internal points of the working space, and most of joint motion parameters corresponding to the internal points of the working space are removed by introducing a matrix rank deficiency constraint condition.
The linear equation of the constructed matrix rank lack constraint is:
Figure BDA0004112240520000085
wherein ,
Figure BDA0004112240520000086
representation->
Figure BDA0004112240520000087
Relative to->
Figure BDA0004112240520000088
Transposed matrix of partial derivatives of>
Figure BDA0004112240520000089
Linear equation representing the assembly constraints of the robot, < +.>
Figure BDA00041122405200000810
Representing a random vector.
The linear equation that builds constraints on the pose of the end effector is:
Figure BDA00041122405200000811
constructing a linear equation introducing constraints brought by intermediate variables: since linear programming requires that all constraints are linear equations, secondary variables, such as
Figure BDA00041122405200000812
And bilinear variables, e.g. g i g j . To linearize all constraint equations, we introduce intermediate variables and some constraints to specify the relationship between the newly introduced intermediate variables and the original parameters.
For the secondary variable
Figure BDA00041122405200000813
We introduce an intermediate variable s i Make->
Figure BDA00041122405200000814
Let g i The value range of (1) is i ,u i ]The following three constraint equations need to be introduced to constrain s i and gi The relationship between, i.e.)>
Figure BDA00041122405200000815
Figure BDA00041122405200000816
For the bilinear variable g i g j Introducing intermediate variable b ij Make b ij =g i g j Let g i and gj The value ranges of (1) are respectively [ l ] i ,u i] and [lj ,u j ]The following four constraint equations need to be introduced to constrain b ij G i and gj The relationship between b ij =g i g j
Figure BDA0004112240520000091
From the above description, all linear equations that introduce constraints generated by intermediate variables are written as:
Figure BDA0004112240520000092
wherein ,
Figure BDA0004112240520000093
representing intermediate variables introduced by secondary variables consisting of articulation parameters (e.g.)>
Figure BDA0004112240520000094
),
Figure BDA0004112240520000095
Representing an intermediate variable (e.g., b) introduced by a bilinear variable consisting of articulation parameters ij =g i g j )。
S3, determining a linear constraint equation set which needs to be met for generating a working space boundary of the end effector of the robot by combining joint motion parameters and constructed robot assembly constraints, matrix rank lack constraints, constraints on the pose of the end effector of the robot and constraints generated by introducing intermediate variables, and setting an initial value range for each parameter, for example, in the embodiment, the joint motion parameters used for representing the rotation pose are quaternion parameters, wherein the initial value range of each element can be set to be [ -1,1];
Figure BDA0004112240520000096
wherein ,
Figure BDA0004112240520000097
constraint linear equation representing pose of robot end effector, +.>
Figure BDA0004112240520000098
Representing the constrained linear equation resulting from the introduction of intermediate variables.
S4, obtaining all robot joint motion parameters meeting linear equations in the constraint equation set according to the linear constraint equation set, wherein the robot joint motion parameters comprise the following steps:
s41, adopting a Pruning method (Pruning) to narrow the value range of each joint motion parameter through a linear programming method so as to achieve the purpose of Pruning, finding out the maximum value and the minimum value which can be obtained by the joint motion parameter on the premise of meeting a linear constraint equation set, and specifically: for each joint motion parameter, two linear programming problems need to be defined to find out the minimum value and the maximum value which can be obtained by the current parameter on the premise of meeting all constraint conditions in the constraint equation set. For example, when it is desired to trim a certain articulation parameter g i When the value range of (a) is taken, we need to solve the following two linear programming problems:
Figure BDA0004112240520000101
and
Figure BDA0004112240520000102
G is updated by solving the above maximization and minimization problems, respectively i The upper limit and the lower limit of the value range.
S42, setting a threshold value, and judging whether the maximum value in the range of values of all the articulation parameters is higher than the set threshold value. Specifically, after all the articulation parameters are trimmed, if the maximum value range of all the articulation parameters is higher than a set threshold value, the value range of the articulation parameters is divided into two parts from the middle point by adopting a branch method, the obtained value ranges of the two groups of parameters are respectively stored in a queue of the articulation parameters, and the value ranges of other articulation parameters are kept unchanged;
s43, repeating the steps S41 and S42 for the value ranges of the next group of parameters in the articulation parameter queue until the value ranges of all the articulation parameters are lower than the set threshold.
S5, after the operation of a pruning method and a branching method, all joint motion parameters meeting the linear constraint equation set are found out (namely
Figure BDA0004112240520000111
and />
Figure BDA0004112240520000112
) Is a range of values. Using newton's iterative method to find out the constraints on robot end effector pose (i.e.)>
Figure BDA0004112240520000113
) Boundary joint motion parameters of (a). Newton's iterative method is a method of approximately solving equations in real and complex number domains. In this step, because the range of values of each joint motion parameter is smaller than the set threshold value through step S4, a solution process adopting the newton iteration method can rapidly obtain a solution meeting all constraint conditions.
S6, identifying and determining the working space boundary point of the robot end effector from the boundary joint motion parameters by combining the robot assembly constraint, wherein the matrix rank lack constraint condition is only the numerical value of one group of joint motion parameters as a necessary condition of the working space boundary point, and the numerical value of each group of joint motion parameters needs to be identified more accurately in the step so as to determine whether the working space boundary point is the working space boundary point of the robot end effector. The method specifically comprises the following steps:
s61, searching each group of boundary joint motion parameters, wherein the embodiment uses one group of edgesInterface joint motion parameters
Figure BDA0004112240520000114
For example, find the constraint linear equation by robot assembly, respectively +.>
Figure BDA0004112240520000115
Gao Weiji which graph is composed at the current boundary articulation parameter +.>
Figure BDA0004112240520000116
Articulation parameters of the end effector of the robot in position +.>
Figure BDA0004112240520000117
Is the joint motion normal of (2)
Figure BDA0004112240520000118
The specific formula is as follows: />
Figure BDA0004112240520000119
wherein ,
Figure BDA00041122405200001110
is->
Figure BDA00041122405200001111
Relative to->
Figure BDA00041122405200001112
Partial derivative of>
Figure BDA00041122405200001113
A random vector representing the current boundary joint motion parameters, represented by the formula +.>
Figure BDA0004112240520000121
Obtaining;
s62, setting a judging function in combination with the normal of the joint motion, in the embodimentIs provided with
Figure BDA0004112240520000122
Is any group of joint movement parameters satisfying the robot assembly constraint linear equation, and +.>
Figure BDA0004112240520000123
Is adjacent to the joint point of the model through a judging function
Figure BDA0004112240520000124
Judging the articulation parameters of each robot end effector to identify and determine the working space boundary points of the end effector, wherein the embodiment is characterized by judging the function ∈ ->
Figure BDA0004112240520000125
Determine->
Figure BDA0004112240520000126
Whether it is a working space boundary point of a robot end effector, if +.>
Figure BDA0004112240520000127
Is positive or negative, then +.>
Figure BDA0004112240520000128
Is a working space boundary point of the robot end effector; if->
Figure BDA0004112240520000129
The sign of (2) cannot be determined, then +.>
Figure BDA00041122405200001210
Not a working space boundary point of a robotic end effector.
In the application process, based on the requirements of the structural size, joint limit and task targets of the robot, boundary points of working spaces, which are obtained by the end effector of the robot on the premise of meeting the task requirements, are connected to obtain the working space boundary of the end effector of the robot, and all constraint conditions on the end effector of the robot are introduced. When the robot and the industrial production line are laid out, the layout work is simplified into the work of overlapping the industrial production line and the robot working space, so that the layout work of an automation scene can be greatly simplified, and the accuracy of the layout can be greatly improved.
The invention is also applicable to any robot, including but not limited to industrial robots, parallel robots, redundant robots, double arm robots or multi-finger robots, etc. type robots.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (9)

1. A robot end effector workspace boundary generation method based on linear programming, comprising the steps of:
s1, according to joint characteristics of a robot, representing joint movement by parameterization, and constructing a robot dynamics model;
s2, respectively constructing a linear equation of robot assembly constraint, matrix rank lack constraint, constraint on the pose of the robot end effector and constraint generated by introducing intermediate variables required by generating the working space boundary of the robot end effector;
s3, combining joint motion parameters with constructed linear equations of robot assembly constraint, matrix rank lack constraint, constraint on the pose of the end effector of the robot and constraint generated by introducing intermediate variables, and determining a linear constraint equation set which needs to be met for generating the working space boundary of the end effector of the robot;
s4, obtaining robot joint motion parameters meeting linear equations in the constraint equation set according to the linear constraint equation set;
s5, boundary joint motion parameters which simultaneously meet the robot assembly constraint, the matrix rank lack constraint and the constraint on the pose of the robot end effector are found out from the robot joint motion parameters which meet the linear equation of the linear constraint equation set;
s6, identifying and determining working space boundary points of the robot end effector from boundary joint motion parameters in combination with robot assembly constraints.
2. The method for generating a working space boundary of a robot end effector based on linear programming according to claim 1, wherein in the step S1, the joint motion is modeled with a parameterized representation to construct a kinematic model of the robot motion according to the joint characteristics of the robot, specifically comprising the steps of:
s11, taking a joint point on each joint of the robot, and abstracting all joints of the robot into an abstract model formed by points and line segments;
s12, representing the motion generated by the current joint by parameterization through the rotation gesture or position change of the next joint point connected with the current joint;
s13, repeating the step S12 for each joint of the robot until the motion of each joint is represented by parameterization, and constructing a kinematic model of the motion of the robot.
3. The method for generating a working space boundary of a robot end effector based on linear programming according to claim 2, wherein in the step S12, the motion generated by the current joint is represented by parameterization by the rotation gesture or the position change of the next joint point connected with the current joint, specifically comprising the steps of:
s121, when the current joint is a rotary joint, the current joint motion parameter is represented by the rotation gesture change of the next joint point by adopting a quaternion parameter;
and S122, when the current joint is a telescopic joint, the current joint motion parameter is represented by the position change of the next joint point by adopting a three-dimensional vector parameter.
4. The method for generating a working space boundary of a robot end effector based on linear programming according to claim 3, wherein in the step S2, the robot assembly constraint for generating the working space boundary is specifically: judging each joint of the robot, and when the current joint is a rotary joint, keeping the rotation direction consistent with the rotation axis of the current joint all the time when the rotation posture of the next joint point is changed under the condition that the assembly constraint condition of the robot needs to be met by the current joint; when the current joint is a telescopic joint, the robot assembly constraint condition to be met by the current joint is that when the position of the next joint point is changed, two vectors inconsistent with the telescopic positive direction of the current joint are always unchanged.
5. The method for generating a working space boundary of a robot end effector based on linear programming according to claim 1, wherein in the step S2, the constructed linear equation of the matrix rank deficiency constraint is:
Figure FDA0004112240510000021
wherein ,
Figure FDA0004112240510000022
representation->
Figure FDA0004112240510000023
Relative to->
Figure FDA0004112240510000024
Transposed matrix of partial derivatives of>
Figure FDA0004112240510000026
Linear equation representing the assembly constraints of the robot, < +.>
Figure FDA0004112240510000025
Indicating the articulation parameters of the end effector of the robot, < +.>
Figure FDA0004112240510000027
Representing other articulation parameters than the robotic end effector +.>
Figure FDA0004112240510000028
Representing a random vector.
6. The method for generating a boundary of a working space of a robot end effector based on linear programming according to claim 1, wherein in the step S3, a system of linear constraint equations to be satisfied by the boundary of the working space of the robot end effector is generated as follows:
Figure FDA0004112240510000031
wherein ,
Figure FDA0004112240510000035
constraint linear equation representing pose of robot end effector, +.>
Figure FDA0004112240510000032
Representing the constrained linear equation resulting from the introduction of intermediate variables, +.>
Figure FDA0004112240510000033
Representing the intermediate variable introduced by the secondary variable consisting of the articulation parameters, <>
Figure FDA0004112240510000034
Representing intermediate variables introduced by bilinear variables made up of articulation parameters.
7. The method for generating a boundary of a working space of a robot end effector based on linear programming according to claim 1, wherein in the step S4, all robot articulation parameters satisfying linear equations in the constraint equation set are obtained according to the linear constraint equation set, comprising the steps of:
s41, finding out the maximum value and the minimum value which can be taken by the joint motion parameters on the premise of meeting the linear constraint equation set;
s42, setting a threshold value, judging whether the maximum value in the maximum value ranges of all the articulation parameters is higher than the set threshold value, if so, dividing the value range of the articulation parameters into two parts from the middle point, respectively storing the obtained value ranges of the two groups of parameters into a queue of the articulation parameters, and keeping the value ranges of other articulation parameters unchanged;
s43, repeating the steps S41 and S42 for the value ranges of the next group of parameters in the articulation parameter queue until the value ranges of all the articulation parameters are lower than the set threshold.
8. The method according to claim 7, wherein in the step S5, boundary joint motion parameters satisfying both the robot assembly constraint, the matrix rank deficiency constraint and the constraint on the pose of the robot end effector are found by newton' S iteration method from all the robot joint motion parameters satisfying the linear equation of the linear constraint equation set.
9. The method for generating a working space boundary of a robot end effector based on the linear programming according to claim 7, wherein the step S6 of identifying and determining the working space boundary point of the robot end effector from the boundary joint motion parameters in combination with the robot assembly constraint comprises the steps of:
s61, searching each group of boundary articulation parameters, and respectively finding out a Gao Weiji graph formed by a robot assembly constraint linear equation relative to an articulation normal line of the robot end effector articulation parameters at the current boundary articulation parameter position, wherein the specific formula is as follows:
Figure FDA0004112240510000041
wherein ,
Figure FDA0004112240510000042
gao Weiji, representing what is constituted by a robot assembly constraint linear equation, relative to the articulation normal of the robot end effector articulation parameter at the current boundary articulation parameter position, < >>
Figure FDA0004112240510000043
Is that
Figure FDA0004112240510000044
Relative to->
Figure FDA0004112240510000045
Partial derivative of>
Figure FDA0004112240510000047
Representing a random vector representing current boundary joint motion parameters;
s62, setting a judging function in combination with the joint motion normal, and judging the joint motion parameters of each robot end effector through the judging function to identify and determine the working space boundary points of the end effectors.
CN202310209843.1A 2022-10-10 2023-03-07 Robot end effector working space boundary generation method based on linear programming Pending CN116141331A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2022112362065 2022-10-10
CN202211236206.5A CN115454097A (en) 2022-10-10 2022-10-10 Robot end effector working space boundary generation method based on linear programming

Publications (1)

Publication Number Publication Date
CN116141331A true CN116141331A (en) 2023-05-23

Family

ID=84309033

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211236206.5A Withdrawn CN115454097A (en) 2022-10-10 2022-10-10 Robot end effector working space boundary generation method based on linear programming
CN202310209843.1A Pending CN116141331A (en) 2022-10-10 2023-03-07 Robot end effector working space boundary generation method based on linear programming

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202211236206.5A Withdrawn CN115454097A (en) 2022-10-10 2022-10-10 Robot end effector working space boundary generation method based on linear programming

Country Status (1)

Country Link
CN (2) CN115454097A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117207200A (en) * 2023-11-09 2023-12-12 湖南视比特机器人有限公司 Method and device for generating working space of mechanical arm and computer equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117207200A (en) * 2023-11-09 2023-12-12 湖南视比特机器人有限公司 Method and device for generating working space of mechanical arm and computer equipment

Also Published As

Publication number Publication date
CN115454097A (en) 2022-12-09

Similar Documents

Publication Publication Date Title
CN108908327B (en) Robot positioning error grading compensation method
Chirikjian et al. Pose changes from a different point of view
JP2008546099A (en) Kinematic singularity compensation system and method
CN105382835A (en) Robot path planning method for passing through wrist singular point
JP3349652B2 (en) Offline teaching method
CN111775145B (en) Control system of serial-parallel robot
CN116141331A (en) Robot end effector working space boundary generation method based on linear programming
Celikag et al. Cartesian stiffness optimization for serial arm robots
CN113715016A (en) Robot grabbing method, system and device based on 3D vision and medium
CN116038702B (en) Seven-axis robot inverse solution method and seven-axis robot
Khatamian Solving kinematics problems of a 6-dof robot manipulator
CN113910218A (en) Robot calibration method and device based on kinematics and deep neural network fusion
Wiese et al. Kinematic modeling of a soft pneumatic actuator using cubic hermite splines
CN109366486B (en) Flexible robot inverse kinematics solving method, system, equipment and storage medium
Kamali et al. A novel method for direct kinematics solution of fully parallel manipulators using basic regions theory
JP3840973B2 (en) Robot teaching data correction method
CN112847441B (en) Six-axis robot coordinate offset detection method and device based on gradient descent method
Song et al. Efficient formulation approach for the forward kinematics of the 3-6 Stewart-Gough platform
CN115933374A (en) Industrial robot load parameter static identification and pose identification optimization method
Vijayan et al. Integrating visual guidance and feedback for an industrial robot
Ye et al. Stiffness optimized multi-robot behavior planning using reduced hessian method
ElMaraghy Kinematic and geometric modelling and animation of robots
Benotsmane et al. Calculation methodology for trajectory planning of a 6-axis manipulator arm
Fraczek et al. Calibration of multi-robot system without and under load using electronic theodolites
Lai A fast task planning system for 6R articulated robots based on inverse kinematics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination