CN113752250A - Method and device for controlling robot joint, robot and storage medium - Google Patents

Method and device for controlling robot joint, robot and storage medium Download PDF

Info

Publication number
CN113752250A
CN113752250A CN202110604994.8A CN202110604994A CN113752250A CN 113752250 A CN113752250 A CN 113752250A CN 202110604994 A CN202110604994 A CN 202110604994A CN 113752250 A CN113752250 A CN 113752250A
Authority
CN
China
Prior art keywords
joint
parameter
model
robot
expected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110604994.8A
Other languages
Chinese (zh)
Inventor
徐佳锋
郑宇�
王帅
来杰
陈科
姜鑫洋
王海涛
张竞帆
张东胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110604994.8A priority Critical patent/CN113752250A/en
Publication of CN113752250A publication Critical patent/CN113752250A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a robot joint control method and device, a robot and a storage medium, and relates to the field of robot control. The control method of the robot joint comprises the following steps: acquiring an expected control moment of a robot joint; determining a motor current corresponding to the expected control torque according to an expected mapping model, wherein the expected mapping model is generated according to a nonlinear relation between the expected control torque and the motor current, and the expected mapping model comprises compensation of joint friction of the robot joint according to joint motion data of the robot joint; and driving a motor of the robot joint according to the motor current. Parameters between expected control torque and motor current related in an expected mapping model are corrected through compensation of joint friction force, so that the accuracy of the obtained motor current is improved, the actual required numerical value is more approximate, and the control accuracy of the robot joint is improved.

Description

Method and device for controlling robot joint, robot and storage medium
Technical Field
The present disclosure relates to the field of robot control, and more particularly, to a method and an apparatus for controlling a robot joint, a robot, and a storage medium.
Background
Due to the complexity of the structure of the robot, complex friction phenomena, including rolling friction and sliding friction, exist between transmission structures such as gears and bearings arranged inside the joints of the robot. These friction phenomena will affect the movement of the robot joints and thus the control of the robot.
In the control process of the robot joint, a desired control torque of the robot joint is generally converted into a control amount of a motor current. In the related technology, the expected control torque and the motor current are regarded as a linear relation, and the corresponding motor current can be obtained according to the required expected control torque, so that the motor current is output to the robot joint through the bottom controller to drive the motor of the robot joint, and the control of the robot joint is realized.
However, the relationship between the desired control torque and the motor current in practice is nonlinear, and the relationship between the desired control torque and the motor current is more complicated due to the existence of joint friction caused by the friction phenomenon. Therefore, a single linear relationship cannot sufficiently describe the mapping between the two, so that the motor of the robot joint cannot acquire the motor current required for accurately executing the desired motion, and the control precision of the robot joint is affected.
Disclosure of Invention
The embodiment of the application provides a control method and device for a robot joint, a robot and a storage medium, and the control precision of the robot joint is improved by compensating joint friction. The technical scheme is as follows:
according to an aspect of the present application, there is provided a control method of a robot joint, the method including:
acquiring an expected control moment of a robot joint;
determining a motor current corresponding to the expected control torque according to an expected mapping model, wherein the expected mapping model is generated according to a nonlinear relation between the expected control torque and the motor current, and the expected mapping model comprises compensation of joint friction of the robot joint according to joint motion data of the robot joint;
and driving a motor of the robot joint according to the motor current.
According to an aspect of the present application, there is provided a method for identifying parameters of a desired mapping model, the method including:
constructing a joint friction model of the robot joint according to the joint motion data of the robot joint;
constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint;
constructing a parameter identification model according to the joint friction model and the theoretical mapping model;
and performing parameter identification on the parameter identification model, and determining the identification value of the model parameter in the parameter identification model as the parameter value in the expected mapping model.
According to an aspect of the present application, there is provided a control apparatus of a robot joint, the apparatus including:
the acquisition module is used for acquiring the expected control torque of the robot joint;
the determining module is used for determining motor current corresponding to the expected control torque according to an expected mapping model, the expected mapping model is generated according to the nonlinear relation between the expected control torque and the motor current, and the expected mapping model comprises compensation of joint friction force of the robot joint according to joint motion data of the robot joint;
and the driving module is used for driving a motor of the robot joint according to the motor current.
According to an aspect of the present application, there is provided a parameter identification apparatus for a desired mapping model, the apparatus including:
the construction module is used for constructing a joint friction model of the robot joint according to the joint motion data of the robot joint;
the construction module is also used for constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint;
the construction module is also used for constructing a parameter identification model according to the joint friction model and the theoretical mapping model;
and the determining module is used for carrying out parameter identification on the parameter identification model and determining the identification value of the model parameter in the parameter identification model as the parameter value in the expected mapping model.
According to an aspect of the application, a robot is provided, comprising a processor and a memory, in which at least one program code is stored, which is loaded by the processor and which performs the method of controlling a robot joint as described above.
According to an aspect of the present application, there is provided a computer device comprising a processor and a memory, the memory having stored therein at least one program code, the program code being loaded by the processor and performing the method of parameter identification of a desired mapping model as described above.
According to an aspect of the present application, there is provided a computer-readable storage medium having at least one program code stored therein, the program code being loaded and executed by a processor to implement the method for controlling a robot joint as described above, or the method for recognizing parameters of a desired mapping model.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
parameters between the expected control torque and the motor current related in the expected mapping model are corrected through compensation of joint friction force, so that the accuracy of determining the motor current corresponding to the expected control torque according to the expected mapping model is improved, the motor current provided by the motor to the robot joint is closer to an actually required value, and the control accuracy of the robot joint is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of an application scenario provided by an exemplary embodiment of the present application;
FIG. 2 is a flow chart of a method of controlling a robotic joint provided in an exemplary embodiment of the present application;
FIG. 3 is a flow chart for determining parameter values in a desired mapping model provided by an exemplary embodiment of the present application;
FIG. 4 is a flow chart for determining parameter values in a desired mapping model provided by an exemplary embodiment of the present application;
FIG. 5 is a flow chart of a method of controlling a robotic joint provided in an exemplary embodiment of the present application;
FIG. 6 is a flow chart of a method for identifying parameters of a desired mapping model provided by an exemplary embodiment of the present application;
fig. 7 is a block diagram of a control apparatus for a robot joint according to an exemplary embodiment of the present application;
FIG. 8 is a block diagram of a parameter identification device for a desired mapping model provided in an exemplary embodiment of the present application;
FIG. 9 is a block diagram of a robot provided in an exemplary embodiment of the present application;
fig. 10 is a block diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Artificial Intelligence (AI) is a theory, method, technique and application system that uses a digital computer or a machine controlled by a digital computer to simulate, extend and expand human Intelligence, perceive the environment, acquire knowledge and use the knowledge to obtain the best results. In other words, artificial intelligence is a comprehensive technique of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making.
The artificial intelligence technology is a comprehensive subject and relates to the field of extensive technology, namely the technology of a hardware level and the technology of a software level. The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
In the case that the artificial intelligence technology is applied to control of a robot, fig. 1 shows an application scenario diagram of a control method of a robot joint provided in an embodiment of the present application.
Wherein the robot 100 has a communication connection with a computer device 200, the computer device 200 sends a desired mapping model to the robot 100, and the robot 100 controls the movements of the robot joints according to the desired mapping model.
Specifically, the robot 100 may be any configuration of robot, including but not limited to at least one of a serial robot, a parallel robot, a legged robot, a wheeled robot, and a tracked robot.
Hereinafter, the robot 100 is exemplified as a wheel-legged robot.
The wheel-legged robot is a robot structure for controlling the movement of a robot body through a wheel structure, and since the contact point of the wheel-legged robot with the ground only includes the contact point of the wheels with the ground, there is a problem of balance control in the case where the arrangement of the wheel structure itself is unstable.
In the present embodiment of the present application, a wheel-leg robot is exemplified as a wheel-type biped robot, that is, the wheel-type biped robot includes two wheels for movement, the two wheels are respectively connected to the leg structures, and the leg structures are connected to the robot main body, so that the two wheels drive the robot main body to perform motion control. It should be understood that the wheel-legged robot in the present application is not limited to the above-described structure. Wherever a wheel-legged robot is to be understood any robot comprising a wheel-like structure.
As schematically shown in fig. 1, the robot 100 includes a base portion 110 and a wheel leg portion 120.
Wherein, the base portion 110 is connected to the wheel leg portion 120, the wheel leg portion 120 includes 2 wheels 121, and a leg portion structure 122 for connecting the wheels 121 and the base portion 110, as shown in fig. 1, the robot 100 includes 4 leg portion structures 122 in total, and 2 leg portion structures 122 in the 4 leg portion structures 122 are respectively connected to one wheel 121.
Illustratively, there is a leg structure a, a leg structure B, a leg structure C, and a leg structure D, and then the leg structure a, the leg structure B are connected with the left wheel, and the leg structure C, the leg structure D are connected with the right wheel.
The leg structure A, the leg structure B and the left wheel, and the leg structure C, the leg structure D and the right wheel form a two-leg plane parallel structure of the wheel-leg robot. The parallel leg has five rotational joints with two translational degrees of freedom in the transverse and vertical directions, respectively. Compared with a serial mechanism, the parallel mechanism has the characteristics of compact structure, high rigidity and strong bearing capacity. Therefore, the robot can jump higher and flexibly overcome obstacles.
Illustratively, the mapping model may be applied to the Whole Body Dynamic Control (wb (d) C) of the robot, or a part of a mechanical arm, a mechanical leg, or the like included in the robot. For example, it is desirable to map the model for controlling the walking of a wheel-legged robot, or for controlling the 4-leg structure 122 of a wheel-legged robot.
In conjunction with the above, a method for controlling a robot joint according to an embodiment of the present application is described, and fig. 2 is a flowchart of a method for controlling a robot joint according to an embodiment of the present application, which may be implemented in the robot shown in fig. 1. As schematically shown in fig. 2, the method comprises the steps of:
step 102: and acquiring the expected control moment of the robot joint.
Illustratively, the desired control torque refers to an output torque required to control the robot joint to perform a desired motion.
In the control process of the robot, a desired control torque of the robot joint is generally converted into a control amount of a motor current of the robot joint. According to the mapping relation between the expected control torque and the motor current, one expected control torque corresponds to the only motor current. Alternatively, the mapping between the desired control torque and the motor current is non-linear, or linear. The embodiment of the present application takes the non-linear relationship between the desired control torque and the motor current as an example.
After the required desired control torque is dimensioned, the corresponding motor current is available. And then, the robot outputs the motor current to the motor of the robot joint, and the motor driving the robot joint outputs the motor current with the same magnitude, so that the robot joint outputs expected control torque, and the control of the robot joint is realized.
In the process, according to the principle of energy conservation, part of the output torque of the robot joint is used for generating friction, and part of the output torque is used for generating motion. That is, the output torque of the robot joint needs to satisfy the following constraint conditions: τ ═ τm′+τf
Wherein, taum' is the expected output torque of the robot joint, tau is the theoretical output torque of the robot joint, taufIs the joint friction of the robot.
From this constraint, it is known that the joint friction has a certain cancellation effect on the desired output torque. Specifically, the theoretical output torque is obtained according to a theoretical mapping model and can be regarded as a theoretical value, and the expected output torque includes compensation for joint friction. Therefore, in some cases, the desired output torque may be regarded as the desired control torque.
Step 104: and determining the motor current corresponding to the expected control torque according to the expected mapping model.
Illustratively, the desired mapping model is generated according to a nonlinear relationship between the desired control torque and the motor current, and the desired mapping model includes compensation of joint friction of the robot joint according to joint motion data of the robot joint.
And the expected mapping model is used for describing the mapping relation between the expected control torque of the robot joint and the motor current. Alternatively, the desired mapping relationship may be represented as a sub-formula: f (i) ═ k1i+k2i2+k3i3. Wherein k isi(i ═ 1, 2, 3) is a parameter in the desired mapping model, which may also be referred to as the torque coefficient, and is used to describe the non-linear relationship between the desired control torque and the motor current. That is, in the case where the torque coefficient is determined (i.e., is constant), a unique value of the motor current can be obtained based on a specific value of the desired control torque.
The desired mapping model includes compensation of joint friction of the robot joint based on joint motion data of the robot joint. Equivalently, it is desirable to map the parameter k in the modeliThe determination of (i ═ 1, 2, and 3) takes into account the influence of the joint motion data on the joint friction.
Illustratively, the joint motion data of the robot joint includes at least one of: a joint angle of the robot joint, a joint angular velocity of the robot joint, and a joint angular acceleration of the robot joint. The joint angular velocity corresponds to a first derivative of the joint angle, and the joint angular acceleration corresponds to a second derivative of the joint angle.
With the desired mapping model being f (i) ═ k1i+k2i2+k3i3For example, in the parameterkiIn the case of a determination of (i ═ 1, 2, 3), the corresponding motor current can be uniquely determined as a function of the desired control torque. For example, the desired control torque is 10N, and the motor current i obtained according to the desired mapping model is a current value capable of providing 10N, wherein the 10N includes 9N required by the robot joint motion and compensation for the joint friction force 1N.
Step 106: and driving a motor of the robot joint according to the motor current.
According to step 104, the desired control torque is substituted into the desired mapping model, and the corresponding motor current can be obtained.
And then, the robot outputs the motor current to the motor of the robot joint, and the motor driving the robot joint outputs the motor current with the same magnitude, so that the robot joint outputs expected control torque, and the control of the robot joint is realized.
In summary, in the control method of the robot joint provided in the embodiment of the present application, parameters between the expected control torque and the motor current involved in the expected mapping model are corrected by compensating the joint friction force, so that the accuracy of determining the motor current corresponding to the expected control torque according to the expected mapping model is improved, and thus the motor current provided by the motor to the robot joint is closer to the actually required value, and the control accuracy of the robot joint is improved.
According to the foregoing, the desired mapping model includes parameters, and when the parameter values are determined, the motor current corresponding to the desired control torque can be acquired. Therefore, when the control method of the robot joint provided by the embodiment of the application is used, the parameter values in the desired mapping model need to be determined.
Fig. 3 shows a determination process of a parameter value in a desired mapping model, comprising the steps of:
step 202: and constructing a joint friction model of the robot joint according to the joint motion data.
Illustratively, the articulation data includes at least one of the following: a joint angle of the robot joint, a joint angular velocity of the robot joint, and a joint angular acceleration of the robot joint.
According to the different configurations of the robot, the friction force of the joints corresponding to the joints of the robot is different. Specifically, the joint friction force is generated according to factors such as the transmission mode of the robot, the machining and assembling errors of joint parts, the use abrasion degree and the like. The above factors all affect the joint angle, the joint angular velocity and the joint angular acceleration of the robot joint.
Therefore, it can be seen from the cause of the joint friction force that the joint friction force is related to the joint angle, the joint angular velocity, and the joint angular acceleration of the robot joint. That is, joint friction is affected by the joint motion data.
Based on the data, the joint friction model is constructed according to the joint motion data and is used for describing the joint friction force of the robot joint, and the joint friction force is influenced by the joint motion data.
Alternatively, the joint friction model may be represented by the following sub-formula:
Figure BDA0003094003730000081
wherein, taufIs joint friction force, tauf,qFor the angle parameter items determined from the joint angles,
Figure BDA0003094003730000082
for the angular velocity parameter term determined from the joint angular velocity,
Figure BDA0003094003730000083
is an angular acceleration parameter term determined according to the angular acceleration of the joint.
Step 204: and constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint.
Illustratively, the theoretical mapping model is generated according to the mapping relation between the theoretical output torque and the motor current. The theoretical output torque can be regarded as a theoretical value.
According to the foregoing, it is desirable that the mapping model is generated from a mapping of the output torque required by the robot joint and the motor current, and the mapping may be non-linear or linear. Similarly, the theoretical mapping model is generated according to the mapping relationship between the theoretical output torque and the motor current of the robot joint, and the mapping relationship can be nonlinear or linear.
Meanwhile, based on the difference between the output torque required by the robot joint (i.e., the desired control torque) and the theoretical output torque of the robot joint (i.e., the theoretical output torque), the parameters in the theoretical mapping model and the parameters in the desired mapping model are also different.
Optionally, in order to reduce the influence of the theoretical output torque on the control precision of the robot and make the relationship between the theoretical output torque and the motor current more trend to the actual situation, a cubic polynomial is used to describe the relationship between the theoretical output torque and the motor current.
That is, the theoretical mapping model may be represented by the following sub-formula: τ ═ k1i+k2i2+k3i3. Wherein k isiFor the torque coefficient, i is the motor current, where the torque coefficient is used to describe the nonlinear relationship between the theoretical output torque and the motor current, and is not the same coefficient as the torque coefficient involved in step 104. That is, the torque coefficients involved in step 104 may be considered the first torque coefficients, and the torque coefficients involved in step 204 may be considered the second torque coefficients. The first torque coefficient is a fixed value, and the second torque coefficient is an unknown parameter.
Illustratively, the nonlinear relationship of the theoretical mapping model can also be obtained by training with a neural network method.
Step 206: and constructing a parameter identification model according to the joint friction model and the theoretical mapping model.
According to the foregoing, the output torque of the robot joint needs to satisfy the following constraint conditions: τ ═ τm′+τf. Wherein, taum' desired output Torque, [ tau ] theoretical output Torque, [ tau ]fIs the joint friction.
To reduce the effect of joint friction on the robot joint, it is necessary to add compensation to the joint friction in the desired control torque (which may be considered as the desired output torque) that is used to counteract the joint friction. Equivalently, due to the existence of joint friction force, the expected control torque of the robot joint is different from the theoretical output torque.
Based on this, a parameter identification model may be constructed for describing the difference between the desired control torque and the theoretical output torque.
The theoretical output torque can also be determined according to a dynamic model, which is also referred to as the current desired torque. In the embodiment of the present application, the kinetic model of the robot refers to a kinetic model corresponding to a robot joint. The dynamic model has various expression forms, such as those obtained according to lagrangian equation or newton euler equation.
That is, the parameter identification model may be constructed from the difference between the current desired torque and the desired output torque.
Alternatively, the parameter identification model may be represented by the following sub-formula: f ═ τmm′=τm-τ+τf. Wherein, taumFor the currently desired moment, τm' desired output Torque, [ tau ] theoretical output Torque, [ tau ]fIs the joint friction.
Step 208: and performing parameter identification on the parameter identification model, and determining the identification value of the model parameter in the parameter identification model as the parameter value in the expected mapping model.
Schematically, parameter identification means that an unknown parameter in a theoretical model is identified according to the theoretical model and experimental data to obtain a determined value of the unknown parameter, so that a numerical result obtained through the theoretical model can achieve a better fitting effect.
In other words, the parameter identification performed on the parameter identification model can obtain the identification value of the model parameter, and the identification value can make the numerical result obtained according to the parameter identification model closer to the true value.
Illustratively, the model parameters in the parameter identification model include all unknown parameters.
According to the foregoing, the parameter identification model is constructed based on the joint friction model and the theoretical mapping model. Therefore, the parameter identification model includes the parameters related to the joint friction force and the parameters in the expected mapping model. Since the parameter identification is performed on the parameter identification model, the obtained identification value is the identification value of all unknown parameters. That is, according to the parameter identification performed on the parameter identification model, the identification value of the parameter in the expected mapping model meets the requirement of the relevant parameter of the joint friction force, which is equivalent to the fact that the expected mapping model includes compensation for the joint friction force.
With the desired mapping model being f (i) ═ k1i+k2i2+k3i3Wherein the parameter identification model is f ═ τmm′=τm-τ+τfFor example, the model parameters in the parameter identification model include parameters related to joint friction and a parameter ki(i ═ 1, 2, 3), parameter kiThe identification value of (i ═ 1, 2, and 3) is the parameter value required in the desired mapping model. Performing parameter identification on the parameter identification model to obtain related parameters of joint friction force and a parameter ki(i is 1, 2, 3) to identify kiThe identification value of (i ═ 1, 2, 3) is determined as a parameter value in the desired mapping model.
Optionally, after the identification value of the model parameter is obtained, the accuracy of the identification value of the model parameter may be determined, so as to improve the accuracy of the parameter value in the expected mapping model.
In summary, the embodiment of the present application provides a determination method for parameter values in an expected mapping model, a parameter identification model is constructed according to a joint friction model and a theoretical mapping model, parameters in the expected mapping model are placed in the parameter identification model, and the parameter values in the expected mapping model are obtained through parameter identification of the parameter identification model.
FIG. 4 illustrates another determination of parameter values in a desired mapping model, including the steps of:
step 301: at least one parameter term of the joint friction model is generated from the joint motion data.
Illustratively, the articulation data includes at least one of the following: a joint angle of the robot joint, a joint angular velocity of the robot joint, and a joint angular acceleration of the robot joint.
According to the foregoing, the joint friction force is related to the joint angle, the joint angular velocity, and the joint angular acceleration of the robot joint. Based on the data, the joint friction model is constructed according to the joint motion data and is used for describing the joint friction force of the robot joint, and the joint friction force is influenced by the joint motion data.
Different parameter items may be generated depending on the joint motion data. Illustratively, the joint friction model includes at least one of the following parameters: angle parameter item, angular velocity parameter item, angular acceleration parameter item. The angle parameter item is determined according to the joint angle, the angular velocity parameter item is determined according to the joint angular velocity, and the angular acceleration parameter item is determined according to the joint angular acceleration.
By the friction force of the joints
Figure BDA0003094003730000101
For example. Wherein, taufIs joint friction force, tauf,qIn order to be the angle parameter item,
Figure BDA0003094003730000102
in the form of an angular velocity parameter term,
Figure BDA0003094003730000103
is an angular acceleration parameter item.
Optionally, step 301 may be implemented as at least one of the following steps:
generating an angle parameter item according to the joint angle of the robot joint;
generating an angular velocity parameter item according to the joint angular velocity of the robot joint;
an angular acceleration parameter term is generated according to joint angular acceleration of the robot joint.
The following three steps are explained in detail in turn:
1. and (4) angle parameter items.
The degree of wear and/or lubrication between the transmission mechanisms inside the robot joint varies depending on the size of the joint angle. In particular, the degree of wear and/or lubrication of the robot joints may be described by parameters that vary according to the structure of the robot. Based on this, the influence of the joint friction force is related to the range in which the joint angle of the robot joint is located.
Illustratively, the step of generating the angle parameter item according to the joint angle of the robot joint may be implemented as follows:
when the joint angle is in a first preset interval, generating a first angle parameter item;
when the joint angle is in a second preset interval, generating a second angle parameter item;
wherein, the first preset interval and the second preset interval are two intervals without intersection.
Alternatively, the angle parameter term may be represented by the following sub-formula:
Figure BDA0003094003730000111
wherein the content of the first and second substances,
Figure BDA0003094003730000112
for symbolic functions, s (q) is a piecewise function related to joint angle.
Depending on the interval in which the joint angle is located, the piecewise function may be represented by the following sub-formula:
Figure BDA0003094003730000113
wherein alpha is0And alpha1Are parameters in the piecewise function. That is, the joint angle of the robot joint is in the interval (q)min,qmax) In, there is a first angle parameter term expressed as
Figure BDA0003094003730000114
Otherwise, the joint of the robot is closedPitch angle in interval (q)min,qmax) Besides, there is a second angle parameter term expressed as
Figure BDA0003094003730000115
Optionally, interval (q)min,qmax) The range of the joint angle of the robot under most working conditions is selected, and the abrasion and lubrication degree of the robot joint in the range can be determined by a parameter alpha0A description is given. The joint angle position under a few working conditions is outside the interval, and the robot can less reach the angle, so the abrasion and lubrication degree of the joint at the angle can be determined by the parameter alpha1A description is given.
2. An angular velocity parameter term.
The angular velocity parameter items comprise a first parameter item and a second parameter item, the first parameter item corresponds to a friction curve of the robot joint, and the second parameter item has a nonlinear relation with the angular velocity of the joint.
Illustratively, the step of generating an angular velocity parameter term from joint angular velocities of the robot joint may be implemented as follows:
generating a first parameter item and a second parameter item according to the joint angular velocity, wherein the first parameter item corresponds to a friction curve of the robot joint, and the second parameter item has a nonlinear relation with the joint angular velocity;
an angular velocity parameter term is generated from the first parameter term and the second parameter term.
The friction curve of the robot joint, also called Stribeck (Stribeck) curve, is used to describe the friction manner between the transmission mechanisms of the robot, including the friction of the robot joint.
Alternatively, the angular velocity parameter term may be represented by the following sub-formula:
Figure BDA0003094003730000116
wherein the content of the first and second substances,
Figure BDA0003094003730000117
is the first parameterThe items are,
Figure BDA0003094003730000118
i.e. the second parameter item. In addition, σ0And σ1Is a parameter in the angular velocity parameter term, an implicit variable z describes the average deformation of the bristle, and the implicit variable z is a variable in the angular velocity parameter term, belongs to an intermediate variable, is used for equation transformation and does not participate in the calculation of the joint friction force.
There are various implementation manners for generating the first parameter item and the second parameter item according to the specific settings of the first parameter item and the second parameter item and the joint angular velocity, and the following manners are given in the embodiments of the present application.
First, the step of generating the first parameter item according to the joint angular velocity may be implemented as follows:
determining an index item corresponding to the joint angular velocity, wherein the index item comprises an angular velocity parameter, the index item is used for correcting a first parameter item, the angular velocity parameter is a correction parameter of the index item, and the angular velocity parameter is determined according to the joint angular velocity;
a first parameter term is generated based on the joint angular velocity and the exponential term.
Optionally, the first parameter term is constructed according to an original friction model (i.e. the LuGre model), and may be represented by the following sub-formula:
Figure BDA0003094003730000121
wherein the content of the first and second substances,
Figure BDA0003094003730000122
is a friction curve of the robot joint, and can be called as a stribeck curve.
Specifically, the friction curve may be represented by the following sub-formula:
Figure BDA0003094003730000123
wherein, beta0、β1And beta2Is a model parameter of the second parameter term, exp is an exponential term,
Figure BDA0003094003730000124
is a function of the sign of the signal,
Figure BDA0003094003730000125
and
Figure BDA0003094003730000126
is an angular velocity parameter term, and the two angular velocity parameter terms are two different parameters determined according to the angular velocity of the joint.
Secondly, the step of generating the second parameter item according to the joint angular velocity can be realized as follows:
and generating a second parameter item according to the nonlinear relation between the joint angular velocity and the second parameter item.
Alternatively, the second parameter term may be represented by the following sub-formula:
Figure BDA0003094003730000127
wherein, beta3And beta4Is a model parameter of the second parameter term.
According to the foregoing, the angular velocity parameter item can be obtained from the first parameter item and the second parameter item.
Illustratively, the construction of the angular velocity parameter item can also be obtained by training through a neural network method.
3. An angular acceleration parameter term.
Illustratively, the step of generating an angular acceleration parameter term from joint angular accelerations of the robot joint may be implemented as follows:
and generating an angular acceleration parameter item according to the linear relation between the joint angular acceleration and the angular acceleration parameter item.
Alternatively, the angular acceleration parameter term may be represented by the following sub-formula:
Figure BDA0003094003730000128
wherein, γ0Is a model parameter of the angular acceleration parameter term.
Step 302: and constructing a joint friction model according to the at least one parameter item.
According to step 302, at least one of an angle parameter term, an angular velocity parameter term, and an angular acceleration parameter term may be obtained, based on which a joint friction model may be constructed.
Illustratively, the joint friction model includes at least one of an angular parameter term, an angular velocity parameter term, and an angular acceleration parameter term.
For example, the joint friction model includes an angular parameter term and an angular velocity parameter term; as another example, the joint friction model includes an angular velocity parameter term and an angular acceleration parameter term; as another example, the joint friction model includes at least one of an angular parameter term and an angular acceleration parameter term; as another example, the joint friction model includes an angular parameter term, an angular velocity parameter term, and an angular acceleration parameter term.
In terms of the angle parameter is
Figure BDA0003094003730000131
The angular velocity parameter term is
Figure BDA0003094003730000132
The angular acceleration parameter term is
Figure BDA0003094003730000133
For example, joint friction may be represented by the following equation, based on three parameter terms:
Figure BDA0003094003730000134
wherein the content of the first and second substances,
Figure BDA0003094003730000135
is a sign function, s (q) is a piecewise function related to joint angle,
Figure BDA0003094003730000136
for the first parameter term corresponding to the friction curve,
Figure BDA0003094003730000137
as a second parameter term, z is an implicit variable, σ0And σ1Is the angular velocityParameter in the degree parameter term, gamma0Is a model parameter of the angular acceleration parameter term.
Step 303: and constructing a theoretical mapping model according to the theoretical output torque and the torque coefficient.
Illustratively, the theoretical output torque and the motor current have a nonlinear relation, and the torque coefficient is a parameter item corresponding to the motor current. Specifically, the theoretical mapping model is generated according to a mapping relation between a theoretical output torque and a motor current. The theoretical output torque can be regarded as a theoretical value.
Optionally, in order to reduce the influence of the theoretical output torque on the control precision of the robot and make the relationship between the theoretical output torque and the motor current more trend to the actual situation, a cubic polynomial is used to describe the relationship between the theoretical output torque and the motor current.
That is, the theoretical mapping model may be represented by the following sub-formula: τ ═ k1i+k2i2+k3i3. Wherein, tau is theoretical output torque, kiFor the torque coefficient, i is the motor current, where the torque coefficient is used to describe the nonlinear relationship between the theoretical output torque and the motor current, and is not the same coefficient as the torque coefficient involved in step 104. That is, the torque coefficients involved in step 104 may be considered the first torque coefficients, and the torque coefficients involved in step 204 may be considered the second torque coefficients. The first torque coefficient is a fixed value, and the second torque coefficient is an unknown parameter.
Step 304: and determining the current expected moment according to the dynamic model corresponding to the robot joint.
According to the foregoing, the output torque of the robot joint needs to satisfy the following constraint conditions: τ ═ τm′+τf. Wherein, taum' desired output Torque, [ tau ] theoretical output Torque, [ tau ]fIs the joint friction.
To reduce the effect of joint friction on the robot joint, it is necessary to add compensation to the joint friction in the desired control torque (which may be considered as the desired output torque) that is used to counteract the joint friction. Based on this, a parameter identification model may be constructed for describing the difference between the desired control torque and the theoretical output torque.
The theoretical output torque can also be determined according to a dynamic model, which is also referred to as the current desired torque. In the embodiment of the present application, the kinetic model of the robot refers to a kinetic model corresponding to a robot joint. The dynamic model has various expression forms, such as those obtained according to lagrangian equation or newton euler equation.
Optionally, the dynamic model corresponding to the robot joint is as follows:
Figure BDA0003094003730000141
wherein, taumFor the current desired moment, q is the joint angle,
Figure BDA0003094003730000142
in order to determine the angular velocity of the joint,
Figure BDA0003094003730000143
is the angular acceleration of the joint, h (q) is the inertia matrix of the robot,
Figure BDA0003094003730000144
is the centrifugal force matrix of the robot, and g (q) is the gravity matrix of the robot.
Illustratively, the inertia matrix, the centrifugal force matrix and the gravity matrix may be set according to actual needs, and are not limited herein.
According to the kinetic model and q,
Figure BDA0003094003730000145
And
Figure BDA0003094003730000146
to obtain taumThe value of (a).
Step 305: and constructing a parameter identification model according to the current expected torque, the joint friction model and the theoretical mapping model.
Referring to the foregoing, the current desired torque τ may be obtained according to step 304mThe current desired moment τmIs a constant value; a joint friction model is obtained according to step 302, and the model includes a joint friction force taufCorresponding model parameters; from step 303, a theoretical mapping model may be obtained that includes model parameters corresponding to parameters in the desired mapping model.
Based on the method, a parameter identification model can be constructed according to the current expected torque, the joint friction model and the theoretical mapping model and used for parameter identification.
In light of the foregoing, a parameter identification model may be constructed from the difference between the current desired torque and the desired output torque, and the parameter identification model may be represented by the following sub-equation: f ═ τmm′=τm-τ+τf. Wherein, taumFor the currently desired moment, τm' desired output Torque, [ tau ] theoretical output Torque, [ tau ]fIs the joint friction.
Illustratively, step 305 may be implemented as follows:
determining expected output torque of the robot joint according to the joint friction model and the theoretical mapping model;
and constructing a parameter identification model according to the difference value of the current expected torque and the expected output torque.
The specific parameters are as follows:
according to the foregoing, τmThe current expected torque is a fixed value;
the joint friction model is
Figure BDA0003094003730000147
Wherein, taufIn order to provide the joint friction force,
Figure BDA0003094003730000148
is a function of the sign of the signal,
Figure BDA0003094003730000149
q is the angle of the joint,
Figure BDA00030940037300001410
in order to determine the angular velocity of the joint,
Figure BDA00030940037300001411
in order to obtain the angular acceleration of the joint,
Figure BDA00030940037300001412
exp is an index term that is,
Figure BDA00030940037300001413
and
Figure BDA00030940037300001414
is an angular velocity parameter term, α0、α1、σ0、σ1、βi(i ═ 1, 2, 3, 4) and γ0Are all model parameters.
The theoretical mapping model is tau-k1i+k2i2+k3i3. Wherein, tau is theoretical output torque, kiI is the motor current.
The expected output torque is tau obtained according to a joint friction model and a theoretical mapping modelm′=τ-τf
Based on the difference between the current desired torque and the desired output torque, the parametric identification model is expressed as:
Figure BDA0003094003730000151
wherein the current desired moment τmJoint angle q and joint angular velocity
Figure BDA0003094003730000152
Angular acceleration of joint
Figure BDA0003094003730000153
And the current value of the motor current i is a constant value, and the following model parameters exist: k is a radical ofi(i=1、2、3)、α0、α1、σ0、σ1、βi(i ═ 1, 2, 3, 4) and γ0. Illustratively, the model parameters may form a parameter set [ k ]i(i=1、2、3)、α0、α1、σ0、σ1、βi(i=1、2、3、4)、γ0]. The parameter identification of the parameter identification model is the identification of the parameter set.
According to the parameter identification of the parameter identification model, the identification value of the parameter set can be obtained. Wherein k isiThe identification value of (i ═ 1, 2, and 3) is the parameter value required in the desired mapping model. Therefore, it is necessary to perform parameter identification on the parameter identification model to obtain an identification value so as to identify kiThe identification value of (i ═ 1, 2, 3) is determined as a parameter value in the desired mapping model.
Based on this, step 208 can be implemented as step 306, step 307, and step 308, which are specifically set forth below:
step 306: and collecting the joint motion parameters and the current values in the theoretical mapping model.
From the parameter identification model, the current expected torque τmJoint angle q and joint angular velocity
Figure BDA0003094003730000154
Angular acceleration of joint
Figure BDA0003094003730000155
And the current value of the motor current i is a constant value.
Wherein the current desired moment τmDetermining the joint angle q and the joint angular velocity according to a dynamic model of the robot
Figure BDA0003094003730000156
Angular acceleration of joint
Figure BDA0003094003730000157
Including the joint motion parameters.
To implement a set of parameters ki(i=1、2、3)、α0、α1、σ0、σ1、βi(i=1、2、3、4)、γ0]Performing parameter identification, determining joint angle q and joint angular velocity
Figure BDA0003094003730000158
Angular acceleration of joint
Figure BDA0003094003730000159
And the value of the current value of the motor current i.
Illustratively, step 306 may be implemented as follows:
and collecting joint motion parameters and current values according to the motion track of the robot joint.
The motion trajectory of the robot joint is also called an excitation trajectory. Optionally, the excitation trajectory includes, but is not limited to, one of the following trajectories: sinusoidal excitation tracks and Fourier series excitation tracks.
Specifically, during the execution of the excitation track of the robot joint, the joint angle q and the joint angular velocity can be acquired
Figure BDA0003094003730000161
Angular acceleration of joint
Figure BDA0003094003730000162
And the value of the current value of the motor current i. That is, according to the motion trajectory of the robot joint, the joint angle q and the joint angular velocity can be acquired
Figure BDA0003094003730000163
Angular acceleration of joint
Figure BDA0003094003730000164
And a current value i.
Step 307: and carrying out nonlinear optimization processing on the model parameters in the parameter identification model according to the joint motion parameters and the current values to obtain identification values of the model parameters.
Illustratively, the model parameters include parameters in the expected mapping model, and the nonlinear optimization processing is identification processing of the parameter set to obtain identification values of the parameters.
Wherein, the nonlinear optimization processing may be to adopt a nonlinear optimization algorithm.
Illustratively, step 307 may be implemented as follows:
substituting the joint motion parameters and the current values into the parameter identification model to obtain an updated parameter identification model;
and processing the updated parameter identification model by adopting a nonlinear optimization algorithm to obtain the identification value of the model parameter.
Wherein the nonlinear optimization algorithm comprises at least one of the following algorithms: maximum likelihood estimation method, iterative algorithm, variable scale method, least square method, simplex searching method, complex searching method, random searching method.
That is, the joint angle q and the joint angular velocity
Figure BDA0003094003730000165
Angular acceleration of joint
Figure BDA0003094003730000166
And substituting the current value i into the parameter identification model to obtain an updated parameter identification model, wherein the model only comprises a parameter set.
And then, processing the updated parameter identification model by adopting a nonlinear optimization algorithm. Equivalently, the identification value of the parameter set is obtained by using the arg (f) function.
Illustratively, the updated parameter identification model may be represented by the following sub-equation:
g=arg(ki(i=1、2、3),α0101i(i=1、2、3、4),γ0)min(τm-τ+τf)。
wherein the arg (f) function is a function of the objective function (f) in terms of parameters or parameter sets. Specifically, the argmin (f) function is a function of a parameter or a parameter set when the objective function (f) is determined to be the minimum value.
That is, based on the updated parameter identification model, the model can be updatedGet the target function f ═ taum-τ+τfAnd taking the identification value of each parameter in the parameter set at the minimum value. The parameter set comprises the following parameters: k is a radical ofi(i=1、2、3)、α0、α1、σ0、σ1、βi(i ═ 1, 2, 3, 4) and γ0
Equivalently, according to the argmin (f) function, the objective function f can be acquired as τm-τ+τfAnd taking a parameter set of the minimum value.
Wherein the objective function f is taum-τ+τfThe minimum value is taken to mean that the difference between the current desired torque and the desired output torque (corresponding to the desired control torque) is minimal. That is, the desired control moment infinitely approaches the current desired moment, so that the control accuracy of the robot is improved.
Step 308: and determining the parameter value in the expected mapping model under the condition that the identification value of the model parameter meets the preset precision.
Illustratively, the predetermined accuracy is used to determine the accuracy of the identification value of the model parameter.
According to step 307, after the identification value of each parameter in the parameter set is determined, k is obtainedi(i is 1, 2, 3). At this time, the parameter ki(i ═ 1, 2, 3) into the desired mapping model f (i ═ k)1i+k2i2+k3i3And then the mapping relation between the expected control torque and the motor current can be obtained.
In summary, the embodiment of the present application provides a determination method for parameter values in an expected mapping model, a parameter identification model is constructed according to a joint friction model and a theoretical mapping model, parameters in the expected mapping model are placed in the parameter identification model, and the parameter values in the expected mapping model are obtained through parameter identification of the parameter identification model.
Specifically, a joint friction model is constructed according to joint motion data of the robot joint, and the friction force characteristic of the joint is fully considered; the theoretical mapping model is constructed according to multiple terms, and the nonlinear relation between the theoretical output torque and the motor current is fully considered. Based on the method, the constructed parameter identification model can sufficiently reflect the mapping relation between the expected control torque and the motor current, so that an expected mapping model comprising compensation of joint friction force is obtained.
Fig. 5 shows a flow chart of another method of controlling a robot joint, comprising the steps of:
step 401: and constructing a joint friction model.
Illustratively, the joint friction model is used to describe the joint friction of the robot joint, which is influenced by the joint motion data.
That is, a joint friction model of the robot joint is constructed from the joint motion data. Wherein the articulation data comprises at least one of: a joint angle of the robot joint, a joint angular velocity of the robot joint, and a joint angular acceleration of the robot joint.
Alternatively, the joint friction model may be represented by the following sub-formula:
Figure BDA0003094003730000171
wherein, taufIs joint friction force, tauf,qFor the angle parameter items determined from the joint angles,
Figure BDA0003094003730000172
for the angular velocity parameter term determined from the joint angular velocity,
Figure BDA0003094003730000173
is an angular acceleration parameter term determined according to the angular acceleration of the joint.
The specific construction of the joint friction force model can refer to the foregoing, and will not be described herein.
Step 402: and constructing a theoretical mapping model.
Illustratively, the theoretical mapping model is generated according to the mapping relation between the theoretical output torque and the motor current. The theoretical output torque can be regarded as a theoretical value.
Alternatively, the theoretical mapping model may be represented by the following sub-formula: τ ═ k1i+k2i2+k3i3. Wherein k isiI is the motor current.
For the specific construction of the theoretical mapping model, reference may be made to the foregoing description, and further description is omitted here.
For example, a parameter identification model may be constructed according to the joint friction model and the theoretical mapping model, which is specifically referred to above and will not be described herein again.
Step 403: collecting identification data and verification data.
Illustratively, the identification data is used for identifying parameters of the parameter identification model, and the verification data is used for verifying the identification values of the model parameters.
Specifically, the identification data and the verification data can be collected according to the movement track of the robot joint.
The motion trajectory of the robot joint is also called an excitation trajectory. Optionally, the excitation trajectory includes, but is not limited to, one of the following trajectories: sinusoidal excitation tracks and Fourier series excitation tracks.
Specifically, during the execution of the excitation track of the robot joint, the joint angle q and the joint angular velocity can be acquired
Figure BDA0003094003730000182
Angular acceleration of joint
Figure BDA0003094003730000181
And a plurality of sets of data of the current value of the motor current i, wherein the identification data and the verification data are respectively one or more sets thereof.
Step 404: and (5) performing parameter identification by adopting a nonlinear optimization method.
The nonlinear optimization method is a method for performing nonlinear optimization processing on the parameter identification model. Specifically, the nonlinear optimization processing is identification processing performed on model parameters in the parameter identification model to obtain identification values of the parameters.
Wherein, the nonlinear optimization processing may be to adopt a nonlinear optimization algorithm. Illustratively, the nonlinear optimization algorithm includes at least one of the following algorithms: maximum likelihood estimation method, iterative algorithm, variable scale method, least square method, simplex searching method, complex searching method, random searching method.
Step 405: and judging whether the identification value of the model parameter meets the precision requirement or not.
After the parameter identification model is subjected to parameter identification, the identification value of the model parameter can be obtained. In order to improve the accuracy of the model parameters, the identification values of the model parameters need to be determined with accuracy.
Based on this, in step 405, under the condition that the identification value of the model parameter meets the accuracy requirement, parameter identification is completed, and the required identification value is determined as the parameter value in the expected mapping model; in the case that the identified values of the model parameters do not meet the accuracy requirement, the step 404 is continuously executed.
In summary, the embodiment of the present application provides a determination method for parameter values in an expected mapping model, a parameter identification model is constructed according to a joint friction model and a theoretical mapping model, parameters in the expected mapping model are placed in the parameter identification model, and the parameter values in the expected mapping model are obtained through parameter identification of the parameter identification model.
The embodiment of the present application further provides a parameter identification method of an expected mapping model, which is used for determining parameter values in the expected mapping model, and the method may be executed by the computer device 200 in fig. 1, and after determining the parameter values, the computer device 200 sends the parameter values or the expected mapping model including the parameter values to the robot 100. As shown in fig. 6, a method for identifying parameters of a desired mapping model provided in an embodiment of the present application includes the following steps:
step 502: and constructing a joint friction model of the robot joint according to the joint motion data of the robot joint.
Illustratively, the articulation data includes at least one of the following: a joint angle of the robot joint, a joint angular velocity of the robot joint, and a joint angular acceleration of the robot joint.
According to the different configurations of the robot, the friction force of the joints corresponding to the joints of the robot is different. Specifically, the joint friction force is generated according to factors such as the transmission mode of the robot, the machining and assembling errors of joint parts, the use abrasion degree and the like. The above factors all affect the joint angle, the joint angular velocity and the joint angular acceleration of the robot joint.
Therefore, it can be seen from the cause of the joint friction force that the joint friction force is related to the joint angle, the joint angular velocity, and the joint angular acceleration of the robot joint. That is, joint friction is affected by the joint motion data.
Based on the data, the joint friction model is constructed according to the joint motion data and is used for describing the joint friction force of the robot joint, and the joint friction force is influenced by the joint motion data.
Alternatively, the joint friction model may be represented by the following sub-formula:
Figure BDA0003094003730000191
wherein, taufIs joint friction force, tauf,qFor the angle parameter items determined from the joint angles,
Figure BDA0003094003730000192
for the angular velocity parameter term determined from the joint angular velocity,
Figure BDA0003094003730000193
is an angular acceleration parameter term determined according to the angular acceleration of the joint.
Step 504: and constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint.
Illustratively, the theoretical mapping model is generated according to the mapping relation between the theoretical output torque and the motor current. The theoretical output torque can be regarded as a theoretical value.
According to the foregoing, it is desirable that the mapping model is generated from a mapping of the output torque required by the robot joint and the motor current, and the mapping may be non-linear or linear. Similarly, the theoretical mapping model is generated according to the mapping relationship between the theoretical output torque and the motor current of the robot joint, and the mapping relationship can be nonlinear or linear.
Meanwhile, based on the difference between the output torque required by the robot joint (i.e., the desired control torque) and the theoretical output torque of the robot joint (i.e., the theoretical output torque), the parameters in the theoretical mapping model and the parameters in the desired mapping model are also different.
Optionally, in order to reduce the influence of the theoretical output torque on the control precision of the robot and make the relationship between the theoretical output torque and the motor current more trend to the actual situation, a cubic polynomial is used to describe the relationship between the theoretical output torque and the motor current.
That is, the theoretical mapping model may be represented by the following sub-formula: τ ═ k1i+k2i2+k3i3. Wherein k isiAnd i is the motor current, and the torque coefficient is used for describing the nonlinear relation between the theoretical output torque and the motor current.
Step 506: and constructing a parameter identification model according to the joint friction model and the theoretical mapping model.
According to the foregoing, the output torque of the robot joint needs to satisfy the following constraint conditions: τ ═ τm′+τf. Wherein, taum' desired output Torque, [ tau ] theoretical output Torque, [ tau ]fIs the joint friction.
To reduce the effect of joint friction on the robot joint, it is necessary to add compensation to the joint friction in the desired control torque (which may be considered as the desired output torque) that is used to counteract the joint friction. Equivalently, due to the existence of joint friction force, the expected control torque of the robot joint is different from the theoretical output torque.
Based on this, a parameter identification model may be constructed for describing the difference between the desired control torque and the theoretical output torque.
The theoretical output torque can also be determined according to a dynamic model, which is also referred to as the current desired torque. In the embodiment of the present application, the kinetic model of the robot refers to a kinetic model corresponding to a robot joint. The dynamic model has various expression forms, such as those obtained according to lagrangian equation or newton euler equation.
That is, the parameter identification model may be constructed from the difference between the current desired torque and the desired output torque.
Alternatively, the parameter identification model may be represented by the following sub-formula: f ═ τmm′=τm-τ+τf. Wherein, taumFor the currently desired moment, τm' desired output Torque, [ tau ] theoretical output Torque, [ tau ]fIs the joint friction.
Step 508: and performing parameter identification on the parameter identification model, and determining the identification value of the model parameter in the parameter identification model as the parameter value in the expected mapping model.
Schematically, parameter identification means that an unknown parameter in a theoretical model is identified according to the theoretical model and experimental data to obtain a determined value of the unknown parameter, so that a numerical result obtained through the theoretical model can achieve a better fitting effect.
In other words, the parameter identification performed on the parameter identification model can obtain the identification value of the model parameter, and the identification value can make the numerical result obtained according to the parameter identification model closer to the true value.
Illustratively, the model parameters in the parameter identification model include all unknown parameters.
According to the foregoing, the parameter identification model is constructed based on the joint friction model and the theoretical mapping model. Therefore, the parameter identification model includes the parameters related to the joint friction force and the parameters in the expected mapping model. Since the parameter identification is performed on the parameter identification model, the obtained identification value is the identification value of all unknown parameters. That is, according to the parameter identification performed on the parameter identification model, the identification value of the parameter in the expected mapping model meets the requirement of the relevant parameter of the joint friction force, which is equivalent to the fact that the expected mapping model includes compensation for the joint friction force.
With the desired mapping model being f (i) ═ k1i+k2i2+k3i3Wherein the parameter identification model is f ═ τmm′=τm-τ+τfFor example, the model parameters in the parameter identification model include parameters related to joint friction and a parameter ki(i ═ 1, 2, 3), parameter kiThe identification value of (i ═ 1, 2, and 3) is the parameter value required in the desired mapping model. Performing parameter identification on the parameter identification model to obtain related parameters of joint friction force and a parameter ki(i is 1, 2, 3) to identify kiThe identification value of (i ═ 1, 2, 3) is determined as a parameter value in the desired mapping model.
Optionally, after the identification value of the model parameter is obtained, the accuracy of the identification value of the model parameter may be determined, so as to improve the accuracy of the parameter value in the expected mapping model.
For the above steps, reference may be made to the above contents, which are not described again.
In summary, the embodiment of the present application constructs the parameter identification model according to the joint friction model and the theoretical mapping model, puts the parameters in the expected mapping model into the parameter identification model, and obtains the parameter values in the expected mapping model through the parameter identification of the parameter identification model.
The joint friction model is constructed according to at least one of the angle parameter item, the angular velocity parameter item and the angular acceleration, so that the joint friction model can include the action of joint motion data on joint friction force, the characteristics and influence factors of the joint friction force are fully considered, and the constructed parameter identification model can more accurately reflect the influence of parameters in the expected mapping model and the joint friction force.
According to the parameter identification of the joint friction model, the parameter values in the obtained expected mapping model can supplement the joint friction force, so that the expected mapping model can accurately reflect the nonlinear relation between the expected control torque and the motor current, and the control precision of the robot joint is improved.
Fig. 7 is a block diagram showing a configuration of a control device for a robot joint according to an embodiment of the present invention. The device includes:
an obtaining module 720, configured to obtain a desired control moment of the robot joint;
a determining module 740, configured to determine a motor current corresponding to the expected control torque according to an expected mapping model, where the expected mapping model is generated according to a nonlinear relationship between the expected control torque and the motor current, and the expected mapping model includes compensation for joint friction of the robot joint according to joint motion data of the robot joint;
and the driving module 760 is used for driving the motor of the robot joint according to the motor current.
In an optional embodiment, the apparatus further comprises a parameter identification module 780 for obtaining parameter values in the desired mapping model according to the following manner: constructing a joint friction model of the robot joint according to the joint motion data; constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint; constructing a parameter identification model according to the joint friction model and the theoretical mapping model; and performing parameter identification on the parameter identification model, and determining the identification value of the model parameter in the parameter identification model as the parameter value in the expected mapping model.
Illustratively, the parameter identification module 780 is disposed in the control device of the robot joint, or in another device connected to the control device of the robot joint.
In an alternative embodiment, the parameter identification module 780 is configured to generate at least one parameter item of the joint friction model from the joint motion data; and constructing a joint friction model according to the at least one parameter item.
In an alternative embodiment, the parameter identification module 780 is configured to implement at least one of the following steps: generating an angle parameter item according to the joint angle of the robot joint; generating an angular velocity parameter item according to the joint angular velocity of the robot joint; an angular acceleration parameter term is generated according to joint angular acceleration of the robot joint.
In an optional embodiment, the parameter identification module 780 is configured to generate a first angle parameter item when the joint angle is within a first preset interval; when the joint angle is in a second preset interval, generating a second angle parameter item; wherein, the first preset interval and the second preset interval are two intervals without intersection.
In an optional embodiment, the parameter identification module 780 is configured to generate a first parameter item and a second parameter item according to the joint angular velocity, where the first parameter item corresponds to a friction curve of a robot joint, and the second parameter item has a non-linear relationship with the joint angular velocity; an angular velocity parameter term is generated from the first parameter term and the second parameter term.
In an alternative embodiment, the parameter identification module 780 is configured to generate the angular acceleration parameter term according to a linear relationship between the joint angular acceleration and the angular acceleration parameter term.
In an alternative embodiment, the parameter identification module 780 is configured to construct a theoretical mapping model according to a theoretical output torque and a torque coefficient, where the theoretical output torque has a nonlinear relationship with the motor current, and the torque coefficient is a parameter term corresponding to the motor current.
In an alternative embodiment, the parameter identification module 780 is configured to determine the current expected torque according to a dynamic model corresponding to the robot joint; and constructing a parameter identification model according to the current expected torque, the joint friction model and the theoretical mapping model.
In an alternative embodiment, the parameter identification module 780 is configured to collect the joint motion parameters and the current values in the theoretical mapping model; according to the joint motion parameters and the current values, carrying out nonlinear optimization processing on model parameters in the parameter identification model to obtain identification values of the model parameters, wherein the model parameters comprise parameters in an expected mapping model; and determining the parameter value in the expected mapping model under the condition that the identification value of the model parameter meets the preset precision.
Fig. 8 is a block diagram of a parameter identification apparatus of a desired mapping model according to an embodiment of the present application. The device includes:
a construction module 820, configured to construct a joint friction model of a robot joint according to joint motion data of the robot joint;
constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint;
a construction module 820, further configured to construct a parameter identification model according to the joint friction model and the theoretical mapping model;
the determining module 840 is configured to perform parameter identification on the parameter identification model, and determine an identification value of a model parameter in the parameter identification model as a parameter value in the expected mapping model.
In an alternative embodiment, the construction module 820 is configured to generate at least one parameter term of a joint friction model from the joint motion data; and constructing a joint friction model according to the at least one parameter item.
In an alternative embodiment, module 820 is constructed to implement at least one of the following: generating an angle parameter item according to the joint angle of the robot joint; generating an angular velocity parameter item according to the joint angular velocity of the robot joint; an angular acceleration parameter term is generated according to joint angular acceleration of the robot joint.
In an alternative embodiment, the constructing module 820 is configured to generate a first angle parameter item when the joint angle is within a first preset interval; when the joint angle is in a second preset interval, generating a second angle parameter item; wherein, the first preset interval and the second preset interval are two intervals without intersection.
In an optional embodiment, the building module 820 is configured to generate a first parameter item and a second parameter item according to the joint angular velocity, where the first parameter item corresponds to a friction curve of a robot joint, and the second parameter item has a non-linear relationship with the joint angular velocity; an angular velocity parameter term is generated from the first parameter term and the second parameter term.
In an alternative embodiment, the module 820 is configured to generate the angular acceleration parameter term according to a linear relationship between the joint angular acceleration and the angular acceleration parameter term.
In an optional embodiment, the constructing module 820 is configured to construct a theoretical mapping model according to a theoretical output torque and a torque coefficient, where the theoretical output torque has a non-linear relationship with the motor current, and the torque coefficient is a parameter term corresponding to the motor current.
In an alternative embodiment, the construction module 820 is configured to determine the current desired moment from a dynamic model corresponding to the robot joint; and constructing a parameter identification model according to the current expected torque, the joint friction model and the theoretical mapping model.
In an alternative embodiment, the determining module 840 is configured to collect the joint motion parameters and the current values in the theoretical mapping model; according to the joint motion parameters and the current values, carrying out nonlinear optimization processing on model parameters in the parameter identification model to obtain identification values of the model parameters, wherein the model parameters comprise parameters in an expected mapping model; and determining the parameter value in the expected mapping model under the condition that the identification value of the model parameter meets the preset precision.
It should be noted that: the apparatus provided in the foregoing embodiment is only illustrated by dividing the functional modules, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the apparatus provided in the foregoing embodiment has the same concept as the method embodiment in the foregoing, and specific implementation processes thereof are described in the method embodiment and are not described herein again.
Fig. 9 shows a block diagram of a robot 900 according to an exemplary embodiment of the present application. The robot 900 includes: a wheel leg portion 910, a base portion 920 connected to the wheel leg portion 910, wherein each of the wheel leg portion 910 and the base portion 920 includes at least one joint.
The wheel leg part refers to a wheel type part used for realizing movement of the robot, and comprises a wheel part and a leg part which is connected to a central shaft of the wheel part and used for realizing movement control of the wheel part. Embodiments of the present disclosure are not limited by the particular type of construction of the wheel legs and the number of wheel portions thereof.
The base portion 920 refers to a main body of the robot, and may be a body of the robot, for example, and the embodiment of the present disclosure is not limited by the specific shape and composition of the base portion 920.
Illustratively, the base portion 920 is connected to the wheel leg portion 910 via a bracket, or the connection of the base portion 920 and the wheel leg portion 910 may be achieved via other means. Embodiments of the present disclosure are not limited by the particular manner of attachment of the base portion 920 to the wheel leg portion 910.
In some embodiments, the base portion 920 includes 2 joints and the wheel leg portion 910 includes 6 joints, and embodiments of the present disclosure are not limited by the specific number of joints included in the base portion 920 and the wheel leg portion 910, nor by the specific joint configuration that the robot has.
The robot further includes a controller 930, and the controller 930 is provided on the robot and can execute the control method of the robot joint as described above, and has the functions as described above.
Illustratively, the controller 930 includes a processing device. The processing device may include a microprocessor, digital signal processor ("DSP"), application specific integrated circuit ("ASIC"), field programmable gate array, state machine, or other processing device for processing electrical signals received from the sensor wires. Such processing devices may include programmable electronic devices such as PLCs, programmable interrupt controllers ("PICs"), programmable logic devices ("PLDs"), programmable read only memories ("PROMs"), electronically programmable read only memories, and the like.
Illustratively, the robot further comprises at least one of the following components: the device comprises a bus, a memory, a sensor assembly, a communication module and an input and output device.
A bus may be a circuit that interconnects the various components of the robot and passes communication information (e.g., control messages or data) among the various components.
The sensor assembly may be used to sense the physical world, including, for example, a camera, an infrared sensor, an ultrasonic sensor, and the like. Furthermore, the sensor assembly may also comprise means for measuring the current operating and movement state of the robot, such as hall sensors, laser position sensors, or strain force sensors, etc.
The communication module may be connected to a network, for example, by wire or by a null, to facilitate communication with the physical world (e.g., a server). The communication module may be wireless and may include a wireless interface, such as an IEEE 802.11, bluetooth, wireless local area network ("WLAN") transceiver, or a radio interface for accessing a cellular telephone network (e.g., a transceiver/antenna for accessing a CDMA, GSM, UMTS, or other mobile communication network). In another example, the communication module may be wired and may include an interface such as ethernet, USB, or IEEE 1394.
The input-output means may transmit, for example, commands or data input from a user or any other external device to one or more other components of the robot, or may output commands or data received from one or more other components of the robot to the user or other external device.
A plurality of robots may compose a robotic system to cooperatively perform a task, the plurality of robots communicatively coupled to a server and receiving cooperative robot instructions from the server.
In some embodiments, the robot further comprises an attachment member 940 arranged on the base portion, the attachment member 940 comprising, for example, at least one joint.
The additional member 940 is a member that is connected to the base portion 920 and has a motion process independent from the base portion 920 and the wheel leg portion 910, and may have a motion degree of freedom of its own and a motion control process independent from the posture control of the base portion 920 and the balance motion control process of the wheel leg portion 910, for example. Embodiments of the present disclosure are not limited by the particular placement of the additional components and their articulation configuration.
Based on the above, in the present application, by further providing the additional component 940 on the base portion 920 of the robot and decoupling the component motion control of the additional component 940, the posture position control of the base portion 920 and the balance motion control of the wheel leg portion 910 from each other, the motion control of the base portion 920, the additional component 940 and the wheel leg portion 910 can be independently achieved, which is beneficial to flexibly adjusting the corresponding motion process of the additional component of the robot (for example, adjusting the three-dimensional coordinate position and speed thereof) according to actual needs, so that the robot can be suitable for different task scenarios, and has various overall posture configurations, thereby being beneficial to better human-computer interaction experience.
Those skilled in the art will appreciate that the configuration shown in fig. 9 is not limiting to robot 900 and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be used.
Fig. 10 is a block diagram illustrating a structure of an electronic device 1000 according to an exemplary embodiment of the present application. The electronic device 1000 may be a portable mobile terminal, such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. The electronic device 1000 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so forth. In the embodiment of the present application, the electronic device 1000 is implemented as a control device portion in a robot.
In general, the electronic device 1000 includes: a processor 1001 and a memory 1002.
Processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1001 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1001 may further include an AI (Artificial Intelligence) processor for processing a computing operation related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. The memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1002 is configured to store at least one instruction for execution by the processor 1001 to implement a method of controlling a robotic joint, or a method of identifying parameters of a desired mapping model, provided by method embodiments herein.
In some embodiments, the electronic device 1000 may further include: a peripheral interface 1003 and at least one peripheral. The processor 1001, memory 1002 and peripheral interface 1003 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, display screen 1005, camera assembly 1006, audio circuitry 1007, positioning assembly 1008, and power supply 1009.
The peripheral interface 1003 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1001 and the memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1001, the memory 1002, and the peripheral interface 1003 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1004 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or Wi-Fi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1004 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1005 is a touch display screen, the display screen 1005 also has the ability to capture touch signals on or over the surface of the display screen 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this point, the display screen 1005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 1005 may be one, disposed on the front panel of the electronic device 1000; in other embodiments, the display screens 1005 may be at least two, respectively disposed on different surfaces of the electronic device 1000 or in a folded design; in other embodiments, the display 1005 may be a flexible display, disposed on a curved surface or on a folded surface of the electronic device 1000. Even more, the display screen 1005 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display screen 1005 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1006 is used to capture images or video. Optionally, the camera assembly 1006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing or inputting the electric signals to the radio frequency circuit 1004 for realizing voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the electronic device 1000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1007 may also include a headphone jack.
The positioning component 1008 is used to locate a current geographic Location of the electronic device 1000 to implement navigation or LBS (Location Based Service). The Positioning component 1008 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 1009 is used to supply power to the respective components in the electronic device 1000. The power source 1009 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 1009 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the electronic device 1000 also includes one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyro sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 1016.
The acceleration sensor 1011 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the electronic apparatus 1000. For example, the acceleration sensor 1011 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1001 may control the display screen 1005 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the electronic device 1000, and the gyro sensor 1012 and the acceleration sensor 1011 may cooperate to acquire a 3D motion of the user on the electronic device 1000. From the data collected by the gyro sensor 1012, the processor 1001 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 1013 may be disposed on a side bezel of the electronic device 1000 and/or on a lower layer of the display screen 1005. When the pressure sensor 1013 is disposed on a side frame of the electronic device 1000, a user's holding signal of the electronic device 1000 can be detected, and the processor 1001 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1013. When the pressure sensor 1013 is disposed at a lower layer of the display screen 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the processor 1001 identifies the user according to the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1014 may be disposed on the front, back, or side of the electronic device 1000. When a physical button or vendor Logo is provided on the electronic device 1000, the fingerprint sensor 1014 may be integrated with the physical button or vendor Logo.
The optical sensor 1015 is used to collect the ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the display screen 1005 according to the ambient light intensity collected by the optical sensor 1015. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1005 is increased; when the ambient light intensity is low, the display brightness of the display screen 1005 is turned down. In another embodiment, the processor 1001 may also dynamically adjust the shooting parameters of the camera assembly 1006 according to the intensity of the ambient light collected by the optical sensor 1015.
A proximity sensor 1016, also known as a distance sensor, is typically disposed on the front panel of the electronic device 1000. The proximity sensor 1016 is used to capture the distance between the user and the front of the electronic device 1000. In one embodiment, the processor 1001 controls the display screen 1005 to switch from the bright screen state to the dark screen state when the proximity sensor 1016 detects that the distance between the user and the front surface of the electronic device 1000 gradually decreases; when the proximity sensor 1016 detects that the distance between the user and the front of the electronic device 1000 gradually becomes larger, the display screen 1005 is controlled by the processor 1001 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 9 does not constitute a limitation of the electronic device 1000, and may include more or fewer components than those shown, or combine certain components, or employ a different arrangement of components.
Embodiments of the present application also provide a robot comprising a processor and a memory, the memory having at least one program code stored therein, the program code being loaded and executed by the processor to implement the method of controlling a robot joint as described above.
Embodiments of the present application also provide a computer device comprising a processor and a memory, the memory having stored therein at least one program code, the program code being loaded and executed by the processor to implement the method for parameter identification of a desired mapping model as described above.
Embodiments of the present application also provide a computer-readable storage medium having at least one program code stored thereon, the program code being loaded and executed by a processor to implement the method for controlling a robot joint as described above, or the method for recognizing parameters of a desired mapping model.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the control method of the robot joint as described above, or the parameter recognition method of the desired mapping model.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM).
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method of controlling a robot joint, the method comprising:
acquiring an expected control moment of the robot joint;
determining a motor current corresponding to the expected control torque according to an expected mapping model, wherein the expected mapping model is generated according to a nonlinear relation between the expected control torque and the motor current, and the expected mapping model comprises compensation of joint friction of the robot joint according to joint motion data of the robot joint;
and driving a motor of the robot joint according to the motor current.
2. The method of claim 1, wherein the parameter values in the desired mapping model are obtained by:
constructing a joint friction model of the robot joint according to the joint motion data;
constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint;
constructing a parameter identification model according to the joint friction model and the theoretical mapping model;
and performing parameter identification on the parameter identification model, and determining the identification value of the model parameter in the parameter identification model as the parameter value in the expected mapping model.
3. The method of claim 2, wherein constructing a joint friction model of the robot joint from the joint motion data comprises:
generating at least one parameter term of the joint friction model from the joint motion data;
and constructing the joint friction model according to the at least one parameter item.
4. A method according to claim 3, wherein said generating at least one parameter term of said joint friction model from said articulation data comprises at least one of the following steps:
generating an angle parameter item according to the joint angle of the robot joint;
generating an angular velocity parameter item according to the joint angular velocity of the robot joint;
and generating an angular acceleration parameter item according to the joint angular acceleration of the robot joint.
5. The method of claim 4, wherein generating an angle parameter term from joint angles of the robotic joints comprises:
when the joint angle is in a first preset interval, generating a first angle parameter item;
when the joint angle is in a second preset interval, generating a second angle parameter item;
wherein the first preset interval and the second preset interval are two intervals without intersection.
6. The method of claim 4, wherein generating an angular velocity parameter term as a function of joint angular velocity of the robotic joint comprises:
generating a first parameter item and a second parameter item according to the joint angular velocity, wherein the first parameter item corresponds to a friction curve of the robot joint, and the second parameter item has a nonlinear relation with the joint angular velocity;
generating the angular velocity parameter item from the first parameter item and the second parameter item.
7. The method of claim 4, wherein generating an angular acceleration parameter term as a function of joint angular acceleration of the robotic joint comprises:
and generating the angular acceleration parameter item according to the linear relation between the joint angular acceleration and the angular acceleration parameter item.
8. The method of claim 2, wherein constructing the theoretical mapping model of the robot joint based on the theoretical output moments of the robot joint comprises:
and determining the theoretical mapping model according to the theoretical output torque and a torque coefficient, wherein the theoretical output torque and the motor current have a nonlinear relation, and the torque coefficient is a parameter item corresponding to the motor current.
9. The method of claim 2, wherein constructing a parameter recognition model from the joint friction model and the theoretical mapping model comprises:
determining a current expected moment according to a dynamic model corresponding to the robot joint;
and constructing the parameter identification model according to the current expected torque, the joint friction model and the theoretical mapping model.
10. A method for identifying parameters of an expected mapping model, the method comprising:
constructing a joint friction model of the robot joint according to joint motion data of the robot joint;
constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint;
constructing a parameter identification model according to the joint friction model and the theoretical mapping model;
and performing parameter identification on the parameter identification model, and determining the identification value of the parameter identification model as the parameter value in the expected mapping model.
11. A control device for a robot joint, characterized in that the device comprises:
an acquisition module for acquiring an expected control moment of the robot joint;
the determining module is used for determining a motor current corresponding to the expected control torque according to an expected mapping model, the expected mapping model is generated according to a nonlinear relation between the expected control torque and the motor current, and the expected mapping model comprises compensation of joint friction of the robot joint according to joint motion data of the robot joint;
and the driving module is used for driving the motor of the robot joint according to the motor current.
12. An apparatus for identifying parameters of a desired mapping model, the apparatus comprising:
the construction module is used for constructing a joint friction model of the robot joint according to joint motion data of the robot joint;
the construction module is further used for constructing a theoretical mapping model of the robot joint according to the theoretical output torque of the robot joint;
the construction module is also used for constructing a parameter identification model according to the joint friction model and the theoretical mapping model;
and the determining module is used for carrying out parameter identification on the parameter identification model and determining the identification value of the parameter identification model as the parameter value in the expected mapping model.
13. A robot, characterized in that it comprises a processor and a memory, in which at least one program code is stored, which is loaded and executed by the processor to implement a method of controlling a robot joint according to any of claims 1-9.
14. A computer device comprising a processor and a memory, the memory having stored therein at least one program code, the program code being loaded and executed by the processor to implement the method of parameter identification of a desired mapping model as claimed in claim 10.
15. A computer-readable storage medium, wherein at least one program code is stored therein, which is loaded and executed by a processor to implement the method for controlling a robot joint according to any one of claims 1 to 9 or the method for parameter identification of a desired mapping model according to claim 10.
CN202110604994.8A 2021-05-31 2021-05-31 Method and device for controlling robot joint, robot and storage medium Pending CN113752250A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110604994.8A CN113752250A (en) 2021-05-31 2021-05-31 Method and device for controlling robot joint, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110604994.8A CN113752250A (en) 2021-05-31 2021-05-31 Method and device for controlling robot joint, robot and storage medium

Publications (1)

Publication Number Publication Date
CN113752250A true CN113752250A (en) 2021-12-07

Family

ID=78787297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110604994.8A Pending CN113752250A (en) 2021-05-31 2021-05-31 Method and device for controlling robot joint, robot and storage medium

Country Status (1)

Country Link
CN (1) CN113752250A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114260892A (en) * 2021-12-17 2022-04-01 深圳市优必选科技股份有限公司 Elastic joint torque control method and device, readable storage medium and robot
CN115674190A (en) * 2022-09-30 2023-02-03 深圳市越疆科技有限公司 Cooperative mechanical arm and motion control method, collision detection method and control system thereof
CN116442220A (en) * 2023-03-30 2023-07-18 之江实验室 Parameter identification method and device for robot joint friction model and moment estimation method and device
WO2024067440A1 (en) * 2022-09-30 2024-04-04 腾讯科技(深圳)有限公司 Two-wheeled robot control method and apparatus, medium, program, and robot

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114260892A (en) * 2021-12-17 2022-04-01 深圳市优必选科技股份有限公司 Elastic joint torque control method and device, readable storage medium and robot
CN114260892B (en) * 2021-12-17 2023-12-15 深圳市优必选科技股份有限公司 Elastic joint moment control method and device, readable storage medium and robot
CN115674190A (en) * 2022-09-30 2023-02-03 深圳市越疆科技有限公司 Cooperative mechanical arm and motion control method, collision detection method and control system thereof
WO2024067440A1 (en) * 2022-09-30 2024-04-04 腾讯科技(深圳)有限公司 Two-wheeled robot control method and apparatus, medium, program, and robot
CN115674190B (en) * 2022-09-30 2024-05-07 深圳市越疆科技股份有限公司 Cooperative mechanical arm and motion control method, collision detection method and control system thereof
CN116442220A (en) * 2023-03-30 2023-07-18 之江实验室 Parameter identification method and device for robot joint friction model and moment estimation method and device

Similar Documents

Publication Publication Date Title
US11151773B2 (en) Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium
CN113752250A (en) Method and device for controlling robot joint, robot and storage medium
CN108245893B (en) Method, device and medium for determining posture of virtual object in three-dimensional virtual environment
CN109712224B (en) Virtual scene rendering method and device and intelligent device
KR102582863B1 (en) Electronic device and method for recognizing user gestures based on user intention
CN110986930B (en) Equipment positioning method and device, electronic equipment and storage medium
CN111354434B (en) Electronic device and method for providing information thereof
CN110920631B (en) Method and device for controlling vehicle, electronic equipment and readable storage medium
CN103473804A (en) Image processing method, device and terminal equipment
WO2020151594A1 (en) Viewing angle rotation method, device, apparatus and storage medium
CN112947474A (en) Method and device for adjusting transverse control parameters of automatic driving vehicle
CN112527104A (en) Method, device and equipment for determining parameters and storage medium
CN108196701A (en) Determine the method, apparatus of posture and VR equipment
WO2023130824A1 (en) Motion control method for under-actuated system robot, and under-actuated system robot
CN112150078A (en) Material handling method, system, device and central management equipment
US11014621B2 (en) Electronic device and method for operating same
CN115480483A (en) Method, device, equipment and medium for identifying kinetic parameters of robot
KR20200067446A (en) Electronic device including spherical mobile device and second device movable thereon, and attitude conrol method of second devcie
CN115480594A (en) Jump control method, apparatus, device, and medium
CN114764241A (en) Motion state control method, device and equipment and readable storage medium
CN111179628B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN114791729A (en) Wheeled robot control method, device, equipment and readable storage medium
CN115480560A (en) Method and device for controlling motion state, wheel-legged robot and storage medium
CN113490576A (en) Electronic device for providing feedback corresponding to input to housing
CN113962138B (en) Method, device, equipment and storage medium for determining parameter value of mobile platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination