CN111338287A - Robot motion control method, device and system, robot and storage medium - Google Patents

Robot motion control method, device and system, robot and storage medium Download PDF

Info

Publication number
CN111338287A
CN111338287A CN202010177523.9A CN202010177523A CN111338287A CN 111338287 A CN111338287 A CN 111338287A CN 202010177523 A CN202010177523 A CN 202010177523A CN 111338287 A CN111338287 A CN 111338287A
Authority
CN
China
Prior art keywords
robot
virtual
acting force
force
virtual environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010177523.9A
Other languages
Chinese (zh)
Inventor
张明明
褚开亚
刘昱东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southern University of Science and Technology
Original Assignee
Southern University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southern University of Science and Technology filed Critical Southern University of Science and Technology
Priority to CN202010177523.9A priority Critical patent/CN111338287A/en
Publication of CN111338287A publication Critical patent/CN111338287A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/19Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39254Behaviour controller, robot have feelings, learns behaviour

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the invention discloses a robot motion control method, a device, a system, a robot and a storage medium, wherein the method comprises the following steps: acquiring a first acting force of a first robot; sending the first acting force to a physical engine so that the physical engine confirms a second acting force of a second robot according to the first acting force and confirms motion information of the second robot according to the second acting force, wherein the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force of the second robot in the virtual environment; and receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information. The embodiment of the invention realizes the purpose of controlling the motion of the touch robot through physical engine modeling, and the touch robot in the control mode can also be used for special training such as limb training, rehabilitation training and the like, thereby improving the use sense of reality of a user.

Description

Robot motion control method, device and system, robot and storage medium
Technical Field
The embodiment of the invention relates to the field of robot control, in particular to a robot motion control method, a device and a system, a robot and a storage medium.
Background
The ideal haptic robot is a robot capable of realizing human haptic function, the application of the haptic robot is increasing at present, especially in the game industry, more and more companies have been provided to combine the haptic robot with games, especially large games, which can bring better viewing effect and game experience to users. Therefore, how to control the motion of the haptic robot is a crucial issue.
The existing control mode of the motion of the haptic robot is generally that a planned motion track or route is set in a control system of the robot, the robot moves according to the motion track or route, if the motion mode of the haptic robot needs to be changed, a program in the control system needs to be changed, and the control mode has low flexibility and low control efficiency. There is also a mode of controlling the motion of the robot through manual modeling, but the quality of the model depends on the technical level of modeling personnel, the adaptability is not high, and the popularization is not facilitated.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method, an apparatus, a system, a robot and a storage medium for controlling robot motion, so as to control the motion of a haptic robot through a physical engine, increase the flexibility of robot motion control, and improve the sense of reality of a user using the haptic robot.
In a first aspect, an embodiment of the present invention provides a robot motion control method, including:
acquiring a first acting force of a first robot;
sending the first acting force to a physical engine, so that the physical engine confirms a second acting force of a second robot according to the first acting force, and confirms motion information of the second robot according to the second acting force, wherein the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force of the second robot in the virtual environment;
and receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information.
Further, confirming the second acting force of the second robot according to the first acting force comprises:
determining a first virtual stress of the second robot according to the first acting force;
determining a second virtual stress of the virtual environment to the second robot according to the configuration parameters, wherein the second virtual stress is an acting force exerted by the virtual environment to the second robot;
and determining the second acting force according to the first virtual stress and the second virtual stress.
Further, the configuration parameters are used to determine the properties of the objects in the virtual environment, including the following properties of the objects in the virtual environment: physical properties, material properties, geometric properties, and connection relationships between objects. .
Further, the second virtual stress includes: one or more of a virtual gravitational force borne by the second robot in the virtual environment, a virtual elastic force borne by the second robot in the virtual environment, a virtual frictional force generated by the virtual environment on the second robot, and a virtual electromagnetic force generated by the virtual environment on the second robot.
Further, the motion information includes one or more of a position, a velocity, an acceleration, a rotation angle, a rotation speed, and a rotation acceleration.
In a second aspect, an embodiment of the present invention provides a robot motion control apparatus, including:
the first acting force acquisition module is used for acquiring a first acting force of the first robot;
the first acting force sending module is used for sending the first acting force to a physical engine so that the physical engine confirms a second acting force of a second robot according to the first acting force and confirms motion information of the second robot according to the second acting force, the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force of the second robot in the virtual environment;
and the control module is used for receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information.
In a third aspect, an embodiment of the present invention provides a robot, including a robot body and a controller, where the controller includes:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the robot motion control method of any of claims 1-5.
Furthermore, the robot is a robot with a three-degree-of-freedom bilateral motion mechanism connected with the force sensor assembly.
In a fourth aspect, an embodiment of the present invention provides a robot motion control system, including a robot and a computer device provided in any embodiment of the present invention, where the computer device is configured to run a physical engine and display a virtual environment.
In a fifth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the robot motion control method provided in any embodiment of the present invention.
The embodiment of the invention realizes the purpose of controlling the motion of the touch robot by physical engine modeling, and the model of the physical engine modeling has higher accuracy, thereby improving the accuracy and the flexibility of the motion control of the robot; the touch robot under the control mode can also be used for special training such as limb training, rehabilitation training and the like, so that the use reality of a user is improved; the compatibility with the existing large game based on the physical engine can be improved, so that the touch robot can be quickly combined with the large game, and the reality and the playability of the game are improved.
Drawings
Fig. 1 is a schematic flow chart of a robot motion control method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a robot motion control apparatus according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a controller of a robot according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a bilateral robot according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a robot motion control system according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. A process may be terminated when its operations are completed, but may have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc.
Furthermore, the terms "first," "second," and the like may be used herein to describe various orientations, actions, steps, elements, or the like, but the orientations, actions, steps, or elements are not limited by these terms. These terms are only used to distinguish one direction, action, step or element from another direction, action, step or element. For example, a first robot may be referred to as a second robot, and similarly, a second robot may be referred to as a first robot, without departing from the scope of the present application. Both the first and second robots are robots, but they are not the same robot. The terms "first", "second", etc. are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "plurality", "batch" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Example one
Fig. 1 is a flowchart illustrating a robot motion control method according to an embodiment of the present invention, where the embodiment is applicable to controlling the motion of a haptic robot through a physical engine modeling method. As shown in fig. 1, a robot motion control method according to an embodiment of the present invention includes:
s110, acquiring a first acting force of the first robot.
Specifically, the first robot represents a real tactile robot in a real environment, and the first acting force is a human-computer interaction force felt by the first robot, namely a force applied by a human to the tactile robot. The first force may be measured by the robot's electronic skin, force sensors, torque sensors, etc.
And S120, sending the first acting force to a physical engine, so that the physical engine confirms a second acting force of a second robot according to the first acting force, and confirms motion information of the second robot according to the second acting force, wherein the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force received by the second robot in the virtual environment.
Specifically, the physical engine may be regarded as a set of a series of operation rules, which calculate the motion, rotation, and collision reflection by giving real physical attributes to virtual objects, and may simulate the rules of motion and interaction of various objects in the real world in the physical engine. The virtual environment is built in the physical engine in advance, and a model of the real robot is built in the virtual environment, namely the virtual robot is built in the virtual environment, and the virtual robot is called as a second robot. The physics engine may be a physics engine capable of virtual modeling by Havok, Novodex, Bullet, Newton, etc. It should be noted that the virtual robot may be a model completely the same as the real robot, may be a model of a part of the real robot in the virtual environment, or may be a model different from the real robot but adaptable to the virtual environment.
The first acting force is sent to the physical engine, the physical engine applies the same force as the first acting force to the second robot, and then the virtual acting force applied to the second robot by the virtual environment is combined, so that the resultant force, namely the second acting force, received by the second robot in the virtual environment can be obtained. The object changes state when being stressed, the motion information of the second robot can be determined according to the second acting force, the motion information comprises one or more of position, speed, acceleration, rotation angle, rotation speed, rotation acceleration and the like, and the second robot can move correspondingly.
Further, a method for confirming the second acting force of the second robot according to the first acting force comprises steps S121 to S123 (not shown in the figure).
And S121, determining a first virtual stress of the second robot according to the first acting force.
Specifically, after receiving the first acting force, the physics engine applies a force to the second robot, which is the same as the first acting force, and the force applied by the second robot and is the same as the first acting force is referred to as a first virtual stress.
And S122, determining a second virtual stress of the virtual environment to the second robot according to the configuration parameters, wherein the second virtual stress is an acting force exerted by the virtual environment to the second robot.
Specifically, the configuration parameters are parameters that need to be set when the virtual environment is constructed, and are used to determine the properties of each object in the virtual environment, including the properties of each object in the virtual environment: physical properties, material properties, geometric properties, and connection relationships between objects. The physical attributes represent the properties of mass, position, rotation angle, speed, damping and the like of the object in the virtual environment; the material attributes represent material characteristics of the object in the virtual environment, such as density, friction coefficient, recovery coefficient and the like; the geometric attributes represent the geometry of objects in the virtual environment; the connection relationship between the objects represents the association relationship between the objects in the virtual environment.
The second virtual stress is an actual acting force of the virtual environment on the second robot, and according to the nature of the force, the virtual stress may include: virtual gravity, virtual gravitational force, virtual elastic force, virtual frictional force, virtual molecular force, virtual electromagnetic force, virtual nuclear force, and the like, for example, the virtual gravitational force borne by the second robot in the virtual environment, the virtual elastic force borne by the second robot in the virtual environment, the virtual frictional force generated by the virtual environment on the second robot, the virtual electromagnetic force generated by the virtual environment on the second robot, and the forces exerted by other objects in the virtual environment on the second robot.
Depending on the effect of the force, the virtual stress may include: virtual tension, virtual pressure, virtual support force, virtual power, virtual resistance, virtual centripetal force, virtual restoring force, and the like.
Depending on the subject, the virtual stress may include a virtual external force and a virtual internal force.
The virtual stress may include a virtual contact force and a virtual non-contact force, depending on the force applied.
The virtual stress may include, according to the interaction of forces: a virtual stress interaction force, a virtual electromagnetic interaction force, a virtual strong interaction case, and a virtual weak interaction force.
Further, the virtual stress to which the second robot is subjected, which is automatically calculated by the physics engine, may be one or more of the above virtual stresses, and may be set by changing configuration parameters of the virtual environment according to actual needs.
And S123, determining the second acting force according to the first virtual stress and the second virtual stress.
Specifically, the second acting force is the sum of the first virtual stress and the second virtual stress, so the second acting force is a resultant force received by the second robot in the virtual environment. The addition operation between the first virtual stress and the second virtual stress is vector addition operation, and a parallelogram principle or a triangle principle can be adopted.
S130, receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information.
Specifically, the first robot is controlled to generate the same motion information according to the motion information of the second robot calculated by the physical engine, so that the aim of controlling the motion of the real robot through modeling of the physical engine is fulfilled. For example, if the received motion information includes a velocity of 1m/s, an acceleration of 0, and a direction of a true west direction, the first robot is controlled to generate the same motion information, and the first robot moves in the true west direction at a velocity of 1 m/s.
The application of the application is exemplarily described by taking the case that a user feels the carrying case through a tactile robot as an example.
The first robot is a tactile robot with two handles (a left handle and a right handle), the left handle and the right handle can be regarded as the first robot at the moment, and a user can control the motion of the left handle and the right handle through the movement of the left hand and the right hand. Configuration parameters are set in a physical engine in advance, a virtual environment and a virtual robot are constructed, the virtual environment comprises a virtual ground and a virtual box placed on the virtual ground, the virtual robot in the virtual environment is two virtual hand models (a left hand model and a right hand model), and the left hand model and the right hand model can be regarded as a second robot. The left handle is held by the left hand of the user, and the right handle is held by the right hand of the user, so that the operation of the left hand and the right hand of the user is actually the motion of the left handle and the right handle of the touch robot.
The user respectively applies an upward force to the left handle and the right handle through the left hand and the right hand, and the upward force sensed by the left handle and the right handle is the first acting force. The physical engine applies a first virtual stress to the left-hand model and the right-hand model, which is the same as the first acting force, according to the first acting force, and calculates a second virtual stress to the left-hand model and the right-hand model, which is applied by the virtual environment, according to configuration parameters, such as: the gravity of the virtual box, the friction of the virtual box on the left-hand model and the right-hand model, and the supporting force of the virtual box on the left-hand model and the right-hand model. And performing vector addition on the first virtual stress and the second virtual stress to obtain resultant force (assuming that the resultant force is upward) applied to the left-hand model and the right-hand model by the virtual environment, and determining motion information of the left-hand model and the right-hand model according to the resultant force received by the left-hand model and the right-hand model, for example, the left-hand model and the right-hand model move upward at a certain speed so as to move the virtual box. And controlling the left handle and the right handle to generate the same motion information according to the motion information of the left hand model and the right hand model, so that the left handle and the right handle move upwards at the same speed, and the left handle and the right handle perform the action of moving the box.
By the control mode, the tactile robot can be used for special training, for example, the tactile robot in the control mode is used for limb training, rehabilitation training and the like, so that a user has strong sense of reality when using the tactile robot.
The robot motion control method provided by the embodiment of the invention obtains a first acting force of a first robot; sending the first acting force to a physical engine, so that the physical engine confirms a second acting force of a second robot according to the first acting force, and confirms motion information of the second robot according to the second acting force, wherein the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force of the second robot in the virtual environment; and receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information. The purpose of controlling the motion of the touch robot by physical engine modeling is realized, and the accuracy of the physical engine modeling model is higher, so that the accuracy and the flexibility of the motion control of the robot are improved; the touch robot under the control mode can also be used for special training such as limb training, rehabilitation training and the like, so that the use reality of a user is improved; the compatibility with the existing large game based on the physical engine can be improved, so that the touch robot can be quickly combined with the large game, and the reality and the playability of the game are improved.
Example two
Fig. 2 is a schematic structural diagram of a robot motion control apparatus according to a second embodiment of the present invention, which may be adapted to control a haptic robot to generate corresponding motion according to a force in a physical engine modeling manner. The robot motion control device provided by the second embodiment of the present invention can implement the robot motion control method provided by any embodiment of the present invention, has the corresponding functional structure and the beneficial effects of the implementation method, can be implemented in a software or hardware manner, and can be integrated on a terminal device, such as a robot controller. Reference may be made to the description of any method embodiment of the invention, the contents of which are not explicitly described in this embodiment.
As shown in fig. 2, a robot motion control apparatus according to a second embodiment of the present invention includes: a first force acquisition module 210, a first force transmission module 220, and a control module 230.
The first acting force acquiring module 210 is used for acquiring a first acting force of the first robot;
the first acting force sending module 220 is configured to send the first acting force to a physics engine, so that the physics engine confirms a second acting force of a second robot according to the first acting force, and confirms motion information of the second robot according to the second acting force, where the second robot is a virtual robot in a virtual environment constructed by the physics engine, and the second acting force is a resultant force received by the second robot in the virtual environment;
the control module 230 is configured to receive the motion information fed back by the physics engine and control the first robot to move according to the motion information.
Further, the physics engine is specifically configured to: determining a first virtual stress of the second robot according to the first acting force; determining a second virtual stress of the virtual environment to the second robot according to the configuration parameters, wherein the second virtual stress is an acting force exerted by the virtual environment to the second robot; and determining the second acting force according to the first virtual stress and the second virtual stress.
Further, the configuration parameters are used to determine the properties of the objects in the virtual environment, including the following properties of the objects in the virtual environment: physical properties, material properties, geometric properties, and connection relationships between objects.
Further, the second virtual stress includes: one or more of a virtual gravitational force borne by the second robot in the virtual environment, a virtual elastic force borne by the second robot in the virtual environment, a virtual frictional force generated by the virtual environment on the second robot, and a virtual electromagnetic force generated by the virtual environment on the second robot.
Further, the motion information includes one or more of a position, a velocity, an acceleration, a rotation angle, a rotation speed, and a rotation acceleration.
The robot motion control device provided by the second embodiment of the invention realizes the purpose of controlling the motion of the touch robot through physical engine modeling through the first acting force acquisition module, the first acting force sending module and the control module, and the model of the physical engine modeling has higher accuracy, so that the accuracy and the flexibility of the robot motion control are improved; the touch robot under the control mode can also be used for special training such as limb training, rehabilitation training and the like, so that the use reality of a user is improved; the compatibility with the existing large game based on the physical engine can be improved, so that the touch robot can be quickly combined with the large game, and the reality and the playability of the game are improved.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a controller of a robot according to a third embodiment of the present invention, as shown in fig. 3, the controller includes a processor 310, a memory 320, an input device 330, and an output device 340; the number of the processors 310 in the controller may be one or more, and one processor 310 is taken as an example in fig. 3; the processor 310, the memory 320, the input device 330 and the output device 340 in the controller may be connected by a bus or other means, and the connection by the bus is exemplified in fig. 3.
The memory 320 is a computer-readable storage medium that can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the image stitching method in the embodiment of the present invention (for example, a boundary line determination module, a distance determination module, and an image stitching module in the image stitching device). The processor 310 executes various functional applications and data processing of the robot by executing software programs, instructions and modules stored in the memory 320, that is, implements a haptic controller control method provided by any embodiment of the present invention, which may include:
acquiring a first acting force of a first robot;
sending the first acting force to a physical engine, so that the physical engine confirms a second acting force of a second robot according to the first acting force, and confirms motion information of the second robot according to the second acting force, wherein the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force of the second robot in the virtual environment;
and receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information.
The memory 320 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 320 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 320 may further include memory located remotely from the processor 310, which may be connected to the controller via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 330 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the robot, e.g., a signal acquisition device that acquires information on various sensors on the robot. The output device 340 may include a display device such as a display screen, for example, a signal output device, which may output a control signal.
Example four
Fig. 4 is a schematic structural diagram of a bilateral robot according to a fourth embodiment of the present invention. As shown in fig. 4, a bilateral robot provided in the fourth embodiment of the present invention includes a robot body 40 and a controller 41, where the robot body 40 includes a three-degree-of-freedom bilateral motion mechanism, and the three-degree-of-freedom bilateral motion mechanism includes: a pair of x-axis motion modules 13013, a pair of y-axis motion modules 13014, a pair of z-axis motion modules 13015, and an end effector 13011, the x, y, and z axes may represent three coordinate axes of the robot coordinate system of the robot body 40.
Alternatively, the end effector 13011 may be an operating handle that is movable under the control of the controller 41. In the present embodiment, an operation handle is taken as an example, and when the operation handle is used, the operation handle is an end effector. It should be noted that the present embodiment is only an example of the operation handle, but is not limited to the operation handle.
Optionally, the operation handles 13011 include left and right handles 13011 screwed to the ends of each z-axis motion module 13015. When each pair of the x-axis motion module 13013, the y-axis motion module 13014 and the z-axis motion module 13015 is at the origin of the respective coordinate axis, a cartesian left-hand coordinate system is established with the center point of the top end of the left operating handle 13011 as the origin, and the positive direction of the x-axis of the cartesian left-hand coordinate system points to the right operating handle; with the center point of the top end of the right operating handle 13011 as the origin, a cartesian right-hand coordinate system is established with the positive direction of the x-axis pointing to the left operating handle.
The controller 41 controls the robot body 40, and executes various functional applications and data processing of the robot, that is, implementing a robot motion control method provided by any embodiment of the present invention, the method may include:
acquiring a first acting force of a first robot;
sending the first acting force to a physical engine, so that the physical engine confirms a second acting force of a second robot according to the first acting force, and confirms motion information of the second robot according to the second acting force, wherein the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force of the second robot in the virtual environment;
and receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a robot motion control system according to a fifth embodiment of the present invention. As shown in fig. 5, a robot motion control system according to a fifth embodiment of the present invention includes: the robot 510 and the computer device 520, and data transmission may be performed between the robot 510 and the computer device 520, where the robot 510 may be a robot provided in any embodiment of the present invention, and may implement the robot motion control method provided in any embodiment of the present invention. The computer device 520 includes a display module 521 and a physics engine 522, the physics engine 522 capable of constructing a virtual environment and a virtual robot, determining a second acting force of the virtual robot from the first acting force of the robot 510, and calculating virtual robot motion information from the second acting force. The display module 521 is used to display the virtual environment and the virtual robot constructed by the physics engine 522, and the display module 521 may be a display screen, a VR device, an AR device, or the like.
EXAMPLE six
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a robot motion control method according to any embodiment of the present invention, where the method may include:
acquiring a first acting force of a first robot;
sending the first acting force to a physical engine, so that the physical engine confirms a second acting force of a second robot according to the first acting force, and confirms motion information of the second robot according to the second acting force, wherein the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force of the second robot in the virtual environment;
and receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or terminal. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A robot motion control method, comprising:
acquiring a first acting force of a first robot;
sending the first acting force to a physical engine, so that the physical engine confirms a second acting force of a second robot according to the first acting force, and confirms motion information of the second robot according to the second acting force, wherein the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force of the second robot in the virtual environment;
and receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information.
2. The method of claim 1, wherein identifying a second force of a second robot from the first force comprises:
determining a first virtual stress of the second robot according to the first acting force;
determining a second virtual stress of the virtual environment to the second robot according to the configuration parameters, wherein the second virtual stress is an acting force exerted by the virtual environment to the second robot;
and determining the second acting force according to the first virtual stress and the second virtual stress.
3. The method of claim 2, wherein the configuration parameters are used to determine properties of objects in the virtual environment, including for each object in the virtual environment: physical properties, material properties, geometric properties, and connection relationships between objects.
4. The method of any of claim 2, wherein the second virtual stress comprises: one or more of a virtual gravitational force borne by the second robot in the virtual environment, a virtual elastic force borne by the second robot in the virtual environment, a virtual frictional force generated by the virtual environment on the second robot, and a virtual electromagnetic force generated by the virtual environment on the second robot.
5. The method of claim 1, wherein the motion information comprises one or more of position, velocity, acceleration, angle of rotation, rate of rotation, and acceleration of rotation.
6. A robot motion control apparatus, comprising:
the first acting force acquisition module is used for acquiring a first acting force of the first robot;
the first acting force sending module is used for sending the first acting force to a physical engine so that the physical engine confirms a second acting force borne by a second robot according to the first acting force and confirms motion information of the second robot according to the second acting force, the second robot is a virtual robot in a virtual environment constructed by the physical engine, and the second acting force is a resultant force borne by the second robot in the virtual environment;
and the control module is used for receiving the motion information fed back by the physical engine and controlling the first robot to move according to the motion information.
7. A robot comprising a robot body and a controller, wherein the controller comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the robot motion control method of any of claims 1-5.
8. The robot of claim 8, wherein the robot is a robot with a three-degree-of-freedom bilateral motion mechanism coupled to the force sensor assembly.
9. A robot motion control system comprising a robot according to any of claims 7-8 and a computer device for running a physics engine and displaying a virtual environment.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the robot motion control method according to any one of claims 1-5.
CN202010177523.9A 2020-03-13 2020-03-13 Robot motion control method, device and system, robot and storage medium Pending CN111338287A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010177523.9A CN111338287A (en) 2020-03-13 2020-03-13 Robot motion control method, device and system, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010177523.9A CN111338287A (en) 2020-03-13 2020-03-13 Robot motion control method, device and system, robot and storage medium

Publications (1)

Publication Number Publication Date
CN111338287A true CN111338287A (en) 2020-06-26

Family

ID=71180155

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010177523.9A Pending CN111338287A (en) 2020-03-13 2020-03-13 Robot motion control method, device and system, robot and storage medium

Country Status (1)

Country Link
CN (1) CN111338287A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112338920A (en) * 2020-11-04 2021-02-09 中国联合网络通信集团有限公司 Data processing method, device and equipment
CN113771043A (en) * 2021-09-30 2021-12-10 上海傅利叶智能科技有限公司 Control method and device for enabling robot to follow virtual object and rehabilitation robot
CN113829348A (en) * 2021-09-30 2021-12-24 上海傅利叶智能科技有限公司 Robot control method and device based on physical engine and rehabilitation robot
CN113829347A (en) * 2021-09-30 2021-12-24 上海傅利叶智能科技有限公司 Robot control method and device based on physical engine and rehabilitation robot
CN113843796A (en) * 2021-09-30 2021-12-28 上海傅利叶智能科技有限公司 Data transmission method and device, control method and device of online robot and online robot
CN114770511A (en) * 2022-05-09 2022-07-22 上海傅利叶智能科技有限公司 Robot control method and device based on physical touch and robot

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104706499A (en) * 2013-12-12 2015-06-17 中国科学院宁波材料技术与工程研究所 Upper limb cranial nerve rehabilitation training system and training method
CN105892626A (en) * 2014-12-15 2016-08-24 普瑞深视科技(北京)有限公司 Lower limb movement simulation control device used in virtual reality environment
CN106779045A (en) * 2016-11-30 2017-05-31 东南大学 Rehabilitation training robot system and its application method based on virtual scene interaction
CN107049702A (en) * 2017-03-29 2017-08-18 东南大学 A kind of lower limbs rehabilitation training robot system based on virtual reality
US20180164982A1 (en) * 2016-12-09 2018-06-14 International Business Machines Corporation Method and system for generating a holographic image having simulated physical properties
US10022628B1 (en) * 2015-03-31 2018-07-17 Electronic Arts Inc. System for feature-based motion adaptation
CN108369478A (en) * 2015-12-29 2018-08-03 微软技术许可有限责任公司 Hand for interaction feedback tracks
CN108873875A (en) * 2017-05-08 2018-11-23 深圳光启合众科技有限公司 Robot divertical motion control method and device, robot, storage medium
CN110075486A (en) * 2019-05-31 2019-08-02 东北大学 A kind of rehabilitation training of upper limbs system and method using virtual reality technology
CN110292748A (en) * 2019-07-02 2019-10-01 南方科技大学 bilateral coordination training system and control method
CN110812105A (en) * 2018-08-07 2020-02-21 深圳二十一天健康科技有限公司 Active three-degree-of-freedom upper limb rehabilitation robot based on virtual reality technology

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104706499A (en) * 2013-12-12 2015-06-17 中国科学院宁波材料技术与工程研究所 Upper limb cranial nerve rehabilitation training system and training method
CN105892626A (en) * 2014-12-15 2016-08-24 普瑞深视科技(北京)有限公司 Lower limb movement simulation control device used in virtual reality environment
US10022628B1 (en) * 2015-03-31 2018-07-17 Electronic Arts Inc. System for feature-based motion adaptation
CN108369478A (en) * 2015-12-29 2018-08-03 微软技术许可有限责任公司 Hand for interaction feedback tracks
CN106779045A (en) * 2016-11-30 2017-05-31 东南大学 Rehabilitation training robot system and its application method based on virtual scene interaction
US20180164982A1 (en) * 2016-12-09 2018-06-14 International Business Machines Corporation Method and system for generating a holographic image having simulated physical properties
CN107049702A (en) * 2017-03-29 2017-08-18 东南大学 A kind of lower limbs rehabilitation training robot system based on virtual reality
CN108873875A (en) * 2017-05-08 2018-11-23 深圳光启合众科技有限公司 Robot divertical motion control method and device, robot, storage medium
CN110812105A (en) * 2018-08-07 2020-02-21 深圳二十一天健康科技有限公司 Active three-degree-of-freedom upper limb rehabilitation robot based on virtual reality technology
CN110075486A (en) * 2019-05-31 2019-08-02 东北大学 A kind of rehabilitation training of upper limbs system and method using virtual reality technology
CN110292748A (en) * 2019-07-02 2019-10-01 南方科技大学 bilateral coordination training system and control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
祁彬斌等: "引入力触觉的数字文物多模交互方法", 《中国图象图形学报》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112338920A (en) * 2020-11-04 2021-02-09 中国联合网络通信集团有限公司 Data processing method, device and equipment
CN112338920B (en) * 2020-11-04 2022-04-15 中国联合网络通信集团有限公司 Data processing method, device and equipment
CN113771043A (en) * 2021-09-30 2021-12-10 上海傅利叶智能科技有限公司 Control method and device for enabling robot to follow virtual object and rehabilitation robot
CN113829348A (en) * 2021-09-30 2021-12-24 上海傅利叶智能科技有限公司 Robot control method and device based on physical engine and rehabilitation robot
CN113829347A (en) * 2021-09-30 2021-12-24 上海傅利叶智能科技有限公司 Robot control method and device based on physical engine and rehabilitation robot
CN113843796A (en) * 2021-09-30 2021-12-28 上海傅利叶智能科技有限公司 Data transmission method and device, control method and device of online robot and online robot
CN113771043B (en) * 2021-09-30 2023-02-24 上海傅利叶智能科技有限公司 Control method and device for enabling robot to follow virtual object and rehabilitation robot
WO2023051108A1 (en) * 2021-09-30 2023-04-06 上海傅利叶智能科技有限公司 Robot control method and device based on physical engine and rehabilitation robot
CN113829348B (en) * 2021-09-30 2023-08-15 上海傅利叶智能科技有限公司 Robot control method and device based on physical engine and rehabilitation robot
CN113829347B (en) * 2021-09-30 2023-08-15 上海傅利叶智能科技有限公司 Robot control method and device based on physical engine and rehabilitation robot
CN114770511A (en) * 2022-05-09 2022-07-22 上海傅利叶智能科技有限公司 Robot control method and device based on physical touch and robot
CN114770511B (en) * 2022-05-09 2023-06-23 上海傅利叶智能科技有限公司 Robot control method and device based on physical touch sense and robot

Similar Documents

Publication Publication Date Title
CN111251305B (en) Robot force control method, device, system, robot and storage medium
CN111338287A (en) Robot motion control method, device and system, robot and storage medium
Wang et al. Real-virtual components interaction for assembly simulation and planning
Borst et al. Realistic virtual grasping
Birglen et al. SHaDe, a new 3-DOF haptic device
Borst et al. A spring model for whole-hand virtual grasping
US20090306825A1 (en) Manipulation system and method
KR20200082449A (en) Apparatus and method of controlling virtual model
US20020123812A1 (en) Virtual assembly design environment (VADE)
US10895950B2 (en) Method and system for generating a holographic image having simulated physical properties
Tsai et al. Unity game engine: Interactive software design using digital glove for virtual reality baseball pitch training
CN111665933A (en) Method and device for operating object in virtual or augmented reality
KR100934391B1 (en) Hand-based Grabbing Interaction System Using 6-DOF Haptic Devices
RU2308764C2 (en) Method for moving a virtual jointed object in virtual space with prevention of collisions of jointed object with elements of environment
Nasim et al. Physics-based interactive virtual grasping
Nandikolla et al. Teleoperation Robot Control of a Hybrid EEG‐Based BCI Arm Manipulator Using ROS
Akahane et al. Two-handed multi-finger string-based haptic interface SPIDAR-8
Du et al. A novel natural mobile human-machine interaction method with augmented reality
Choi et al. Haptic display in the virtual collaborative workspace shared by multiple users
Guda et al. Introduction of a Cobot as Intermittent Haptic Contact Interfaces in Virtual Reality
Yoshikawa et al. A touch/force display system for haptic interface
Steger et al. Design of a passively balanced spatial linkage haptic interface
CN107219918A (en) A kind of design method of data glove interface module
Lin et al. Design of force-reflection joystick system for VR-based simulation
CN110968183A (en) Method and device for providing real physical feedback when contacting virtual object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200626