CN115122342B - Software system for controlling robot and control method of robot - Google Patents

Software system for controlling robot and control method of robot Download PDF

Info

Publication number
CN115122342B
CN115122342B CN202211068327.3A CN202211068327A CN115122342B CN 115122342 B CN115122342 B CN 115122342B CN 202211068327 A CN202211068327 A CN 202211068327A CN 115122342 B CN115122342 B CN 115122342B
Authority
CN
China
Prior art keywords
position information
target
robot
target position
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211068327.3A
Other languages
Chinese (zh)
Other versions
CN115122342A (en
Inventor
魏晓晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yidian Lingdong Technology Co ltd
Original Assignee
Beijing Yidian Lingdong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yidian Lingdong Technology Co ltd filed Critical Beijing Yidian Lingdong Technology Co ltd
Priority to CN202211068327.3A priority Critical patent/CN115122342B/en
Publication of CN115122342A publication Critical patent/CN115122342A/en
Application granted granted Critical
Publication of CN115122342B publication Critical patent/CN115122342B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a software system for controlling a robot and a control method of the robot, which relate to the field of artificial intelligence, wherein the software system comprises: the client side at least comprises: the system comprises an interaction layer, an interface layer and a communication layer, wherein the interaction layer receives operation information and configuration information input by a target object, the interface layer converts the operation information into instruction information, and the communication layer forwards the instruction information and the configuration information to a control end; the control end, the control end includes at least: the robot comprises a logic layer and an execution layer, wherein the logic layer performs logic judgment and data processing based on the instruction information and the configuration information to obtain a processing result, and the execution layer controls the robot to move based on the processing result. By the aid of the method and the device, the problem that the efficiency of developing the navigation surgical robot is low due to the fact that a software system of the navigation surgical robot is lacked in the related technology is solved.

Description

Software system for controlling robot and control method of robot
Technical Field
The application relates to the field of artificial intelligence, in particular to a software system for controlling a robot and a control method of the robot.
Background
With the development of medical science, in order to improve the accuracy, the elaboration, the fineness and the like of the operation, a surgical robot is researched and developed to perform an auxiliary operation. The operation robot is used for assisting the operation, so that the operation efficiency and the operation quality can be improved. A surgical robot generally includes hardware devices such as a control system, a robot arm, and a camera. However, the currently developed surgical robot is usually only suitable for one type of surgery, and a development framework of a surgical robot, especially a navigation type surgical robot, is also lacking in the prior art, and it is very necessary to develop a surgical robot software system in order to meet the needs of the types of surgery, such as hip joint, knee joint, unicondylar, spine, dentistry and the like, in the navigation surgery process based on the robot.
Aiming at the problem that the efficiency of developing the navigation surgical robot is low due to the lack of a software system of the navigation surgical robot in the related technology, an effective solution is not provided at present.
Disclosure of Invention
The main purpose of the present application is to provide a software system for controlling a robot and a control method of the robot, so as to solve the problem that the efficiency of developing a navigation type surgical robot is relatively low due to the lack of a software system of the navigation type surgical robot in the related art.
To achieve the above object, according to one aspect of the present application, there is provided a software system for controlling a robot. The robot is applied to assist in performing a surgical operation, and comprises: a client, the client comprising at least: the system comprises an interaction layer, an interface layer and a communication layer, wherein the interaction layer receives operation information and configuration information input by a target object, the interface layer converts the operation information into instruction information, and the communication layer forwards the instruction information and the configuration information to a control end; the control end, the control end includes at least: the robot comprises a logic layer and an execution layer, wherein the logic layer carries out logic judgment and data processing based on the instruction information and the configuration information to obtain a processing result, and the execution layer controls the robot to move based on the processing result.
Further, the logical layer includes: the navigation positioning module constructs a pose conversion relation between software position information in the configuration information and a visual coordinate system through a first output interface, converts the software position information into target pose information in the visual coordinate system through a second output interface according to the pose conversion relation, and transmits the target pose information to the visual tracking module; the vision tracking module receives the target pose information through a third output interface, converts the target pose information into first target position information under a robot coordinate system through a fourth output interface, and transmits the first target position information to the motion control module; the motion control module receives the first target position information through a fifth output interface, and the motion control module restrains and optimizes the first target position information through a sixth output interface based on preset planning restraint and preset force control parameters to obtain second target position information; and the execution layer controls the robot to move to the target position corresponding to the second target position information so as to execute corresponding operation.
Further, the software system further includes: the tool function module sets working names of a plurality of surgical tools through a seventh output interface when surgical operation is performed through the surgical tools, receives third target position information of a working center point of a target surgical tool in a robot coordinate system through an eighth output interface, verifies the third target position information through a ninth output interface, and controls the robot to move the working center point of the target surgical tool to a target position corresponding to the third target position information through the execution layer under the condition that the third target position information is verified.
Further, the motion control module further comprises: a planning control interface for setting the preset planning constraint; and the interactive control interface is used for setting the preset force control parameters.
Furthermore, the navigation positioning module also comprises a plurality of navigation positioning sub-modules, wherein each navigation positioning sub-module corresponds to a different operation type; the visual tracking module also comprises a plurality of visual tracking module sub-modules, wherein each visual tracking module sub-module corresponds to a different tracking strategy; the motion control module also comprises a plurality of motion control sub-modules, wherein each motion control sub-module corresponds to a different motion strategy; the tool function module further comprises a plurality of tool function sub-modules, wherein each tool function sub-module corresponds to a different type of surgical tool.
In order to achieve the above object, according to another aspect of the present application, there is provided a control method of a robot, the method including: receiving first target software position information; establishing a pose conversion relation between the first target software position information and a visual coordinate system, and converting the software position information into target pose information under the visual coordinate system according to the pose conversion relation; converting the target pose information into first target position information under a robot coordinate system; constraining and optimizing the first target position information based on preset planning constraint and preset force control parameters to obtain second target position information; and controlling the robot to move to the target position corresponding to the second target position information.
Further, the method further comprises: when the robot executes surgical operation through a surgical tool, acquiring position information of a target surgical tool to be used and second target software; converting the second target software position information into third target position information under the robot coordinate system; verifying the third target position information to obtain a verification result; and if the verification result is that the third target position information passes, controlling the working center point of the target surgical tool to move to the target position corresponding to the third target position information.
In order to achieve the above object, according to another aspect of the present application, there is provided a control apparatus of a robot. The device includes: the receiving unit is used for receiving the position information of the first target software; the construction unit is used for constructing a pose conversion relation between the first target software position information and a visual coordinate system and converting the software position information into target pose information in the visual coordinate system according to the pose conversion relation; the first conversion unit is used for converting the target pose information into first target position information under a robot coordinate system; the optimization unit is used for constraining and optimizing the first target position information based on preset planning constraint and preset force control parameters to obtain second target position information; and the first control unit is used for controlling the robot to move to the target position corresponding to the second target position information.
Further, the apparatus further comprises: an acquisition unit for acquiring position information of a target surgical tool and second target software to be used when the robot performs a surgical operation through the surgical tool; the second conversion unit is used for converting the second target software position information into third target position information under the robot coordinate system; the checking unit is used for checking the third target position information to obtain a checking result; and the second control unit is used for controlling the working center point of the target surgical tool to move to the target position corresponding to the third target position information if the verification result is that the working center point passes.
In order to achieve the above object, according to one aspect of the present application, there is provided a processor for executing a program, wherein the program executes to execute the control method of the robot described in any one of the above.
To achieve the above object, according to one aspect of the present application, there is provided an electronic device including one or more processors and a memory for storing the one or more processors to implement the control method of the robot described in any one of the above.
By this application, the following is adopted: the client side at least comprises: the system comprises an interaction layer, an interface layer and a communication layer, wherein the interaction layer receives operation information and configuration information input by a target object, the interface layer converts the operation information into instruction information, and the communication layer forwards the instruction information and the configuration information to a control end; the control end, the control end includes at least: the robot comprises a logic layer and an execution layer, wherein the logic layer carries out logic judgment and data processing based on instruction information and configuration information to obtain a processing result, and the execution layer controls the robot to move based on the processing result, so that the problem that the efficiency of developing the navigation surgical robot is low due to the lack of a software system of the navigation surgical robot in the related technology is solved. The software system for controlling the robot comprises the following components in sequence from the top layer to the bottom layer: the interaction layer, the interface layer, the communication layer, the logic layer and the execution layer cover basic functions of the surgical navigation robot, development work of software programs of robots for executing different types of operations can be quickly realized based on the software system, and then the effect of improving the efficiency of developing the navigation surgical robot is achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application. In the drawings:
FIG. 1 is a schematic diagram of a software system for controlling a robot provided in accordance with an embodiment of the present application;
FIG. 2 is a schematic diagram of a navigational positioning module provided in accordance with an embodiment of the present application;
FIG. 3 is a schematic diagram of a visual tracking module provided in accordance with an embodiment of the present application;
FIG. 4 is a schematic diagram of a motion control module provided in accordance with an embodiment of the present application;
FIG. 5 is a schematic diagram of a tool function module provided in accordance with an embodiment of the present application;
FIG. 6 is a development flow diagram of a software system for controlling a robot provided according to an embodiment of the present application;
fig. 7 is a flowchart of a control method of a robot according to an embodiment of the present application;
fig. 8 is a schematic diagram of a control device of a robot provided according to an embodiment of the present application;
fig. 9 is a schematic diagram of an electronic device provided according to an embodiment of the application.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the accompanying drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
The surgical robot is used for assisting the surgery, so that the efficiency of the surgery and the quality of the surgery can be improved, but the currently developed surgical robot can only aim at one type of surgery, is lack of openness, has weak expansibility and is not beneficial to maintenance and upgrading.
The basic operation process of the current navigation type operation robot comprises the following steps: 1. configuration selection: determining the type of surgery, such as hip surgery, shoulder surgery, etc.; determining a tracking control mode, including an open-loop tracking mode, a closed-loop tracking mode and the like; determine the type of tool used in the operation, including acetabular files, cutters, etc. 2. Planning and adjusting: and introducing a surgical plan, adjusting the used surgical tools according to the preoperative 3D images of the patient, adjusting the type of the used prosthesis and the like. 3. And (3) coordinate system calibration: the method comprises the steps of robot coordinate system calibration, tool coordinate system calibration and robot coordinate system calibration. 4. Feature registration: establishing a position conversion relation between a local 3D image of a human body and a visual coordinate system; 5. intraoperative exercise: according to the operation requirement, assisting a doctor to move the position of a tool, wherein the movement form comprises free movement, positioning movement, following movement and the like; 6. intraoperative adjustment: according to the implementation situation in the operation, the staged verification is carried out by combining the analysis of data information, so as to provide reference for the next operation, wherein the data information comprises depth measurement, angle measurement, gap measurement and the like; 7. and (3) postoperative detection: and comparing the postoperative state with the planning target to evaluate the operation effect.
Under the above technical background and basic surgical procedures, a software system for controlling a robot is proposed, as shown in fig. 1, the software system for controlling a robot includes the following contents:
the client side at least comprises: the system comprises an interaction layer, an interface layer and a communication layer, wherein the interaction layer receives operation information and configuration information input by a target object, the interface layer converts the operation information into instruction information, and the communication layer forwards the instruction information and the configuration information to a control end; the control end, the control end includes at least: the robot comprises a logic layer and an execution layer, wherein the logic layer carries out logic judgment and data processing based on the instruction information and the configuration information to obtain a processing result, and the execution layer controls the robot to move based on the processing result.
The logic layer comprises: the navigation positioning module is used for establishing a pose conversion relation between software position information in the configuration information and a visual coordinate system through a first output interface, converting the software position information into target pose information in the visual coordinate system through a second output interface according to the pose conversion relation, and transmitting the target pose information to the visual tracking module; the vision tracking module receives the target pose information through a third output interface, converts the target pose information into first target position information under a robot coordinate system through a fourth output interface, and transmits the first target position information to the motion control module; the motion control module receives the first target position information through a fifth output interface, and the motion control module restrains and optimizes the first target position information through a sixth output interface based on preset planning restraint and preset force control parameters to obtain second target position information; and the execution layer controls the robot to move to the target position corresponding to the second target position information so as to execute the corresponding operation.
Specifically, the software system is applied to a navigation type surgical robot, and the navigation type surgical robot is mainly used for moving the position of a surgical tool to assist a doctor to complete a surgical operation, so that as shown in fig. 1, the software system for controlling the robot provided by the application mainly comprises a client and a control end, wherein the client is mainly used for man-machine interaction, receives an operation and a control instruction of the doctor, and the control end performs logic judgment and data processing based on the operation and the control instruction and controls the robot to move to a corresponding position.
As shown in fig. 1, the client may include an interaction layer, an interface layer, and a communication layer. The operation information and configuration information input by a doctor (i.e., the target object described above) such as the type of operation, the tracking manner, the motion control mode, and the target position to which the robot is to be moved are received through the interaction layer. And converting the corresponding operation information into an operation instruction by using the interface layer, and transmitting the operation instruction and the configuration information to the control end through the communication layer. The interaction layer may include a plurality of operation interfaces, such as a planning interface, a calibration interface, a feature registration interface, an intra-operative interface, an post-operative detection interface, and the like, through which the target object may perform different operations. For example, in the calibration interface, the target object issues a calibration instruction for the robot coordinate system, and the like, and in the post-operation detection interface, the operation result is analyzed and the operation effect is evaluated, and the like.
The interface layer can also comprise the following parts: the motion mode command is used for issuing a robot motion control mode in different interfaces; the real-time pose instruction is used for issuing/acquiring the pose of the robot target in real time; the log instruction is used for recording the operation process; and the state instruction is used for recording the current mode state of the robot.
In the communication layer, the control end may be connected using a TCP/IP communication protocol.
As shown in fig. 1, the control end at least comprises a logic layer and an execution layer, and the logic layer at least comprises a navigation positioning module, a visual tracking module and a motion control module. The logic layer carries out logic judgment and data processing based on the instruction information and the configuration information input by the target object to obtain a processing result, and the execution layer controls the robot to move based on the processing result. For example, if the target object wants to control the robot to move to the target position, the position information is transmitted to the logic layer, the logic layer converts the target position into the position information under the robot angle, and then the execution layer controls the robot to move to the corresponding position.
And a navigation positioning module as shown in fig. 2. The navigation positioning module at least comprises a first output interface and a second output interface, wherein the first output interface: constructing a pose transformation relation between software position information (input by a target object) in the configuration information and a visual coordinate system; a second output interface to be designatedConverting the software position information into target pose information under a visual coordinate system, wherein the calculation formula is as follows:
Figure DEST_PATH_IMAGE002A
wherein, in the step (A),
Figure DEST_PATH_IMAGE004AAA
is the information of the pose of the object,
Figure DEST_PATH_IMAGE006
is the information on the location of the software,
Figure DEST_PATH_IMAGE008
is a pose transformation relation. And sending the target pose information to the visual tracking module after the conversion is finished.
Visual tracking module, as shown in fig. 3, the visual tracking module includes at least a third output interface and a fourth output interface, the third output interface: receiving target pose information sent by a navigation positioning module; a fourth output interface: and converting the position and posture information of the target under the visual coordinate system into the coordinate system of the robot to obtain the position information of the first target, so that the robot can be conveniently controlled to move to the position corresponding to the position information of the first target in the follow-up process. Planning and constraining is also performed by the motion control module in order to better control the robot movements.
The motion control module further comprises the following interfaces: the planning control interface is used for setting preset planning constraints; and the interactive control interface is used for setting preset force control parameters.
As shown in fig. 4, the planning class control interface: setting planning constraint for updating and constraining the first target position information according to the motion range of the operation type; the interactive control interface comprises: and setting a force control parameter for carrying out degree of freedom limit value and force control compliance degree processing on the first target position information according to the interactive constraint of the operation type.
The motion control module, as shown in fig. 4, includes at least a fifth output interface and a sixth output interface. A fifth output interface: acquiring first target position information under a robot coordinate system; a sixth output interface: and is configured to output second target position information after constraint in the current motion mode (the corresponding first target position information is constrained and optimized based on the preset planning constraint and the preset force control parameter to obtain the second target position information).
And after the second target position information is obtained, the robot is controlled to move to the target position corresponding to the second target position information through the execution layer, and the responsive operation is executed.
In summary, the software system explicitly sets the functions of each module and can meet the basic function requirements of various operation types and operation tools, so that when the robot is developed later, only improvement and optimization are needed on the basis, and in the subsequent use process, maintenance can be performed by taking the modules as units. Therefore, the technical effect of improving the development efficiency of the navigation surgical robot is achieved through the software system.
Navigation-type surgical robots are also generally used to assist in moving a surgical tool to a corresponding target location, and therefore the software system for controlling the robot further comprises: the tool function module sets working names of a plurality of surgical tools through a seventh output interface when surgical operation is performed through the surgical tools, the tool function module receives third target position information of a working central point of a target surgical tool under a robot coordinate system through an eighth output interface, the tool function module verifies the third target position information through a ninth output interface, and under the condition that the third target position information is verified, the robot is controlled through an execution layer to move the working central point of the target surgical tool to a target position corresponding to the third target position information.
Specifically, as shown in fig. 5, the tool function module includes at least a seventh output interface, an eighth output interface, and a ninth output interface. A seventh output interface: tcp (work center point) setting for setting a work name of a surgical tool to be used; an eighth output interface: calibration setting, namely receiving third target position information of a working center point of a target surgical tool in a robot coordinate system; a ninth output interface: and detecting the position, detecting third target position information of the surgical tool, and performing secondary verification on the third target position information to improve the accuracy. And finally, moving the working center point of the target surgical tool to a target position corresponding to the third target position information through the control layer.
The tool function modules described above ensure that the software system used to control the robot can be used to control the movement of a variety of surgical tools to target locations to assist in performing surgical procedures.
In order to be able to rapidly develop robots for various types of surgery, the navigation positioning module, the visual tracking module, the motion control module and the tool function module further include the following: the navigation positioning module also comprises a plurality of navigation positioning sub-modules, wherein each navigation positioning sub-module corresponds to different operation types; the visual tracking module also comprises a plurality of visual tracking module sub-modules, wherein each visual tracking module sub-module corresponds to a different tracking strategy; the motion control module also comprises a plurality of motion control sub-modules, wherein each motion control sub-module corresponds to a different motion strategy; the tool function module further comprises a plurality of tool function sub-modules, wherein each tool function sub-module corresponds to a different type of surgical tool.
Specifically, as shown in fig. 2, the navigation positioning module may include a plurality of navigation positioning sub-modules for matching different surgical types, for example, a hip navigation positioning sub-module, a knee navigation positioning sub-module, a spine navigation positioning sub-module, and other navigation positioning sub-modules. For each navigation positioning sub-module, two output interfaces of the navigation positioning module are inherited, but due to different registration strategies and target positions of operation types, implementation schemes of the navigation positioning sub-modules for the same interface are not completely the same, and development of response is needed according to actual requirements.
As shown in fig. 3, the visual tracking module may include a plurality of visual tracking sub-modules for matching different tracking strategies, such as an open-loop visual tracking sub-module, a closed-loop visual tracking sub-module, and other visual tracking sub-modules.
As shown in FIG. 4, the motion control module may also include a plurality of motion control sub-modules for matching different motion strategies, such as a calibration motion control sub-module, a free motion control sub-module, a fixed plane motion control sub-module, and the like.
As shown in fig. 5, the tool function module may further include a plurality of tool function sub-modules for matching different surgical tool types, for example, including an acetabular file tool sub-module, a hip implant tool sub-module, a knee guide tool sub-module, and other tool sub-modules.
In an alternative embodiment, the development process of the software system is shown in fig. 6, and in the first step, according to the configuration information (including the operation type and the tracking mode) of the operation patient, the configuration information is input to the constructor for initializing the corresponding module;
secondly, the constructor constructs a required navigation positioning module, a visual tracking module and a tool function module according to the configuration information;
and thirdly, calling a registration function in the navigation positioning module to obtain a pose conversion relation, calling a target pose method and obtaining target pose information in a visual coordinate system.
And fourthly, inputting the target surgical tool selected by the client to a visual tracking module, calling a target pose method, and obtaining target position information of the target surgical tool in a robot coordinate system.
And fifthly, constructing a motion control module according to the motion mode selected by the client, and issuing constrained target position information and compliant rigidity according to the motion mode so as to control the robot to move.
In the software system for controlling a robot provided in an embodiment of the present application, the client includes at least: the system comprises an interaction layer, an interface layer and a communication layer, wherein the interaction layer receives operation information and configuration information input by a target object, the interface layer converts the operation information into instruction information, and the communication layer forwards the instruction information and the configuration information to a control end; the control end, the control end includes at least: the robot comprises a logic layer and an execution layer, wherein the logic layer carries out logic judgment and data processing based on instruction information and configuration information to obtain a processing result, and the execution layer controls the robot to move based on the processing result, so that the problem that the efficiency of developing the navigation surgical robot is low due to the lack of a software system of the navigation surgical robot in the related technology is solved. The software system for controlling the robot comprises the following components in sequence from the top layer to the bottom layer: the interaction layer, the interface layer, the communication layer, the logic layer and the execution layer cover basic functions of the surgical navigation robot, development work of software programs of robots for executing different types of operations can be quickly realized based on the software system, and then the effect of improving the efficiency of developing the navigation surgical robot is achieved.
Example 2
The present invention is described below with reference to preferred implementation steps, and fig. 7 is a flowchart of a control method for a robot according to an embodiment of the present application, and as shown in fig. 7, the method includes the following steps:
step S701, receiving first target software location information.
Step S702, a pose conversion relation between the first target software position information and a visual coordinate system is established, and the software position information is converted into target pose information in the visual coordinate system according to the pose conversion relation.
Step S703, converting the target pose information into first target position information in the robot coordinate system.
Step S704, constraining and optimizing the first target position information based on a preset planning constraint and a preset force control parameter to obtain a second target position information.
Step S705, the robot is controlled to move to the target position corresponding to the second target position information.
Specifically, the control method of the robot described above is applied to the software system for controlling the robot described in the first embodiment. The target object sets software position information to which the robot needs to move (i.e., the first target software position information described above). And constructing a pose conversion relation between the first target software position information and the visual coordinate system through a navigation positioning module, and converting the software position information into target pose information under the visual coordinate system according to the pose conversion relation. And transmitting the target pose information to a visual tracking module, and converting the target pose information into first target position information under a robot coordinate system through the visual tracking module. And finally, the robot is controlled to move to a target position corresponding to the second target position information through an execution layer, so that the operation is executed through the assistance of the robot.
Controlling the robot while the robot performs a surgical operation with the surgical tool by: acquiring position information of a target surgical tool to be used and second target software; converting the second target software position information into third target position information under a robot coordinate system; verifying the third target position information to obtain a verification result; and if the checking result is that the target surgical tool passes, controlling the working center point of the target surgical tool to move to the target position corresponding to the third target position information.
Generally, a navigation-type surgical robot assists in completing a surgical operation by moving a surgical tool to a target position, and thus, when the robot performs a surgical operation through the surgical tool, includes: the target object sets target surgical tools to be used and second target software position information; then converting the second target software position information into third target position information under a robot coordinate system by using a conversion relation; the third target position information is verified through the tool function module, so that the position accuracy is improved; and when the verification result is that the operation center point passes, the operation center point of the target operation tool is controlled by the execution layer to move to the target position corresponding to the third target position information.
According to the control method of the robot, the position information of the first target software is received; and constructing a pose conversion relation between the first target software position information and the visual coordinate system, and converting the software position information into target pose information in the visual coordinate system according to the pose conversion relation. And converting the target pose information into first target position information under a robot coordinate system. And constraining and optimizing the first target position information based on preset planning constraint and preset force control parameters to obtain second target position information. The robot is controlled to move to the target position corresponding to the second target position information, and the problem that the efficiency of developing the navigation type surgical robot is low due to the fact that a software system of the navigation type surgical robot is lacked in the related technology is solved. The software system for controlling the robot comprises the following components in sequence from the top layer to the bottom layer: the interaction layer, the interface layer, the communication layer, the logic layer and the execution layer cover basic functions of the surgical navigation robot, development work of software programs of robots for executing different types of operations can be quickly realized based on the software system, and then the effect of improving the efficiency of developing the navigation surgical robot is achieved.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
Example 3
The embodiment of the present application further provides a control device for a robot, and it should be noted that the control device for a robot according to the embodiment of the present application may be used to execute the control method for a robot according to the embodiment of the present application. The following describes a control device for a robot according to an embodiment of the present application.
Fig. 8 is a schematic diagram of a control device of a robot according to an embodiment of the present application. As shown in fig. 8, the apparatus includes: a receiving unit 801, a construction unit 802, a first conversion unit 803, an optimization unit 804 and a first control unit 805.
A receiving unit 801 configured to receive first target software location information;
the constructing unit 802 is configured to construct a pose transformation relationship between the first target software position information and the visual coordinate system, and transform the software position information into target pose information in the visual coordinate system according to the pose transformation relationship;
a first conversion unit 803, configured to convert the target pose information into first target position information in a robot coordinate system;
the optimization unit 804 is configured to constrain and optimize the first target position information based on a preset planning constraint and a preset force control parameter to obtain second target position information;
and a first control unit 805 configured to control the robot to move to a target position corresponding to the second target position information.
The control device of the robot provided by the embodiment of the application receives the position information of the first target software through the receiving unit 801; the construction unit 802 constructs a pose transformation relationship between the first target software position information and a visual coordinate system, and transforms the software position information into target pose information in the visual coordinate system according to the pose transformation relationship; the first conversion unit 803 converts the target pose information into first target position information in a robot coordinate system; the optimization unit 804 is configured to constrain and optimize the first target position information based on a preset planning constraint and a preset force control parameter to obtain second target position information; the first control unit 805 controls the robot to move to the target position corresponding to the second target position information, and the problem that the efficiency of developing the navigation type surgical robot is low due to the lack of a software system of the navigation type surgical robot in the related art is solved. The software system for controlling the robot sequentially comprises the following components from the top layer to the bottom layer: the interaction layer, the interface layer, the communication layer, the logic layer and the execution layer cover basic functions of the surgical navigation robot, development work of software programs of robots for executing different types of operations can be quickly realized based on the software system, and then the effect of improving the efficiency of developing the navigation surgical robot is achieved.
Optionally, in the control device for a robot provided in an embodiment of the present application, the device further includes: the acquisition unit is used for acquiring the position information of a target surgical tool and second target software to be used when the robot performs surgical operation through the surgical tool; the second conversion unit is used for converting the second target software position information into third target position information under a robot coordinate system; the checking unit is used for checking the third target position information to obtain a checking result; and the second control unit is used for controlling the working center point of the target surgical tool to move to the target position corresponding to the third target position information if the verification result is that the working center point passes.
The control device of the robot comprises a processor and a memory, wherein the receiving unit 801, the constructing unit 802, the first converting unit 803, the optimizing unit 804, the first control unit 805 and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more than one, and the control of the robot is realized by adjusting kernel parameters.
The memory may include volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM), including at least one memory chip.
The embodiment of the invention provides a processor, which is used for running a program, wherein the program executes a control method of a robot when running.
As shown in fig. 9, an embodiment of the present invention provides an electronic device, where the device includes a processor, a memory, and a program stored in the memory and executable on the processor, and the processor executes the program to implement the following steps: receiving first target software position information; constructing a pose transformation relation between the first target software position information and a visual coordinate system, and transforming the software position information into target pose information under the visual coordinate system according to the pose transformation relation; converting the target pose information into first target position information under a robot coordinate system; constraining and optimizing the first target position information based on preset planning constraint and preset force control parameters to obtain second target position information; and controlling the robot to move to the target position corresponding to the second target position information.
Optionally, the method further comprises: when the robot executes surgical operation through a surgical tool, acquiring position information of a target surgical tool to be used and second target software; converting the second target software position information into third target position information under a robot coordinate system; verifying the third target position information to obtain a verification result; and if the checking result is that the target surgical tool passes, controlling the working center point of the target surgical tool to move to the target position corresponding to the third target position information.
The device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device: receiving first target software position information; establishing a pose conversion relation between the first target software position information and a visual coordinate system, and converting the software position information into target pose information in the visual coordinate system according to the pose conversion relation; converting the target pose information into first target position information under a robot coordinate system; constraining and optimizing the first target position information based on preset planning constraint and preset force control parameters to obtain second target position information; and controlling the robot to move to the target position corresponding to the second target position information.
Optionally, the method further comprises: when the robot executes surgical operation through a surgical tool, acquiring position information of a target surgical tool to be used and second target software; converting the second target software position information into third target position information under a robot coordinate system; verifying the third target position information to obtain a verification result; and if the checking result is that the target surgical tool passes, controlling the working center point of the target surgical tool to move to the target position corresponding to the third target position information.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional identical elements in the process, method, article, or apparatus comprising the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (8)

1. A software system for controlling a robot, wherein the robot is adapted to assist in performing a surgical procedure, comprising:
a client, the client comprising at least: the system comprises an interaction layer, an interface layer and a communication layer, wherein the interaction layer receives operation information and configuration information input by a target object, the interface layer converts the operation information into instruction information, and the communication layer forwards the instruction information and the configuration information to a control end;
the control end, the control end includes at least: the robot comprises a logic layer and an execution layer, wherein the logic layer carries out logic judgment and data processing based on the instruction information and the configuration information to obtain a processing result, and the execution layer controls the robot to move based on the processing result;
wherein the logic layer comprises:
the navigation positioning module constructs a pose conversion relation between software position information in the configuration information and a visual coordinate system through a first output interface, converts the software position information into target pose information in the visual coordinate system through a second output interface according to the pose conversion relation, and transmits the target pose information to the visual tracking module;
the vision tracking module receives the target pose information through a third output interface, converts the target pose information into first target position information under a robot coordinate system through a fourth output interface, and transmits the first target position information to the motion control module;
the motion control module receives the first target position information through a fifth output interface, and the motion control module restrains and optimizes the first target position information through a sixth output interface based on preset planning restraint and preset force control parameters to obtain second target position information;
and the execution layer controls the robot to move to the target position corresponding to the second target position information so as to execute corresponding operation.
2. The software system according to claim 1, further comprising:
the tool function module sets working names of a plurality of surgical tools through a seventh output interface when surgical operation is performed through the surgical tools, receives third target position information of a working center point of a target surgical tool in a robot coordinate system through an eighth output interface, verifies the third target position information through a ninth output interface, and controls the robot to move the working center point of the target surgical tool to a target position corresponding to the third target position information through the execution layer under the condition that the third target position information is verified.
3. The software system of claim 1, wherein the motion control module further comprises:
a planning control interface for setting the preset planning constraint;
and the interactive control interface is used for setting the preset force control parameters.
4. The software system according to claim 1,
the navigation positioning module also comprises a plurality of navigation positioning sub-modules, wherein each navigation positioning sub-module corresponds to different operation types;
the visual tracking module also comprises a plurality of visual tracking module sub-modules, wherein each visual tracking module sub-module corresponds to a different tracking strategy;
the motion control module also comprises a plurality of motion control sub-modules, wherein each motion control sub-module corresponds to a different motion strategy;
the tool function module further comprises a plurality of tool function sub-modules, wherein each tool function sub-module corresponds to a different type of surgical tool.
5. A control method of a robot, characterized in that the method is applied to the software system for controlling a robot of any one of claims 1 to 4, comprising:
receiving first target software position information;
establishing a pose conversion relation between the first target software position information and a visual coordinate system, and converting the software position information into target pose information under the visual coordinate system according to the pose conversion relation;
converting the target pose information into first target position information under a robot coordinate system;
constraining and optimizing the first target position information based on preset planning constraint and preset force control parameters to obtain second target position information;
controlling the robot to move to a target position corresponding to the second target position information;
when the robot executes surgical operation through a surgical tool, acquiring position information of a target surgical tool to be used and second target software;
converting the second target software position information into third target position information under the robot coordinate system;
verifying the third target position information to obtain a verification result;
and if the verification result is that the third target position information passes, controlling the working center point of the target surgical tool to move to the target position corresponding to the third target position information.
6. A control device for a robot, comprising:
the receiving unit is used for receiving the position information of the first target software;
the construction unit is used for constructing a pose conversion relation between the first target software position information and a visual coordinate system and converting the software position information into target pose information in the visual coordinate system according to the pose conversion relation;
the first conversion unit is used for converting the target pose information into first target position information under a robot coordinate system;
the optimization unit is used for constraining and optimizing the first target position information based on preset planning constraint and preset force control parameters to obtain second target position information;
the first control unit is used for controlling the robot to move to a target position corresponding to the second target position information;
wherein the apparatus further comprises: an acquisition unit for acquiring position information of a target surgical tool and second target software to be used when the robot performs a surgical operation through the surgical tool;
the second conversion unit is used for converting the second target software position information into third target position information under the robot coordinate system;
the checking unit is used for checking the third target position information to obtain a checking result;
and the second control unit is used for controlling the working center point of the target surgical tool to move to the target position corresponding to the third target position information if the verification result is that the working center point passes.
7. A processor, characterized in that the processor is configured to run a program, wherein the program is configured to execute the control method of the robot according to claim 5 when running.
8. An electronic device comprising one or more processors and memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the control method of the robot of claim 5.
CN202211068327.3A 2022-09-02 2022-09-02 Software system for controlling robot and control method of robot Active CN115122342B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211068327.3A CN115122342B (en) 2022-09-02 2022-09-02 Software system for controlling robot and control method of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211068327.3A CN115122342B (en) 2022-09-02 2022-09-02 Software system for controlling robot and control method of robot

Publications (2)

Publication Number Publication Date
CN115122342A CN115122342A (en) 2022-09-30
CN115122342B true CN115122342B (en) 2022-12-09

Family

ID=83386945

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211068327.3A Active CN115122342B (en) 2022-09-02 2022-09-02 Software system for controlling robot and control method of robot

Country Status (1)

Country Link
CN (1) CN115122342B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108388159A (en) * 2018-04-10 2018-08-10 大连理工大学 A kind of design architecture of micro parts automatic setup system control software
CN109806004A (en) * 2019-03-18 2019-05-28 汪俊霞 A kind of surgical robot system and operating method based on cloud data technique
CN109807900A (en) * 2019-03-19 2019-05-28 西北工业大学 A kind of software architecture of industrial robot component networked control systems
CN111467036A (en) * 2020-04-15 2020-07-31 上海电气集团股份有限公司 Surgical navigation system, surgical robot system for acetabular osteotomy and control method thereof
CN111652175A (en) * 2020-06-11 2020-09-11 山东大学 Real-time surgical tool detection method applied to robot-assisted surgical video analysis
CN113538522A (en) * 2021-08-12 2021-10-22 广东工业大学 Instrument vision tracking method for laparoscopic minimally invasive surgery

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT1399456B1 (en) * 2009-09-11 2013-04-19 Sr Labs S R L METHOD AND APPARATUS FOR THE USE OF GENERIC SOFTWARE APPLICATIONS THROUGH EYE CONTROL AND INTERACTION METHODS IS APPROPRIATE.

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108388159A (en) * 2018-04-10 2018-08-10 大连理工大学 A kind of design architecture of micro parts automatic setup system control software
CN109806004A (en) * 2019-03-18 2019-05-28 汪俊霞 A kind of surgical robot system and operating method based on cloud data technique
CN109807900A (en) * 2019-03-19 2019-05-28 西北工业大学 A kind of software architecture of industrial robot component networked control systems
CN111467036A (en) * 2020-04-15 2020-07-31 上海电气集团股份有限公司 Surgical navigation system, surgical robot system for acetabular osteotomy and control method thereof
CN111652175A (en) * 2020-06-11 2020-09-11 山东大学 Real-time surgical tool detection method applied to robot-assisted surgical video analysis
CN113538522A (en) * 2021-08-12 2021-10-22 广东工业大学 Instrument vision tracking method for laparoscopic minimally invasive surgery

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
工业机器人示教***扩展功能与软件开发;丁健伟等;《工业控制计算机》;20190525(第05期);第103-105页 *

Also Published As

Publication number Publication date
CN115122342A (en) 2022-09-30

Similar Documents

Publication Publication Date Title
WO2018194965A1 (en) Mixed reality assisted spatial programming of robotic systems
CN109712189B (en) A kind of method and apparatus of sensor combined calibrating
Pedram et al. Autonomous suturing framework and quantification using a cable-driven surgical robot
CN105082161A (en) Robot vision servo control device of binocular three-dimensional video camera and application method of robot vision servo control device
US9452533B2 (en) Robot modeling and positioning
WO2024027647A1 (en) Robot control method and system and computer program product
CN106061427A (en) Robot arm apparatus, robot arm control method, and program
AU2021201644A1 (en) Automated cut planning for removal of diseased regions
JP2008006519A (en) Robot device and method for controlling robot device
CN105592818A (en) System and method for interaction with an object
CN113766997A (en) Method for guiding a robot arm, guiding system
CN114347033B (en) Robot character grabbing method and device, robot and storage medium
CN109806004A (en) A kind of surgical robot system and operating method based on cloud data technique
US12011839B1 (en) Three-dimensional scanning system and scanning path planning method thereof
CN113164215A (en) Techniques for patient-specific virtual boundary deformation
CN115122342B (en) Software system for controlling robot and control method of robot
CN113146634A (en) Robot attitude control method, robot and storage medium
CN112168197B (en) Positioning method and navigation system for elbow joint external fixation rotating shaft
Dagioglou et al. Smoothing of human movements recorded by a single rgb-d camera for robot demonstrations
CN116672031A (en) Robot control method and device, processor and electronic equipment
Sun et al. Adaptive fusion-based autonomous laparoscope control for semi-autonomous surgery
CN117257346A (en) Ultrasonic probe guiding method and device based on image recognition
CN111113430B (en) Robot and tail end control method and device thereof
WO2019012121A1 (en) A method and apparatus for providing an adaptive self-learning control program for deployment on a target field device
Sun et al. Development of a novel intelligent laparoscope system for semi‐automatic minimally invasive surgery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant