CN116672031A - Robot control method and device, processor and electronic equipment - Google Patents

Robot control method and device, processor and electronic equipment Download PDF

Info

Publication number
CN116672031A
CN116672031A CN202310964547.2A CN202310964547A CN116672031A CN 116672031 A CN116672031 A CN 116672031A CN 202310964547 A CN202310964547 A CN 202310964547A CN 116672031 A CN116672031 A CN 116672031A
Authority
CN
China
Prior art keywords
robot
coordinate system
target
joint
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310964547.2A
Other languages
Chinese (zh)
Other versions
CN116672031B (en
Inventor
张靖
魏晓晨
李文龙
王远
尹政顺
李文彦
朱海燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yidian Lingdong Technology Co ltd
Original Assignee
Beijing Yidian Lingdong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yidian Lingdong Technology Co ltd filed Critical Beijing Yidian Lingdong Technology Co ltd
Priority to CN202310964547.2A priority Critical patent/CN116672031B/en
Publication of CN116672031A publication Critical patent/CN116672031A/en
Application granted granted Critical
Publication of CN116672031B publication Critical patent/CN116672031B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1657Bone breaking devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1613Component parts
    • A61B17/1626Control means; Display units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Robotics (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a control method and device of a robot, a processor and electronic equipment. The method comprises the following steps: calibrating the robot based on a conversion relation among a visual coordinate system of visual acquisition equipment of the robot, an axis joint coordinate system of the robot and a motion terminal coordinate system to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information; determining a target motion plane of the robot based on the calibration result; determining a target position of an axial joint of the robot corresponding to the knee joint operation requirement information based on the target motion plane; and controlling the shaft joint to move to a target position to execute the osteotomy operation, and responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and performing error processing on the movement track to obtain a processing result. The application solves the technical problem of low accuracy of the knee joint operation of the robot.

Description

Robot control method and device, processor and electronic equipment
Technical Field
The present application relates to the field of medical technologies, and in particular, to a method and apparatus for controlling a robot, a processor, and an electronic device.
Background
Currently, a robot for performing knee joint surgery mostly adopts a seven-degree-of-freedom universal robot to determine an osteotomy plane, however, when the seven-degree-of-freedom universal robot is adopted to realize certain specific work, the effective utilization rate of the working space and the automation degree of the robot is very low, and the position layout of various instruments in a knee joint surgery room can be influenced due to the large volume and mass of the robot, so that the actual requirement of knee joint surgery is difficult to meet, and the technical problem of low accuracy of the knee joint surgery performed by the robot still exists.
Aiming at the technical problem that the accuracy of the robot for knee joint operation is low in the related art, no effective solution is proposed at present.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, a processor and electronic equipment of a robot, which are used for at least solving the technical problem of low accuracy of knee joint operation of the robot.
In order to achieve the above object, according to an aspect of an embodiment of the present application, there is provided a control method of a robot. The method may include: calibrating the robot based on a conversion relation among a visual coordinate system of visual acquisition equipment of the robot, an axis joint coordinate system of the robot and a motion terminal coordinate system to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information; determining a target motion plane of the robot based on the calibration result, wherein the target motion plane is a plane where the position of executing the osteotomy operation behavior meeting the knee joint operation requirement information is located; determining a target position of an axial joint of the robot corresponding to knee joint operation requirement information based on a target motion plane, wherein the target position is used for representing the position and the angle of the axial joint when the osteotomy operation is executed; and controlling the shaft joint to move to a target position to execute the osteotomy operation, and responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and performing error processing on the movement track to obtain a processing result.
Optionally, after determining the target motion plane of the robot based on the calibration result, the method further comprises: a first conversion relationship between an osteotomy target location or visual coordinate system and the target motion plane is determined on the target motion plane, wherein the osteotomy target location is used to represent a location at which an osteotomy procedure is performed.
Optionally, determining the target position of the axial joint of the robot corresponding to the knee joint surgery requirement information based on the target motion plane includes: determining a first pose conversion matrix between the target motion plane and the femur and tibia mark based on a first conversion relation, and calibrating the pose relation under a visual coordinate system based on the femur and tibia, wherein the first pose conversion matrix is used for representing the pose conversion relation between the target motion plane and the femur and tibia mark; determining a position conversion matrix of the target motion plane on a base coordinate system based on the first pose relation matrix, the pose relation and the conversion relation between the base coordinate system and a visual coordinate system in the calibrated axial joint coordinate system; based on the position conversion matrix, a target position is determined.
Optionally, determining the target location based on the location transformation matrix includes: simplifying the position conversion matrix to obtain the plane conversion relation of the simplified target motion plane in the base coordinate system; in the process of controlling a motion terminal of a robot to a target motion plane, determining a positive kinematic model of a plane conversion relation based on positive kinematics; determining a general solution of the positive kinematic model when the motion terminal is overlapped with the target motion plane; and selecting a target solution meeting preset conditions from the general solutions, wherein the target solution is used for representing the angle of the shaft joint when the target position is positioned or the target motion plane where the target position is positioned.
Optionally, selecting a target solution satisfying a preset condition from the general solutions, including: in response to the solution in the general solution, collision interference with the target object is not generated in the process of controlling the robot to move, the solution does not exceed the shaft boundary limit, and in response to the solution, the speed is greater than or equal to the speed threshold and the consumed energy is less than or equal to the energy threshold when the robot moves to the target position, and the solution is determined to be the target solution.
Optionally, controlling the shaft articulation to a target position performs osteotomy actions, including: acquiring start and end state data of the shaft joint before the osteotomy operation is executed and after the osteotomy operation is executed; based on the start-end state data, performing polynomial programming of target times to obtain a programming model, wherein the programming model is used for programming a track for controlling the movement of the robot; the planning model is brought into constraint conditions, and a planned target motion track is obtained; the control shaft joint moves to a target position according to a target movement track.
Optionally, the method may include: determining actual position data of an end tool of the robot under the visual coordinate system based on a second conversion relation and a conversion relation between a flange mark of the robot and the visual coordinate system, wherein the second conversion relation is used for representing the conversion condition between a tool center point of the end tool of the robot and a base coordinate system in an axis joint coordinate system of the robot, and the flange mark is used for being deployed on the end tool of the robot; the actual position data is displayed on a display interface of the robot.
Optionally, in the process of executing the osteotomy operation, responding to the error of the motion track of the axial joint, and performing error processing on the motion track to obtain a processing result, wherein the processing result comprises: in the process of executing osteotomy operation behaviors, acquiring actual pose data of an end tool of a robot under a visual coordinate system, wherein the actual pose data are used for representing a motion trail; comparing the actual pose data with the expected pose data to obtain a comparison result, wherein the comparison result is used for indicating whether the actual pose data is identical with the expected pose data; and responding to the comparison result that the actual pose data is different from the expected pose data, and performing error processing on the actual pose data to obtain a processing result.
Optionally, in response to the comparison result being that the actual pose data is different from the expected pose data, performing error processing on the actual pose data to obtain a processing result, including: determining a rotation matrix after angle error processing based on the rotation axis and rotation angle of the robot and the actual plane normal vector and the expected plane normal vector of the end tool; and determining a pose issuing matrix after distance error processing based on the expected matrix of the robot and the error parameter in the previous cycle period, wherein the pose issuing matrix is a processing result.
Optionally, calibrating the robot based on a conversion relation among a visual coordinate system of a visual acquisition device of the robot, an axis joint coordinate system of the robot and a motion terminal coordinate system to obtain a calibration result, including: based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system, calibrating the robot by hand and eye to obtain a hand and eye calibration result, and calibrating the robot by a tool to obtain a tool calibration result, wherein the calibration result comprises the hand and eye calibration result and the tool calibration result.
Optionally, performing hand-eye calibration on the robot based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a hand-eye calibration result, including: determining a first transformation matrix between a base and a first shaft joint of the robot, determining a second transformation matrix between the first shaft joint and a second shaft joint, and determining a third transformation matrix between the second shaft joint and a third shaft joint of the robot based on parameters to be calibrated of the robot, wherein the shaft joint comprises the first shaft joint, the second shaft joint and the third shaft joint, the first transformation matrix is used for representing a conversion relation between the first shaft joint and the base, the second transformation matrix is used for representing a conversion relation between the first shaft joint and the second shaft joint, and the third transformation matrix is used for representing a conversion relation between the second shaft joint and the third shaft joint; and (3) calibrating the hand and the eye of the robot based on the first transformation matrix, the second transformation matrix and the third transformation matrix, and determining the conversion relation between the base coordinate system and the visual coordinate system in the axis joint coordinate system after the hand and the eye are calibrated and the conversion relation between the motion terminal coordinate system and the flange mark of the robot as hand and eye calibration results.
Optionally, based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system, performing tool calibration on the robot to obtain a tool calibration result, including: determining a second conversion relation between a tool center point of an end tool of the robot and a flange mark of the robot and a third conversion relation between the tool center point and a base coordinate system in an axis joint coordinate system based on parameters to be calibrated of the robot; and based on the second conversion relation and the third conversion relation, calibrating the tool of the robot to obtain a tool calibration result.
Optionally, performing hand-eye calibration on the robot based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a hand-eye calibration result, and performing tool calibration on the robot to obtain a tool calibration result, wherein the method comprises the following steps: determining the image characteristic position of a target object needing to execute osteotomy operation behaviors; and registering the robot based on the conversion relation between the image characteristic position and the visual coordinate system to obtain a registration result.
In order to achieve the above object, according to another aspect of the embodiments of the present application, there is also provided a control device of a robot. The apparatus may include: the calibration unit is used for calibrating the robot based on the conversion relation among the visual coordinate system of the visual acquisition equipment of the robot, the axis joint coordinate system of the robot and the motion terminal coordinate system to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information; the first determining unit is used for determining a target motion plane of the robot based on the calibration result, wherein the target motion plane is a plane where a position for executing osteotomy operation behaviors meeting knee joint operation requirement information is located; a second determining unit, configured to determine, based on a target movement plane, a target position of an axial joint of the robot corresponding to the knee joint operation requirement information, where the target position is used to represent a position and an angle at which the axial joint is located when performing an osteotomy operation; the processing unit is used for controlling the shaft joint to move to the target position to execute the osteotomy operation, responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and carrying out error processing on the movement track to obtain a processing result.
In order to achieve the above object, according to still another aspect of the embodiments of the present application, there is also provided a processor for executing a program stored in a memory, wherein the program executes the control method of the robot according to any one of the above.
In order to achieve the above object, according to still another aspect of the embodiments of the present application, there is further provided an electronic device including one or more processors and a memory for storing one or more processors implementing the control method of the robot according to any one of the above.
In the embodiment of the application, the robot is calibrated based on the conversion relation among a visual coordinate system of visual acquisition equipment of the robot, an axis joint coordinate system of the robot and a motion terminal coordinate system, so as to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information; determining a target motion plane of the robot based on the calibration result, wherein the target motion plane is a plane where the position of executing the osteotomy operation behavior meeting the knee joint operation requirement information is located; determining a target position of an axial joint of the robot corresponding to knee joint operation requirement information based on a target motion plane, wherein the target position is used for representing the position and the angle of the axial joint when the osteotomy operation is executed; and controlling the shaft joint to move to a target position to execute the osteotomy operation, and responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and performing error processing on the movement track to obtain a processing result. That is, in the embodiment of the application, the parameter to be calibrated of the robot can be calibrated by determining the relative position conversion relation and/or the gesture conversion relation between the visual coordinate system of the visual acquisition device and the axial joint coordinate system and the motion terminal coordinate system of the robot, which can meet the requirement information of the knee joint operation, so as to obtain the calibration result. After the robot is calibrated, a target motion plane for executing the osteotomy operation behavior when the robot is required to execute the osteotomy operation behavior can be determined, the target position for controlling the shaft joint to be reached when the robot is required to execute the osteotomy operation behavior can be solved according to the fact that the motion tail end of the robot reaches the target motion plane, so that each shaft joint of the robot can be controlled to move to the corresponding target position to execute the osteotomy operation behavior, whether deviation occurs in the motion track of the shaft joint can be detected in real time in the process of executing the osteotomy operation behavior, if deviation occurs, the motion track can be planned again to carry out error processing, and the aim of improving the accuracy of the target position can be achieved by taking the problem that the accuracy of the knee joint of the robot is low in view of solving the problem that the accuracy of the operation accuracy of the knee joint of the robot is improved by considering the error in the process of executing the osteotomy operation behavior, and the technical effect of improving the accuracy of the knee joint of the robot is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application. In the drawings:
fig. 1 is a flowchart of a control method of a robot according to an embodiment of the present application;
FIG. 2 is a flow chart of a control method of a three degree of freedom surface planning robot applied to a total knee replacement surgery according to an embodiment of the present application;
FIG. 3 is a schematic view of the configuration of each axis joint and coordinate system definition of a robot according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a tool calibration fixture according to an embodiment of the present application;
FIG. 5 is a schematic illustration of a pre-osteotomy condition of a knee joint, in accordance with an embodiment of the present application;
FIG. 6 is a schematic illustration of a post-knee osteotomy condition in accordance with an embodiment of the present application;
FIG. 7 is a schematic diagram of tracking control logic according to an embodiment of the present application;
fig. 8 is a schematic view of a control device of a robot according to an embodiment of the present application;
fig. 9 is an electronic device according to an embodiment of the application.
Detailed Description
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the application herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that, related information (including, but not limited to, user equipment information, user personal information, etc.) and data (including, but not limited to, data for presentation, analyzed data, etc.) related to the present disclosure are information and data authorized by a user or sufficiently authorized by each party. For example, an interface is provided between the system and the relevant user or institution, before acquiring the relevant information, the system needs to send an acquisition request to the user or institution through the interface, and acquire the relevant information after receiving the consent information fed back by the user or institution.
Example 1
According to an embodiment of the present application, there is provided an embodiment of a control method of a robot, it being noted that the steps shown in the flowcharts of the drawings may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is shown in the flowcharts, in some cases the steps shown or described may be performed in an order different from that herein.
Fig. 1 is a flowchart of a control method of a robot according to an embodiment of the present application, and as shown in fig. 1, the method may include the steps of:
step S102, calibrating the robot based on a conversion relation among a visual coordinate system of visual acquisition equipment of the robot, an axis joint coordinate system of the robot and a motion terminal coordinate system to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information.
In the technical scheme provided in the step S102, a corresponding visual coordinate system can be established for the vision acquisition device, a corresponding axis joint coordinate system can be established for each axis joint of the robot, a corresponding motion terminal coordinate system can be established for a motion terminal of the robot, and a conversion relation between the visual coordinate system and the three of the axis joint coordinate system and the motion terminal coordinate system and related to knee joint operation requirements can be determined, so that the robot is calibrated based on the conversion relation, and a calibration result can be obtained, wherein the robot can at least comprise three axis joints. The three axis joints correspond to respective coordinate systems. The axis joint coordinate system may include coordinate systems of three axis joints and a base coordinate system of the robot. The motion terminal may be the end position of the system to which the robot is to be moved. The motion terminal coordinate system can be a flange coordinate system, also can be called a triaxial coordinate system, and the motion terminal of the robot can be the origin of the flange coordinate system. The conversion relationship may include position information and/or posture information corresponding to knee surgery requirement information. The knee joint surgery may be total knee replacement surgery by replacing the diseased joint of the patient by implantation of an artificial joint prosthesis. The knee surgery requirement information may be a surgery action that needs to be performed in connection with knee surgery. The robot may be a tandem robot, and may be referred to as a knee joint robot, a knee joint navigation robot, a knee joint surgical robot, a three-degree-of-freedom surface planning robot, or the like. It should be noted that the above robot is merely illustrative, and is not particularly limited herein.
In the embodiment of the application, the knee joint operation requirement information requires that the robot can position the end tool on the motion terminal on the target plane where the osteotomy operation is performed, because six degrees of freedom exist in space, and the three degrees of freedom of translation and the three degrees of freedom of rotation can be included. When the plane is uniquely determined, the plane locks three degrees of freedom, namely, a translational degree of freedom along the Y axis, a rotational degree of freedom along the X axis and a rotational degree of freedom along the Z axis, so that the pose of the unique plane can be controlled based on the three degrees of freedom.
Optionally, to meet knee surgery requirement information, the robot requires at least three degrees of freedom to perform positioning of different planes. By combining space utilization and execution efficiency, a serial topology of three mutually perpendicular rotational shaft joints can be designed.
Alternatively, visual collection may be providedThe center of the device is the origin of the visual coordinate system, the visual coordinate system can be established according to the right hand rule, or the base coordinate system can be established according to the right hand rule by taking the center of the base of the robot as the origin of the base coordinate system, and the points in the base coordinate system can be expressed as . According to the right-hand rule, the first axis joint coordinate system is established with the center of the first axis joint of the robot as the origin of the first axis joint coordinate system, and the points in the coordinate system can be expressed as +.>. In the same way, a second axis joint coordinate system of the second axis joint of the robot can be established, points in this coordinate system can be expressed as +.>A third axis joint coordinate system of a third axis joint of the robot, points in the coordinate system may be expressed as +.>The origin of the base coordinate system and the origin of the first axis joint coordinate system may be determined to be the same point, or the origin of the second axis joint coordinate system and the origin of the third axis joint coordinate system may be determined to be the same point. It should be noted that the description is given here by way of example only, and the origin of each coordinate system and the directions of the axes of the coordinate systems are not particularly limited.
Optionally, an optical positioner may be deployed around the robot, where the optical positioner may obtain a pose matrix detected by a flange Marker (flange Marker), where the flange Marker may be fixed to an end tool, and the end tool may be fixed to a flange at the end of the movement of the robot.
Optionally, through hand-eye calibration, the conversion relation between the base coordinate system and the visual coordinate system in the axis joint coordinate system of the robot can be determined, and the conversion relation between the flange Marker and the motion terminal coordinate system can also be determined.
Optionally, through tool calibration, a conversion relationship between a tool center point of an end tool of the robot and the flange Marker can be determined, and a conversion relationship between the tool center point and a base coordinate system in an axis joint coordinate system can be determined.
Optionally, through registration, determining the conversion relation between the characteristic position of the human body local stereoscopic image and the visual coordinate system.
In the embodiment of the application, three methods of hand-eye calibration, tool calibration and registration can be adopted for the robot to be calibrated, so that the conversion relation among various coordinate systems can be determined, and in the calibration process, the conversion relation among various coordinate systems can be continuously and accurately obtained, and the calibration result after the final calibration is finished, namely, the conversion relation among the final coordinate systems after the calibration is finished, and the technical effect of improving the accuracy of knee joint operation of the robot is realized by considering the common calibration of various calibration methods.
Step S104, determining a target motion plane of the robot based on the calibration result, wherein the target motion plane is a plane where the position of the osteotomy operation behavior meeting the knee joint operation requirement information is located.
In the technical solution provided in the above step S104 of the present application, after the calibration result is obtained based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system, the target motion plane where the position of the robot corresponding to the calibrated knee joint operation requirement is located where the osteotomy operation behavior is to be executed may be determined based on the calibration result, where the target motion planes may be six groups and may include a tibia plane, a femur rear chamfer plane, a femur distal end plane, a femur front chamfer plane, a femur front condyle plane and a femur rear condyle plane. It should be noted that the above-mentioned target motion plane is merely illustrative, and is not particularly limited herein.
Optionally, a transformation relationship of the osteotomy target position or the osteotomy plane on the visual coordinate system for performing the osteotomy operation behavior may be defined, so as to calculate a transformation relationship of the osteotomy target position or the osteotomy plane on the base coordinate system of the robot.
Step S106, determining a target position of an axial joint of the robot corresponding to the knee joint operation requirement information based on the target motion plane, wherein the target position is used for representing the position and the angle of the axial joint when the osteotomy operation is executed.
In the technical solution provided in the above step S106 of the present application, after determining the target movement plane of the osteotomy operation behavior corresponding to the calibrated knee joint operation requirement based on the calibration, the target position of the axial joint of the robot corresponding to the knee joint operation information may be determined based on the target movement plane, where the target position may be used to represent the position and angle of the axial joint when the osteotomy operation behavior is performed.
Optionally, after defining the transformation relation of the osteotomy target position or the osteotomy plane for executing the osteotomy operation on the visual coordinate system, the inverse kinematics solution and the selection of multiple inverse solutions can be calculated, so that the analytic solution of the inverse kinematics solution of the robot can be quickly solved and calculated, namely, the angle of each axial joint when the osteotomy target position or the osteotomy plane is positioned is calculated, and the optimal solution can be obtained from multiple groups of inverse solutions, namely, the optimal target position of the axial joint is obtained.
In the embodiment of the application, the target position of the shaft joint corresponding to the target position of the osteotomy can be determined by inverse kinematics, so that 75% of parameters in a homogeneous transformation matrix of a system where a conventional robot is positioned can be reduced, the real-time calculation amount is reduced, and the technical effect of improving the calculation efficiency of real-time tracking of the robot is further realized.
Step S108, controlling the shaft joint to move to the target position to execute the osteotomy operation, and responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and performing error processing on the movement track to obtain a processing result.
In the technical scheme provided in the step S108, after the target position of the shaft joint corresponding to the calibrated knee joint operation requirement information is determined, each shaft joint of the robot can be controlled to move to the corresponding target position, so that the osteotomy operation behavior is executed, whether the motion track of the shaft joint deviates or not can be detected in real time in the process of executing the osteotomy operation behavior, and if the deviation occurs, the motion track can be re-planned to perform error processing.
Optionally, after the target position of the shaft joint is obtained, trajectory planning can be performed according to inverse kinematics solution, so that a moving trajectory when the current position of the shaft joint moves to the target position can be planned, thereby driving the corresponding shaft joint of the robot to move to the target position to execute osteotomy operation behavior,
Optionally, the position of the end tool of the robot may be displayed in real time during the movement to the target position, and the current position of the robot axis joint during the movement may also be displayed. And the error between the current position and the expected position in the moving process according to the track can be determined, and the error processing is carried out, so that the technical effect of the accuracy of the knee joint operation of the robot is improved.
In the embodiment of the application, the steps S102 to S108 are performed on the robot based on the conversion relation among the visual coordinate system of the visual acquisition device of the robot, the axis joint coordinate system of the robot and the motion terminal coordinate system, so as to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to the knee joint operation requirement information; determining a target motion plane of the robot based on the calibration result, wherein the target motion plane is a plane where the position of executing the osteotomy operation behavior meeting the knee joint operation requirement information is located; determining a target position of an axial joint of the robot corresponding to knee joint operation requirement information based on a target motion plane, wherein the target position is used for representing the position and the angle of the axial joint when the osteotomy operation is executed; and controlling the shaft joint to move to a target position to execute the osteotomy operation, and responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and performing error processing on the movement track to obtain a processing result. That is, in the embodiment of the application, the parameter to be calibrated of the robot can be calibrated by determining the relative position conversion relation and/or the gesture conversion relation between the visual coordinate system of the visual acquisition device and the axial joint coordinate system and the motion terminal coordinate system of the robot, which can meet the requirement information of the knee joint operation, so as to obtain the calibration result. After the robot is calibrated, a target motion plane for executing the osteotomy operation behavior when the robot is required to execute the osteotomy operation behavior can be determined, the target position for controlling the shaft joint to be reached when the robot is required to execute the osteotomy operation behavior can be solved according to the fact that the motion tail end of the robot reaches the target motion plane, so that each shaft joint of the robot can be controlled to move to the corresponding target position to execute the osteotomy operation behavior, whether deviation occurs in the motion track of the shaft joint can be detected in real time in the process of executing the osteotomy operation behavior, if deviation occurs, the motion track can be planned again to carry out error processing, and the aim of improving the accuracy of the target position can be achieved by taking the problem that the accuracy of the knee joint of the robot is low in view of solving the problem that the accuracy of the operation accuracy of the knee joint of the robot is improved by considering the error in the process of executing the osteotomy operation behavior, and the technical effect of improving the accuracy of the knee joint of the robot is achieved.
The above-described method of this embodiment is further described below.
As an optional embodiment, step S104, after determining the target motion plane of the robot based on the calibration result, the method further includes: a first conversion relationship between an osteotomy target location or visual coordinate system and the target motion plane is determined on the target motion plane, wherein the osteotomy target location is used to represent a location at which an osteotomy procedure is performed.
In this embodiment, a target motion plane for performing an osteotomy operation, which satisfies the knee joint operation requirement information, may be acquired, and a third conversion relationship between an osteotomy target position or a visual coordinate system and the target motion plane may be determined on the target motion plane, wherein the osteotomy target position may be used to represent a position at which the osteotomy operation is performed. The first transformation relationship may be used to represent a transformation relationship of the object motion plane in a visual coordinate system.
Alternatively, a transformation relationship of the osteotomy target position or target movement plane in the visual coordinate system, i.e., a first transformation relationship, may be definedWherein->May be used to represent the sequence number of the osteotomy target location or target movement plane. Based on the first transformation relationship, the transformation relationship of the osteotomy target position or the target movement plane in the base coordinate system of the robot can be determined.
As an optional embodiment, step S106, determining, based on the target motion plane, a target position of an axis joint of the robot corresponding to the knee joint operation requirement information includes: determining a first pose conversion matrix between the target motion plane and the femur and tibia mark based on a first conversion relation, and calibrating the pose relation under a visual coordinate system based on the femur and tibia, wherein the first pose conversion matrix is used for representing the pose conversion relation between the target motion plane and the femur and tibia mark; determining a position conversion matrix of the target motion plane on a base coordinate system based on the first pose relation matrix, the pose relation and the conversion relation between the base coordinate system and a visual coordinate system in the calibrated axial joint coordinate system; based on the position conversion matrix, a target position is determined.
In this embodiment, in the process of determining the target position of the axial joint of the robot corresponding to the calibrated knee joint operation requirement information based on the calibration result, a first pose conversion matrix between the target motion plane and the femoral tibia marker may be determined based on the first conversion relation, and a pose relation of the femoral tibia calibration under the visual coordinate system may be determined based on the first pose relation matrix, the pose relation, and the conversion relation between the base coordinates and the visual coordinate system in the calibrated axial joint coordinate system, and the position conversion matrix of the target motion plane on the base coordinate system may be determined based on the position conversion matrix, where the first pose conversion matrix may be used to represent the pose conversion relation between the target motion plane and the femoral tibia marker. The femoral tibial Marker may also be referred to as a femoral tibial Marker.
Optionally, based on the first transformation relationship, a transformation relationship of the osteotomy target position or the target movement plane in the base coordinate system of the robot can be further calculated. In the embodiment of the application, aiming at the three-degree-of-freedom plane planning robot system, 75% of parameters in a homogeneous transformation matrix of a conventional robot system can be reduced, so that the real-time calculation amount can be reduced, and the technical effect of improving the calculation efficiency of real-time tracking of the robot is realized.
Alternatively, the robot may need to determine six target planes of motion at the knee joint to guide the physician in performing the disambiguation operation. Six target motion planes and the pose conversion matrix of the femur and tibia Marker are obtained through registration and operation planning. The transformation relation between the robot base coordinate system and the visual coordinate system can be determined through hand-eye calibration>The position and the posture of the femur tibia Marker in a visual coordinate system can be obtained in real time>
Optionally, based on the first pose relation matrix, the pose relation and the conversion relation between the calibrated base coordinate system and the visual coordinate system, the position conversion matrix of the target motion plane in the base coordinate system can be calculated through the following formula
As an optional embodiment, step S106, determining the target location based on the location conversion matrix includes: simplifying the position conversion matrix to obtain the plane conversion relation of the simplified target motion plane in the base coordinate system; in the process of controlling a motion terminal of a robot to a target motion plane, determining a positive kinematic model of a plane conversion relation based on positive kinematics; determining a general solution of the positive kinematic model when the motion terminal is overlapped with the target motion plane; and selecting a target solution meeting preset conditions from the general solutions, wherein the target solution is used for representing the angle of the shaft joint when the target position is positioned or the target motion plane where the target position is positioned.
In this embodiment, in the process of determining the target position based on the position conversion matrix, simplification processing may be performed on the position conversion matrix, so as to obtain a planar conversion relationship of the target motion plane after the simplification processing in the base coordinate system, in the process of controlling the motion terminal of the robot to the target motion plane, a forward kinematics model of the planar conversion relationship may be determined based on forward kinematics, a general solution of the forward kinematics model when the motion terminal is coincident with the target motion plane may be based on a general solution of the forward kinematics model when the motion terminal is coincident with the target motion plane, and a target solution satisfying a preset condition may be selected from the general solution, where the target solution may be used to characterize an angle of an axis joint when the target position is located to the target position or the target motion plane where the target position is located. The positive kinematic model may be a positive kinematic solution. The general solution may be an angle of an axis joint of a mechanical arm of the robot, and may include angles of a first axis joint, a second axis joint, and a third axis joint. The solution may be referred to as an inverse solution.
Alternatively, the position conversion matrix may beIs subjected to a simplified processing of the data of (4X 4) homogeneous rotation matrixTotally contain 16The number of the parameters is 12, and the position and posture information of the full degree of freedom of the space can be completely expressed through the 12 effective parameters, and the following formula is shown:
Wherein,,may be used to represent location information; />May be used to represent the pose information. If the XOZ plane is set as the target motion plane, 12 effective parameters can be processed, parameters irrelevant to the plane are removed, and only relevant information expressing the target motion plane is reserved, so that the plane conversion relation of the new target motion plane on the base coordinates can be obtained>The planar conversion relation->Comprising four parameters->The planar conversion relationship shown below can be obtained:
wherein,,
optionally, the simplified planar conversion relation of the target motion plane in the base coordinate systemThe data input as robot motion control is further located or tracked. For high real-time performance compared to single positioningThe effect of simplifying the data processing is more obvious in the tracking mode.
Optionally, in obtaining the real-time plane conversion relation between the target motion plane and the base coordinate systemThereafter, the robot can realize the movement of the tip of the robot to the target movement plane by controlling the angles of the respective axis joints. Can be at the angle of the axis joint of the robot +.>As a parameter, the pose of the tip of the robot +.>Can be obtained by a positive kinematic model in the configuration design of the robot.
Alternatively, the positive kinematic model may be represented by the following formula:
alternatively, the principle of solving the positive kinematic model is as follows: obtaining a robot axis joint angle when the tip plane (set as the tip XOZ plane) of the robot is overlapped with the target motion planeIs a general solution expression of (2). The two planes can be said to coincide when the end XOZ plane of the robot is parallel to the target movement plane and the distance of the end point from the target movement plane is equal to zero.
Alternatively, the process may be carried out in a single-stage,the vector of the first three rows and the second column of the matrix may represent the normal vector of the terminal XOZ plane.Is->The normal vector may be a target motion plane, and the two normal vectors may be unit vectors, and if the two unit vectors are equal, it may be described that the terminal XOZ plane is parallel to the target motion plane. The distance between the end point and the target motion plane can be equal to zero, and the following formulas (A) - (D) are obtained:
(A)
(B)
(C)
(D)
the above equation is an overrunning equation, solved as follows:
is obtained from (A):
(E)
the simultaneous (B) and (D) can be obtained:
(F)
(G)
the simultaneous (C) and (G) can be obtained:
(H)
(I)
the simultaneous (F) and (I) can be obtained:
(J)
to sum up: based on (E) (H) (J), the end result, i.e., the general solution, can be obtained:
Wherein,,
。/>
as an optional embodiment, step S106, selecting a target solution from the general solutions, where the target solution meets a preset condition, includes: in response to the solution in the general solution, collision interference with the target object is not generated in the process of controlling the robot to move, the solution does not exceed the shaft boundary limit, and in response to the solution, the speed is greater than or equal to the speed threshold and the consumed energy is less than or equal to the energy threshold when the robot moves to the target position, and the solution is determined to be the target solution.
In this embodiment, in the process of controlling the robot to perform movement, it may be determined whether a solution in the full solution collides with the target object, the magnitude of the limit between the solution and the axis boundary may be determined, and the speed and the consumed energy when moving to the target position based on the solution may be determined. When the robot is controlled to move based on a certain solution, collision interference does not occur with a target object, the solution does not exceed the axis boundary limit, and the solution can be determined to be the target solution based on the fact that the speed of the solution when the solution moves to the target position is greater than or equal to a speed threshold and the consumed energy is less than or equal to an energy threshold, wherein the target object can be the robot or other objects. The target solution may be an optimal solution.
Alternatively, the rule for selecting the target solution from the general solutions is as follows: (1) According to the current guidance, the robot motion can not collide and interfere with the robot or other physical objects; (2) the current solution should not exceed a preset shaft boundary limit; (3) A set of solutions is selected that move from the current location to the target location at the fastest speed and consume the least energy.
Optionally, after acquiring the current shaft joint angle of the mechanical armThe weight coefficients of the first shaft joint, the second shaft joint and the third shaft joint of the robot are determined according to the comprehensive efficiency and energy consumption principle>Can be selected such that +.>(/>Can be used to represent different sets of inverse solutions) the smallest set +.>The optimal solution is as follows:
as an alternative embodiment, step S108, controlling the shaft articulation to the target location performs an osteotomy maneuver, comprising: acquiring start and end state data of the shaft joint before the osteotomy operation is executed and after the osteotomy operation is executed; based on the start-end state data, performing polynomial programming of target times to obtain a programming model, wherein the programming model is used for programming a track for controlling the movement of the robot; the planning model is brought into constraint conditions, and a planned target motion track is obtained; the control shaft joint moves to a target position according to a target movement track.
In this embodiment, before the robot is controlled to move to the target position to perform the osteotomy operation, start and end state data before the robot performs the osteotomy operation and after the robot performs the osteotomy operation may be obtained, and polynomial planning of the target times may be performed on the start and end state data to obtain a planning model, the planning model may be brought into constraint conditions to obtain a planned target movement track, and the axis joint may be controlled to move to the target position according to the target movement track, where the planning model may be used to plan the track for controlling the robot to move. The target motion profile may be used to control the movement of the axial joint to a target position. The target number of times may be three or five times. It should be noted that the above target number is merely illustrative, and is not particularly limited herein. The start-end state data may include position, velocity, acceleration, etc. data of the initial state and the end state. It should be noted that the start-end state data is merely illustrative, and is not particularly limited herein.
Alternatively, the three-degree-of-freedom surface planning robot for total knee replacement surgery designed for the embodiments of the present application requires smooth motion and no collision. Compared with Cartesian space planning, when the axial joint space trajectory planning is carried out, the method is simple in algorithm and high in moving efficiency, and the problem of motion singularity of a mechanism does not occur, so that the trajectory planning is required to be carried out on an axial joint control.
Optionally, for a system of mutually coupling multiple axes of the robot, in the motion process, the load inertia of each axis joint is continuously changed, so that the conventional trapezoidal or parabolic planning cannot well meet the requirement, and the three-time/five-time polynomial planning can be used, so that the characteristic that the load inertia of each axis joint is continuously changed can be met to a certain extent.
For example, a conventional five-degree polynomial programming can be adopted, and six boundary conditions comprising the position, the speed, the acceleration and the like of the start state and the end state are utilized to solve the following programming model:
alternatively, the above may be brought into the following constraints:
wherein,,a position that can be used to represent an initial state; />A speed that can be used to represent an initial state; />Acceleration that may be used to represent an initial state; />A position that can be used to represent an end state; />A speed that can be used to represent an end state; />May be used to represent acceleration of the end state.
Optionally, after the planning model is brought into the constraint condition, the planned target motion track can be solved to obtain the following formula:
optionally, after obtaining the target motion trail, the target motion trail may be issued to each axis joint of the robot, so as to guide the robot to perform the motion.
As an alternative embodiment, the method may include step S108: determining actual position data of an end tool of the robot under the visual coordinate system based on a second conversion relation and a conversion relation between a flange mark of the robot and the visual coordinate system, wherein the second conversion relation is used for representing the conversion condition between a tool center point of the end tool of the robot and a base coordinate system in an axis joint coordinate system of the robot, and the flange mark is used for being deployed on the end tool of the robot; the actual position data is displayed on a display interface of the robot.
In this embodiment, the actual position data of the end tool in the visual coordinate system may be determined based on the second conversion relation and the conversion relation between the flange mark of the robot and the visual coordinate system, and the actual position data may be displayed on the display interface, where the second conversion relation may be used to represent the conversion situation between the tool center point of the end tool and the base coordinate system of the robot. The actual location data may also be referred to as an actual location.
Alternatively, the tool center point has been obtainedConversion matrix to flange MarkerThe conversion relation of the flange Marker in the visual coordinate system can be determined in real time >The real-time position of the end tool in the visual coordinate system may be calculated by the following formula and may be displayed in the display interface:
as an optional embodiment, step S108, in the process of performing the osteotomy, performs error processing on the motion track in response to the error existing in the motion track of the axial joint, to obtain a processing result, where the processing result includes: in the process of executing osteotomy operation behaviors, acquiring actual pose data of an end tool of a robot under a visual coordinate system, wherein the actual pose data are used for representing a motion trail; comparing the actual pose data with the expected pose data to obtain a comparison result, wherein the comparison result is used for indicating whether the actual pose data is identical with the expected pose data; and responding to the comparison result that the actual pose data is different from the expected pose data, and performing error processing on the actual pose data to obtain a processing result.
In this embodiment, the actual pose data of the end tool of the robot in the visual coordinate system may be obtained, the actual pose data may be compared with the expected pose data, it may be determined whether the two pose data are the same, if the actual pose data are different from the expected pose data, it may be indicated that the actual pose data are inaccurate, and at this time, the error processing may be performed on the actual pose data. If the actual pose data is the same as the expected pose data, the fact that the actual pose data is accurate can be indicated, and the robot can be directly controlled to move without error processing, wherein the expected pose data can be the pose of the expected end tool in a visual coordinate system.
In the embodiment of the application, aiming at the situation that human body micro motion can occur in the total knee joint replacement operation, the three-degree-of-freedom plane planning robot system is required to adjust the position in real time to ensure the positioning accuracy, so that the embodiment of the application can design a tracking algorithm to perform error processing on inaccurate actual pose data.
Optionally, in the tracking control logic of the tracking algorithm, the first loop does not contain an error processing module. In the moving process of the robot, the actual pose data of the end tool in the visual coordinate system can be acquired in real time>Can compare the expected pose data +.>Thereby determining whether error handling is required.
As an optional embodiment, step S108, in response to the comparison result being that the actual pose data is different from the expected pose data, performs error processing on the actual pose data to obtain a processing result, including: determining a rotation matrix after angle error processing based on the rotation axis and rotation angle of the robot and the actual plane normal vector and the expected plane normal vector of the end tool; and determining a pose issuing matrix after distance error processing based on the expected matrix of the robot and the error parameter in the previous cycle period, wherein the pose issuing matrix is a processing result.
In this embodiment, when the actual pose data is different from the expected pose data, in the process of performing error processing on the actual pose data, a rotation matrix after angle error processing may be obtained based on the rotation axis, the rotation angle, and the actual plane normal vector and the expected plane normal vector of the end tool of the robot, and a pose issuing matrix after distance error processing may be obtained based on the expected matrix of the robot and an error parameter in the previous cycle, where the error parameter may be a distance error parameter.
Optionally, a needleFor the three-degree-of-freedom plane planning robot in the embodiment of the application, the design goal is to determine the plane, so two requirements for designing the measurement error include: angle errors and distance errors. The angle error may be a desired planar normal vectorNormal vector to the plane in which the end tool is currently located +.>Is a space angle of (a). The distance error may be the projection distance +_ of the feature point of the desired plane and the current plane of the end tool to the normal vector of the desired plane>
Alternatively, the angle error may be processed by an error processing module in the tracking control logic as follows: calculated according to the following formulaAnd->Is >Rotation angle->From the Rodrigas rotation formula, the rotation matrix can be calculated>
Wherein,,can be used for representing +.>The generated antisymmetric matrix; />;/>
Alternatively, the distance error may be processed by an error processing module in the tracking control logic as follows: since the measured distance is greatly influenced by the plane angle, the smaller the angle error is, the more accurate the distance error is described, and theAs a distance error parameter. According to the input expected matrix and the error parameter obtained in the last cycle period, the pose issuing matrix of the current cycle can be calculated according to the following formula:
as an optional embodiment, step S102, calibrating the robot based on the conversion relation among the visual coordinate system of the visual acquisition device of the robot, the axis joint coordinate system of the robot, and the motion terminal coordinate system, to obtain a calibration result, includes: based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system, calibrating the robot by hand and eye to obtain a hand and eye calibration result, and calibrating the robot by a tool to obtain a tool calibration result, wherein the calibration result comprises the hand and eye calibration result and the tool calibration result.
In this embodiment, in the process of calibrating the robot based on the conversion relations among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain the calibration result, the hand-eye calibration may be performed on the robot based on the conversion relations among the three coordinate systems to obtain the hand-eye calibration result, and the tool calibration may be performed on the robot to obtain the tool calibration result, where the calibration result may include the hand-eye calibration result and the tool calibration result.
In the embodiment of the application, three methods of hand-eye calibration, tool calibration and registration can be adopted for the robot to be calibrated, so that the conversion relation among various coordinate systems can be determined, and in the calibration process, the conversion relation among various coordinate systems can be continuously and accurately obtained, and the calibration result after the final calibration is finished, namely, the conversion relation among the final coordinate systems after the calibration is finished, and the technical effect of improving the accuracy of knee joint operation of the robot is realized by considering the common calibration of various calibration methods.
As an optional embodiment, step S102, performing hand-eye calibration on the robot based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a hand-eye calibration result, includes: determining a first transformation matrix between a base and a first shaft joint of the robot, determining a second transformation matrix between the first shaft joint and a second shaft joint, and determining a third transformation matrix between the second shaft joint and a third shaft joint of the robot based on parameters to be calibrated of the robot, wherein the shaft joint comprises the first shaft joint, the second shaft joint and the third shaft joint, the first transformation matrix is used for representing a conversion relation between the first shaft joint and the base, the second transformation matrix is used for representing a conversion relation between the first shaft joint and the second shaft joint, and the third transformation matrix is used for representing a conversion relation between the second shaft joint and the third shaft joint; and (3) calibrating the hand and the eye of the robot based on the first transformation matrix, the second transformation matrix and the third transformation matrix, and determining the conversion relation between the base coordinate system and the visual coordinate system in the axis joint coordinate system after the hand and the eye are calibrated and the conversion relation between the motion terminal coordinate system and the flange mark of the robot as hand and eye calibration results.
In this embodiment, in the process of performing hand-eye calibration on the robot based on the conversion relation between the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain the hand-eye calibration result, a first conversion matrix between the base of the robot and the first axis joint, a second conversion matrix between the first axis joint and the second axis joint, and a third conversion matrix between the second axis joint and the third axis joint of the robot may be determined based on the parameters to be calibrated of the robot, so that the hand-eye calibration may be performed on the robot based on the first conversion matrix, the second conversion matrix and the third conversion matrix, and the conversion relation between the base coordinates in the axis joint coordinate system after the hand-eye calibration and the visual coordinate system, and the conversion relation between the motion terminal white bear and the flange mark of the robot may be determined as the hand-eye calibration result, where the flange mark is used for being deployed on the end tool of the robot and may be referred to as a flange Marker. The shaft joints include a first shaft joint, a second shaft joint, and a third shaft joint. The first transformation matrix may be used to characterize a conversion relationship between the first axis joint and the base. The second transformation matrix is used to characterize the conversion relationship between the first and second axis joints. The third transformation matrix is used for representing the conversion relation between the second shaft joint and the third shaft joint. The parameters to be calibrated can comprise (Denavit-Hartenberg parameters, which is simply referred to as DH parameter), and the DH parameter can be parameters such as the length of a connecting rod, the rotation angle of the connecting rod, the rotation angle of a joint, the offset distance of the connecting rod and the like of the robot.
Alternatively, a parameter table to be calibrated can be obtained according to parameters such as the length of each connecting rod and the angle of each joint in the robot, that is, a DH parameter table can be obtained, wherein the length of each connecting rod of each shaft joint of the robot can be filled in the DH parameter tableConnecting rod corner->Corner of joint->And link offset->Wherein the link length may be used to represent the length of the joint axis common perpendicular to two adjacent axial joints. The link rotation angle can be used to represent two adjacent shaft closuresThe included angle between the joint axes of the joints. The articulation angle may be used to represent the angle between two adjacent links about a common axis. The link offset distance may be used to represent the distance along the common axis of two adjacent links.
Alternatively, the first transformation matrix, the second transformation matrix, and the third transformation matrix may be determined according to the DH parameter table as follows:
wherein,,a first transformation matrix that may be used to represent the first axis joint between the base and the first axis joint; />(/>=1, 2, 3) can be used to represent the joint angles corresponding to the first, second, and third axis joints; l can be obtained from a three-dimensional drawing of the robot. />May be used to represent a second transformation matrix between the first and second axis joints. / >A third transformation matrix may be used between the second axis joint and the third axis joint.
Alternatively, based on the first transformation matrix, the second transformation matrix, and the third transformation matrix obtained by the above determination, the conversion relationship between the base coordinate system and the flange coordinate system of the robot positive kinematic solution may be derived by the following formula:
wherein,,can be used to represent the conversion relationship between the base coordinate system and the flange coordinate system.
Optionally, based on the first transformation matrix, the second transformation matrix and the third transformation matrix, performing hand-eye calibration on the robot, and determining a conversion relationship between a base coordinate system and a visual coordinate system of the robotThe conversion relation between the flange mark and the flange coordinate system can also be determined>Therefore, the calibrated two conversion relations can be determined as hand-eye calibration results.
As an optional embodiment, step S102, performing tool calibration on the robot based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system, to obtain a tool calibration result, includes: determining a second conversion relation between a tool center point of an end tool of the robot and a flange mark of the robot and a third conversion relation between the tool center point and a base coordinate system in an axis joint coordinate system based on parameters to be calibrated of the robot; and based on the second conversion relation and the third conversion relation, calibrating the tool of the robot to obtain a tool calibration result.
In this embodiment, in the process of calibrating a tool on the robot based on the conversion relations among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a tool calibration result, a second conversion relation between a tool center point of a terminal tool of the robot and a flange mark and a third conversion relation between the tool center point and a base coordinate system in the axis joint coordinate system may be determined based on parameters to be calibrated of the robot, and based on the second conversion relation and the third conversion relation, tool calibration may be performed on the robot to obtain the tool calibration result, where the second conversion relation may be a conversion matrix from the tool center point to the flange mark. The third transformation relationship may be a transformation matrix of the tool center point in the base coordinate system.
Optionally, a visual recognition Marker may be provided on the tool calibration fixture (calibration fixture), and a mounting clamping groove seat for the osteotomy power saw blade is provided. When the calibration tool is clamped at the designated position of the osteotomy power saw blade, the pose of the visual identification Marker of the current calibration tool under the visual coordinate system can be recordedThe pose of the current flange Marker under the visual coordinate system can also be recorded +. >. Factory calibration is carried out to obtain a conversion relation between a reference position of the clamping groove seat clamping bone-carrying power saw blade and a calibration tool Marker>
Alternatively, the transformation matrix of the tool center point (Tool Center Point, abbreviated as TCP) of the end tool to the flange Marker can be calculated by the following formulaAnd the transformation matrix of the tool center point in the robot-based coordinate system +.>
Wherein,,may be used to represent a second conversion relationship; />May be used to represent a third conversion relationship;the method can be used for representing the pose of the flange Marker under a visual coordinate system; />The visual identification Marker can be used for representing the pose of the visual identification Marker of the current calibration tool under a visual coordinate system; />The conversion relation between the reference position of the clamping groove seat clamping bone-carrying power saw blade obtained through factory calibration and the calibration tool Marker can be expressed; />Can be used for representing the transformation relation between the base coordinate system and the flange coordinate system; />The method can be used for representing the transformation relation between the flange Marker and the flange coordinate system.
Optionally, the second conversion relationship and the third conversion relationship may be calibrated, so that the second conversion relationship and the third conversion relationship obtained after the final calibration may be used as tool calibration results.
As an optional embodiment, step S102, after performing hand-eye calibration on the robot based on the conversion relationship among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a hand-eye calibration result, and performing tool calibration on the robot to obtain a tool calibration result, the method includes: determining the image characteristic position of a target object needing to execute osteotomy operation behaviors; and registering the robot based on the conversion relation between the image characteristic position and the visual coordinate system to obtain a registration result.
In this embodiment, in the process of registering and registering the robot based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain the registration result, the image feature position of the target object can be determined based on the parameter to be calibrated of the robot, and the registration and registration can be performed on the robot based on the conversion relation between the image feature position and the visual coordinate system to obtain the registration result, wherein the target object can be a patient with damaged knee joint and needing to perform the total knee joint replacement operation. The image feature location may be a human local three-dimensional (3D) image feature location.
Optionally, a position conversion relation between the feature position of the 3D image of the human body part and the visual coordinate system can be establishedTherefore, registration is carried out based on the conversion relation, and a registration result of successful registration can be obtained.
According to the embodiment of the application, the parameters to be calibrated of the robot can be calibrated by determining the relative position conversion relation and/or the posture conversion relation between the visual coordinate system of the visual acquisition equipment and the axial joint coordinate system and the motion terminal coordinate system of the robot, which can meet the knee joint operation requirement information, so as to obtain a calibration result. After the robot is calibrated, a target motion plane for executing the osteotomy operation behavior when the robot is required to execute the osteotomy operation behavior can be determined, the target position for controlling the shaft joint to be reached when the robot is required to execute the osteotomy operation behavior can be solved according to the fact that the motion tail end of the robot reaches the target motion plane, so that each shaft joint of the robot can be controlled to move to the corresponding target position to execute the osteotomy operation behavior, whether deviation occurs in the motion track of the shaft joint can be detected in real time in the process of executing the osteotomy operation behavior, if deviation occurs, the motion track can be planned again to carry out error processing, and the aim of improving the accuracy of the target position can be achieved by taking the problem that the accuracy of the knee joint of the robot is low in view of solving the problem that the accuracy of the operation accuracy of the knee joint of the robot is improved by considering the error in the process of executing the osteotomy operation behavior, and the technical effect of improving the accuracy of the knee joint of the robot is achieved.
Example 2
The technical solution of the embodiment of the present application will be illustrated in the following with reference to a preferred embodiment.
In the related art, a robot for performing knee joint surgery mostly adopts a seven-degree-of-freedom universal robot to determine an osteotomy plane, however, when a certain specific work is performed by adopting the seven-degree-of-freedom universal robot, the effective utilization rate of the working space and the automation degree of the robot is very low, and the position layout of various instruments in a knee joint surgery room can be influenced due to the large volume and mass of the robot, so that the actual requirement of knee joint surgery is difficult to meet, and the technical problem of low accuracy of the knee joint surgery performed by the robot still exists.
In a related art, a method for identifying a mechanical arm DH parameter based on a least square method is provided, including the following steps: determining an initial DH parameter of the mechanical arm according to the configuration and structural parameters of the mechanical arm, and constructing an error model of the mechanical arm according to a differential motion principle; based on the initial DH parameter of the mechanical arm, carrying out self-calibration on the mechanical arm by using a calibration plate, and recording encoder values of joints of the mechanical arm corresponding to each group of points; parameterizing the manipulator error model by a least square method in combination with the encoder values; obtaining the tail end positions of all groups of points according to the mechanical arm error model parameters and the mechanical arm initial DH parameters, and calculating the difference between the tail end positions and the absolute positions of fixed points; and identifying the error model parameters of the mechanical arm by comparing the difference value with a set threshold value. The method can solve the technical problem of large error of the mechanical arm through multiple calibration, and achieves the technical effect of reducing the error of the mechanical arm.
In another related art, a method for calibrating DH parameters of a medical robot is provided, including: establishing a kinematic model of the medical robot and an imaging model of the camera; establishing an error equation based on system position information and posture information determined by a kinematic model of the medical robot and an imaging model of the camera; and determining an objective function of the optimization problem according to the error equation, and solving the objective function to realize DH parameter calibration of the medical robot. The calibration of the parameters of the medical robot is realized by the method, so that the technical problem that an error equation can be built only through the position information in the related technology is solved.
However, since the calibration of the robot by the knee joint operation requirement information is not considered, there is still a technical problem that the accuracy of the robot for performing the knee joint operation is low.
In order to solve the problems, the application provides a three-degree-of-freedom surface planning robot control method applied to total knee replacement surgery. The method may comprise the steps of: the parameters to be calibrated of the robot can be calibrated by determining the relative position conversion relation and/or the gesture conversion relation between the visual coordinate system of the visual acquisition equipment, the axis joint coordinate system of the robot and the motion terminal coordinate system, wherein the relative position conversion relation and/or the gesture conversion relation can meet the knee joint operation requirement information, and a calibration result is obtained. After the robot is calibrated, a target motion plane for executing the osteotomy operation behavior when the robot is required to execute the osteotomy operation behavior can be determined, the target position for controlling the shaft joint to be reached when the robot is required to execute the osteotomy operation behavior can be solved according to the fact that the motion tail end of the robot reaches the target motion plane, so that each shaft joint of the robot can be controlled to move to the corresponding target position to execute the osteotomy operation behavior, whether deviation occurs in the motion track of the shaft joint can be detected in real time in the process of executing the osteotomy operation behavior, if deviation occurs, the motion track can be planned again to carry out error processing, and the aim of improving the accuracy of the target position can be achieved by taking the problem that the accuracy of the knee joint of the robot is low in view of solving the problem that the accuracy of the operation accuracy of the knee joint of the robot is improved by considering the error in the process of executing the osteotomy operation behavior, and the technical effect of improving the accuracy of the knee joint of the robot is achieved.
Embodiments of the present application are further described below.
Fig. 2 is a flowchart of a control method of a three-degree-of-freedom surface planning robot applied to a total knee replacement surgery according to an embodiment of the present application, and as shown in fig. 2, the method may include the steps of:
step S202, performing hand-eye calibration, tool calibration and registration on the robot.
In the technical scheme provided in the step S202, for the robot to be calibrated, three methods of hand-eye calibration, tool calibration and registration can be adopted to determine the conversion relation between various coordinate systems, and in the calibration process, the conversion relation between various coordinate systems can be continuously and accurately obtained, and the calibration result after the final calibration is finished, namely, the conversion relation between the final coordinate systems after the calibration is finished, and the technical effect of improving the accuracy of knee joint operation of the robot is realized by considering the common calibration of various calibration methods.
Alternatively, the knee surgery requirement information requires that the robot can position the end tool on the mobile terminal onto the target plane where the osteotomy is performed, since there are six degrees of freedom in space, which can include three translational degrees of freedom and three rotational degrees of freedom. When the plane is uniquely determined, the plane locks three degrees of freedom, namely, a translational degree of freedom along the Y axis, a rotational degree of freedom along the X axis and a rotational degree of freedom along the Z axis, so that the pose of the unique plane can be controlled based on the three degrees of freedom.
Optionally, to meet knee surgery requirement information, the robot requires at least three degrees of freedom to perform positioning of different planes. By combining space utilization and execution efficiency, a serial topology of three mutually perpendicular rotational shaft joints can be designed. FIG. 3 is a schematic view showing the configuration of each axis joint and the definition of the coordinate system of a robot according to the embodiment of the present application, as shown in FIG. 3, since the origins of the coordinate systems of the bottom base and the first axis joint of the robot are the same point, the base coordinate system is represented asThe first axis joint coordinate system may be expressed as +.>. The origin of the coordinate system of the second axis joint and the third axis joint of the robot is the same point, and the coordinate system of the second axis joint can be expressed as +.>The third axis joint coordinate system may be expressed as +.>. It should be noted that the description is given here by way of example only, and the origin of each coordinate system and the directions of the axes of the coordinate systems are not particularly limited.
Alternatively, a parameter table to be calibrated can be obtained according to parameters such as the length of each connecting rod and the angle of each joint in the robot, that is, a DH parameter table can be obtained, wherein the length of each connecting rod of each shaft joint of the robot can be filled in the DH parameter table Connecting rod corner->Corner of joint->And link offset->Wherein the link length may be used to represent the length of the joint axis common perpendicular to two adjacent axial joints. The link rotation angle may be used to represent the angle between the joint axes of two adjacent axial joints. The articulation angle may be used to represent the angle between two adjacent links about a common axis. The link offset distance may be used to represent the distance along the common axis of two adjacent links.
For example, a three degree of freedom surface planning robot DH parameters table for total knee replacement surgery as shown in the following table may be obtained, i may be used to represent the ith axial joint. The DH parameter table can be filled with the length, the rotation angle and the rotation angle of the connecting rod and the offset distance of the connecting rod of each shaft joint of the robot.
TABLE 1 DH parameter Table of three degree of freedom surface planning robot for total knee replacement surgery
Alternatively, the first transformation matrix, the second transformation matrix, and the third transformation matrix may be determined according to the DH parameter table as follows:
wherein,,can be used for representing the base and the first shaft jointA first transformation matrix therebetween; />(/>=1, 2, 3) can be used to represent the joint angles corresponding to the first, second, and third axis joints; l can be obtained from a three-dimensional drawing of the robot. / >May be used to represent a second transformation matrix between the first and second axis joints. />A third transformation matrix may be used between the second axis joint and the third axis joint.
Alternatively, based on the first transformation matrix, the second transformation matrix, and the third transformation matrix obtained by the above determination, the conversion relationship between the base coordinate system and the flange coordinate system of the robot positive kinematic solution may be derived by the following formula:
wherein,,can be used to represent the transformation relationship between the base coordinate system and the flange coordinate system, i.e. the positive kinematic solution of the robot.
Optionally, based on the first transformation matrix, the second transformation matrix and the third transformation matrix, performing hand-eye calibration on the robot, and determining a conversion relationship between a base coordinate system and a visual coordinate system of the robotThe conversion relation between the flange mark and the flange coordinate system can also be determined>Therefore, the calibrated two conversion relations can be determined as hand-eye calibration results.
Optionally, fig. 4 is a schematic diagram of a tool calibration fixture according to an embodiment of the present application, as shown in fig. 4, a visual identification Marker may be disposed on the tool calibration fixture (calibration fixture), and a mounting clamping slot seat of the osteotomy power saw blade may be disposed. When the calibration tool is clamped at the designated position of the osteotomy power saw blade, the pose of the visual identification Marker of the current calibration tool under the visual coordinate system can be recorded The pose of the current flange Marker under the visual coordinate system can also be recorded +.>. Factory calibration is carried out to obtain a conversion relation between a reference position of the clamping groove seat clamping bone-carrying power saw blade and a calibration tool Marker>
Alternatively, the transformation matrix of the tool center point (Tool Center Point, abbreviated as TCP) of the end tool to the flange Marker can be calculated by the following formulaAnd the transformation matrix of the tool center point in the robot-based coordinate system +.>
Wherein,,may be used to represent a first transformation relationship; />May be used to represent a second conversion relationship;the method can be used for representing the pose of the flange Marker under a visual coordinate system; />The visual identification Marker can be used for representing the pose of the visual identification Marker of the current calibration tool under a visual coordinate system; />The conversion relation between the reference position of the clamping groove seat clamping bone-carrying power saw blade obtained through factory calibration and the calibration tool Marker can be expressed; />Can be used for representing the transformation relation between the base coordinate system and the flange coordinate system; />The method can be used for representing the transformation relation between the flange Marker and the flange coordinate system.
Optionally, the first conversion relationship and the second conversion relationship may be calibrated, so that the first conversion relationship and the second conversion relationship obtained after the final calibration may be used as tool calibration results.
Optionally, a position conversion relation between the feature position of the 3D image of the human body part and the visual coordinate system can be establishedTherefore, registration is carried out based on the conversion relation, and a registration result of successful registration can be obtained. />
Step S204, defining a target position or a target motion plane.
In the technical solution provided in the above step S204 of the present application, the method may be used to define the position of the osteotomy target or the target movement plane in the visual coordinate systemTransforming relationshipsWherein->May be used to represent the sequence number of the osteotomy target location or target movement plane. Based on the transformation relation of the osteotomy target position or target motion plane in the visual coordinate system, the transformation relation of the osteotomy target position or target motion plane in the base coordinate system of the robot can be determined>
In the embodiment of the application, the target position of the shaft joint corresponding to the target position of the osteotomy can be determined by inverse kinematics, so that 75% of parameters in a homogeneous transformation matrix of a system where a conventional robot is positioned can be reduced, the real-time calculation amount is reduced, and the technical effect of improving the calculation efficiency of real-time tracking of the robot is further realized.
Alternatively, the robot may need to determine six target planes of motion at the knee joint to guide the physician in performing the disambiguation operation. Six target motion planes and the pose conversion matrix of the femur and tibia Marker are obtained through registration and operation planning . The transformation relation between the robot base coordinate system and the visual coordinate system can be determined through hand-eye calibration>The position and the posture of the femur tibia Marker in a visual coordinate system can be obtained in real time>
Optionally, based on the first pose relation matrix, the pose relation and the conversion relation between the calibrated base coordinate system and the visual coordinate system, the position conversion matrix of the target motion plane in the base coordinate system can be calculated through the following formula
FIG. 5 is a schematic view showing a state before osteotomy of a knee joint according to an embodiment of the present application, as shown in FIG. 5, in order to determine the state of the knee joint before the target movement plane, the target movement plane of the desired osteotomy can be determined by the above method, and a position conversion matrix of the target movement plane in the base frame can be determined. Fig. 6 is a schematic view of a state after osteotomy of a knee joint according to an embodiment of the present application, and as shown in fig. 6, a plane dividing the knee joint into two parts may be a target motion plane, which is a position conversion matrix in a base coordinate system.
Alternatively, the position conversion matrix may beIs subjected to a simplified processing of the data of (4X 4) homogeneous rotation matrixThe system contains 16 parameters in total, the number of the effective parameters is 12, and the position and posture information of the space full degree of freedom can be completely expressed through the 12 effective parameters, for example, as shown in the following formula:
Wherein,,may be used to represent location information; />May be used to represent the pose information. If the XOZ plane is set as the target motion plane, 12 effective parameters can be processed, removed and flattenedThe plane independent parameters only retain the related information expressing the target motion plane, and the new plane conversion relation of the target motion plane on the base coordinates can be obtained>The planar conversion relation->Comprising four parameters->The planar conversion relationship shown below can be obtained:
wherein,,
optionally, the simplified planar conversion relation of the target motion plane in the base coordinate systemThe data input as robot motion control is further located or tracked. Compared with single positioning, the effect of simplifying the data processing is more obvious for the tracking mode with high real-time requirement.
Step S206, calculating the inverse kinematics solution and selecting the inverse solution multiple solutions.
In the technical solution provided in the above step S206 of the present application, an analytical solution of the inverse kinematics solution of the robot may be calculated by a fast solution method, that is, the angle of each axis joint when the robot is positioned to the target position or plane is calculated. And (5) carrying out optimal solution selection on the multiple groups of calculated inverse solutions by using a corresponding rule.
Optionally, in obtaining the real-time plane conversion relation between the target motion plane and the base coordinate systemThereafter, the robot may passControlling the angle of each axis joint realizes the movement of the tail end of the robot to the target movement plane. Can be at the angle of the axis joint of the robot +.>As a parameter, the pose of the tip of the robot +.>Can be obtained by a positive kinematic model in the configuration design of the robot.
Alternatively, the positive kinematic model may be represented by the following formula:
alternatively, the principle of solving the positive kinematic model is as follows: obtaining a robot axis joint angle when the tip plane (set as the tip XOZ plane) of the robot is overlapped with the target motion planeIs a general solution expression of (2). The two planes can be said to coincide when the end XOZ plane of the robot is parallel to the target movement plane and the distance of the end point from the target movement plane is equal to zero.
Alternatively, the process may be carried out in a single-stage,the vector of the first three rows and the second column of the matrix may represent the normal vector of the terminal XOZ plane.Is->The normal vector of the target motion plane can be used as the unit vector, and if the two unit vectors are equal, the final description can be givenThe end XOZ plane is parallel to the target motion plane. The distance between the end point and the target motion plane can be equal to zero, and the following formulas (K) - (N) are obtained:
(K)
(L)
(M)
(N)
The above equation is an overrunning equation, solved as follows:
is obtained by (K):
(O)
the simultaneous (L) and (N) can be obtained:
(P)
(Q)
the simultaneous (M) and (Q) can be obtained:
(R)
(S)
the simultaneous (P) and (S) can be obtained:
(T)/>
to sum up: based on (O) (R) (T), the end result, i.e., the general solution, can be obtained:
wherein,,
alternatively, the rule for selecting the target solution from the general solutions is as follows: (1) According to the current guidance, the robot motion can not collide and interfere with the robot or other physical objects; (2) the current solution should not exceed a preset shaft boundary limit; (3) A set of solutions is selected that move from the current location to the target location at the fastest speed and consume the least energy.
Optionally, after acquiring the current shaft joint angle of the mechanical armDetermining the weight coefficients of a first shaft joint, a second shaft joint and a third shaft joint of the robot according to the comprehensive efficiency and energy consumption principleCan be selected such that +.>(/>Can be used to represent different sets of inverse solutions) the smallest set +.>The optimal solution is as follows:
step S208, planning movement to a target movement plane and displaying in real time.
In the technical scheme provided in the step S208, track planning is performed according to inverse kinematics solution, and the axis joint of the robot is driven to move to the target position and the tool position is displayed in real time.
Alternatively, the three-degree-of-freedom surface planning robot for total knee replacement surgery designed for the embodiments of the present application requires smooth motion and no collision. Compared with Cartesian space planning, when the axial joint space trajectory planning is carried out, the method is simple in algorithm and high in moving efficiency, and the problem of motion singularity of a mechanism does not occur, so that the trajectory planning is required to be carried out on an axial joint control.
Optionally, for a system of mutually coupling multiple axes of the robot, in the motion process, the load inertia of each axis joint is continuously changed, so that the conventional trapezoidal or parabolic planning cannot well meet the requirement, and the three-time/five-time polynomial planning can be used, so that the characteristic that the load inertia of each axis joint is continuously changed can be met to a certain extent.
For example, a conventional five-degree polynomial programming can be adopted, and six boundary conditions comprising the position, the speed, the acceleration and the like of the start state and the end state are utilized to solve the following programming model:
alternatively, the above may be brought into the following constraints:
wherein,,a position that can be used to represent an initial state; />A speed that can be used to represent an initial state; />Acceleration that may be used to represent an initial state; / >A position that can be used to represent an end state; />A speed that can be used to represent an end state; />May be used to represent acceleration of the end state.
Optionally, after the planning model is brought into the constraint condition, the planned target motion track can be solved to obtain the following formula:
optionally, after obtaining the target motion trail, the target motion trail may be issued to each axis joint of the robot, so as to guide the robot to perform the motion.
Alternatively, the already obtained tool center point to flange Marker conversion matrixThe conversion relation of the flange Marker in the visual coordinate system can be determined in real time>The real-time position of the end tool in the visual coordinate system may be calculated by the following formula and may be displayed in the display interface:
optionally, in the embodiment of the application, aiming at the situation that human body micro motion can occur in the total knee joint replacement operation, the three-degree-of-freedom plane planning robot system should adjust the position in real time to ensure the accuracy of positioning, so that the embodiment of the application can design a tracking algorithm to perform error processing on inaccurate actual pose data.
FIG. 7 is a schematic diagram of tracking control logic, as shown in FIG. 7, in which pose data is desired, in accordance with an embodiment of the present application Output via V-B module>The inverse solver module calculates the inverse kinematics solution and the selection of the inverse solution multiple solutions and the robot motion module are described above. Error processing module is not contained in the first circulation>. In the moving process of the robot, the actual pose data of the end tool in the visual coordinate system can be acquired in real time>Can compare the expected pose data +.>Thereby determining whether error handling is required.
Optionally, for the three-degree-of-freedom plane planning robot in the embodiment of the present application, the design objective is to determine a plane, so two requirements for designing a measurement error include: angle errors and distance errors. The angle error may be a desired planar normal vectorNormal vector to the plane in which the end tool is currently located +.>Is a space angle of (a). The distance error may be the projection distance +_ of the feature point of the desired plane and the current plane of the end tool to the normal vector of the desired plane>
Alternatively, the angle error may be processed by an error processing module in the tracking control logic as follows: calculated according to the following formulaAnd->Is>Rotation angle->From the Rodrigas rotation formula, the torque can be calculatedMatrix->
Wherein,,can be used for representing +. >The generated antisymmetric matrix; />;/>
Alternatively, the distance error may be processed by an error processing module in the tracking control logic as follows: since the measured distance is greatly influenced by the plane angle, the smaller the angle error is, the more accurate the distance error is described, and theAs a distance error parameter.
Alternatively, it may be byThe module can calculate the pose issuing matrix of the current period according to the following formula according to the input expected matrix and the error parameter obtained in the last cycle period:
step S210, performing osteotomy operation in the target motion plane by means of the end tool.
In the solution provided in the above step S210 of the present application, the in-plane osteotomy may be performed by using the end tool loaded on the flange.
Alternatively, the end tool may be mounted on a flange (motion terminal) of the robot via a planar hinge linkage, and the doctor may grasp the end tool along the hinge plane, i.e., the target motion plane, for osteotomy operations while the robot positions and tracks the target motion plane.
According to the embodiment of the application, the parameters to be calibrated of the robot can be calibrated by determining the relative position conversion relation and/or the posture conversion relation between the visual coordinate system of the visual acquisition equipment and the axial joint coordinate system and the motion terminal coordinate system of the robot, which can meet the knee joint operation requirement information, so as to obtain a calibration result. After the robot is calibrated, a target motion plane for executing the osteotomy operation behavior when the robot is required to execute the osteotomy operation behavior can be determined, the target position for controlling the shaft joint to be reached when the robot is required to execute the osteotomy operation behavior can be solved according to the fact that the motion tail end of the robot reaches the target motion plane, so that each shaft joint of the robot can be controlled to move to the corresponding target position to execute the osteotomy operation behavior, whether deviation occurs in the motion track of the shaft joint can be detected in real time in the process of executing the osteotomy operation behavior, if deviation occurs, the motion track can be planned again to carry out error processing, and the aim of improving the accuracy of the target position can be achieved by taking the problem that the accuracy of the knee joint of the robot is low in view of solving the problem that the accuracy of the operation accuracy of the knee joint of the robot is improved by considering the error in the process of executing the osteotomy operation behavior, and the technical effect of improving the accuracy of the knee joint of the robot is achieved.
Example 3
According to the embodiment of the application, a control device of the robot is also provided. The control device of the robot may be used to execute the control method of the robot in embodiment 1.
Fig. 8 is a schematic view of a control device of a robot according to an embodiment of the present application. As shown in fig. 8, the control device 800 of the robot may include: a calibration unit 802, a first determination unit 804, a second determination unit 806 and a processing unit 808.
The calibration unit 802 is configured to calibrate the robot based on a conversion relationship between a visual coordinate system of a visual acquisition device of the robot, an axis joint coordinate system of the robot, and a motion terminal coordinate system, so as to obtain a calibration result, where the conversion relationship includes position information and/or posture information corresponding to knee joint operation requirement information.
The first determining unit 804 is configured to determine a target motion plane of the robot based on the calibration result, where the target motion plane is a plane where a position for performing an osteotomy operation that satisfies the knee joint operation requirement information is located.
A second determining unit 806, configured to determine, based on the target movement plane, a target position of an axis joint of the robot corresponding to the knee joint surgery requirement information, where the target position is used to represent a position and an angle at which the axis joint is located when performing the osteotomy operation.
And the processing unit 808 is used for controlling the shaft joint to move to the target position to execute the osteotomy operation, responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and performing error processing on the movement track to obtain a processing result.
Optionally, the apparatus may include: and a third determining unit for determining a first conversion relation between an osteotomy target position or a visual coordinate system and the target motion plane on the target motion plane, wherein the osteotomy target position is used for representing a position for executing osteotomy operation behaviors.
Alternatively, the second determining unit 806 may include: the first processing module is used for determining a first pose conversion matrix between the target motion plane and the femur and tibia mark based on a first conversion relation and determining a pose relation under a visual coordinate system based on femur and tibia mark, wherein the first pose conversion matrix is used for representing the pose conversion relation between the target motion plane and the femur and tibia mark; the first determining module is used for determining a position conversion matrix of the target motion plane on the base coordinate system based on the first pose relation matrix, the pose relation and the conversion relation between the base coordinate system and the visual coordinate system in the calibrated axis joint coordinate system; and the second determining module is used for determining the target position based on the position conversion matrix.
Optionally, the second determining module may include: the first processing submodule is used for simplifying the position conversion matrix to obtain a plane conversion relation of the simplified target motion plane in the base coordinate system; the first determining submodule is used for determining a positive kinematic model of a plane conversion relation based on positive kinematics in the process of controlling a motion terminal of the robot to a target motion plane; the second determining submodule is used for determining the general solution of the positive kinematic model when the motion terminal is coincident with the target motion plane; the selecting sub-module is used for selecting a target solution meeting preset conditions from general solutions, wherein the target solution is used for representing the angle of the shaft joint when the target position is positioned or the target motion plane where the target position is positioned.
Optionally, the second determining module may further include: and the third determining submodule is used for responding to the solution in the general solution, controlling the robot to perform collision interference with the target object in the process of moving, responding to the solution without exceeding the shaft boundary limit, and responding to the solution, and determining the solution as the target solution when the speed is greater than or equal to the speed threshold and the consumed energy is less than or equal to the energy threshold when the robot moves to the target position.
Optionally, the processing unit 808 may include: the first acquisition module is used for acquiring start and end state data before the shaft joint executes the osteotomy operation behavior and after the shaft joint executes the osteotomy operation behavior; the planning module is used for performing polynomial planning of target times based on the start-end state data to obtain a planning model, wherein the planning model is used for planning a track for controlling the movement of the robot; the second processing module is used for bringing the planning model into constraint conditions to obtain a planned target motion track; and the control module is used for controlling the shaft joint to move to the target position according to the target motion track.
Optionally, the apparatus may further include: a fourth determining unit configured to determine actual position data of the end tool of the robot in the visual coordinate system based on a second conversion relationship for representing a conversion condition between a tool center point of the end tool of the robot and a base coordinate system in an axis joint coordinate system of the robot and a conversion relationship between a flange mark of the robot for deployment on the end tool of the robot and the visual coordinate system; and the display unit is used for displaying the actual position data on a display interface of the robot.
Optionally, the processing unit 808 may further include: the second acquisition module is used for acquiring actual pose data of the end tool of the robot under a visual coordinate system in the process of executing osteotomy operation behaviors, wherein the actual pose data are used for representing a motion trail; the comparison module is used for comparing the actual pose data with the expected pose data to obtain a comparison result, wherein the comparison result is used for indicating whether the actual pose data is identical with the expected pose data or not; and the third processing module is used for carrying out error processing on the actual pose data to obtain a processing result in response to the comparison result that the actual pose data is different from the expected pose data.
Optionally, the third processing module may include: a fourth determination submodule for determining a rotation matrix after angle error processing based on the rotation axis and the rotation angle of the robot and the actual plane normal vector and the expected plane normal vector of the end tool; and a fifth determining submodule, configured to determine a pose issuing matrix after distance error processing based on the expected matrix of the robot and the error parameter in the previous cycle period, where the pose issuing matrix is a processing result.
Alternatively, the calibration unit 802 may include: the first calibration module is used for calibrating the hand and the eye of the robot based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a hand and eye calibration result; and the second calibration module is used for calibrating the tool of the robot based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a tool calibration result, wherein the calibration result comprises a hand-eye calibration result and a tool calibration result.
Alternatively, the first calibration module may include: a fifth determining submodule, configured to determine a first transformation matrix between a base of the robot and a first axis joint, determine a second transformation matrix between the first axis joint and a second axis joint, and determine a third transformation matrix between the second axis joint and a third axis joint of the robot based on parameters to be calibrated of the robot, where the axis joint includes the first axis joint, the second axis joint, and the third axis joint, the first transformation matrix is used to characterize a transformation relationship between the first axis joint and the base, the second transformation matrix is used to characterize a transformation relationship between the first axis joint and the second axis joint, and the third transformation matrix is used to characterize a transformation relationship between the second axis joint and the third axis joint; and the sixth determining submodule is used for calibrating the hand and the eye of the robot based on the first transformation matrix, the second transformation matrix and the third transformation matrix, and determining the conversion relation between the base coordinate system and the visual coordinate system in the shaft joint coordinate system after the hand and the eye are calibrated and the conversion relation between the moving terminal coordinate system and the flange mark of the robot as the hand and eye calibration result.
Optionally, the second calibration module may include: a seventh determining submodule, configured to determine, based on parameters to be calibrated of the robot, a second conversion relationship between a tool center point of an end tool of the robot and a flange mark of the robot, and a third conversion relationship between the tool center point and a base coordinate system in an axis joint coordinate system; and the calibration sub-module is used for calibrating the tool of the robot based on the second conversion relation and the third conversion relation to obtain a tool calibration result.
Optionally, the apparatus may further include: a fifth determining unit, configured to determine an image feature position of a target object that needs to perform an osteotomy operation; and the registration registering unit is used for registering the robot based on the conversion relation between the image characteristic position and the visual coordinate system to obtain a registration result.
In the embodiment of the application, the calibration unit is used for calibrating the robot based on the conversion relation among the visual coordinate system of the visual acquisition equipment of the robot, the axis joint coordinate system of the robot and the motion terminal coordinate system to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information; determining a target motion plane of the robot based on a calibration result through a first determining unit, wherein the target motion plane is a plane where a position for executing osteotomy operation behaviors meeting knee joint operation requirement information is located; determining, by a second determining unit, a target position of an axial joint of the robot corresponding to the knee joint operation requirement information based on a target motion plane, wherein the target position is used for representing a position and an angle of the axial joint when performing an osteotomy operation; the processing unit is used for controlling the shaft joint to move to the target position to execute the osteotomy operation, and responding to errors of the movement track of the shaft joint in the process of executing the osteotomy operation, and performing error processing on the movement track to obtain a processing result, thereby solving the technical problem of low accuracy of the knee joint operation of the robot and realizing the technical effect of improving the accuracy of the knee joint operation of the robot.
Example 4
According to an embodiment of the present application, there is also provided a computer-readable storage medium including a stored program, wherein the program executes the control method of the robot described in embodiment 1.
Example 5
According to an embodiment of the present application, there is also provided a processor for running a program, wherein the program executes the control method of the robot described in embodiment 1 when running.
Example 6
According to an embodiment of the present application, there is further provided an electronic device, fig. 9 is an electronic device according to an embodiment of the present application, and as shown in fig. 9, the electronic device includes a processor, a storage medium, and a program stored on the memory and executable on the processor, and the processor implements the following steps when executing the program: acquiring a visual coordinate system of a visual acquisition device, an axial joint coordinate system of a robot and a motion terminal coordinate system; calibrating the robot based on a conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information; determining a target position of an axial joint of the robot corresponding to the calibrated knee joint operation requirement information based on a calibration result, wherein the target position is used for representing the position and the angle of the axial joint when the osteotomy operation is executed; and controlling the robot to move to the target position to execute the osteotomy operation.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (16)

1. A control method of a robot, comprising:
calibrating the robot based on a conversion relation among a visual coordinate system of visual acquisition equipment of the robot, an axial joint coordinate system of the robot and a motion terminal coordinate system to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information;
Determining a target motion plane of the robot based on the calibration result, wherein the target motion plane is a plane where a position for executing osteotomy operation behaviors meeting the knee joint operation requirement information is located;
determining a target position of an axial joint of the robot corresponding to the knee joint operation requirement information based on the target motion plane, wherein the target position is used for representing the position and the angle of the axial joint when the osteotomy operation behavior is executed;
and controlling the shaft joint to move to the target position to execute the osteotomy operation, and responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and performing error processing on the movement track to obtain a processing result.
2. The method of claim 1, wherein after determining the target motion plane of the robot based on the calibration result, the method further comprises:
and determining an osteotomy target position or a first conversion relation between the visual coordinate system and the target motion plane on the target motion plane, wherein the osteotomy target position is used for representing a position for executing the osteotomy operation behavior.
3. The method according to claim 2, characterized in that: based on the target motion plane, determining a target position of an axial joint of the robot corresponding to the knee joint surgery requirement information comprises:
determining a first pose conversion matrix between the target motion plane and a femoral tibia mark based on the first conversion relation, and calibrating the pose relation under the visual coordinate system based on the femoral tibia, wherein the first pose conversion matrix is used for representing the pose conversion relation between the target motion plane and the femoral tibia mark;
determining a position conversion matrix of the target motion plane on a basic coordinate system based on the first pose relation matrix, the pose relation and the conversion relation between the basic coordinate system and the visual coordinate system in the calibrated axial joint coordinate system;
the target location is determined based on the location transformation matrix.
4. A method according to claim 3, wherein determining the target location based on the location transformation matrix comprises:
simplifying the position conversion matrix to obtain a plane conversion relation of the simplified target motion plane in the base coordinate system;
In the process of controlling the motion terminal of the robot to the target motion plane, determining a positive kinematic model of the plane conversion relation based on positive kinematics;
determining a general solution of the positive kinematic model when the motion terminal is coincident with the target motion plane;
and selecting a target solution meeting a preset condition from the general solutions, wherein the target solution is used for representing the angle of the shaft joint when the target position is positioned or the target movement plane of the target position is positioned.
5. The method of claim 4, wherein selecting a target solution from the general solutions that satisfies a preset condition comprises:
and in response to the solution in the general solution, collision interference with a target object does not occur in the process of controlling the robot to move, the solution does not exceed an axis boundary limit, and in response to the solution, the speed is greater than or equal to a speed threshold and the consumed energy is less than or equal to an energy threshold when moving to the target position, based on the solution, the solution is determined to be the target solution.
6. The method of claim 1, wherein controlling the movement of the shaft joint to the target location to perform the osteotomy maneuver comprises:
Acquiring start and end state data of the shaft joint before the osteotomy operation is executed and after the osteotomy operation is executed;
performing polynomial programming of target times based on the start and end state data to obtain a programming model, wherein the programming model is used for programming a track for controlling the robot to move;
bringing the planning model into constraint conditions to obtain a planned target motion trail;
and controlling the shaft joint to move to the target position according to the target movement track.
7. The method according to any one of claims 1 to 6, further comprising:
determining actual position data of an end tool of the robot in the visual coordinate system based on a second conversion relation for representing a conversion condition between a tool center point of the end tool of the robot and a base coordinate system in an axis joint coordinate system of the robot and a conversion relation between a flange mark of the robot and the visual coordinate system, wherein the flange mark is used for being deployed on the end tool of the robot;
and displaying the actual position data on a display interface of the robot.
8. The method of claim 1, wherein in response to an error in the motion trajectory of the shaft joint during performance of the osteotomy maneuver, performing error processing on the motion trajectory to obtain a processed result, comprising:
acquiring actual pose data of an end tool of the robot under the visual coordinate system in the process of executing the osteotomy operation behavior, wherein the actual pose data are used for representing the motion trail;
comparing the actual pose data with expected pose data to obtain a comparison result, wherein the comparison result is used for indicating whether the actual pose data is identical with the expected pose data or not;
and responding to the comparison result that the actual pose data is different from the expected pose data, and performing error processing on the actual pose data to obtain the processing result.
9. The method of claim 8, wherein in response to the comparison result being that the actual pose data is different from the desired pose data, performing error processing on the actual pose data to obtain the processing result, comprising:
determining a rotation matrix after angle error processing based on a rotation axis and a rotation angle of the robot and an actual plane normal vector and an expected plane normal vector of the end tool; and, in addition, the method comprises the steps of,
And determining a pose issuing matrix after distance error processing based on the expected matrix of the robot and error parameters in the previous cycle period, wherein the pose issuing matrix is the processing result.
10. The method of claim 1, wherein calibrating the robot based on a conversion relation among a visual coordinate system of a visual acquisition device of the robot, an axis joint coordinate system of the robot, and a motion terminal coordinate system, to obtain a calibration result comprises:
and calibrating the robot by hand and eye based on the conversion relation among the visual coordinate system, the axis joint coordinate system and the motion terminal coordinate system to obtain a hand and eye calibration result, and calibrating the robot by a tool to obtain a tool calibration result, wherein the calibration result comprises the hand and eye calibration result and the tool calibration result.
11. The method of claim 10, wherein performing hand-eye calibration of the robot based on the conversion relationship among the visual coordinate system, the axis joint coordinate system, and the motion terminal coordinate system to obtain a hand-eye calibration result comprises:
Determining a first transformation matrix between a base and a first shaft joint of the robot, determining a second transformation matrix between the first shaft joint and a second shaft joint, and determining a third transformation matrix between the second shaft joint and a third shaft joint of the robot based on parameters to be calibrated of the robot, wherein the shaft joint comprises the first shaft joint, the second shaft joint and the third shaft joint, the first transformation matrix is used for representing a transformation relationship between the first shaft joint and the base, the second transformation matrix is used for representing a transformation relationship between the first shaft joint and the second shaft joint, and the third transformation matrix is used for representing a transformation relationship between the second shaft joint and the third shaft joint;
and based on the first transformation matrix, the second transformation matrix and the third transformation matrix, calibrating the hand and the eye of the robot, and determining the conversion relation between the base coordinate system and the visual coordinate system in the axis joint coordinate system after the hand and the eye calibration and the conversion relation between the motion terminal coordinate system and the flange mark of the robot as the hand and the eye calibration result.
12. The method of claim 10, wherein performing tool calibration on the robot based on the conversion relationship among the visual coordinate system, the axis joint coordinate system, and the motion terminal coordinate system to obtain a tool calibration result comprises:
determining a second conversion relation between a tool center point of an end tool of the robot and a flange mark of the robot and a third conversion relation between the tool center point and a base coordinate system in the axis joint coordinate system based on parameters to be calibrated of the robot;
and carrying out tool calibration on the robot based on the second conversion relation and the third conversion relation to obtain a tool calibration result.
13. The method according to claim 10, wherein after performing hand-eye calibration on the robot based on the conversion relation among the visual coordinate system, the axis joint coordinate system, and the motion terminal coordinate system to obtain a hand-eye calibration result, and performing tool calibration on the robot to obtain a tool calibration result, the method further comprises:
determining the image characteristic position of a target object needing to execute the osteotomy operation behavior;
And registering the robot based on the conversion relation between the image characteristic position and the visual coordinate system to obtain a registration result.
14. A control device for a robot, comprising:
the calibration unit is used for calibrating the robot based on a conversion relation among a visual coordinate system of visual acquisition equipment of the robot, an axis joint coordinate system of the robot and a motion terminal coordinate system to obtain a calibration result, wherein the conversion relation comprises position information and/or posture information corresponding to knee joint operation requirement information;
the first determining unit is used for determining a target motion plane of the robot based on the calibration result, wherein the target motion plane is a plane where a position for executing osteotomy operation behaviors meeting the knee joint operation requirement information is located;
a second determining unit, configured to determine, based on the target movement plane, a target position of an axial joint of the robot corresponding to the knee joint operation requirement information, where the target position is used to represent a position and an angle at which the axial joint is located when the osteotomy operation behavior is performed;
And the processing unit is used for controlling the shaft joint to move to the target position to execute the osteotomy operation, responding to the error of the movement track of the shaft joint in the process of executing the osteotomy operation, and carrying out error processing on the movement track to obtain a processing result.
15. A processor, characterized in that the processor is adapted to run a program, wherein the program when run performs the control method of the robot according to any one of claims 1 to 13.
16. An electronic device comprising one or more processors and a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of controlling a robot of any of claims 1-13.
CN202310964547.2A 2023-08-02 2023-08-02 Robot control method and device, processor and electronic equipment Active CN116672031B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310964547.2A CN116672031B (en) 2023-08-02 2023-08-02 Robot control method and device, processor and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310964547.2A CN116672031B (en) 2023-08-02 2023-08-02 Robot control method and device, processor and electronic equipment

Publications (2)

Publication Number Publication Date
CN116672031A true CN116672031A (en) 2023-09-01
CN116672031B CN116672031B (en) 2023-12-19

Family

ID=87791309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310964547.2A Active CN116672031B (en) 2023-08-02 2023-08-02 Robot control method and device, processor and electronic equipment

Country Status (1)

Country Link
CN (1) CN116672031B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117921684A (en) * 2024-03-22 2024-04-26 北京壹点灵动科技有限公司 Control method and device of mechanical arm, storage medium and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170340389A1 (en) * 2016-05-27 2017-11-30 Mako Surgical Corp. Preoperative planning and associated intraoperative registration for a surgical system
CN113842213A (en) * 2021-09-03 2021-12-28 北京长木谷医疗科技有限公司 Surgical robot navigation positioning method and system
CN115089302A (en) * 2022-06-27 2022-09-23 苏州微创畅行机器人有限公司 Surgical robot system and method
CN115381526A (en) * 2022-08-10 2022-11-25 南京鼓楼医院 Robot system for assisting in osteotomy around acetabulum
WO2023280310A1 (en) * 2021-07-09 2023-01-12 武汉联影智融医疗科技有限公司 Surgical robot and control method therefor
CN115844546A (en) * 2023-02-23 2023-03-28 北京壹点灵动科技有限公司 Bone cutting method, device, storage medium and processor
CN116277035A (en) * 2023-05-15 2023-06-23 北京壹点灵动科技有限公司 Robot control method and device, processor and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170340389A1 (en) * 2016-05-27 2017-11-30 Mako Surgical Corp. Preoperative planning and associated intraoperative registration for a surgical system
WO2023280310A1 (en) * 2021-07-09 2023-01-12 武汉联影智融医疗科技有限公司 Surgical robot and control method therefor
CN113842213A (en) * 2021-09-03 2021-12-28 北京长木谷医疗科技有限公司 Surgical robot navigation positioning method and system
CN115089302A (en) * 2022-06-27 2022-09-23 苏州微创畅行机器人有限公司 Surgical robot system and method
CN115381526A (en) * 2022-08-10 2022-11-25 南京鼓楼医院 Robot system for assisting in osteotomy around acetabulum
CN115844546A (en) * 2023-02-23 2023-03-28 北京壹点灵动科技有限公司 Bone cutting method, device, storage medium and processor
CN116277035A (en) * 2023-05-15 2023-06-23 北京壹点灵动科技有限公司 Robot control method and device, processor and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117921684A (en) * 2024-03-22 2024-04-26 北京壹点灵动科技有限公司 Control method and device of mechanical arm, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN116672031B (en) 2023-12-19

Similar Documents

Publication Publication Date Title
Fujita et al. Passivity-based dynamic visual feedback control for three-dimensional target tracking: Stability and $ L_ {2} $-gain performance analysis
D'Souza et al. Learning inverse kinematics
Yoshimi et al. Active, uncalibrated visual servoing
CN116672031B (en) Robot control method and device, processor and electronic equipment
CN109153125A (en) For orienting the method and industrial robot of industrial robot
CN116277035B (en) Robot control method and device, processor and electronic equipment
CN105082161A (en) Robot vision servo control device of binocular three-dimensional video camera and application method of robot vision servo control device
Li et al. Performance of surgical robots with automatically generated spatial virtual fixtures
CN113524201B (en) Active adjusting method and device for pose of mechanical arm, mechanical arm and readable storage medium
Gondokaryono et al. An approach to modeling closed-loop kinematic chain mechanisms, applied to simulations of the da vinci surgical system
Šuligoj et al. Medical applicability of a low-cost industrial robot arm guided with an optical tracking system
CN114654466B (en) Automatic calibration method, device, system, electronic equipment and storage medium
Shen et al. Automatic camera calibration for a multiple-sensor integrated coordinate measurement system
Wu et al. Leveraging vision and kinematics data to improve realism of biomechanic soft tissue simulation for robotic surgery
CN115179297A (en) Method and system for controlling joint limit of joint in combined obstacle avoidance mode through position and posture of surgical robot
Maric et al. Unsupervised optimization approach to in situ calibration of collaborative human-robot interaction tools
Fried et al. Uncalibrated image-based visual servoing approach for translational trajectory tracking with an uncertain robot manipulator
Vahrenkamp et al. Planning multi-robot grasping motions
Sun et al. Adaptive fusion-based autonomous laparoscope control for semi-autonomous surgery
Hanses et al. Hand-guiding robots along predefined geometric paths under hard joint constraints
Tauscher et al. High-accuracy drilling with an image guided light weight robot: autonomous versus intuitive feed control
Sun et al. Development of a novel intelligent laparoscope system for semi‐automatic minimally invasive surgery
CN116392253A (en) Active positioning method and system applied to surgical robot
CN113814978B (en) Robot control method, robot control device, robot, and storage medium
CN115813556A (en) Surgical robot calibration method and device, surgical robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant