CN113413214B - Surgical robot manpower feedback method and device based on mixed reality guidance - Google Patents

Surgical robot manpower feedback method and device based on mixed reality guidance Download PDF

Info

Publication number
CN113413214B
CN113413214B CN202110564837.9A CN202110564837A CN113413214B CN 113413214 B CN113413214 B CN 113413214B CN 202110564837 A CN202110564837 A CN 202110564837A CN 113413214 B CN113413214 B CN 113413214B
Authority
CN
China
Prior art keywords
surgical instrument
mechanical arm
tail end
surgical
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110564837.9A
Other languages
Chinese (zh)
Other versions
CN113413214A (en
Inventor
陈晓军
涂朴勋
郭妍
李东远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202110564837.9A priority Critical patent/CN113413214B/en
Publication of CN113413214A publication Critical patent/CN113413214A/en
Application granted granted Critical
Publication of CN113413214B publication Critical patent/CN113413214B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/77Manipulators with motion or force scaling
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Veterinary Medicine (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Urology & Nephrology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a method and a device for feeding back surgical robot manpower based on mixed reality guidance, wherein the method comprises the following steps: acquiring a basic image, and dividing a path area for moving the surgical instrument at the tail end of the mechanical arm and a forbidden area for forbidding the surgical instrument at the tail end of the mechanical arm to enter into the basic image, wherein the forbidden area is a dynamic area; obtaining an interaction force, and controlling the speed of the surgical instrument at the tail end of the mechanical arm according to the input force; acquiring the position of a surgical instrument at the tail end of a mechanical arm; controlling the surgical instrument at the tail end of the mechanical arm to move along the path region; when the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area is smaller than a preset allowance value, generating feedback resistance according to the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area, wherein the magnitude of the feedback resistance is inversely related to the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area. Compared with the prior art, the robot control method has the advantages of improving the robot control accuracy and the like.

Description

Surgical robot manpower feedback method and device based on mixed reality guidance
Technical Field
The invention relates to a robot control method, in particular to a surgical robot manpower feedback method and device based on mixed reality guidance.
Background
With the development and progress of medical image processing, optical positioning and tracking, and robot control technology, computer-assisted surgical robot systems are increasingly widely used in various clinical surgeries. Compared with the traditional free-hand operation and operation navigation system, the surgical robot has the following advantages: and (1) the precision is high. Based on accurate optical positioning and a precise mechanical structure, the operation can be accurately executed according to the operation planning path. And (2) the performance is stable. The speed and the pose of the robot are controlled by a control system and are not influenced by artificial subjective experience. (3) is more minimally invasive. The operation field does not need to be exposed in a large area in the operation, thereby being beneficial to relieving the pain of a patient in the operation and reducing the postoperative complications.
However, for an operation with a narrow surgical field where important soft tissues are partially densely covered, the safety of the surgical robot limits its large-scale clinical application. At present, a force control method and a man-machine cooperation control method are mostly adopted by a surgical robot to control the safety in the surgical process. For example, chinese patent CN 202010724934.5 discloses a safety boundary and force control method for a surgical robot, which restrains the movement of the surgical robot by combining position information and force sensing data of the end of a mechanical arm. The method does not provide visual information during operation, lacks the participation of an operator, and has limited safety performance by the effectiveness of the control method. Chinese patent CN 201910535890.9 discloses a remote operation robot with virtual technology and 3D printing, which performs an operation on a model through a control end, and performs guidance on a remote operation end to perform the operation. The method is essentially an operation type robot, the virtual technology is only used for imaging, safety constraint is lacked, and the operation safety is greatly influenced by experience of operators.
In summary, although some conventional surgical robots have a safety constraint method, automatic safety control is often adopted, and a safety control strategy for intraoperative mixed reality imaging guidance is lacking.
Disclosure of Invention
The invention aims to provide a method and a device for feeding back the manpower of a surgical robot based on the guidance of mixed reality, which can assist in controlling the motion of the robot by dividing a region through the mixed reality technology, and based on the method, a feedback resistance is added according to the distance between a surgical instrument and an forbidden region, so that the user experience and the control precision of the robot can be improved, and the soft tissue of a variable region can be protected.
The purpose of the invention can be realized by the following technical scheme:
a surgical robotic human feedback method based on mixed reality guidance, comprising:
acquiring a basic image, and dividing a path area for moving the surgical instrument at the tail end of the mechanical arm and a forbidden area for forbidding the surgical instrument at the tail end of the mechanical arm to enter into the basic image, wherein the forbidden area is a dynamic area;
obtaining an interaction force, and controlling the speed of the surgical instrument at the tail end of the mechanical arm according to the input force;
acquiring the position of a surgical instrument at the tail end of a mechanical arm;
controlling the surgical instrument at the tail end of the mechanical arm to move along the path region;
when the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area is smaller than a preset allowance value, generating feedback resistance according to the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area, wherein the magnitude of the feedback resistance is inversely related to the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area.
The margin value is positively correlated with the terminal speed of the mechanical arm at the current moment.
The feedback resistance is specifically:
Figure BDA0003080554890000021
wherein: f is feedback resistance, alpha is constant, d i Distance of the position of the end-of-arm surgical instrument from the forbidden zone, d max To a margin value, P i The nodes closest to the surgical instruments to the forbidden area.
The control of the movement of the surgical instrument at the tail end of the mechanical arm along the path region specifically comprises the following steps: after the surgical instrument at the end of the instrument arm moves to the starting point of the path area, the 4 degrees of freedom of the end of the mechanical arm except for the back-and-forth movement and the rotation are restricted, so that the surgical instrument can only move from the starting point to the end point along the path area.
The speed of the mechanical arm tail end surgical instrument is specifically as follows:
V t+1 =A·V t +(I 6×6 -A)·B -1 ·F
wherein, V t+1 The velocity of the end of the arm at time t +1, A being a constant matrix, V t The tip speed of the robot arm at time t,I 6×6 is a 6 x 6 unit matrix, B is damping, and F is input force.
A surgical robot human force feedback device based on mixed reality guidance comprises a processor, a memory and a program, and is characterized in that the processor realizes the method when executing the program.
Compared with the prior art, the invention has the following beneficial effects:
1) The region is divided through the mixed reality technology, so that the motion of the robot can be controlled in an auxiliary mode, and based on the motion, feedback resistance is increased according to the distance between the surgical instrument and the forbidden region, so that the user experience and the control precision of the robot can be improved, and soft tissues of the variable region are protected.
2) The intraoperative safety mechanism is established from multiple aspects of vision and touch, and is safer compared with the traditional method.
3) The mixed reality and the robot can help a doctor to complete the operation more intuitively, and the learning curvature is reduced.
Drawings
FIG. 1 is a schematic diagram of an implementation system of the present invention;
FIG. 2 is a schematic diagram of a method for implementing the present invention;
fig. 3 is a diagram of a mixed reality guided security policy.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. The present embodiment is implemented on the premise of the technical solution of the present invention, and a detailed implementation manner and a specific operation process are given, but the scope of the present invention is not limited to the following embodiments.
A method for feeding back the manpower of surgical robot based on the guidance of mixed reality is realized by a set of robot and its control system in the form of computer program as shown in figure 1, the whole set of control system is composed of two parts: a mixed reality security policy and a force feedback aware security policy.
Wherein, mixed reality safety strategy is realized through the optics helmet display that partly sees through the surgeon wears, specifically comprises the triplex: (1) The space virtual-real fusion of important soft and hard tissues, so that a surgeon can have more visual space perception on invisible real tissues by observing virtual subcutaneous tissues, and the important tissues are prevented from being injured in the operation. (2) The preoperatively planned path is superposed in a real operation visual field in real time, so that a surgeon can more clearly know the preoperative planning result and visually and intuitively judge the path deviation which possibly appears in the operation. (3) The virtual organization and the virtual planning path are dynamically changed, so that the displacement of the patient in the operation can not influence the precision of the virtual-real fusion and the dynamic superposition. (4) The real-time display of important parameters in the operation enables a surgeon to visually know the precision in the operation process.
The force feedback perception security strategy is realized by a six-dimensional force sensor and specifically comprises three parts: (1) And free traction is performed in the process that the surgical robot moves to the surgical planning starting point, and the surgeon freely moves to the surgical path planning starting point by dragging the mechanical arm. (2) And the guiding clamp is executed in the operation process of the surgical robot, and ensures that the tail end surgical tool moves from a starting point to an end point along a preoperatively planned path. (3) And the clamp is forbidden to be executed in the operation process of the surgical robot, so that the surgical instrument cannot move to a preset forbidden area in the operation process, and the operation safety is ensured.
The mixed reality safety strategy and the force feedback perception strategy both take a surgeon as a main body, participate in a safety control closed loop in the whole process, and specifically execute safety control in a man-machine interaction mode. The human-computer interaction mode for the mixed reality strategy mainly comprises the following steps: (1) And a gesture instruction, namely completing the preoperative hybrid operation process through the gesture instruction, initializing the virtual organization, and moving each component of the hybrid reality virtual control panel. (2) And the voice instruction is used for finishing various operations in the operation, so that the various operations in the operation are ensured to be carried out in a sterile environment. The man-machine interaction mode for force feedback perception mainly comprises the following steps: (1) And the operation interface is used for completing various preoperative operation flows through the real operation interface of the workstation. (2) And manual interaction, wherein the surgeon performs operation according to the planned path by dragging the robot.
As shown in fig. 2, the method mainly comprises the following steps:
acquiring a basic image, and dividing a path area for movement of the surgical instrument at the tail end of the mechanical arm and a forbidden area for forbidding the surgical instrument at the tail end of the mechanical arm to enter into the basic image, wherein the two areas are three-dimensional areas, and the forbidden area is a dynamic area;
acquiring an interaction force, and controlling the speed of the surgical instrument at the tail end of the mechanical arm according to the input force, wherein the speed of the surgical instrument at the tail end of the mechanical arm is specifically as follows:
V t+1 =A·V t +(I 6×6 -A)·B -1 ·F
wherein, V t+1 The velocity of the end of the arm at time t +1, A being a constant matrix, V t End velocity of the robot arm at time t, I 6×6 The constant matrix is a unit matrix of 6 multiplied by 6, B is damping, F is input force, and the constant matrix is specifically as follows:
Figure BDA0003080554890000041
a mechanical arm speed limiting function is set, and when the tail end input force is larger than the upper limit of a set threshold value, the tail end speed is kept unchanged; when the tip input force is less than the set upper threshold, the mechanical arm tip remains stable. Eventually making the process of free traction more compliant.
Wherein T is the time interval, and M is the mass of the mechanical arm.
Acquiring the position of a surgical instrument at the tail end of a mechanical arm;
controlling the surgical instrument at the tail end of the mechanical arm to move along the path area;
when the distance between the position of the surgical instrument at the end of the mechanical arm and the forbidden area is smaller than a preset margin value, generating a feedback resistance according to the distance between the position of the surgical instrument at the end of the mechanical arm and the forbidden area, wherein the magnitude of the feedback resistance and the distance between the position of the surgical instrument at the end of the mechanical arm and the forbidden area are inversely related, and in one embodiment, the feedback resistance is specifically:
Figure BDA0003080554890000051
wherein: f is feedback resistance, alpha is constant, d i Distance of the position of the end-of-arm surgical instrument from the forbidden zone, d max To a margin value, P i The nodes closest to the forbidden area are the surgical instruments. . Is set when d i A value of (d) exceeds max In the process, the size of the reaction force received by the tail end of the mechanical arm is 0, so that the interference on the functions of free traction and guide clamp is avoided.
In some embodiments, the margin value is positively correlated to the tip speed of the robotic arm at the current time.
The control of the movement of the surgical instrument at the tail end of the mechanical arm along the path region specifically comprises the following steps: after the surgical instrument at the end of the instrument arm moves to the starting point of the path area, the 4 degrees of freedom of the end of the mechanical arm except for the back-and-forth movement and the rotation are restricted, so that the surgical instrument can only move from the starting point to the end point along the path area.
As shown in fig. 3, the dynamic superposition of the virtual and real soft and hard tissues is realized, that is, the conversion relationship between the three-dimensional virtual model and the patient/reference coordinate system is obtained, and the specific principle is as follows: first, the mixed reality glasses are registered to obtain the conversion relation between the mixed reality glasses and the optical positioning tracker
Figure BDA0003080554890000052
The optical positioning tracker can acquire the pose of the patient in real time
Figure BDA0003080554890000053
Translation relationships between patients and virtual models in a workstation
Figure BDA0003080554890000054
Is stationary. Therefore, we can get the conversion relationship of the virtual model to the mixed reality glasses:
Figure BDA0003080554890000055
by intra-operatively constantly refreshing the matrix
Figure BDA0003080554890000056
The effect of accurate dynamic fusion of the virtual model and the actual tissue is achieved. Meanwhile, the planned preoperative path is also superimposed on the virtual model.
Intraoperatively important parameters include angular error and distance error. The angle error is a three-dimensional included angle between the axis of the actual surgical instrument and the virtual planned path, and the distance error is the distance between the sharp point of the surgical instrument and the time of the planned path. The calculation formula of the angle error is as follows:
Figure BDA0003080554890000057
wherein, P 1 And P 2 For any two different points, Q, on the planned path under the virtual model coordinate system 1 For surgical cusps, Q 2 For the surgical instrument to be different from Q on the axis 1 Any point of (a).
The distance error is calculated as follows:
Figure BDA0003080554890000058
wherein the content of the first and second substances,
Figure BDA0003080554890000059
is Q 1 Distance to the virtual planned path.
The method and the system are applied to teaching tests, and a surgeon completes cheekbone planting operation model teaching experiments by using the operation robot and the control system thereof. The specific implementation details are as follows: firstly, a CT image is imported into a workstation, a planting path is manually planned before an operation, and the segmentation and three-dimensional reconstruction of important soft and hard tissues are completed. The registration among the mixed reality glasses, the patient and the image is completed in the operation process, the calibration of surgical instruments at the tail end of the surgical robot is completed, the conversion relation between the virtual soft and hard tissue model and the mixed reality glasses is established, and finally the mixed reality operation environment is established. The surgeon firstly moves the mechanical arm from an initial position to a starting point of a planned path through a free traction model, and then completes the operation under the action of a guide clamp and a prohibition clamp. The experimental results show that: the average error of the in-point of the implant is 0.97 +/-0.19 mm, the error of the out-point is 1.57 +/-0.37 m, and the error of the angle is 1.43 +/-0.43, so that the requirement of clinical operation can be met.

Claims (4)

1. A surgical robotic force feedback device based on mixed reality guidance, comprising a processor, a memory, and a program, wherein the processor when executing the program implements the steps of:
acquiring a basic image, and dividing a path area for moving the surgical instrument at the tail end of the mechanical arm and a forbidden area for forbidding the surgical instrument at the tail end of the mechanical arm to enter into the basic image, wherein the forbidden area is a dynamic area;
obtaining an interaction force, and controlling the speed of the surgical instrument at the tail end of the mechanical arm according to the input force;
acquiring the position of a surgical instrument at the tail end of a mechanical arm;
controlling the surgical instrument at the tail end of the mechanical arm to move along the path region;
when the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area is smaller than a preset margin value, generating feedback resistance according to the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area, wherein the magnitude of the feedback resistance is inversely related to the distance between the position of the surgical instrument at the tail end of the mechanical arm and the forbidden area;
the feedback resistance is specifically:
Figure FDA0003862758670000011
wherein: f is feedback resistance, alpha is constant, d i Distance of the position of the end-of-arm surgical instrument from the forbidden zone, d max To a margin value, P i Is a handThe surgical instruments are positioned at the nodes nearest to the forbidden region.
2. A surgical robotic force feedback device based on mixed reality guidance as claimed in claim 1, wherein the margin value is positively correlated with the tip speed of the robotic arm at the current time.
3. A hybrid reality-based guided surgical robotic human feedback device according to claim 1, wherein the controlling the robotic end-of-arm surgical instrument to move along the path region is specifically: after the surgical instrument at the tail end of the mechanical arm moves to the starting point of the path area, the tail end of the mechanical arm is restrained by 4 degrees of freedom except for back-and-forth movement and rotation, so that the surgical instrument can only move from the starting point to the end point along the path area.
4. A surgical robotic force feedback device based on mixed reality guidance as claimed in claim 1, wherein the velocity of the robotic end-of-arm surgical instrument is specifically:
V t+1 =A·V t +(I 6×6 -A)·B -1 ·F
wherein, V t+1 The velocity of the end of the arm at time t +1, A being a constant matrix, V t The tip speed of the robot arm at time t, I 6×6 Is a 6 x 6 unit matrix, B is damping, and F is input force.
CN202110564837.9A 2021-05-24 2021-05-24 Surgical robot manpower feedback method and device based on mixed reality guidance Active CN113413214B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110564837.9A CN113413214B (en) 2021-05-24 2021-05-24 Surgical robot manpower feedback method and device based on mixed reality guidance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110564837.9A CN113413214B (en) 2021-05-24 2021-05-24 Surgical robot manpower feedback method and device based on mixed reality guidance

Publications (2)

Publication Number Publication Date
CN113413214A CN113413214A (en) 2021-09-21
CN113413214B true CN113413214B (en) 2022-12-30

Family

ID=77712819

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110564837.9A Active CN113413214B (en) 2021-05-24 2021-05-24 Surgical robot manpower feedback method and device based on mixed reality guidance

Country Status (1)

Country Link
CN (1) CN113413214B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113855254B (en) * 2021-10-24 2023-11-03 北京歌锐科技有限公司 Medical equipment and operation method based on medical equipment
CN114098981B (en) * 2021-11-24 2024-05-07 东南大学 Double-arm-coordinated head and neck auxiliary traction operation robot and control method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103961179A (en) * 2014-04-09 2014-08-06 深圳先进技术研究院 Surgical instrument moving simulation method
CN106781941A (en) * 2016-11-24 2017-05-31 北京理工大学 A kind of method and its system for simulating microtrauma puncture operation
CN110421547A (en) * 2019-07-12 2019-11-08 中南大学 A kind of tow-armed robot collaboration impedance adjustment based on estimated driving force model
CN111870349A (en) * 2020-07-24 2020-11-03 前元运立(北京)机器人智能科技有限公司 Safety boundary and force control method of surgical robot
CN112497208A (en) * 2020-10-22 2021-03-16 西安交通大学 Mobile operation robot general control method based on full-state impedance controller

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040243147A1 (en) * 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller
US7198630B2 (en) * 2002-12-17 2007-04-03 Kenneth I. Lipow Method and apparatus for controlling a surgical robot to mimic, harmonize and enhance the natural neurophysiological behavior of a surgeon
DE102008041867B4 (en) * 2008-09-08 2015-09-10 Deutsches Zentrum für Luft- und Raumfahrt e.V. Medical workstation and operating device for manually moving a robot arm
US8781630B2 (en) * 2008-10-14 2014-07-15 University Of Florida Research Foundation, Inc. Imaging platform to provide integrated navigation capabilities for surgical guidance
SG10201501706YA (en) * 2010-03-05 2015-06-29 Agency Science Tech & Res Robot Assisted Surgical Training
CN107703933B (en) * 2016-08-09 2021-07-06 深圳光启合众科技有限公司 Charging method, device and equipment of robot
US11071594B2 (en) * 2017-03-16 2021-07-27 KB Medical SA Robotic navigation of robotic surgical systems
CN108527305A (en) * 2018-04-11 2018-09-14 南京理工大学 A kind of hot line robot force feedback master-slave control method and system based on virtual reality technology
AU2020219858A1 (en) * 2019-02-08 2021-09-30 The Board Of Trustees Of The University Of Illinois Image-guided surgery system
CN110653821B (en) * 2019-10-10 2023-03-24 上海电气集团股份有限公司 Control method, system, medium and equipment for mechanical arm
CN112587244A (en) * 2020-12-15 2021-04-02 深圳市精锋医疗科技有限公司 Surgical robot and control method and control device thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103961179A (en) * 2014-04-09 2014-08-06 深圳先进技术研究院 Surgical instrument moving simulation method
CN106781941A (en) * 2016-11-24 2017-05-31 北京理工大学 A kind of method and its system for simulating microtrauma puncture operation
CN110421547A (en) * 2019-07-12 2019-11-08 中南大学 A kind of tow-armed robot collaboration impedance adjustment based on estimated driving force model
CN111870349A (en) * 2020-07-24 2020-11-03 前元运立(北京)机器人智能科技有限公司 Safety boundary and force control method of surgical robot
CN112497208A (en) * 2020-10-22 2021-03-16 西安交通大学 Mobile operation robot general control method based on full-state impedance controller

Also Published As

Publication number Publication date
CN113413214A (en) 2021-09-21

Similar Documents

Publication Publication Date Title
US11730544B2 (en) Surgical systems and methods for facilitating ad-hoc intraoperative planning of surgical procedures
US11337766B2 (en) Robotic surgical system and methods utilizing a cutting bur for bone penetration and cannulation
CN113413214B (en) Surgical robot manpower feedback method and device based on mixed reality guidance
US20080221520A1 (en) Positioning System for Percutaneous Interventions
KR20170125360A (en) A method and apparatus for using a physical object to manipulate corresponding virtual objects in a virtual environment,
KR20170093200A (en) System for robot-assisted medical treatment
CN115916091A (en) Multi-arm robotic system capable of performing multi-port endoscopic surgery
Li et al. Autonomous multiple instruments tracking for robot-assisted laparoscopic surgery with visual tracking space vector method
CN103479430A (en) Image guiding intervention operation navigation system
CN109996510B (en) Systems and methods for controlling tools having articulatable distal portions
Da Col et al. Scan: System for camera autonomous navigation in robotic-assisted surgery
Li Intelligent robotic surgical assistance for sinus surgery
Cavusoglu Telesurgery and surgical simulation: Design, modeling, and evaluation of haptic interfaces to real and virtual surgical environments
CN113633387A (en) Surgical field tracking supporting laparoscopic minimally invasive robot touch force interaction method and system
Webster III Design and mechanics of continuum robots for surgery
Beasley et al. Increasing accuracy in image-guided robotic surgery through tip tracking and model-based flexion correction
Schleer et al. Evaluation of different modes of haptic guidance for robotic surgery
CN117323019A (en) Three-operation-arm robot system for urinary puncture operation
CN112168197B (en) Positioning method and navigation system for elbow joint external fixation rotating shaft
CN210250064U (en) Craniomaxillofacial surgery robot system based on optical navigation and force feedback control
Zhang et al. Study on the control method and optimization experiment of prostate soft tissue puncture
EP4070753A1 (en) Handle for guiding a robotic arm of a computer-assisted surgery system and a surgical tool held by said robotic arm
Andreff et al. Epipolar geometry for vision-guided laser surgery
Zhou et al. The study of using eye movements to control the laparoscope under a haptically-enabled laparoscopic surgery simulation environment
CN112932669B (en) Mechanical arm control method for executing retina layer anti-seepage tunnel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Chen Xiaojun

Inventor after: Tu Puxun

Inventor after: Guo Yan

Inventor after: Li Dongyuan

Inventor before: Tu Puxun

Inventor before: Chen Xiaojun

Inventor before: Guo Yan

Inventor before: Li Dongyuan

GR01 Patent grant
GR01 Patent grant