CN112618029A - Surgical robot and method and control device for guiding surgical arm to move - Google Patents

Surgical robot and method and control device for guiding surgical arm to move Download PDF

Info

Publication number
CN112618029A
CN112618029A CN202110011217.2A CN202110011217A CN112618029A CN 112618029 A CN112618029 A CN 112618029A CN 202110011217 A CN202110011217 A CN 202110011217A CN 112618029 A CN112618029 A CN 112618029A
Authority
CN
China
Prior art keywords
instrument
field
image
view
end instrument
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110011217.2A
Other languages
Chinese (zh)
Inventor
高元倩
王建辰
***
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Edge Medical Co Ltd
Original Assignee
Shenzhen Edge Medical Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Edge Medical Co Ltd filed Critical Shenzhen Edge Medical Co Ltd
Priority to CN202210284003.7A priority Critical patent/CN114652449A/en
Priority to CN202110011217.2A priority patent/CN112618029A/en
Publication of CN112618029A publication Critical patent/CN112618029A/en
Priority to PCT/CN2021/092697 priority patent/WO2022147935A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3478Endoscopic needles, e.g. for infusion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/71Manipulators operated by drive cable mechanisms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • A61B2034/743Keyboards
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • A61B2034/744Mouse

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a surgical robot, a method for guiding a surgical arm to move and a control device thereof. The surgical robot having a plurality of manipulator arms including a camera arm having an end-of-image instrument and a surgical arm having an end-of-manipulation instrument, the method comprising the steps of: acquiring a target position to which the operation terminal instrument is expected to reach; the field of view of the image end instrument is adjusted to move towards the target position and the operation end instrument is ensured to be always positioned within the field of view of the image end instrument. The surgical robot can reduce or even avoid the problem that the operation end instrument of the surgical arm moves out of the visual field provided by the image end instrument of the camera arm, thereby ensuring the surgical safety.

Description

Surgical robot and method and control device for guiding surgical arm to move
Technical Field
The invention relates to the field of medical instruments, in particular to a surgical robot and a method and a control device for guiding a surgical arm to move.
Background
The minimally invasive surgery is a surgery mode for performing surgery in a human body cavity by using modern medical instruments such as a laparoscope, a thoracoscope and the like and related equipment. Compared with the traditional minimally invasive surgery, the minimally invasive surgery has the advantages of small wound, light pain, quick recovery and the like.
With the progress of science and technology, the minimally invasive surgery robot technology is gradually mature and widely applied. The surgical robot includes a master console and a slave operation device including a plurality of operation arms including a camera arm having an image end instrument and a surgical arm having an operation end instrument. The main console comprises a display and a handle. The physician operates the handle to control the camera arm or surgical arm movement under the field of view provided by the camera arm as displayed by the display.
In general, it is safe to recognize the movement of the camera arm itself and the movement of the surgical arm in the field of view of the camera arm. However, in some procedures, there are cases where the surgical arm inevitably moves out of the field of view of the camera arm, for example, when inserting or withdrawing the surgical arm into or from the abdominal cavity of the patient, blind insertion is usually performed by the experience of the doctor, for example, when inserting or withdrawing the surgical arm, and since the doctor experience is different and the physical condition of the patient is different, the operation based on the experience is liable to cause an unexpected situation, which is not safe, and it is desirable to reduce or even avoid the operation of moving the surgical arm out of the field of view of the camera arm as much as possible.
Disclosure of Invention
Accordingly, it is desirable to provide a surgical robot, a method of guiding the movement of the surgical arm, and a control device thereof, which can reduce or even avoid the problem that the operation end instrument of the surgical arm moves out of the field of view provided by the image end instrument of the camera arm, and can ensure the safety of the surgery.
In one aspect, the present invention provides a method of guiding movement of a surgical arm in a surgical robot having a plurality of manipulator arms including a camera arm having an end-of-image instrument and a surgical arm having an end-of-manipulation instrument, the method comprising the steps of: acquiring a target position expected to be reached by the operating terminal instrument; adjusting the field of view of the image end instrument to move towards the target position and ensuring that the operational end instrument is always within the field of view of the image end instrument.
Wherein, prior to the step of adjusting the field of view of the image tip instrument to move toward the target position and ensuring that the operative tip instrument is always within the field of view of the image tip instrument, the method further comprises: determining whether the operative tip instrument is within a field of view of the image tip instrument; adjusting the field of view of the image end instrument to position the manipulation end instrument within the field of view of the image end instrument when the manipulation end instrument is not positioned within the field of view of the image end instrument; and when the operation end instrument is positioned in the visual field of the image end instrument, adjusting the visual field of the image end instrument to move towards the target position, and ensuring that the operation end instrument is always positioned in the visual field of the image end instrument.
Wherein the step of determining whether the operative tip instrument is within the field of view of the image tip instrument comprises: acquiring an operation image in a visual field of the image end instrument; and judging whether the operation end instrument is positioned in the visual field of the image end instrument by image recognition whether the operation end instrument is positioned in the operation image.
Wherein the step of determining whether the operative tip instrument is within the field of view of the image tip instrument comprises: acquiring the current position of the operating terminal instrument; converting a field of view of the image tip instrument into a range of positions; determining whether the operative tip instrument is within a field of view of the image tip instrument by determining whether the current position is within the range of positions.
Wherein adjusting the field of view of the image tip instrument so that the operative tip instrument is within the field of view of the image tip instrument comprises: acquiring the current position of the operating terminal instrument; adjusting a field of view of the image end instrument to be within the field of view of the image end instrument by changing camera parameters of the image end instrument according to the current position of the operating end instrument, the camera parameters including a field angle and/or a depth of field.
Wherein adjusting the field of view of the image tip instrument so that the operative tip instrument is within the field of view of the image tip instrument comprises: acquiring the current position of the operating terminal instrument; adjusting a field of view of the image end instrument to position the manipulation end instrument within the field of view of the image end instrument by changing a pose, including a position and/or a pose, of the image end instrument based on the current position of the manipulation end instrument.
Wherein adjusting the field of view of the image end instrument to move toward the target position comprises: acquiring the current position of the operating terminal instrument; determining an adjustment direction of a field of view of the image tip instrument based on the current position and the target position of the manipulation tip instrument; incrementally adjusting movement of a field of view of the image tip instrument toward a target position of the manipulation tip instrument according to the adjustment direction.
Wherein the step of incrementally adjusting the field of view of the image tip instrument to move toward the target position of the manipulation tip instrument in accordance with the adjustment direction is: adjusting the movement of the field of view of the image end instrument to the target position of the operating end instrument in a manner of gradually adjusting the field angle and/or the depth of field of the image end instrument according to the adjustment direction.
Wherein the step of incrementally adjusting the field of view of the image tip instrument to move toward the target position of the manipulation tip instrument in accordance with the adjustment direction is: adjusting the movement of the field of view of the image end instrument to the target position of the operation end instrument in a manner of gradually adjusting the position and/or posture of the image end instrument according to the adjustment direction.
Wherein the step of obtaining a target location that the manipulation tip instrument is expected to reach comprises: obtaining an input mode of operation, the mode of operation including a first mode of operation for guiding insertion of the operative tip instrument to a target location and a second mode of operation for guiding withdrawal of the operative tip instrument to the target location; and determining a target position expected to be reached by the operation terminal instrument according to the acquired operation mode.
Wherein, when the acquired operation mode is a first operation mode, the step of determining a target position where the operation tip instrument is expected to reach according to the acquired operation mode includes: acquiring a target field of view of the image tip instrument; determining a target location where the manipulation tip instrument is expected to reach based on the target field of view.
Wherein two or more of the operative tip instruments configured to perform the first mode of operation have different target positions.
Wherein two or more of the operating tip instruments configured to perform the first mode of operation have different target locations with a safe distance between them.
The surgical robot comprises a puncture outfit, the proximal end of the puncture outfit is connected with the distal end of the surgical robot, the distal end of the puncture outfit is used for being inserted into and fixed at an incision, the puncture outfit is used for guiding the surgical arm to be inserted into a human body through the incision, and when the obtained operation mode is a second operation mode, the step of determining the target position which is expected to be reached by the operation terminal instrument according to the obtained operation mode is as follows: acquiring the position of the associated point associated with the puncture outfit as the target position.
Wherein the association point associated with the puncture instrument as the target position is located on the puncture instrument or on an extension of a shaft of the puncture instrument and on a distal side of the puncture instrument.
Wherein the image end instrument has a safe distance to the target location.
Wherein the method comprises the following steps: acquiring the current position of the operating terminal instrument; adjusting the field of view of the image tip instrument back to an initial field of view as the operating tip instrument is moved substantially from the current position to the target position, the initial field of view referring to a field of view before the image tip instrument was initially proximately adjusted toward the target position.
Wherein adjusting the field of view of the image end instrument to move toward the target position comprises: recording the camera parameters and the pose changes of the image terminal instrument at the corresponding moment in the process of moving the visual field to the target position in real time; the step of adjusting the field of view of the image end instrument back to the initial field of view comprises: and gradually adjusting the visual field of the image end instrument according to a visual field adjusting mode to restore the initial visual field, wherein the visual field adjusting mode is an adjusting mode for adjusting the camera parameter and the pose at the next moment to the camera parameter and the pose at the adjacent previous moment.
Wherein the step of adjusting the field of view of the image end instrument to move towards the target position comprises: acquiring and recording camera parameters and poses corresponding to the current visual field of the image terminal instrument; and then in the step of adjusting the visual field of the image end instrument to restore the initial visual field, directly restoring the visual field of the image end instrument to the initial visual field according to the recorded camera parameters and the pose corresponding to the current visual field of the image end instrument.
Wherein the method comprises the following steps: inhibiting movement of the manipulation tip instrument when the manipulation tip instrument is not within a field of view of the image tip instrument.
Wherein the method further comprises: detecting whether a starting instruction is acquired; and when the starting instruction is acquired, judging whether the operation terminal instrument is positioned in the visual field of the image terminal instrument.
The surgical robot comprises a power mechanism for mounting and driving the operating arm, and the starting command is generated by triggering the surgical arm when the surgical arm is mounted to the power mechanism.
Wherein the method comprises the following steps: acquiring an accessible interval of a visual field of the image terminal instrument, wherein the accessible interval of the visual field refers to a space set of all visual fields; acquiring the current position of the operating terminal instrument; judging whether the current position of the operation terminal instrument is located in an accessible interval of the visual field of the image terminal instrument; and entering a step of adjusting the movement of the visual field of the image end instrument to the target position when the current position of the operation end instrument is located in the reachable interval of the visual field of the image end instrument.
Wherein the image tip instrument moves synchronously with movement of the manipulation tip instrument under conditions that ensure that the manipulation tip instrument is always within a field of view of the image tip instrument.
Wherein the distal imaging instrument lags behind movement of the distal imaging instrument with movement of the distal manipulation instrument under conditions that ensure that the distal manipulation instrument is always within a field of view of the distal imaging instrument.
The field of view of the image end instrument at each adjacent moment is respectively a first field of view and a second field of view, an overlapping area is formed between the first field of view and the second field of view, and the movement of the operation end instrument to the target position through the overlapping area is limited.
Wherein the method comprises the following steps: and controlling the operation end instrument to move substantially linearly along the direction of the current position of the operation end instrument to the target position when the operation end instrument moves.
Wherein the method comprises the following steps: calculating a deviation angle of a connecting line between a moving direction of the operated tip instrument and a current position and a target position of the operated tip instrument; at least a resistance force is generated that resists movement of the operating tip instrument in a deviating direction when the deviation angle reaches a deviation threshold.
Wherein the magnitude of the resistance is positively correlated with the magnitude of the slip angle.
In another aspect, the invention provides a computer-readable storage medium storing a computer program configured to be loaded by a processor and to execute steps implementing a method according to any one of the embodiments described above.
In another aspect, the present invention provides a control device for a surgical robot, including: a memory for storing a computer program; and a processor for loading and executing the computer program; wherein the computer program is configured to be loaded by the processor and to execute steps implementing the method according to any of the embodiments described above.
In another aspect, the present invention provides a surgical robot comprising: an operation arm having a camera arm for imaging an end instrument and an operation arm for operating the end instrument; and a controller coupled to the manipulator arm and configured to perform the steps of the control method according to any of the embodiments described above.
The surgical robot and the method and the control device for guiding the surgical arm to move have the following beneficial effects:
when the operation terminal instrument is positioned in the visual field of the image terminal instrument, the visual field of the image terminal instrument is adjusted to move to the target position, and the mode that the operation terminal instrument is always positioned in the visual field of the image terminal instrument is ensured, so that the operation terminal instrument can be gradually guided to move to the target position by utilizing the visual field of the image terminal instrument moving to the target position, the operation terminal instrument of the operation arm is always in an observable state, and the safety and the reliability of the operation can be further ensured.
Drawings
FIG. 1 is a schematic structural diagram of a surgical robot according to an embodiment of the present invention;
FIG. 2 is a partial schematic view of an embodiment of the surgical robot of FIG. 1;
FIG. 3 is a flow chart of an embodiment of a method of controlling a surgical robot;
FIG. 4 is a schematic structural diagram of an operation arm and a power unit in the surgical robot;
FIG. 5 is a flow chart of a method of guiding movement of a surgical arm in a surgical robot;
FIG. 6 is a schematic diagram illustrating one embodiment of a method for guiding movement of a surgical arm in a surgical robot;
FIGS. 7-10 are flow diagrams of one embodiment of a method of guiding movement of a surgical arm in a surgical robot;
FIGS. 11-20 are schematic views of an embodiment of a guidance state of a manipulation tip instrument in a surgical arm, respectively;
FIGS. 21-25 are flow diagrams of one embodiment of a method of guiding movement of a surgical arm in a surgical robot;
fig. 26 is a schematic structural diagram of a control device of a surgical robot according to an embodiment of the present invention.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present invention are shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It will be understood that when an element is referred to as being "disposed on" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. When an element is referred to as being "coupled" to another element, it can be directly coupled to the other element or intervening elements may also be present. As used herein, the terms "vertical," "horizontal," "left," "right," and the like are for purposes of illustration only and are not intended to represent the only embodiments. As used herein, the terms "distal" and "proximal" are used as terms of orientation that are conventional in the art of interventional medical devices, wherein "distal" refers to the end of the device that is distal from the operator during a procedure, and "proximal" refers to the end of the device that is proximal to the operator during a procedure. The terms "first/second" and the like as used herein denote one element and a class of two or more elements having common characteristics.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. The term "each" as used in the present invention includes one or more than two.
Fig. 1 to 2 are schematic structural diagrams and partial schematic diagrams of a surgical robot according to an embodiment of the present invention.
The surgical robot includes a master console 2 and a slave operation device 3 controlled by the master console 2. The main operation table 2 has a motion input device 21 and a display 22, and a surgeon transmits a control command to the slave operation device 3 by operating the motion input device 21 to cause the slave operation device 3 to perform a corresponding operation according to the control command of the surgeon operating the motion input device 21, and observes an operation region through the display 22. The slave manipulator 3 has a driving arm having a robot arm 30 and one or more manipulation arms 31 detachably attached to a distal end of the robot arm 30. The robot arm 30 includes a base and a connecting assembly connected in sequence, and the connecting assembly has a plurality of joint assemblies. The operating arm 31 comprises a connecting rod 32, a connecting component 33 and a terminal instrument 34 which are connected in sequence, wherein the connecting component 33 is provided with a plurality of joint components, and the posture of the terminal instrument 34 is adjusted by adjusting the joint components of the operating arm 31; end instrument 34 has an image end instrument 34A and a manipulation end instrument 34B. The image end instrument 34A is used to acquire images within the field of view and the display 22 is used to display the images. The operating tip instrument 34B is used to perform surgical operations such as cutting, stapling. The manipulator arm with the image end instrument 34A is referred to herein as a camera arm 31A, and the manipulator arm with the manipulation end instrument 34B is referred to as a surgical arm 31B.
The surgical robot shown in fig. 1 is a single-hole surgical robot, and each of the operation arms 31 is inserted into the patient through the same puncture instrument 4 installed at the distal end of the robot arm 30. In a single-bore surgical robot, the surgeon typically only controls manipulator arm 31 to complete the basic surgical procedure. At this time, the operation arm 31 of the single-hole surgical robot should have both a position degree of freedom (i.e. a positioning degree of freedom) and an attitude degree of freedom (i.e. a directional degree of freedom) to realize a change of the pose within a certain range, for example, the operation arm 31 has a horizontal movement degree of freedom x, a vertical movement degree of freedom y, a rotation degree of freedom α, a pitching degree of freedom β and a yawing degree of freedom γ, the operation arm 31 can also realize a forward and backward movement degree of freedom z (i.e. a feeding degree of freedom) under the driving of the distal joint component of the robot arm 30, i.e. the power mechanism 301, and in some embodiments, the operation arm 31 can also be provided with a redundant degree of freedom to realize more functions, for example, one, two or even more degrees of freedom can be additionally provided on. For example, the power mechanism 301 has a guide rail and a power portion slidably disposed on the guide rail, and the operation arm 31 is detachably mounted on the power portion, on one hand, the sliding of the power portion on the guide rail provides the forward and backward movement freedom z of the operation arm 31, and on the other hand, the power portion provides power for the joint components of the operation arm 31 to realize the remaining 5 degrees of freedom (i.e., [ x, y, α, β, γ ]).
The surgical robot also includes a controller. The controller may be integrated in the master console 2 or in the slave console 3. Of course, the controller may also be independent of the master console 2 and the slave console 3, which may be deployed locally, for example, or in the cloud, for example. The controller may be configured with one or more processors.
The surgical robot further includes an input. The input may be integrated into the main console 2. The input section may also be integrated in the slave operating device 3. Of course, the input unit may be independent of the master console 2 and the slave console 3. The input unit may be, for example, a mouse, a keyboard, a voice input device, or a touch panel. In one embodiment, a touch screen is used as the input unit, and the touch screen may be disposed on an armrest of the main console 2, for example.
The operating arm 31 also includes sensors that sense joint variables of the joint assembly. The sensors include an angle sensor for sensing the rotational movement of the joint assembly and a displacement sensor for sensing the linear movement of the joint assembly, and the sensors can be adapted according to the type of the joint assembly.
A controller is coupled to the sensors and to an input and display 22.
For example, as shown in fig. 3, a storage unit 311 is installed on an abutting surface of the driving box 310 of the operation arm 31 abutting against the power portion 302 of the power mechanism 301, a reading unit 303 configured with the storage unit 311 is installed on an abutting surface of the power portion 302 abutting against the driving box 310, the reading unit 303 is coupled to the controller, when the operation arm 31 is installed on the power portion 302, the reading unit 303 communicates with the storage unit 311, and the reading unit 303 reads relevant information from the storage unit 311. The storage unit 311 is, for example, a memory or an electronic tag. The storage unit stores, for example, the type of the operation arm, the part of the operation arm that can be configured as the target portion, a kinematic model of the operation arm, and the like. For example, the storage unit 311 of the camera arm 31A additionally stores therein camera parameters.
Fig. 4 is a schematic structural diagram of a surgical robot according to an embodiment of the present invention, and more specifically, fig. 4 is a schematic structural diagram of a multi-hole surgical robot according to an embodiment of the present invention. The difference between the multi-hole surgical robot shown in fig. 4 and the single-hole surgical robot shown in fig. 1 mainly exists in the difference between the slave operation devices of the two. The multi-hole surgical robot shown in fig. 4 has a robot arm 110, an adjusting arm 120, a manipulator 130, and an operating arm 150 connected in this order from the driving arm of the operating device. The number of the adjusting arms 120, the manipulator 130 and the operation arms 150 is the same and is more than two, for example, four, the distal end of the robot arm 110 has an orientation platform, the proximal ends of the adjusting arms 120 are all connected to the orientation platform, and the proximal end of the manipulator 130 is connected to the distal end of the adjusting arms 120. The manipulator 130 is for detachably connecting the manipulation arm 150, and the manipulator 130 has a plurality of joint assemblies. Each manipulator 130 has a power mechanism, and the operating arm 150 is mounted on the power mechanism and is further driven by the power mechanism. In a multi-hole surgical robot, different operation arms 150 are inserted into a patient through different puncture instruments, the operation arm 150 of the multi-hole surgical robot generally has fewer degrees of freedom compared with the operation arm 31 of a single-hole surgical robot, and generally, the operation arm 150 only has a posture degree of freedom (i.e. a directional degree of freedom), although the change of the posture generally has an influence on the position, but can be ignored in some situations because the influence is small. The change of the position of the manipulator arm 150 can be generally realized by the aid of the manipulator 130, and since the manipulator 130 is linked with the manipulator arm 150 to realize the change of the pose, the two can be considered as a manipulator assembly, which is equivalent to the manipulator arm 31 in the single-hole surgical robot.
According to the configuration, the motion-input device 21 can input a pose instruction including a position instruction and a posture instruction to control the change of the distal end pose of the first portion in the drive arm. The distal end of the first portion is typically referred to as the end instrument and may be referred to as an articulation component associated with the end instrument, the change in the pose of the end instrument typically corresponding to the change in the pose of the articulation component.
In the surgical robot shown in fig. 1, the driving arm includes a robot arm and an operation arm, the proximal end of the operation arm is mounted at the distal end of the robot arm, and the distal end instrument is mounted at the distal end of the operation arm. According to a configuration, the first portion may be configured to be an operating arm; alternatively, the first portion may be configured as an integral part of the robotic arm and the handling arm.
Correspondingly, in the surgical robot shown in fig. 4, the driving arm includes a mechanical arm, an adjusting arm, a manipulator and an operating arm, the adjusting arm is mounted at the distal end of the mechanical arm at the proximal end, the manipulator is mounted at the distal end of the adjusting arm at the proximal end, the manipulator is mounted at the distal end of the manipulator at the proximal end, and the distal end instrument is mounted at the distal end of the operating arm. According to a configuration, the first portion may be configured to be an operating arm; alternatively, the first portion may be configured to be integral with the manipulator and the manipulator arm; alternatively, the first portion may be configured as an integral part of the robotic arm, the adjustment arm, the manipulator, and the handling arm.
It can be understood that, in both the single-hole surgical robot shown in fig. 1 and the multi-hole surgical robot shown in fig. 4, the mechanical arm is generally used to adjust the pose of the end instrument in a wide range, and the operation arm is used to finely adjust the pose of the end instrument, for example, the mechanical arm and the like are used to position before operation, and the operation is mainly performed by controlling the operation arm during operation. Of course, in some embodiments, the specific function may also be realized by combining the corresponding arm structures such as the mechanical arm and the operation arm to cooperatively move together. Depending on the configuration, more than one of the end instruments may be configured as a controlled end instrument to accept control of the motion-input device.
In one embodiment, the present invention provides a method for guiding movement of a surgical arm in a surgical robot, which can be performed by a controller and is applicable to various types of surgical robots. As shown in fig. 5, the method includes the steps of:
in step S11, a target position to which the distal end instrument is expected to be operated is acquired.
The position and posture of different objects involved in the present invention, including but not limited to the image end instrument of the camera arm and the operation end instrument of the surgical arm, are described based on the same reference coordinate system. The reference coordinate system includes, but is not limited to, a base coordinate system of the surgical robot, and may be other coordinate systems that can be used as a reference, for example, a coordinate system of the main operating table, which is converted from the base coordinate system.
The target position may be a position currently located within the field of view of the end-of-image instrument, or a position not currently located within the field of view of the end-of-image instrument, which may be specifically determined according to the requirements during the surgical procedure.
Step S12, adjusting the field of view of the image end instrument to move towards the target position and ensuring that the manipulation end instrument is always within the field of view of the image end instrument.
It is a precondition for performing this step S12 that the operation end instrument is located within the field of view of the image end instrument. That is, when the manipulation end instrument is positioned within the field of view of the image end instrument, the field of view of the image end instrument is adjusted to move toward the target position, and it is ensured that the manipulation end instrument is always positioned within the field of view of the image end instrument.
In step S12, adjusting the movement of the field of view of the image tip instrument toward the target position may be performed in response to the manipulation tip instrument moving toward the target position. Preferably, adjusting the movement of the field of view of the end-of-image instrument toward the target position is performed in response to movement of the manipulation end instrument within the field of view of the end-of-image instrument toward the target position. The operative tip instrument is always located within the field of view of the image end instrument, for example, one or more points on a part of the operative tip instrument, such as its tip or middle or tail end, may be always located within the field of view of the image end instrument, or the entire operative tip instrument may be always located within the field of view of the image end instrument.
For example, as shown in FIG. 6, when the field of view of the image tip instrument is moved from the current position of the manipulation tip instrument to the target position, assuming that the fields of view 0, 1, 2 …, n-1, n, are experienced, the moved position of the manipulation tip instrument experiences the positions P0, P1, P2 … Pn-1, Pn. Wherein P0 and P1 are located in view 0, P1 and P2 are located in view 1 … Pn-1, Pn is located in view n. It can be seen that the manipulation tip instrument is always moving within the field of view of the imaging tip instrument. In the whole process, on one hand, the visual field of the image end instrument continuously approaches the target position, and on the other hand, the operation end instrument continuously approaches the target position until the visual field of the image end instrument reaches the target position and the operation end instrument is guided to the target position.
In the above steps S11 to S12, when the manipulation tip instrument is located within the field of view of the image tip instrument, the field of view of the image tip instrument is adjusted to move to the target position, and the manipulation tip instrument is ensured to be always located within the field of view of the image tip instrument, so that the manipulation tip instrument can be gradually guided to move to the target position by the field of view of the image tip instrument moving to the target position, and the manipulation tip instrument of the surgical arm can be ensured to be always in an observable state, thereby ensuring the safety and reliability of the surgery.
Before proceeding to step S12, with continued reference to fig. 5, specifically before the step of adjusting the movement of the field of view of the end-of-image instrument to the target position, it may include:
in step S13, it is determined whether the operation end instrument is positioned within the field of view of the image end instrument.
If the manipulation end instrument is located within the field of view of the image end instrument, proceeding to step S12; and if the operation end instrument is not located within the field of view of the image end instrument, the process proceeds to step S14. Step S14 is: the field of view of the image tip instrument is adjusted so that the manipulation tip instrument is within the field of view of the image tip instrument.
Regarding the above-described step S13, that is, there are various methods for determining whether the manipulation tip instrument is positioned within the field of view of the image tip instrument, the present invention exemplifies two methods for implementing the step S13.
In one embodiment, as shown in fig. 7, the step S13 may include:
step S131, acquiring an operation image in the visual field of the image end instrument.
In step S132, it is determined whether the operation end instrument is positioned within the field of view of the image end instrument by recognizing whether the operation end instrument is positioned within the operation image.
In this step S132, if it is recognized that the operation terminal instrument exists within the operation image, it is determined that the operation terminal instrument is located within the field of view of the image terminal instrument; and if it is recognized that the operation end instrument does not exist in the operation image, it is judged that the operation end instrument is not located in the field of view of the image end instrument.
To better perform image recognition, a neural network may be trained for image recognition. For example, the trained neural network may be a convolutional neural network.
In another embodiment, as shown in fig. 8, the step S13 may further include:
step S131', the current position of the operation tip instrument is acquired.
The current position of the operative tip instrument may be obtained using positive kinematic calculations in conjunction with a kinematic model of the surgical arm and joint variables from joint components in the surgical arm. These joint variables may be detected by sensors at the respective joint components. In other embodiments, when the end-of-image instrument is, for example, a stereoscopic end-of-image instrument, the current position of the operating end instrument may be scanned and identified by adjusting the stereoscopic end-of-image instrument, for example, by identifying the position of the operating end instrument relative to the end-of-image instrument, and the current position of the operating end instrument in the reference coordinate system may be determined by coordinate system transformation.
Step S132', convert the field of view of the image end instrument into a range of positions.
The field of view is an area that actually has boundaries and can therefore be converted into a range of positions, for example, of a reference coordinate system.
Step S133' determines whether the operation end instrument is positioned within the field of view of the image end instrument by determining whether the current position is positioned within the position range.
In this step S133', if the current position of the operation end instrument is located within the position range corresponding to the field of view of the image end instrument, it is determined that the operation end instrument is located within the field of view of the image end instrument; and if the current position of the operation end instrument is not located in the position range corresponding to the visual field of the image end instrument, judging that the operation end instrument is not located in the visual field of the image end instrument.
In some embodiments, the two approaches may be combined to mutually verify whether the operative end instrument is within the field of view of the imaging end instrument. For example, when the result determined by the image recognition and the result determined by the position detection do not coincide, the adjustment of the field of view of the image end instrument may be stopped first for safety, and the adjustment of the field of view of the image end instrument may be continued after the confirmation instruction of the doctor is acquired. This process can also be used to calibrate the neural network for image recognition to improve the accuracy of its determination.
The step S132 of adjusting the field of view of the image end instrument so that the operation end instrument is positioned within the field of view of the image end instrument may be implemented in various ways.
In one embodiment, as shown in fig. 9, the step S132 may include:
in step S1321, the current position of the operation tip instrument is acquired.
Step S1322 is adjusting the field of view of the end-of-image instrument by changing camera parameters of the end-of-image instrument according to the current position of the end-of-image instrument such that the end-of-image instrument is within the field of view of the end-of-image instrument.
The camera parameters include a field angle and/or a depth of field. This can be done if it is pre-calculated that only the camera parameters are adjusted to cover both the current position and the target position of the operative tip instrument, so that the pose of the image tip instrument can be maintained.
In another embodiment, as shown in fig. 10, the step S132 may further include:
step S1321', the current position of the operation tip instrument is acquired.
Step S1322', adjusting the field of view of the image end instrument by changing the pose of the image end instrument so that the manipulation end instrument is within the field of view of the image end instrument, according to the current position of the manipulation end instrument.
Wherein the pose comprises a position and/or a posture. This can be done if it is pre-calculated that adjusting only the pose of the end-of-image instrument covers the area between the current position of the end-of-image instrument to the target position, so that the camera parameters of the end-of-image instrument can be maintained.
For example, a multi-hole surgical robot is shown in fig. 11 to 13. Assume that the current position of the image tip instrument of the camera arm is B0, the current position of the operation tip instrument of the surgical arm is a0, the field of view of B0 is the target field of view, and a0 is located outside the target field of view, as shown in fig. 11. In one embodiment, the field of view may be adjusted by adjusting camera parameters of the image end instrument, such as the field angle, as shown in FIG. 12, keeping the current position B0 of the image end instrument constant, so that A0 falls within the adjusted field of view. In one embodiment, the field of view may be adjusted by adjusting the pose of the end-of-image instrument, such as position B1, so that A0 falls within the adjusted field of view, as shown in FIG. 13, while keeping the camera parameters of the end-of-image instrument unchanged.
Of course, this is also true for single-hole surgical robots, see fig. 14-16. Assume that the current position of the image tip instrument of the camera arm is B0, the current position of the operation tip instrument of the surgical arm is a0, the field of view of B0 is the target field of view, and a0 is located outside the target field of view, as shown in fig. 14. In one embodiment, as shown in FIG. 15, the field of view is adjusted by adjusting the pose of the end-of-image instrument, e.g., position B1, so that A0 falls within the adjusted field of view, while keeping the camera parameters of the end-of-image instrument unchanged; as shown in FIG. 16, the field of view is adjusted by adjusting the pose of the end-of-image instrument to B1, again keeping the camera parameters of the end-of-image instrument unchanged, so that A0 falls within the adjusted field of view. Of course, in some cases, the current position B0 of the image end instrument may be kept unchanged, and the field of view may be adjusted by adjusting a camera parameter of the image end instrument, such as the field angle, so that a0 falls within the adjusted field of view, which is not illustrated.
As shown in fig. 12 and 13, when the field of view of the image end instrument includes the current position and the target position of the manipulation end instrument, the field of view of the image end instrument may be kept constant, and the manipulation end instrument may be guided to move from the current position to the target position directly according to the field of view.
When the field of view of the image tip instrument does not include the target position of the operation tip instrument as shown in fig. 15, it is possible to keep the field of view of the image tip instrument unchanged as shown in fig. 17, to guide the operation tip instrument to move from a0 to a1 first according to the field of view, to keep the position of the operation tip instrument, to adjust the field of view to move to the target position by adjusting the position of the image tip instrument to move to a position B2 and to ensure that the operation tip instrument is located in its field of view, and to guide the operation tip instrument to move again according to the field of view first, to operate periodically until the operation tip instrument reaches the target position.
Alternatively, as shown in fig. 16, when the field of view of the image tip instrument does not include the target position of the operation tip instrument, the field of view of the image tip instrument may be kept constant, the operation tip instrument may be guided to move from a0 to a1 according to the field of view, as shown in fig. 19, the position of the operation tip instrument may be kept, the field of view may be adjusted to move to the target position by adjusting the position and posture of the image tip instrument (to the posture B2) and ensuring that the operation tip instrument is within the field of view thereof, and then the operation tip instrument may be guided to move to the target position according to the field of view, and the operation may be performed periodically until the operation tip instrument reaches the target position.
Fig. 11-20 illustrate embodiments of surgical arm insertion procedures, although these methods or principles are applicable even to surgical arm retraction procedures.
In some embodiments, the two methods may also be combined to adjust the field of view of the end-of-image instrument to better move to the current position and/or target position of the manipulation end instrument to enable the current position and/or target position to fall within the field of view of the end-of-image instrument. For example, the pose of the image-end instrument may be preferentially adjusted; as another example, camera parameters of the end-of-image instrument may be preferentially adjusted. The objects that are preferentially adjusted (i.e., pose of the end-of-image instrument and camera parameters) can be configured according to instructions entered by the physician. For example, if the prior adjustment object is the pose of the end-of-image instrument, the pose of the end-of-image instrument is adjusted as much as possible to move the visual field of the end-of-image instrument to the current position and/or the target position of the end-of-image instrument, and if the visual field of the end-of-image instrument does not cover the current position and/or the target position of the end-of-image instrument when the movement reaches the limit, the camera parameters of the end-of-image instrument are adjusted to make the visual field of the end-of-image instrument cover the current position and/or the target position of. When the camera parameters of the image end instrument are preferentially adjusted, the camera parameters of the image end instrument are adjusted to the greatest extent so that the visual field of the image end instrument moves to the current position and/or the target position of the operation end instrument, and if the visual field of the image end instrument does not cover the current position and/or the target position of the operation end instrument when the movement reaches the limit, the position and the posture of the image end instrument are adjusted to enable the visual field of the image end instrument to cover the current position and/or the target position of the operation end instrument.
In some embodiments, even if the pose of the image-end instrument is merely adjusted, it may be prioritized, for example, the pose may be preferentially adjusted; also for example, the position may be preferentially adjusted. Similarly, even if only the camera parameters of the image end instrument are adjusted, priority can be set thereto, for example, the angle of field can be preferentially adjusted; also for example, the depth of field may be preferentially adjusted. The objects that are preferentially adjusted (i.e., pose and position in pose, and/or field angle and depth of field in camera parameters) can also be configured according to instructions entered by the physician.
In some embodiments, multiple levels of priority may be configured to adjust the field of view of the end-of-image instrument, and corresponding parameters of the end-of-image instrument may be adjusted in stages based on the configured priorities to effect the adjustment of the field of view until the field of view may cover the current position and/or target position of the end-of-image instrument. For example, a first level of priority is to adjust the pose of the end-of-image instrument, a second level of priority is to adjust the pose in the pose of the end-of-image instrument, and a third level of priority is to adjust the field of view in the camera parameters of the end-of-image instrument. Assuming that the current position and/or the target position of the operation end instrument can be achieved by combining the adjustment of the pose of the image end instrument and the camera parameters, the whole working process is approximately as follows:
firstly, adjusting the posture of an instrument at the tail end of an image to reach the visual field reachable limit;
if the visual field can cover the current position and/or the target position of the operation terminal instrument at the moment, the adjustment is finished; if the current position and/or the target position of the operation terminal instrument cannot be covered by the visual field at the moment, adjusting the position of the image terminal instrument to reach the visual field reachable limit;
if the visual field can cover the current position and/or the target position of the operation terminal instrument at the moment, the adjustment is finished; if the current position and/or the target position of the operation end instrument can not be covered, the field angle of the image end instrument is further adjusted to reach the visual field reachable limit;
if the visual field can cover the current position and/or the target position of the operation terminal instrument at the moment, the adjustment is finished; if the field of view cannot cover the current position and/or the target position of the operation end instrument, the depth of field of the image end instrument is adjusted to reach the limit, and the field of view can cover the current position and/or the target position of the operation end instrument.
In one embodiment, as shown in fig. 21, the step S12 of adjusting the field of view of the image end instrument to move to the target position includes:
in step S121, the current position of the operation tip instrument is acquired.
Step S122, determining the adjusting direction of the visual field of the image end instrument according to the current position and the target position of the operation end instrument.
The adjustment direction is always tangent to a waypoint on a path planned according to the current position of the operating tip instrument and the target position. For example, when the planned path is a straight line connecting the current position and the target position, the adjustment direction is an extension direction of the straight line, and more specifically, a direction from the current position to the target position.
And step S123, adjusting the visual field of the image end instrument to move towards the target position of the operation end instrument in an incremental mode according to the adjusting direction.
In this step S123, the movement of the field of view of the image end instrument may be restricted. For example, it is constrained that a certain passing point as described above should be located at a certain position in the field of view of the end-of-image instrument, for example at the center of the field of view or at a certain point having a certain positional relationship with the field of view. For example, other constraints may also be imposed on the movement of the field of view of the image end instrument. For example, a point after the field of view of the image end instrument is constrained to move, having a particular positional relationship with the field of view, is maintained at a particular distance in a particular direction from the current position of the operational end instrument updated in real time.
In one embodiment, the fields of view of the image end instruments at the adjacent moments are respectively a first field of view and a second field of view, an overlapping area is limited between the first field of view and the second field of view, and the movement of the operation end instrument to the target position through the overlapping area is limited, so that the operation end instrument can be always positioned in the fields of view.
In step S123, the field of view of the image end instrument may be adjusted to move to the target position of the operation end instrument according to the adjustment direction in a manner of gradually adjusting the field angle and/or the depth of field of the image end instrument. Further, this step S123 may also adjust the field of view of the image end instrument to move toward the target position of the manipulation end instrument in such a manner that the position and/or posture of the image end instrument is adjusted stepwise according to the adjustment direction. Of course, the two may also be used in combination, and specifically, the scheme described in the above embodiment of adjusting the view of the distal end of image instrument to make the distal end of operation instrument within the view of the distal end of image instrument may be referred to, and will not be repeated herein.
Typically, different operational requirements correspond to target locations with different operating tip instruments. In one embodiment, as shown in fig. 22, the step S11 of obtaining the target position to which the distal end instrument is expected to be operated includes:
step S111, an input operation mode is acquired.
Wherein the operation mode includes, but is not limited to, a first operation mode and a second operation mode, for example, the first operation mode is used for guiding the operation of inserting the operation end instrument to the target position, and the applicable scene includes, but is not limited to, the scene of inserting the operation arm into the body from the outside of the patient body before the operation; the second operation mode is used for guiding the operation terminal instrument to withdraw to the target position, and the applicable scenes of the operation terminal instrument include, but are not limited to, the scene of replacing the operation arm in the operation process and withdrawing the operation arm at the end of the operation.
And step S112, determining a target position which is expected to be reached by operating the terminal instrument according to the acquired operation mode.
In an embodiment, when the acquired operation mode is the first operation mode, as shown in fig. 23, the step S112 of determining the target position to which the distal end instrument is expected to be operated according to the acquired operation mode includes:
in step S1121, a target view of the image end instrument is acquired.
The target visual field may be, for example, a visual field to be confirmed by a doctor in accordance with confirmation of a visual field corresponding to a certain time, and the operation is usually performed in the target visual field. For example, a doctor usually inserts a camera arm into a patient before an operation, observes and determines a visual field suitable for the operation using an image end instrument of the camera arm, and sets a visual field corresponding to the time when the confirmation instruction is generated as the target visual field after receiving a confirmation instruction triggered by the doctor.
In step S1122, a target position to which the distal end instrument is desired to be operated is determined based on the target visual field.
The target position is a point having a specific positional relationship with the target field of view. Examples of such target locations include, but are not limited to, the center of the target field of view, or a point offset from the center of the target field of view and intersecting the direction of extension of the links of the surgical arm. The target position is, for example, a point M in the target field of view in fig. 11 and 14, where in fig. 11, the point M is illustrated as being not the center of the target field of view, and in fig. 14, the point M is illustrated as being the center of the target field of view.
In an embodiment, more than two operating end instruments configured to perform the first mode of operation have different target positions, further, there is typically a safe distance between the different target positions to avoid collisions between the operating end instruments. For example, one of the target positions may be allowed to be a center point of the target visual field, and the others may be allowed to be a specific point other than the center of the target visual field.
In one embodiment, when there are more than two distal end instruments to be guided to the target position, the distal end instruments may be guided one by one, that is, after one distal end instrument is guided to the target position, another distal end instrument is guided to the target position until all distal end instruments are guided to the target position.
The surgical robot comprises a puncture outfit, the near end of the puncture outfit is detachably connected with the far end of the surgical robot, the far end is inserted into and fixed at the incision, and the puncture outfit is used for guiding the surgical arm to be inserted into a human body through the incision. In an embodiment, when the acquired operation mode is the second operation mode, the step S112 of determining the target position where the distal end instrument is expected to be operated according to the acquired operation mode may be: the position of a point associated with the puncture instrument is acquired as a target position. The point related to the puncture device as the target position may be located on the puncture device or may be located on an extension of a shaft of the puncture device, which generally has a cylindrical insertion portion, on a distal end side of the puncture device, and the shaft herein generally refers to a central axis of the insertion portion.
In the above embodiments, the image end instrument is located at a safe distance from the target location to prevent a collision between the image end instrument and the manipulation end instrument.
In some embodiments, as shown in fig. 24, the method of the present invention may further comprise:
in step S141, the current position of the operation tip instrument is acquired.
In step S142, when the operation end instrument is substantially moved from the current position to the target position, the field of view of the image end instrument is adjusted to return to the initial field of view.
Wherein whether the operation terminal instrument is basically moved from the current position to the target position can be automatically judged by the system or can be judged by a doctor. For example, in the case of automatic determination by the system, when the deviation between the current position (updated in real time) and the target position is less than the deviation threshold, the system may determine that the current position substantially reaches the target position. Furthermore, the physician's judgment should generally be conditional, e.g., the confirmation instruction entered by the physician and associated with the current position substantially reaching the target position is only validated when the deviation between the current position and the target position is less than a deviation threshold; alternatively, the system may allow the physician to enter the confirmation instruction only when the deviation between the current position and the target position is less than the deviation threshold, and such a design may serve as a preliminary judgment.
The initial field of view refers to the field of view at the time just before the image end instrument was first adjusted. For example, for a manipulation tip instrument performing the first mode of operation, the initial field of view is a target field of view confirmed by the physician. This initial field of view may also be a target field of view confirmed by the physician for the operating tip instrument performing the second mode of operation.
In the above step S142, the purpose of adjusting the image end instrument to return to the initial field of view can be achieved in various ways.
In one embodiment, the step of performing step S12, namely adjusting the movement of the field of view of the image end instrument to the target position, includes: and recording the camera parameters of the image terminal instrument and the change of the pose thereof at the corresponding moment in the process of moving the visual field of the image terminal instrument to the target position in real time. Further, in step S142, i.e., the step of adjusting the field of view of the distal end of image instrument to return to the initial field of view, the field of view of the distal end of image instrument is returned to the initial field of view in a reverse step by step according to the recorded camera parameters and pose at the corresponding time of the distal end of image instrument.
Assuming that from time T0 to time Tn, the instruction parameters for adjusting the field of view of the image tip instrument for that time are as follows:
time of day T0 T1 T2 T3 Tn
Instruction parameters C0 C1 C2 C3 Cn
T0 is the time before the field of view of the image end instrument starts moving to the target position, Tn is the time when the field of view of the image end instrument reaches the target position, the time between T0 and Tn is the intermediate time, and C0 to Cn are the command parameters corresponding to the times T0 to Tn. Wherein the instruction parameters include one or more of camera parameters and pose, and are determined according to a mode of adjusting the visual field of the image end instrument. Wherein the initial field of view is obtained by commanding the parameter C0.
In one embodiment, the field of view of the end-of-image instrument may be gradually adjusted to restore the initial field of view according to a field-of-view adjustment manner, where the field of view adjustment manner is an adjustment manner that adjusts the camera parameters and the pose at a subsequent time to those at an adjacent previous time, where the adjacent may be consecutive or spaced apart. For example, all of the instruction parameters Cn-C0 corresponding to times Tn-T0 may be executed in reverse to restore the field of view of the image end instrument to the initial field of view. For another example, the instruction parameters corresponding to some of the times Tn to T0 may be executed in reverse to restore the field of view of the image tip instrument to the initial field of view, and for example, the instruction parameters Cn, C3, and C0 corresponding to the times Tn, T3, and T0 may be executed to restore the field of view of the image tip instrument to the initial field of view. This approach is simple and fast.
In another embodiment, the step of performing the step S12 of adjusting the movement of the field of view of the image end instrument to the target position includes: and acquiring and recording camera parameters and poses corresponding to the current visual field of the image terminal instrument. Further, in step S142, i.e., the step of adjusting the field of view of the distal end instrument of the image to be restored to the initial field of view, the field of view of the distal end instrument of the image is directly restored to the initial field of view according to the recorded camera parameters and pose corresponding to the current field of view of the distal end instrument of the image. That is, referring to the command parameters corresponding to the times in the table, the command parameter C0 corresponding to the time T0 can be directly executed to restore the field of view of the image end instrument to the original field of view. This approach is simpler and faster.
In some embodiments, as shown in fig. 25, the method of the present invention may further comprise:
step S151 determines whether the operation end instrument is positioned within the field of view of the image end instrument.
In step S152, when the operation end instrument is not located within the field of view of the image end instrument, the operation end instrument is prohibited from moving.
By the above steps S151 to S155, the operation of the distal end instrument can be prevented from going out of the image distal end instrument, and thus the safety can be further ensured.
In an embodiment, before the step S151, that is, before the step of determining whether the operation end instrument is located within the field of view of the image end instrument, the method further includes:
step S150, detecting whether a start instruction is acquired.
When the activation instruction is acquired in step S150, the process proceeds to step S151, in which it is determined whether or not the operation end instrument is positioned within the field of view of the image end instrument.
The activation command includes, but is not limited to, being triggered to be generated when the surgical arm is mounted to the power mechanism, and being triggered by a confirmation command entered by the surgeon after the surgical arm is mounted to the power mechanism. For example, a sensor such as a distance sensor is installed on the power mechanism, when the surgical arm is installed on the power mechanism, the sensor detects the distance between the surgical arm and the power mechanism, and when the controller judges that the distance is smaller than or equal to a preset value, the generation of the starting instruction is triggered.
In some embodiments, the method of the present invention may further comprise: acquiring an accessible interval of a visual field of an image terminal instrument, wherein the accessible interval of the visual field refers to a space set of all visual fields; acquiring the current position of an operating terminal instrument; judging whether the current position of the operation terminal instrument is positioned in an accessible interval of the visual field of the image terminal instrument; and when the current position of the operation end instrument is positioned in an accessible interval of the visual field of the image end instrument, the step of adjusting the visual field of the image end instrument to move towards the target position is carried out. When the current position of the manipulation tip instrument is within the reachable range of the field of view of the image tip instrument, it can be ensured that the corresponding manipulation can be done within the field of view. The problem that the current position of the operation end instrument cannot be located in the reachable section of the visual field of the image end instrument occurs in the multi-hole surgical robot, so that a doctor can be prompted to adjust the telecentric motionless point of the surgical arm, and the condition that the current position of the operation end instrument is located in the reachable section of the visual field of the image end instrument can be met after the telecentric motionless point is adjusted subsequently.
In some embodiments, the distal imaging instrument moves synchronously with the movement of the distal imaging instrument, i.e., both movements occur simultaneously, under conditions that ensure that the distal imaging instrument is always within the field of view of the distal imaging instrument. The figure is not shown.
In some embodiments, the distal image instrument lags movement with movement of the distal image instrument under conditions that ensure that the distal image instrument is always within the field of view of the distal image instrument. As shown in fig. 15, 17 and 18, or as shown in fig. 15, 19 and 20.
In some embodiments, the method of the present invention may further comprise: when the manipulation tip instrument is moved, it is controlled or referred to as being restrained from moving substantially linearly in the direction of its current position toward the target position. Such a definition is particularly applicable to the insertion procedure of the surgical arm during the surgical preparation procedure, since this is closer to the existing way of operation.
In some embodiments, the method of the present invention may further comprise: calculating a deviation angle of a connecting line between a moving direction of the operation terminal instrument and a current position and a target position of the operation terminal instrument; when the deviation angle reaches a deviation threshold value, at least a resistance force is generated that hinders the movement of the operating tip instrument in the deviation direction. The connecting line is for example a straight line between the current position and the target position.
In one embodiment, the magnitude of the resistance is positively correlated with the magnitude of the slip angle, such as a linear positive correlation or a stepped positive correlation.
In one embodiment, a computer readable storage medium is provided, the computer readable storage medium storing a computer program configured to be loaded by a processor and executed to implement the steps of: acquiring a target position expected to be reached by the operating terminal instrument; adjusting the field of view of the image end instrument to move toward the target position when the manipulation end instrument is within the field of view of the image end instrument and ensuring that the manipulation end instrument is always within the field of view of the image end instrument.
In one embodiment, a control device for a surgical robot is provided. As shown in fig. 26, the control device may include: a processor (processor)501, a Communications Interface (Communications Interface)502, a memory (memory)503, and a Communications bus 504.
The processor 501, the communication interface 502, and the memory 503 communicate with each other via a communication bus 504.
A communication interface 502 for communicating with other devices such as various sensors or motors or solenoid valves or other clients or network elements of a server or the like.
The processor 501 is configured to execute the program 505, and may specifically perform relevant steps in the foregoing method embodiments.
In particular, program 505 may include program code comprising computer operating instructions.
The processor 505 may be a Central Processing Unit (CPU), or an Application Specific Integrated Circuit (ASIC), or one or more Integrated Circuits (ICs) configured to implement embodiments of the present invention, or a Graphics Processing Unit (GPU). The control device comprises one or more processors, which can be the same type of processor, such as one or more CPUs, or one or more GPUs; or may be different types of processors, such as one or more CPUs and one or more GPUs.
The memory 503 stores a program 505. The memory 503 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 505 may specifically be configured to cause the processor 501 to perform the following operations: acquiring a target position expected to be reached by the operating terminal instrument; adjusting the field of view of the image end instrument to move toward the target position when the manipulation end instrument is within the field of view of the image end instrument and ensuring that the manipulation end instrument is always within the field of view of the image end instrument.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of guiding movement of a surgical arm in a surgical robot having a plurality of manipulator arms including a camera arm having an image end instrument and a surgical arm having a manipulation end instrument, the method comprising the steps of:
acquiring a target position expected to be reached by the operating terminal instrument;
adjusting the field of view of the image end instrument to move towards the target position and ensuring that the operational end instrument is always within the field of view of the image end instrument.
2. The method of claim 1, wherein prior to the step of adjusting the field of view of the image tip instrument to move toward the target position and ensuring that the operational tip instrument is always within the field of view of the image tip instrument, the method further comprises:
determining whether the operative tip instrument is within a field of view of the image tip instrument;
adjusting the field of view of the image end instrument to position the manipulation end instrument within the field of view of the image end instrument when the manipulation end instrument is not positioned within the field of view of the image end instrument;
and when the operation end instrument is positioned in the visual field of the image end instrument, adjusting the visual field of the image end instrument to move towards the target position, and ensuring that the operation end instrument is always positioned in the visual field of the image end instrument.
3. The method of claim 2, wherein determining whether the operated end instrument is within a field of view of the imaged end instrument comprises:
acquiring an operation image in a visual field of the image end instrument;
and judging whether the operation end instrument is positioned in the visual field of the image end instrument by image recognition whether the operation end instrument is positioned in the operation image.
4. The method of claim 2, wherein determining whether the operated end instrument is within a field of view of the imaged end instrument comprises:
acquiring the current position of the operating terminal instrument;
converting a field of view of the image tip instrument into a range of positions;
determining whether the operative tip instrument is within a field of view of the image tip instrument by determining whether the current position is within the range of positions.
5. The method of claim 2, wherein adjusting the field of view of the image tip instrument to position the manipulation tip instrument within the field of view of the image tip instrument comprises:
acquiring the current position of the operating terminal instrument;
adjusting a field of view of the image end instrument to be within the field of view of the image end instrument by changing camera parameters of the image end instrument according to the current position of the operating end instrument, the camera parameters including a field angle and/or a depth of field.
6. The method of claim 2, wherein adjusting the field of view of the image tip instrument to position the manipulation tip instrument within the field of view of the image tip instrument comprises:
acquiring the current position of the operating terminal instrument;
adjusting a field of view of the image end instrument to position the manipulation end instrument within the field of view of the image end instrument by changing a pose, including a position and/or a pose, of the image end instrument based on the current position of the manipulation end instrument.
7. The method of claim 1, wherein adjusting the field of view of the image end instrument to move toward the target position comprises:
acquiring the current position of the operating terminal instrument;
determining an adjustment direction of a field of view of the image tip instrument based on the current position and the target position of the manipulation tip instrument;
incrementally adjusting movement of a field of view of the image tip instrument toward a target position of the manipulation tip instrument according to the adjustment direction.
8. A computer-readable storage medium, characterized in that it stores a computer program configured to be loaded by a processor and to execute steps implementing the method according to any one of claims 1 to 7.
9. A control device for a surgical robot, comprising:
a memory for storing a computer program;
and a processor for loading and executing the computer program;
wherein the computer program is configured to be loaded by the processor and to perform the steps of implementing the method according to any of claims 1-7.
10. A surgical robot, comprising:
an operation arm having a camera arm for imaging an end instrument and an operation arm for operating the end instrument;
and a controller coupled to the manipulator arm and configured to perform the steps of the method of any of claims 1-7.
CN202110011217.2A 2021-01-06 2021-01-06 Surgical robot and method and control device for guiding surgical arm to move Pending CN112618029A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202210284003.7A CN114652449A (en) 2021-01-06 2021-01-06 Surgical robot and method and control device for guiding surgical arm to move
CN202110011217.2A CN112618029A (en) 2021-01-06 2021-01-06 Surgical robot and method and control device for guiding surgical arm to move
PCT/CN2021/092697 WO2022147935A1 (en) 2021-01-06 2021-05-10 Surgical robot, method for same to guide movement of surgical arm, and control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110011217.2A CN112618029A (en) 2021-01-06 2021-01-06 Surgical robot and method and control device for guiding surgical arm to move

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210284003.7A Division CN114652449A (en) 2021-01-06 2021-01-06 Surgical robot and method and control device for guiding surgical arm to move

Publications (1)

Publication Number Publication Date
CN112618029A true CN112618029A (en) 2021-04-09

Family

ID=75290832

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110011217.2A Pending CN112618029A (en) 2021-01-06 2021-01-06 Surgical robot and method and control device for guiding surgical arm to move
CN202210284003.7A Pending CN114652449A (en) 2021-01-06 2021-01-06 Surgical robot and method and control device for guiding surgical arm to move

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210284003.7A Pending CN114652449A (en) 2021-01-06 2021-01-06 Surgical robot and method and control device for guiding surgical arm to move

Country Status (2)

Country Link
CN (2) CN112618029A (en)
WO (1) WO2022147935A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113334391A (en) * 2021-08-06 2021-09-03 成都博恩思医学机器人有限公司 Method and system for controlling position of mechanical arm, robot and storage medium
WO2022147935A1 (en) * 2021-01-06 2022-07-14 深圳市精锋医疗科技有限公司 Surgical robot, method for same to guide movement of surgical arm, and control device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115741780B (en) * 2022-11-29 2024-06-11 中国电子科技集团公司第四十四研究所 Multi-axis mechanical arm device capable of operating pulling mechanism

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105188592A (en) * 2013-03-15 2015-12-23 Sri国际公司 Hyperdexterous surgical system
CN106859742A (en) * 2017-03-21 2017-06-20 北京阳光易帮医疗科技有限公司 A kind of puncturing operation navigation positioning system and method
CN109195544A (en) * 2016-07-14 2019-01-11 直观外科手术操作公司 Secondary instrument control in computer-assisted remote operating system
CN109330699A (en) * 2018-07-31 2019-02-15 深圳市精锋医疗科技有限公司 Mechanical arm, from operation apparatus assembly and operating robot
CN109330685A (en) * 2018-10-31 2019-02-15 南京航空航天大学 A kind of porous abdominal operation robot laparoscope automatic navigation method
CN110464471A (en) * 2019-09-10 2019-11-19 深圳市精锋医疗科技有限公司 The control method of operating robot and its end instrument, control device
CN110893118A (en) * 2018-09-12 2020-03-20 微创(上海)医疗机器人有限公司 Surgical robot system and method for controlling movement of robot arm
US20200337777A1 (en) * 2018-01-11 2020-10-29 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for surgical route planning
CN111867511A (en) * 2018-01-17 2020-10-30 奥瑞斯健康公司 Surgical robotic system with improved robotic arm
CN111991085A (en) * 2020-10-08 2020-11-27 深圳市精锋医疗科技有限公司 Surgical robot, graphical control device thereof and graphical display method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10025285A1 (en) * 2000-05-22 2001-12-06 Siemens Ag Fully automatic, robot-assisted camera guidance using position sensors for laparoscopic interventions
US8620473B2 (en) * 2007-06-13 2013-12-31 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
DE102011005917A1 (en) * 2011-03-22 2012-09-27 Kuka Laboratories Gmbh Medical workplace
DE102014226240A1 (en) * 2014-12-17 2016-06-23 Kuka Roboter Gmbh System for robot-assisted medical treatment
JP7197511B2 (en) * 2017-05-25 2022-12-27 コヴィディエン リミテッド パートナーシップ Systems and methods for detecting objects within the field of view of an image capture device
EP3678581A4 (en) * 2017-09-05 2021-05-26 Covidien LP Robotic surgical systems and methods and computer-readable media for controlling them
GB2608751B (en) * 2018-10-03 2023-06-14 Cmr Surgical Ltd Methods and systems for providing assistance to a user of a surgical robot system
CN112618029A (en) * 2021-01-06 2021-04-09 深圳市精锋医疗科技有限公司 Surgical robot and method and control device for guiding surgical arm to move
CN114795491A (en) * 2021-01-06 2022-07-29 深圳市精锋医疗科技股份有限公司 Surgical robot and method and control device for guiding surgical arm to move

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105188592A (en) * 2013-03-15 2015-12-23 Sri国际公司 Hyperdexterous surgical system
CN109195544A (en) * 2016-07-14 2019-01-11 直观外科手术操作公司 Secondary instrument control in computer-assisted remote operating system
CN106859742A (en) * 2017-03-21 2017-06-20 北京阳光易帮医疗科技有限公司 A kind of puncturing operation navigation positioning system and method
US20200337777A1 (en) * 2018-01-11 2020-10-29 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for surgical route planning
CN111867511A (en) * 2018-01-17 2020-10-30 奥瑞斯健康公司 Surgical robotic system with improved robotic arm
CN109330699A (en) * 2018-07-31 2019-02-15 深圳市精锋医疗科技有限公司 Mechanical arm, from operation apparatus assembly and operating robot
CN110893118A (en) * 2018-09-12 2020-03-20 微创(上海)医疗机器人有限公司 Surgical robot system and method for controlling movement of robot arm
CN109330685A (en) * 2018-10-31 2019-02-15 南京航空航天大学 A kind of porous abdominal operation robot laparoscope automatic navigation method
CN110464471A (en) * 2019-09-10 2019-11-19 深圳市精锋医疗科技有限公司 The control method of operating robot and its end instrument, control device
CN111991085A (en) * 2020-10-08 2020-11-27 深圳市精锋医疗科技有限公司 Surgical robot, graphical control device thereof and graphical display method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022147935A1 (en) * 2021-01-06 2022-07-14 深圳市精锋医疗科技有限公司 Surgical robot, method for same to guide movement of surgical arm, and control device
CN113334391A (en) * 2021-08-06 2021-09-03 成都博恩思医学机器人有限公司 Method and system for controlling position of mechanical arm, robot and storage medium

Also Published As

Publication number Publication date
WO2022147935A1 (en) 2022-07-14
CN114652449A (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN112618028B (en) Surgical robot and method and control device for guiding surgical arm to move
CN112618029A (en) Surgical robot and method and control device for guiding surgical arm to move
US10939969B2 (en) Command shaping to dampen vibrations in mode transitions
EP3419543B1 (en) System for collision avoidance using virtual boundaries
CN112587243B (en) Surgical robot and control method and control device thereof
CN112043397B (en) Surgical robot and motion error detection method and detection device thereof
US20240050178A1 (en) Surgical robot, control method thereof, and storage medium
EP3473202B1 (en) Robotic system for minimally invasive surgery
KR20150120944A (en) Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
US20180036090A1 (en) Medical manipulator system
KR20170128327A (en) Systems and methods for providing feedback during manual joint positioning
US11819288B2 (en) Trocar pose estimation using machine learning for docking surgical robotic arm to trocar
US10518415B2 (en) Method for safe coupling and decoupling of an input device
US11443501B2 (en) Robotic surgical safety via video processing
WO2013136583A1 (en) Operation-control device for insertion apparatus
CN115005979A (en) Computer-readable storage medium, electronic device, and surgical robot system
CN112472298B (en) Surgical robot, and control device and control method thereof
Bauzano et al. Auto-guided movements on minimally invasive surgery for surgeon assistance
CN113876433A (en) Robot system and control method
US20210030502A1 (en) System and method for repositioning input control devices
US20240045404A1 (en) Predictive motion mapping for flexible devices
WO2023276238A1 (en) Robot control device and robot control method
CN116670711A (en) Predictive motion mapping for flexible devices
CN117159143A (en) Control method, equipment and storage medium for intervention pose adjustment joint
CN117179895A (en) Adjustment method and device of image acquisition component and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518000 2b1901, phase II, smart home, 76 Baohe Avenue, Baolong community, Baolong street, Longgang District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Jingfeng Medical Technology Co.,Ltd.

Address before: 518000 2b1901, phase II, smart home, 76 Baohe Avenue, Baolong community, Baolong street, Longgang District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN JINGFENG MEDICAL TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
RJ01 Rejection of invention patent application after publication

Application publication date: 20210409

RJ01 Rejection of invention patent application after publication